From e856a48ea892b6ceb5d8ca52736a8732b5df6883 Mon Sep 17 00:00:00 2001 From: Jack Westwood Date: Mon, 2 Sep 2024 13:38:10 +0100 Subject: [PATCH] Adds manage multiple clusters recipe Fix links in similarity search recipe --- docs/recipes.adoc | 4 +- docs/recipes/managing_multiple_clusters.adoc | 164 +++++++++++++++++++ docs/recipes/similarity_search.adoc | 22 +-- 3 files changed, 178 insertions(+), 12 deletions(-) create mode 100644 docs/recipes/managing_multiple_clusters.adoc diff --git a/docs/recipes.adoc b/docs/recipes.adoc index aa86193f..056a95a2 100644 --- a/docs/recipes.adoc +++ b/docs/recipes.adoc @@ -5,6 +5,8 @@ Welcome to the recipes section of the Couchbase Shell `cbsh` documentation. Here you can find some examples of the more complicated tasks that can be performed using `cbsh`. +include::recipes/managing_multiple_clusters.adoc[] + include::recipes/similarity_search.adoc[] -include::recipes/useful-snippets.adoc[] \ No newline at end of file +include::recipes/useful-snippets.adoc[] diff --git a/docs/recipes/managing_multiple_clusters.adoc b/docs/recipes/managing_multiple_clusters.adoc new file mode 100644 index 00000000..31b1611a --- /dev/null +++ b/docs/recipes/managing_multiple_clusters.adoc @@ -0,0 +1,164 @@ +== Managing multiple clusters + +CBShell is a powerful tool that can be used to interact with fleets comprised of a mix of self-managed and Capella clusters. +Say we have the following four clusters registered with CBShell: + +``` +๐Ÿ‘ค Charlie ๐Ÿ  obligingfaronmoller in โ˜๏ธ default._default._default +> cb-env managed +โ•ญโ”€โ”€โ”€โ”ฌโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ฌโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ฌโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ฌโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ฌโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ฌโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ•ฎ +โ”‚ # โ”‚ active โ”‚ tls โ”‚ identifier โ”‚ username โ”‚ capella_organization โ”‚ project โ”‚ +โ”œโ”€โ”€โ”€โ”ผโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ผโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ผโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ผโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ผโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ผโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ค +โ”‚ 0 โ”‚ false โ”‚ true โ”‚ systemtest โ”‚ Administrator โ”‚ my-org โ”‚ CBShell Testing โ”‚ +โ”‚ 1 โ”‚ false โ”‚ false โ”‚ localdev โ”‚ Administrator โ”‚ โ”‚ โ”‚ +โ”‚ 2 โ”‚ false โ”‚ true โ”‚ prod โ”‚ Administrator โ”‚ my-org โ”‚ CBShell Testing โ”‚ +โ”‚ 3 โ”‚ true โ”‚ true โ”‚ ci โ”‚ Administrator โ”‚ my-org โ”‚ CBShell Testing โ”‚ +โ•ฐโ”€โ”€โ”€โ”ดโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ดโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ดโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ดโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ดโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ดโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ•ฏ +``` + +There is one self-managed cluster (localdev) and three Capella clusters. +Imagine that we want to perform some general health checks on this set of clusters, a good starting point is the https://couchbase.sh/docs/#_listing_nodes[nodes] command with the https://couchbase.sh/docs/#_working_with_clusters[clusters] flag. + +[options="nowrap"] +``` +๐Ÿ‘ค Charlie ๐Ÿ  localdev in ๐Ÿ—„ travel-sample._default._default +> nodes --clusters * +โ•ญโ”€โ”€โ”€โ”€โ”ฌโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ฌโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ฌโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ฌโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ฌโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ฌโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ฌโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ฌโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ฌโ”€โ”€โ”€โ”€โ”€โ•ฎ +โ”‚ # โ”‚ cluster โ”‚ hostname โ”‚ status โ”‚ services โ”‚ version โ”‚ os โ”‚ memory_total โ”‚ memory_free โ”‚ ... โ”‚ +โ”œโ”€โ”€โ”€โ”€โ”ผโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ผโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ผโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ผโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ผโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ผโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ผโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ผโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ผโ”€โ”€โ”€โ”€โ”€โ”ค +โ”‚ 0 โ”‚ localdev โ”‚ 192.168.107.128:8091 โ”‚ healthy โ”‚ search,indexing,kv,query โ”‚ 7.6.2-3505-enterprise โ”‚ aarch64-unknown-linux-gnu โ”‚ 6201221120 โ”‚ 2841657344 โ”‚ ... โ”‚ +โ”‚ 1 โ”‚ localdev โ”‚ 192.168.107.129:8091 โ”‚ healthy โ”‚ search,indexing,kv,query โ”‚ 7.6.2-3505-enterprise โ”‚ aarch64-unknown-linux-gnu โ”‚ 6201221120 โ”‚ 2842959872 โ”‚ ... โ”‚ +โ”‚ 2 โ”‚ localdev โ”‚ 192.168.107.130:8091 โ”‚ healthy โ”‚ search,indexing,kv,query โ”‚ 7.6.2-3505-enterprise โ”‚ aarch64-unknown-linux-gnu โ”‚ 6201221120 โ”‚ 2843160576 โ”‚ ... โ”‚ +โ”‚ 3 โ”‚ prod โ”‚ svc-dqi-node-001.lhb4l06lajhydwmk.cloud.couchbase.com:8091 โ”‚ healthy โ”‚ indexing,kv,query โ”‚ 7.6.2-3721-enterprise โ”‚ x86_64-pc-linux-gnu โ”‚ 16776548352 โ”‚ 15518982144 โ”‚ ... โ”‚ +โ”‚ 4 โ”‚ prod โ”‚ svc-dqi-node-002.lhb4l06lajhydwmk.cloud.couchbase.com:8091 โ”‚ healthy โ”‚ indexing,kv,query โ”‚ 7.6.2-3721-enterprise โ”‚ x86_64-pc-linux-gnu โ”‚ 16776548352 โ”‚ 15518420992 โ”‚ ... โ”‚ +โ”‚ 5 โ”‚ prod โ”‚ svc-dqi-node-003.lhb4l06lajhydwmk.cloud.couchbase.com:8091 โ”‚ healthy โ”‚ indexing,kv,query โ”‚ 7.6.2-3721-enterprise โ”‚ x86_64-pc-linux-gnu โ”‚ 16776544256 โ”‚ 15501099008 โ”‚ ... โ”‚ +โ”‚ 6 โ”‚ ci โ”‚ svc-dqi-node-001.fwplhqyopu9pgolq.cloud.couchbase.com:8091 โ”‚ healthy โ”‚ indexing,kv,query โ”‚ 7.6.2-3721-enterprise โ”‚ x86_64-pc-linux-gnu โ”‚ 16277504000 โ”‚ 14538944512 โ”‚ ... โ”‚ +โ”‚ 7 โ”‚ ci โ”‚ svc-dqi-node-002.fwplhqyopu9pgolq.cloud.couchbase.com:8091 โ”‚ healthy โ”‚ indexing,kv,query โ”‚ 7.6.2-3721-enterprise โ”‚ x86_64-pc-linux-gnu โ”‚ 16277504000 โ”‚ 14559510528 โ”‚ ... โ”‚ +โ”‚ 8 โ”‚ ci โ”‚ svc-dqi-node-003.fwplhqyopu9pgolq.cloud.couchbase.com:8091 โ”‚ healthy โ”‚ indexing,kv,query โ”‚ 7.6.2-3721-enterprise โ”‚ x86_64-pc-linux-gnu โ”‚ 16277504000 โ”‚ 14565412864 โ”‚ ... โ”‚ +โ”‚ 9 โ”‚ systemtest โ”‚ svc-dqi-node-001.lyl8kbhzdovyqhv.cloud.couchbase.com:8091 โ”‚ healthy โ”‚ indexing,kv,query โ”‚ 7.6.2-3721-enterprise โ”‚ x86_64-pc-linux-gnu โ”‚ 16766582784 โ”‚ 15491842048 โ”‚ ... โ”‚ +โ•ฐโ”€โ”€โ”€โ”€โ”ดโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ดโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ดโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ดโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ดโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ดโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ดโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ดโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ดโ”€โ”€โ”€โ”€โ”€โ•ฏ +``` + +This gives us plenty of information, but sometimes it can be a bit difficult to read. +We can make things much easier with some simple reformatting. +To focus on the free memory that each cluster has, we can https://www.nushell.sh/commands/docs/select.html[select] just the relevant columns: + +``` +๐Ÿ‘ค Charlie ๐Ÿ  localdev in ๐Ÿ—„ travel-sample._default._default +> nodes --clusters * | select cluster memory_free +โ•ญโ”€โ”€โ”€โ”€โ”ฌโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ฌโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ•ฎ +โ”‚ # โ”‚ cluster โ”‚ memory_free โ”‚ +โ”œโ”€โ”€โ”€โ”€โ”ผโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ผโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ค +โ”‚ 0 โ”‚ localdev โ”‚ 2841657344 โ”‚ +โ”‚ 1 โ”‚ localdev โ”‚ 2842959872 โ”‚ +โ”‚ 2 โ”‚ localdev โ”‚ 2843160576 โ”‚ +โ”‚ 3 โ”‚ prod โ”‚ 15518982144 โ”‚ +โ”‚ 4 โ”‚ prod โ”‚ 15518420992 โ”‚ +โ”‚ 5 โ”‚ prod โ”‚ 15501099008 โ”‚ +โ”‚ 6 โ”‚ ci โ”‚ 14538944512 โ”‚ +โ”‚ 7 โ”‚ ci โ”‚ 14559510528 โ”‚ +โ”‚ 8 โ”‚ ci โ”‚ 14565412864 โ”‚ +โ”‚ 9 โ”‚ systemtest โ”‚ 15491842048 โ”‚ +โ•ฐโ”€โ”€โ”€โ”€โ”ดโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ดโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ•ฏ +``` + +We could then convert the `memory_free` column from bytes to gigabytes as follows: + +[options="nowrap"] +``` +๐Ÿ‘ค Charlie ๐Ÿ  localdev in ๐Ÿ—„ travel-sample._default._default +> nodes --clusters * | each {|n| $n | update memory_free ($n.memory_free * 1B)} | select cluster memory_free +โ•ญโ”€โ”€โ”€โ”ฌโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ฌโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ•ฎ +โ”‚ # โ”‚ cluster โ”‚ memory_free โ”‚ +โ”œโ”€โ”€โ”€โ”ผโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ผโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ค +โ”‚ 0 โ”‚ localdev โ”‚ 2.6 GiB โ”‚ +โ”‚ 1 โ”‚ localdev โ”‚ 2.6 GiB โ”‚ +โ”‚ 2 โ”‚ localdev โ”‚ 2.6 GiB โ”‚ +โ”‚ 3 โ”‚ prod โ”‚ 14.5 GiB โ”‚ +โ”‚ 4 โ”‚ prod โ”‚ 14.5 GiB โ”‚ +โ”‚ 5 โ”‚ prod โ”‚ 14.4 GiB โ”‚ +โ”‚ 6 โ”‚ ci โ”‚ 13.5 GiB โ”‚ +โ”‚ 7 โ”‚ ci โ”‚ 13.6 GiB โ”‚ +โ”‚ 8 โ”‚ ci โ”‚ 13.6 GiB โ”‚ +โ”‚ 9 โ”‚ systemtest โ”‚ 14.4 GiB โ”‚ +โ•ฐโ”€โ”€โ”€โ”ดโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ดโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ•ฏ +``` + +We do this by iterating over each node and https://www.nushell.sh/commands/docs/update.html[updating] the value in the `memory_free` column by multiplying the current value by nushell's inbuilt https://www.nushell.sh/book/types_of_data.html#file-sizes[File Size] datatype. + +While it is somewhat useful to know the free memory that each cluster has, it'd be more useful for our healthcheck to know the memory utilization for each cluster. +Unfortunately the info returned by `nodes` does not include the memory utilization, however there are two columns that can be used to calculate this: `memory_free` and `memory_total`. + +[options="nowrap"] +``` +๐Ÿ‘ค Charlie ๐Ÿ  localdev in ๐Ÿ—„ travel-sample._default._default +> nodes --clusters * | each {|n| $n | insert utilization ((($n.memory_total - $n.memory_free) / $n.memory_total) * 100 ) } | select cluster utilization | sort-by utilization --reverse +โ•ญโ”€โ”€โ”€โ”ฌโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ฌโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ•ฎ +โ”‚ # โ”‚ cluster โ”‚ utilization โ”‚ +โ”œโ”€โ”€โ”€โ”ผโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ผโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ค +โ”‚ 0 โ”‚ localdev โ”‚ 54.32 โ”‚ +โ”‚ 1 โ”‚ localdev โ”‚ 54.32 โ”‚ +โ”‚ 2 โ”‚ localdev โ”‚ 54.28 โ”‚ +โ”‚ 3 โ”‚ ci โ”‚ 10.71 โ”‚ +โ”‚ 4 โ”‚ ci โ”‚ 10.60 โ”‚ +โ”‚ 5 โ”‚ ci โ”‚ 10.50 โ”‚ +โ”‚ 6 โ”‚ prod โ”‚ 7.61 โ”‚ +โ”‚ 7 โ”‚ systemtest โ”‚ 7.59 โ”‚ +โ”‚ 8 โ”‚ prod โ”‚ 7.52 โ”‚ +โ”‚ 9 โ”‚ prod โ”‚ 7.49 โ”‚ +โ•ฐโ”€โ”€โ”€โ”ดโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ดโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ•ฏ +``` + +For https://www.nushell.sh/commands/docs/each.html[each] of the nodes we add a new column called utilization and we calculate the percentage disk used with: + +``` +(($n.memory_total - $n.memory_free) / $n.memory_total) * 100 +``` + +Finally we https://www.nushell.sh/commands/docs/sort-by.html[sort-by] descending utilization. + +The results of such resource checks can be useful when we are deciding where to create new resources on our clusters. +For example imagine that we want to create a 1GB bucket on any one of our clusters. +Firstly we could just try to create it on the active cluster: + +[options="nowrap"] +``` +๐Ÿ‘ค Charlie ๐Ÿ  localdev in ๐Ÿ—„ travel-sample._default._default +> buckets create BigBucket 1000 +Error: ร— Unexpected status code + โ•ญโ”€[entry #8:1:1] + 1 โ”‚ buckets create BigBucket 1000 + ยท โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ฌโ”€โ”€โ”€โ”€โ”€โ”€ + ยท โ•ฐโ”€โ”€ + โ•ฐโ”€โ”€โ”€โ”€ + help: Unexpected status code: 400, body: {"errors":{"ramQuota":"RAM quota specified is too large to be provisioned into this cluster."},"summaries":{"ramSummary": + {"total":1610612736,"otherBuckets":1610612736,"nodesCount":3,"perNodeMegs":1000,"thisAlloc":3145728000,"thisUsed":0,"free":-3145728000},"hddSummary": + {"total":183855980544,"otherData":27159966105,"otherBuckets":418430976,"thisUsed":0,"free":156277583463}}} +``` + +This failed since the https://couchbase.sh/docs/#_cb_env_cluster[active cluster] doesn't have enough memory to support such a large bucket. +We can use `nodes` to find the cluster with the most free memory and create the bucket there: + +[options="nowrap"] +``` +๐Ÿ‘ค Charlie ๐Ÿ  localdev in ๐Ÿ—„ travel-sample._default._default +> nodes --clusters * | sort-by memory_free --reverse | first | get cluster | buckets create BigBucket 1000 --clusters $in +``` + +Here we have fetched the nodes for all the registered clusters, sorted by the descending amount of memory free and got the cluster name. +Then we pipe the cluster name into `buckets create` command, using `$in` to access the piped value, and since no error is returned it is a success. +To double check the success and see where our bucket was created we can do: + +[options="nowrap"] +``` +๐Ÿ‘ค Charlie ๐Ÿ  localdev in ๐Ÿ—„ travel-sample._default._default +> buckets --clusters a* | where name == "BigBucket" +โ•ญโ”€โ”€โ”€โ”ฌโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ฌโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ฌโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ฌโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ฌโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ฌโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ฌโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ฌโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ฌโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ•ฎ +โ”‚ # โ”‚ cluster โ”‚ name โ”‚ type โ”‚ replicas โ”‚ min_durability_level โ”‚ ram_quota โ”‚ flush_enabled โ”‚ cloud โ”‚ max_expiry โ”‚ +โ”œโ”€โ”€โ”€โ”ผโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ผโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ผโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ผโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ผโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ผโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ผโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ผโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ผโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ค +โ”‚ 0 โ”‚ prod โ”‚ BigBucket โ”‚ couchbase โ”‚ 1 โ”‚ none โ”‚ 1000.0 MiB โ”‚ false โ”‚ true โ”‚ 0 โ”‚ +โ•ฐโ”€โ”€โ”€โ”ดโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ดโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ดโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ดโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ดโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ดโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ดโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ดโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ดโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ•ฏ +``` + + + diff --git a/docs/recipes/similarity_search.adoc b/docs/recipes/similarity_search.adoc index 8f202448..7e00991f 100644 --- a/docs/recipes/similarity_search.adoc +++ b/docs/recipes/similarity_search.adoc @@ -1,7 +1,7 @@ == Similarity Search -The <<_vector_commands, vector commands>> can be used to enrich your existing data and allow you to experiment with the value that similarity search can add. -Before you can follow this recipe you'll need to <<_cb_env_llm,configure a llm>> for use with the shell. +The https://couchbase.sh/docs/#_vector_commands[vector commands] can be used to enrich your existing data and allow you to experiment with the value that similarity search can add. +Before you can follow this recipe you'll need to https://couchbase.sh/docs/#_cb_env_llm[configure a llm] for use with the shell. Next you'll need a set of data, for this example we'll be using the travel-sample data set that you can load with: @@ -25,11 +25,11 @@ Embedding batch 3/3 โ•ฐโ”€โ”€โ”€โ”ดโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ดโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ดโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ดโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ดโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ•ฏ ``` -Here we have used <<_query, query>> to get all the landmark doc ids and bodies. -Then we have enriched all of these with the embedding generated from the `content` field, see <<_vector_enrich_doc, vector enrich-doc>> for details. -Finally we pipe the output directly into <<_mutating, doc upsert>> to overwrite the original landmark documents with our enriched versions. +Here we have used https://couchbase.sh/docs/#_query_commands[query] to get all the landmark doc ids and bodies. +Then we have enriched all of these with the embedding generated from the `content` field, see https://couchbase.sh/docs/#_vector_enrich_doc[vector enrich-doc] for details. +Finally we pipe the output directly into https://couchbase.sh/docs/#_mutating[doc upsert] to overwrite the original landmark documents with our enriched versions. -Now that we have a set of docs containing vectors we can create a vector index over them using <<_vector_create_index, vector create-index>>: +Now that we have a set of docs containing vectors we can create a vector index over them using https://couchbase.sh/docs/#_vector_create_index[vector create-index]: ``` ๐Ÿ‘ค Charlie ๐Ÿ  remote in โ˜๏ธ travel-sample._default._default @@ -37,7 +37,7 @@ Now that we have a set of docs containing vectors we can create a vector index o ``` Once the index has finished building we can use it to perform similarity searches over all of the contentVector fields. -This is done using the <<_vector_search, vector search>> command as follows: +This is done using the https://couchbase.sh/docs/#_vector_search[vector search] command as follows: [options="nowrap"] ``` @@ -54,7 +54,7 @@ This is done using the <<_vector_search, vector search>> command as follows: โ•ฐโ”€โ”€โ”€โ”ดโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ดโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ดโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ•ฏ ``` -Here we have used <<_subdoc_get, subdoc get>> to get the contentVector field from `landmark_10019`, which is why the most similar result is `landmark_10019` the vector is the same. +Here we have used https://couchbase.sh/docs/#_subdoc_get[subdoc get] to get the contentVector field from `landmark_10019`, which is why the most similar result is `landmark_10019`: the vector is the same. Once we have this list of results from the vector search we can use the ids to inspect the source documents: [options="nowrap"] @@ -98,10 +98,10 @@ Once we have this list of results from the vector search we can use the ids to i โ•ฐโ”€โ”€โ”€โ”ดโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ดโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ดโ”€โ”€โ”€โ”€โ”€โ•ฏ ``` -Here we could have used <<_reading, doc get>> to get the whole of the documents, but to keep things tidy we've used another `subdoc get` to retrieved the name, address and content fields. +Here we could have used https://couchbase.sh/docs/#_reading[doc get] to get the whole of the documents, but to keep things tidy we've used another `subdoc get` to retrieved the name, address and content fields. As you can see by examining the results they all have semantically similar content fields. -Another way that CBShell can be used to generate embeddings is from plain text with <<_vector_enrich_text, vector enrich-text>>: +Another way that CBShell can be used to generate embeddings is from plain text with https://couchbase.sh/docs/#_vector_enrich_text[vector enrich-text]: [options="nowrap"] ``` @@ -122,4 +122,4 @@ Embedding batch 1/1 Here we have done another similarity search using the same index, but our source vector is the result of embedding the phrase "physical exercise". One important detail to remeber is that the embedding generated from `vector enrich-text` must have the same dimension as those over which the index was created, otherwise `vector search` will return no results. -See <<_vector_enrich-text, vector enrich-text>> for how to specify the dimension of the generated embeddings. +See https://couchbase.sh/docs/#_vector_enrich_text[vector enrich-text] for how to specify the dimension of the generated embeddings.