-
Notifications
You must be signed in to change notification settings - Fork 165
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
add alibabacloud ai search notebook #324
base: main
Are you sure you want to change the base?
add alibabacloud ai search notebook #324
Conversation
Found 1 changed notebook. Review the changes at https://app.gitnotebooks.com/elastic/elasticsearch-labs/pull/324 |
@elastic/search-experiences-team please review |
@@ -0,0 +1,492 @@ | |||
{ |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Commented on notebook notebooks/alibabacloud-ai-search/inference-alibabacloud-ai-search.ipynb
Cell 2 Line 5
# Requirements
For this example, you will need:
- An Elastic deployment with minimum **4GB machine learning node**
We use Alibaba's Cloud right? So we dont need a ML Node here as the work is proxied through to Alibaba
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thank you for your suggestion! I have removed the restriction on the 4GB machine learning node.
"# Create the client instance\n", | ||
"client = Elasticsearch(\n", | ||
" # For local development\n", | ||
" # hosts=[\"http://localhost:9200\"]\n", |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
You could also use getpass
for storing the hosts variable like this if you'd like:
hosts=getpass.getpass("Host: ")
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for the suggestion! I'v used getpass
to store the hosts for improving security.
💚 CLA has been signed |
fc4dd2d
to
1b1ac85
Compare
@@ -0,0 +1,483 @@ | |||
{ |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Commented on notebook notebooks/alibabacloud-ai-search/inference-alibabacloud-ai-search.ipynb
Cell 2 Line 7
# Requirements
For this example, you will need:
- An Elastic deployment:
- we'll be using [Alibaba Elasticsearch](https://www.aliyun.com/product/bigdata/elasticsearch) for this example.
- Elasticsearch 8.16 or above
is this correct as 8.16 has not been released yet. SHould this be 8.15 or is this a notebook that should be published on release of 8.16?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This notebook is related to the following pull request: elastic/elasticsearch#111181, which is labeled as version 8.16.0. The inference API in version 8.15 does not support the Alibaba Cloud AI Search Model, so I believe this notebook should be published with the release of version 8.16.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
OK great. Approving for now but will add a note that this is on hold to merge until 8.16 release cc @JessicaGarson .
Ah you also need to run pre-commit, to format the notebook https://github.com/elastic/elasticsearch-labs/blob/main/CONTRIBUTING.md#pre-commit-hook |
Thank you for the reminder! I've already run the pre-commit hook to format the notebook. Let me know if you need anything else! |
Related to https://www.elastic.co/guide/en/elasticsearch/reference/master/infer-service-alibabacloud-ai-search.html
This PR adds a Jupyter notebook that contains an end-to-end example of using the Inference API with the AlibabaCloud AI Search service.