-
Notifications
You must be signed in to change notification settings - Fork 510
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Bug]: Cache Needs to be inited #619
Comments
refer to: #585 (comment)
|
Okay that works, thank you but now i get this error
is this because of the new openai endpoints or am i doing something wrong. |
@theinhumaneme This seems to be the wrong format of the custom embedding function. |
@theinhumaneme or, you can show the embed_query func, maybe i can give you some advice |
there is no Here's the link |
@theinhumaneme |
Okay thank you, I will look into the openai library thank you |
Current Behavior
i get a stack trace
Expected Behavior
I should be able to use the cache normally
Steps To Reproduce
Environment
No response
Anything else?
i get this error when i use the
set_llm_cache()
from langchainit works fine when i use it normally i.e init but fails when i am trying to embed my text using the openai embeddings i get an error stating that
to_embeddings
doesn't exist when i change the code in the function toembed_query
i get unexpectedextra_params
passed.Thank you :D
The text was updated successfully, but these errors were encountered: