-
Notifications
You must be signed in to change notification settings - Fork 655
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Is PaperQA requires LLM 'stream' set up as 'True' ? #720
Comments
I try to speed up answers ; i read caching is a solution for this purpose. I set up litellm as follows :
But it seems 'stream' and 'caching' stay turned off (see lite llm debug message below):
Did I missed something ? @dosu. |
The 'stream' parameter in PaperQA is intended to enable streaming responses from Llama models, allowing responses to be sent in chunks as they are generated. This should be set to Similarly, for the 'caching' parameter, ensure that it is correctly configured and supported by the version of the Ollama API you are using. If both 'stream' and 'caching' are not being set as expected, check for any configuration validation or logging that might indicate why these settings are not being applied. Additionally, verify that there are no overriding settings elsewhere in your setup that might be affecting these parameters. |
Hello,
I currently use paperQA with Llama3.1:70b served by Ollama.
I can see in debug logs 'stream' parameter is set up as 'False' even if i pass it as 'True' :
Is PaperQA requires this parameter as True ?
Best regards.
The text was updated successfully, but these errors were encountered: