We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Following the demo script response = llm(prompt_text):
response = llm(prompt_text)
860 if not isinstance(prompt, str): 861 raise ValueError( 862 "Argument `prompt` is expected to be a string. Instead found " 863 f"{type(prompt)}. If you want to run the LLM on multiple prompts, use " 864 "`generate` instead." 865 ) 866 return ( --> 867 self.generate( 868 [prompt], 869 stop=stop, 870 callbacks=callbacks, 871 tags=tags, 872 metadata=metadata, 873 **kwargs, 874 ) 875 .generations[0][0] 876 .text 877 ) ... 55 log_output=self.log_output, 56 **kwargs, 57 ) TypeError: Service.__init__() got an unexpected keyword argument 'log_output'
The text was updated successfully, but these errors were encountered:
No branches or pull requests
Following the demo script
response = llm(prompt_text)
:The text was updated successfully, but these errors were encountered: