-
Notifications
You must be signed in to change notification settings - Fork 121
[Feature] Any plan to support async and streaming? #233
Comments
Yes, we have plans to add async routes, we are now planning for 2024Q1/2 so we should have better estimates soon As for streaming, can you elaborate a bit more about it? What are you trying to achieve |
Same here, definitely need faster upsert methods. A bulk upload is taking forever. |
@usamasaleem1 @hustxx Having said that we will start working on this next week - we will update this issue as we progress 🙏 |
I think streaming chatbot feature is essential, I did it on my anyscale inference and now I'm trying to move it to canopy, but i couldn't find any documentation about it. as for my anyscale chatbot i follow this docs https://docs.endpoints.anyscale.com/examples/openai-chat-agent/ |
@Evanrsl Canopy already supports streaming responses. It's fully built into the codebase already. It implements the same interface as the OpenAI chat completion API. If |
@scottmx81 Thanks for your explanations, what about the model parameter? just to make sure is my code correct?
|
@Evanrsl the value for |
Is this your first time submitting a feature request?
Describe the feature
We will need async and streaming to integrate canopy into our app, do you have any timeline when those will be supported?
Describe alternatives you've considered
No response
Who will this benefit?
No response
Are you interested in contributing this feature?
No response
Anything else?
No response
The text was updated successfully, but these errors were encountered: