-
-
Notifications
You must be signed in to change notification settings - Fork 276
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Handling multiple message responses? #83
Comments
Hi @ckcollab. You are on the right track. By default - the
Let me know if this helps. |
Thanks for the blazing fast response! Here's my latest attempt, which appears to mostly work, but I don't seem to make the DeepChat component enter a "waiting for response" state with my send message any more? I'm sure there has to be a simple call to make that enables the "display loading bubble?"
|
That's a good point. When I originally designed the interface for websockets I did not consider the need for a loading bubble as the intent was for messages to be fully asynchronous (meaning that the user could send multiple messages without the need to wait/load for a response from the server).
You can simply add the
You can also use the If you want the same loading animation bubble as the native deep chat loading message bubble, it is a little bit more tricky. You will first have to use the
Then you will need to set
Finally, for the animation bubbles to have color, you will need to set the following style properties in your project's css style:
The above properties can ofcourse be further customised to suit your preferences.
Change the response code to something like this:
In Deep Chat, you will also need to change the following: Set
Set
Ofcourse you can change everything here to your preference. Let me know if you need any further assistance. Thanks! |
Thanks so much for the great support, this is already working quite well, much appreciated! |
Happy to hear it worked for you @ckcollab! |
@OvidijusParsiunas Coming back to this, I think it would be very helpful if we could return multiple messages without using websockets. It's just a very common pattern by now, that the LLM responds in multiple messages when using tools for example. My workaround is to put the text of multiple messages into one, but it would be nice if deep chat could receive mutliple messages frorm the backend. |
Hi @pietz. Deep Chat has grown a lot since the initial release, and I agree that it should probably be able to handle multiple message responses. I am currently working on a couple of other features, but I will open up this issue and work on it as soon as I can. |
Thank you very much! |
Hi, I have updated the codebase to either accept the Response format as an object or an array of Response objects. This is now available in To note, multiple responses are not available in stream due to the nature of how messages are populated. Please let me know if you have any issues with this update. |
Hey there, thanks so much for releasing this -- much appreciated!
I'm having a bit of trouble rendering multiple message responses I'm receiving from my backend.
Here's my Vue template code:
And here's my
sendMessage
helper function, which attempts to fire off multiple signals:This only ever prints the first message, I assume
signals.onResponse
is only ever meant to be called once? It's not apparent to me how to make this call multiple times.. probably missing something very simple, appreciate any help, thanks!The text was updated successfully, but these errors were encountered: