Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Handling multiple message responses? #83

Open
ckcollab opened this issue Dec 30, 2023 · 9 comments
Open

Handling multiple message responses? #83

ckcollab opened this issue Dec 30, 2023 · 9 comments
Assignees
Labels
advice Information how to use/implement the component enhancement New feature or request

Comments

@ckcollab
Copy link

Hey there, thanks so much for releasing this -- much appreciated!

I'm having a bit of trouble rendering multiple message responses I'm receiving from my backend.

Here's my Vue template code:

              <deep-chat
                :demo="true"
                :initialMessages="thread.messages"
                :request="{
                  handler: sendMessage
                }"
              />

And here's my sendMessage helper function, which attempts to fire off multiple signals:

const sendMessage = async (body, signals) => {
  try {
    const response = await useRequest(`/threads/${thread.value.id}/message/`, {
      method: 'POST',
      body: {
        message: body.messages[0].text,
      }
    })

    response.forEach((message) => {
      signals.onResponse(message)
    })
  } catch (e) {
    useErrorHandler(e)
  }
}

This only ever prints the first message, I assume signals.onResponse is only ever meant to be called once? It's not apparent to me how to make this call multiple times.. probably missing something very simple, appreciate any help, thanks!

@OvidijusParsiunas
Copy link
Owner

Hi @ckcollab.

You are on the right track. By default - the handler function is intended to handle one text response per one request.
If you are exercising more dynamic behaviour where you want to return multiple response text messages, I would instead recommend using the websocket handler instead. All you'll need to do is set the websocket property to true and then add signals.onOpen(); on the start of the handler as it is triggered when the component loads up. There is a good example in the Websocket tab of the handler function documentation. You can change it to something like this:

// Vue component
:request="{
  websocket: true,
  handler: sendMessage
}"

// sendMessage variable
const sendMessage = async (body, signals) => {
  try {
    signals.onOpen(); // enables the user to send messages
    const response = await useRequest(`/threads/${thread.value.id}/message/`, {
      method: 'POST',
      body: {
        message: body.messages[0].text,
      }
    })
    response.forEach((message) => {
      signals.onResponse(message); // displays a text message from the server
    })
  } catch (e) {
    useErrorHandler(e)
  }
}

Let me know if this helps.

@OvidijusParsiunas OvidijusParsiunas self-assigned this Dec 30, 2023
@OvidijusParsiunas OvidijusParsiunas added the advice Information how to use/implement the component label Dec 30, 2023
@ckcollab
Copy link
Author

Thanks for the blazing fast response!

Here's my latest attempt, which appears to mostly work, but I don't seem to make the DeepChat component enter a "waiting for response" state with my send message any more? I'm sure there has to be a simple call to make that enables the "display loading bubble?"

// Component..
<deep-chat
  :demo="true"
  :initialMessages="thread.messages"
  :request="{
    websocket: true,
    handler: chatEventHandler
  }"
/>

// Handler..
const chatEventHandler = async (_, signals) => {
  signals.onOpen(); // enables the user to send messages, allows us to handle multiple msgs "websocket style"

  signals.newUserMessage.listener = async (body) => {
    try {
      const response = await useRequest(`/threads/${thread.value.id}/message/`, {
        method: 'POST',
        body: {
          message: body.messages[0].text,
        }
      })

      response.forEach((message) => {
        signals.onResponse(message)
      })
    } catch (e) {
      useErrorHandler(e)
    }
  };
}

@OvidijusParsiunas
Copy link
Owner

OvidijusParsiunas commented Dec 31, 2023

That's a good point. When I originally designed the interface for websockets I did not consider the need for a loading bubble as the intent was for messages to be fully asynchronous (meaning that the user could send multiple messages without the need to wait/load for a response from the server).
Having looked at the code, there are two core ways that you can go about this:

  1. You can create your own custom loading message:

You can simply add the signals.onResponse({text: 'Loading...'}); message when you are waiting for a response from the server and then on your first signal containing the response from the server, add the overwrite property to overwrite the loading message. e.g.

signals.onResponse({text: responseText, overwrite: true});

You can also use the disableSubmitButton method to prevent the user from being able to send messages.

If you want the same loading animation bubble as the native deep chat loading message bubble, it is a little bit more tricky. You will first have to use the html property in your response messages instead of text, and the loading message will have to be:

signals.onResponse({html: '<div class="loading-message-text custom-loading"><div class="dots-flashing"></div></div>'});

Then you will need to set htmlClassUtilities with:

{
  'custom-loading': {
    styles: {
      default: {
        padding: '0.18em 0.1em 0.1em 0.6em',
      },
    },
  },
}

Finally, for the animation bubbles to have color, you will need to set the following style properties in your project's css style:

:root {
  --message-dots-color: #848484;
  --message-dots-color-fade: #55555533;
}

The above properties can ofcourse be further customised to suit your preferences.

  1. You can go back to using your original implementation, but instead of responding with text you can respond with html which will allow you to customise the response message into multiple smaller ones just the way you need it:

Change the response code to something like this:

const responseHTML = response.map((message) => {
  return `<div class="custom-text-message">${message.text}</div> `;
})
signals.onResponse(responseHTML);

In Deep Chat, you will also need to change the following:

Set messageStyles:

{
  html: {
    ai: {
      bubble: {backgroundColor: 'white', margin: '0', padding: '0'},
    },
  },
}

Set htmlClassUtilities:

{
  'custom-text-message': {
    styles: {
      default: {
        backgroundColor: '#e4e6eb',
        borderRadius: '10px',
        padding: '0.42em 0.55em',
        marginTop: '10px',
      },
    },
  },
}

Ofcourse you can change everything here to your preference.

Let me know if you need any further assistance. Thanks!

@ckcollab
Copy link
Author

Thanks so much for the great support, this is already working quite well, much appreciated!

@OvidijusParsiunas
Copy link
Owner

Happy to hear it worked for you @ckcollab!

@pietz
Copy link

pietz commented Jan 8, 2025

@OvidijusParsiunas Coming back to this, I think it would be very helpful if we could return multiple messages without using websockets. It's just a very common pattern by now, that the LLM responds in multiple messages when using tools for example. My workaround is to put the text of multiple messages into one, but it would be nice if deep chat could receive mutliple messages frorm the backend.

@OvidijusParsiunas
Copy link
Owner

Hi @pietz. Deep Chat has grown a lot since the initial release, and I agree that it should probably be able to handle multiple message responses. I am currently working on a couple of other features, but I will open up this issue and work on it as soon as I can.
I will update this thread when i get started on it!

@OvidijusParsiunas OvidijusParsiunas added the enhancement New feature or request label Jan 8, 2025
@pietz
Copy link

pietz commented Jan 8, 2025

Thank you very much!

@OvidijusParsiunas
Copy link
Owner

Hi, I have updated the codebase to either accept the Response format as an object or an array of Response objects. This is now available in deep-chat-dev and deep-chat-react-dev version 9.0.225. These packages work exactly the same as the core ones, except their names are different.

To note, multiple responses are not available in stream due to the nature of how messages are populated.

Please let me know if you have any issues with this update.
I will keep this issue open until this feature has been published in the main packages.
Thankyou!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
advice Information how to use/implement the component enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

3 participants