-
Notifications
You must be signed in to change notification settings - Fork 1.2k
Added queue for the user message with limit of 5 #1835
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Caution
Changes requested ❌
Reviewed everything up to c6098d2 in 2 minutes and 50 seconds. Click for details.
- Reviewed
218
lines of code in3
files - Skipped
0
files when reviewing. - Skipped posting
5
draft comments. View those below. - Modify your settings and rules to customize what types of comments Ellipsis leaves. And don't forget to react with 👍 or 👎 to teach Ellipsis.
1. apps/studio/src/lib/editor/engine/chat/index.ts:120
- Draft comment:
processMessageQueue is invoked without awaiting its execution. Verify chaining of asynchronous queue processing to avoid race conditions. - Reason this comment was not posted:
Decided after close inspection that this draft comment was likely wrong and/or not actionable: usefulness confidence = 20% vs. threshold = 50% The processMessageQueue method has built-in guards against race conditions - it checks isWaiting before processing and won't process if another message is being handled. The queue is also protected by mobx observables. Not awaiting means the function returns earlier, which seems intentional to allow async processing of queued messages in the background. The comment raises a valid point about async execution. There could potentially be edge cases where you want to ensure all queued messages are processed before continuing. The code appears deliberately designed to handle async queue processing safely. The lack of await seems intentional to allow background processing rather than being a bug. The comment should be deleted. The current implementation has proper safeguards and the async behavior appears intentional.
2. apps/studio/src/lib/editor/engine/chat/index.ts:112
- Draft comment:
Duplication in queue management logic between sendNewMessage and processMessage; consider centralizing to avoid inconsistencies. - Reason this comment was not posted:
Comment was on unchanged code.
3. apps/studio/src/routes/editor/EditPanel/ChatTab/ChatInput.tsx:462
- Draft comment:
The MAX_QUEUE_SIZE constant is redefined locally; consider importing it from ChatManager for consistency. - Reason this comment was not posted:
Decided after close inspection that this draft comment was likely wrong and/or not actionable: usefulness confidence = 10% vs. threshold = 50% The comment makes an assumption about code organization that I can't verify. While having shared constants in a central location can be good practice, I have no evidence that ChatManager exists or that this constant should be defined there. The current local definition is clear and serves its purpose. Making speculative suggestions about code organization without clear evidence is not helpful. The comment could be correct - maybe there is a ChatManager file that defines this constant. I don't have access to the full codebase to verify. Without being able to verify the existence of this constant in ChatManager, keeping this comment would be speculative and could send the author on a wild goose chase. Delete this comment since we don't have clear evidence that the constant exists elsewhere or that moving it would be beneficial.
4. apps/studio/src/lib/editor/engine/chat/index.ts:62
- Draft comment:
Consider replacing the recursive processMessageQueue with an iterative loop. Even though the current limit is small, an iterative approach avoids potential stack issues if the queue size ever increases. - Reason this comment was not posted:
Confidence changes required:33%
<= threshold50%
None
5. apps/studio/src/routes/editor/EditPanel/ChatTab/ChatInput.tsx:32
- Draft comment:
Avoid duplicating the MAX_QUEUE_SIZE constant. Consider exposing the maxQueueSize from ChatManager to prevent inconsistencies in the future. - Reason this comment was not posted:
Decided after close inspection that this draft comment was likely wrong and/or not actionable: usefulness confidence = 20% vs. threshold = 50% While the suggestion makes sense architecturally, I don't have evidence that MAX_QUEUE_SIZE is actually duplicated elsewhere. The comment assumes there must be a similar constant in ChatManager, but we can't see the ChatManager code. The local constant could be intentional if the UI wants to be more restrictive than the backend. Without seeing ChatManager, we can't be confident this is an issue. The comment makes a reasonable architectural suggestion. Moving constants to a central location is generally good practice. However, we need strong evidence of an actual problem to keep a comment. Without seeing ChatManager code, we're making assumptions about duplication that may not be true. Delete the comment. While architecturally sensible, we lack concrete evidence that this constant is actually duplicated or that moving it would be beneficial.
Workflow ID: wflow_KrcICSGfm0NJJat5
You can customize by changing your verbosity settings, reacting with 👍 or 👎, replying to comments, or adding code review rules.
Hello, I am migrating the desktop app to a new repository. This repository will now focus on the web version. This PR has been migrated to the new repository: onlook-dev#1. Sorry for the inconvenience, rest assure your work here will also be adapted for the web version (if it hasn't already). |
Description
Added a queue of size 5 when the ai is still processing users message
Related Issues
related #1768
Type of Change
Important
Introduces a message queue in
ChatManager
with a limit of 5, queuing messages when busy and processing them sequentially.ChatManager
with a limit of 5 messages.isWaiting
is true and processed sequentially.ChatInput.tsx
when messages are queued.processMessageQueue()
andprocessMessage()
inChatManager
handle message queuing and processing.sendNewMessage()
inChatManager
queues messages if busy.QueuedMessage
type inconversation/index.ts
for queued messages.This description was created by
for c6098d2. You can customize this summary. It will automatically update as commits are pushed.