Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug] Output Display, Window Size, and Chat List #42

Open
candyknife opened this issue Dec 23, 2024 · 4 comments
Open

[Bug] Output Display, Window Size, and Chat List #42

candyknife opened this issue Dec 23, 2024 · 4 comments
Assignees
Labels
bug Something isn't working

Comments

@candyknife
Copy link

First of all, I'd like to express my sincere gratitude for developing such a fantastic macOS native LLM chat application. It's great to see a dedicated client for macOS, and I appreciate the effort you've put into it.

I've been using macai and have a few suggestions that I believe could enhance the user experience:

  1. Streaming Output Display: When the LLM is generating a response in a streaming fashion, the display doesn't always show the latest line of text. Currently, it only automatically displays the first line of the response. Users have to wait for the streaming output to finish and then manually scroll down to read the complete answer. It would be great if the display could automatically scroll down to show the latest line of text as it's being generated.

  2. Window Size Persistence: The application doesn't remember the adjusted window size. The default window size is quite small. After manually resizing the window and closing the application, it reverts back to the default size when reopened. It would be very helpful if the application could remember the user-adjusted window size.

  3. Chat List Scrolling Lag: The left-side chat list has a noticeable lag when scrolling through the conversations. This makes the experience feel less smooth. Optimizing the scrolling performance of the chat list would significantly improve the overall user experience.

@candyknife candyknife added the bug Something isn't working label Dec 23, 2024
@Renset
Copy link
Owner

Renset commented Dec 26, 2024

@candyknife thank you for reporting these bugs

  1. This is a bit controversial. The speed of LLM output is quite high nowadays, and can be much higher than the speed of reading. But I understand the problem and will try to improve it somehow.
  2. Actually this is an unexpected problem and I can't reproduce it. Can you tell me your macai version, macOS version and your chip (Apple or Intel?)?
  3. Cannot reproduce it either, so any additional information is very much appreciated.

@Renset Renset changed the title [Issue] Output Display, Window Size, and Chat List [Bug] Output Display, Window Size, and Chat List Dec 26, 2024
@candyknife
Copy link
Author

Additional information:

  • macai version: 2.0.1
  • macOS version: Sequoia 15.2 (24C101)
  • chip: Apple M2

This screen recording reproduces the bug regarding window size preservation. I hope it will be helpful.
https://github.com/user-attachments/assets/915bad42-2948-4c58-a4dd-860976e2e7fb

@Renset
Copy link
Owner

Renset commented Dec 28, 2024

@candyknife thanks, this is really helpful!
The window size is saved when you quit the whole application, not just when you close the window - that would probably be helpful for you (as a workaround). I'll try fixing the window size on close as well.

@FreedomCoder-dev
Copy link

FreedomCoder-dev commented Dec 30, 2024

The auto-scroll feature should only activate when the scroll view is at the bottom of the chat. If the user scrolls up, the feature will stop interfering. To ensure smooth scrolling, we can cache the auto-scroll flag and animate the scroll to the end of the conversation until the user interacts with the scroll.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

3 participants