-
Notifications
You must be signed in to change notification settings - Fork 13
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Cycle through suggestions #21
Comments
No, this is a copilot feature. Regular LLMs don't provide multiple solutions. In theory we could generate multiple solutions as well with prompt engineering, but I have not tried id yet. What the plugin does is using FIM tokens to complete the current code fragment only. Nothing more. |
Another idea could be to use multiple different LLMs simultanously, so we could cycle through the different solutions of multiple models. The question will be how many LLMs can you load into VRAM on consumer hardware, though. Completion models are smaller than the chat models normally, so this could work. |
I think prompt engineering would be easier to implement while still maintaining sane requirements (running a single model instance). We could use a list to store subsequent suggestions so we could mimic the GitHub Copilot behavior. Don't get me wrong, i absolutely enjoy this project and am grateful it exists, but this seems like a feature that is necessary in a real development environment. |
Is there a way to cycle through code completion suggestions? You know, similar to what GitHub Copilot can do with
Alt (Option) + ]
andAlt (Option) + [
, and thenTab
to accept it.The text was updated successfully, but these errors were encountered: