What about Copilot
support?
#4037
-
I know it would fit better as a plugin when the plugin ecosystem is ready, but nowadays copilot it's nearly essential for a lot of people, has someone done an integration, or is planning that? |
Beta Was this translation helpful? Give feedback.
Replies: 20 comments 86 replies
-
It's unlikely to be accepted as a feature to core until there is a a widely-adopted specification designed for editors. To my knowledge, such a standard has yet to be attempted. However, there has been movement in terms of external LSPs that provide similar functionality to what direct integration could provide: |
Beta Was this translation helpful? Give feedback.
-
Thank you for the discussion so far. I understand that the Helix team may not currently have plans to add Github Copilot integration to the main editor codebase, and that it may be better suited as a plugin in the future. However, I would like to emphasize the importance of making the plugin system more accessible for users to create their own plugins, including ones that integrate with external tools like Github Copilot. Having a more accessible plugin system would allow Helix to expand its functionality and cater to a wider range of developers with varying needs. Additionally, I believe that an integration with Github Copilot would be a valuable addition to Helix, and it may require some work on the Helix side to make it possible. I understand that there may be technical challenges and limitations to creating such an integration, but I would like to encourage the Helix team to explore the possibilities and work towards making it a reality. Thank you for your consideration. |
Beta Was this translation helpful? Give feedback.
-
IMHO top notch AI pair programming assistance is a feature which will be totally fundamental for any serious coder very soon. |
Beta Was this translation helpful? Give feedback.
-
Dropping by to say no Copilot support was, and is, absolutely the dealbreaker for me. I have one third-party plugin on vim installed and it is Copilot support. That is how absurdly useful and generalizable this tool is to me. I use it everywhere I possibly can, for every kind of language. It is my weapon of first resort. If Helix supported Copilot I would switch in a heartbeat and never look back. The zero-config experience of Helix is far, far nicer than Vim, and let's not even get into Neovim. I understand that it does not fit the ethos of the project, and I also understand that you were never crazy about including a plugin system at the start. I am here to let you know that you ignore this feature at your peril. And that if, on the other hand, you built in a |
Beta Was this translation helpful? Give feedback.
-
I've made a draft pr for this- #6865 |
Beta Was this translation helpful? Give feedback.
-
I understand things have moved fast, but it's really a key feature for editors in the current day. Helix is the best — you guys get that modern features should be baked into the editor in an ergonomic way so developers can focus on productivity. Don't just turn around and do the opposite thing with AI! Yes, it's a fast-moving technology with no standard that may have flown in from off the radar (for an opinionated text editor creator, perhaps), but... that's exactly why we need the wisdom of the Helix team in implementing it. I'm confident you cannot only implement this well, but do it better than in any other editor. Well, with everybody wanting to do something with AI today, hopefully you can find a passionate contributor to take on the challenge. It'd be something to be proud of! |
Beta Was this translation helpful? Give feedback.
-
Not Copilot, but similar. I use this right now in my config and it works. It would be better if I could have a feedback while the command is working in the background: A-c = [":pipe sgpt --code --temperature 0.3 --no-cache 'Replace this code with a better version and complete it.'"]
A-C = [":sh echo working...", ":pipe-to cat > /tmp/helix-gpt", ":append-output cat /tmp/helix-gpt | sgpt --code --temperature 0.3 --no-cache 'Finish this code. Start typing from where I left.'", ":sh echo done!"] Based on: https://github.com/TheR1D/shell_gpt |
Beta Was this translation helpful? Give feedback.
-
Building on my other answer, here is a quick way to emulate Copilot with just C-n = [
":insert-output echo '# FILL'",
"join_selections",
"extend_line_above",
"extend_line_above",
"extend_line_above",
"extend_line_above",
"extend_line_above",
"extend_line_above",
"extend_line_above",
"extend_line_above",
"extend_line_below",
"extend_line_below",
"extend_line_below",
"extend_line_below",
"extend_line_below",
"extend_line_below",
"extend_line_below",
":pipe sgpt --code --model gpt-3.5-turbo --temperature 0.3 --no-cache 'Using this code, fill the line having the comment \"# FILL\". Return the whole code, including previous and next lines and tabs as in the input.'",
] Dependency: |
Beta Was this translation helpful? Give feedback.
-
Ok, I understand that copilot will not be part of helix. But they say we can use copilot lsp https://github.com/TerminalFi/LSP-copilot. Does anyone have working config for this lsp and can somebody explain the current workflow if I want to use copilot in Helix. |
Beta Was this translation helpful? Give feedback.
-
Plugging an LSP code assistant that I've been working on: llmvm-codeassist It can perform code completions with various models and include context such as type definitions in completion requests. So far I've only tested it with Helix and Rust. |
Beta Was this translation helpful? Give feedback.
-
I did that https://github.com/efugier/smartcat over the weekend. It's not completion but it covers most of my use cases. It's basically making llms unix citizens so that you can pipe stuff into, and because helix already is a top tier unix citizen they work great together! E.g. select a function, hit |
Beta Was this translation helpful? Give feedback.
-
Another tool that's worth playing around with: aichat |
Beta Was this translation helpful? Give feedback.
-
If anybody is still looking for a solution, I created an LSP assistant to do this here https://github.com/leona/helix-gpt with support for Copilot & OpenAI models I wasn't really happy with any of the current options, and thought it would be something fun to make. It currently does straight code completion, no suggestions/explanations, and no need to bind any keys. Right now it uses the The prompt could do with improvements, as it can sometimes include more than what you asked. You can switch to gpt-4 and it would solve it but I personally like the speed and cost of the turbo model. |
Beta Was this translation helpful? Give feedback.
-
Beta Was this translation helpful? Give feedback.
-
I apologize for going a bit off-topic here, but as this seems to have gotten a bit heated, I'd like to bring some positivity to it. BackgroundMy history of editors (professionally, at least), looks like this:
I am reasonably happy with LazyVim, it gives me a "Full IDE" type of experience, and does pretty well as a polyglot editor. However, it requires an entire repo of configuration files and countless plugins to get it there. Also, I miss emac's TRAMP mode. I'm always looking for new tools, so when I heard about helix, I had to try it out. What amazes me the most is how similar of experience I can get to LazyVim, with zero plugins, and practically no configuration (Except some keybinds I can't live without, e.g. I think some of this power comes from its lack of plugins, and more specifically from how it can pick up any LSPs you have in your path and give you configuration-free Language support. I'd love to live in a world where I can simply add Back to CopilotThe things that actually pushed me over the edge moving from JetBrains to NeoVim were:
So, with that in mind,
I couldn't agree more! We are humble users of your burgeoning application. We are not clients, we have not bought your product, and you owe. us. nothing.
If you have tried using AI assistants and they didn't work for you- fair enough. However, if you haven't given them a solid try, I think you are doing yourself a disservice, as I (and others, it seems) have grown to rely on these tools to save time and keystrokes, which (IMO) is what mode based editors are all about.
Again, an incredibly fair point. TBBH at this point in my life, I am too busy with my main job and my side hustle to try and dive in to something like this. However, I see a ton of promise in this project, and I'd love to make helix my main editor. (TL; DR) The Call To ActionI can't commit time to this project, but I'd be happy to contribute some bounties if that's something the maintainers would be open to. I'd be willing to pay $100 USD, (maybe more), to get this feature supported "out-of-the-box". Whether it be an LSP based solution or whatever- if all I have to do is configure some urls / api keys, I'm happy. I realize that amount doesn't begin to cover the amount of work that would go into this, but I hope that are some other devs in this thread that would be willing to pitch in enough to make it viable, and worthy of your time. Either way, thanks for the awesome product! I do think helix's model of editing is "better" than the traditional vim style, and I'm really looking forward to seeing where it goes from here. Thank You |
Beta Was this translation helpful? Give feedback.
-
I use Helix fulltime in a professional context. There are a few things I miss from previous editors/IDEs but, since I'm already in the terminal, I can usually find a satisfactory work-around. Seeing my colleagues with AI code assistants write boiler plate code at a fraction of the time it takes me, however, makes me question if I'm doing myself – or rather, the thing I'm paid for: my productivity – a disservice. That said, I think this thread collectively underestimates how much tweaking actually goes into an actually useful AI code assistant. I'm talking AST parses of the code, tons of prompt engineering, potential re-prompting for consistency, output validation. Then there are the more obvious steps, like integrating with a broad set of (mostly proprietary) LLM web APIs. There is some standardization on the OpenAI spec (which is kind of a win, except for giving OpenAI disproportionate say in shaping the future of LLM APIs.) Also, it's a moving target, obviously. The way I see it, this thread can/should be split up into several related, but separate discussions:
Probably even this categorisation doesn't cover everything. I'd love to see progress on this topic. AI code assistants aren't going away, I think. Their usefulness today can absolutely be debated, as can their copyright issues, certainly – but it's getting harder to ignore them. We can't expect the core Helix team to pick up this immense task on their own though. We need a way to split it up so that we can all contribute some piece of the puzzle. |
Beta Was this translation helpful? Give feedback.
-
I sort of wonder if "in" the editor is the way to go here. I'm imagining some frame "around" the editor, in the style of tmux or zellij (or perhaps as a zellij plugin). This way the AI could consider more than just the editor contents:
You'd want to specify which panes contain context that you want considered, which one has the editor that you want suggestions in, and where to put the suggestions until they're accepted. Most of this could be editor agnostic, although you might want a plugin for the editor which:
So the frame need not change the way the editor looks at all. The content of the suggestion would appear nearby, and a hotkey would apply it to the open file. This means doing much less on a per-editor basis. A dedicated and standardized suggestions API, via plugin or otherwise, would be nice, but the other option is that the AI could just write chars to /dev/tty and apply the suggestion in the same way that a human would. |
Beta Was this translation helpful? Give feedback.
-
I agree that a larger context is needed, as I believe I already state somewhere earlier on.
This suggests that Helix should be able to do something similar to LSPs to get a larger context for AIs, and that AIs that only look at the current code snippet to generate something like standard code for a SQL prepare statement or whatever, is ridiculously limited. I don't think LLMs like GPT-4o is up to any kind of creative support yet, but they do understand context can help with trivialities if the get the right information and instruction. Getting the right information in the right amount is the real problem here, but likely solveable. LLama3 with 8b parameters which are practical to run locally, simply experiences instant amnesia if you feed it too much context. [1] https://clang.llvm.org/docs/JSONCompilationDatabase.html |
Beta Was this translation helpful? Give feedback.
-
Unfortunately, without this feature, helix will just be another old editor. Users who came over from vim will go back to vim(including me). Look at the huge number of users that cursor has created just by having a good integration of llm. |
Beta Was this translation helpful? Give feedback.
-
I don't know what goes into supporting the likes of copilot and supermaven, but Neovim seemed to do it pretty easily. I could be wrong, but it seems to me that it's a philosophical rather than a technical decision to not integrate what is arguably the biggest text editing innovation in our lifetime. It means I can only use helix for the odd config editing and not on my real projects because the automation of boilerplate is a convenience I can't go without now that it's standard in every major editor. I don't use any of the built-in chat garbage, if I want anything like that I will use the LLM websites. I think copilot and a telescope like live search grep is all thats missing from helix being a real option for software development and I don't understand why it's not a priority, unless you don't care about getting more users which is fair enough. Thankfully Zed exists in the meantime. |
Beta Was this translation helpful? Give feedback.
It's unlikely to be accepted as a feature to core until there is a a widely-adopted specification designed for editors. To my knowledge, such a standard has yet to be attempted. However, there has been movement in terms of external LSPs that provide similar functionality to what direct integration could provide:
lsp-ai
andhelix-gpt
. Beyond that, it has been deferred to the plugin system. While the initial MVP does not contain the necessary API for integration akin to VSCode, the API will be further built upon in subsequent PRs.