|
27 | 27 | 2. [Install plugin to QtCreator](#install-plugin-to-qtcreator)
|
28 | 28 | 3. [Configure for Anthropic Claude](#configure-for-anthropic-claude)
|
29 | 29 | 4. [Configure for OpenAI](#configure-for-openai)
|
30 |
| -4. [Configure for Mistral AI](#configure-for-mistral-ai) |
31 |
| -4. [Configure for Google AI](#configure-for-google-ai) |
32 |
| -5. [Configure for Ollama](#configure-for-ollama) |
33 |
| -6. [System Prompt Configuration](#system-prompt-configuration) |
34 |
| -7. [File Context Features](#file-context-features) |
35 |
| -9. [QtCreator Version Compatibility](#qtcreator-version-compatibility) |
36 |
| -10. [Development Progress](#development-progress) |
37 |
| -11. [Hotkeys](#hotkeys) |
38 |
| -12. [Troubleshooting](#troubleshooting) |
39 |
| -13. [Support the Development](#support-the-development-of-qodeassist) |
40 |
| -14. [How to Build](#how-to-build) |
| 30 | +5. [Configure for Mistral AI](#configure-for-mistral-ai) |
| 31 | +6. [Configure for Google AI](#configure-for-google-ai) |
| 32 | +7. [Configure for Ollama](#configure-for-ollama) |
| 33 | +8. [Configure for llama.cpp](#configure-for-llamacpp) |
| 34 | +9. [System Prompt Configuration](#system-prompt-configuration) |
| 35 | +10. [File Context Features](#file-context-features) |
| 36 | +11. [QtCreator Version Compatibility](#qtcreator-version-compatibility) |
| 37 | +12. [Development Progress](#development-progress) |
| 38 | +13. [Hotkeys](#hotkeys) |
| 39 | +14. [Troubleshooting](#troubleshooting) |
| 40 | +15. [Support the Development](#support-the-development-of-qodeassist) |
| 41 | +16. [How to Build](#how-to-build) |
41 | 42 |
|
42 | 43 | ## Overview
|
43 | 44 |
|
|
51 | 52 | - Automatic syncing with open editor files (optional)
|
52 | 53 | - Support for multiple LLM providers:
|
53 | 54 | - Ollama
|
| 55 | + - llama.cpp |
54 | 56 | - OpenAI
|
55 | 57 | - Anthropic Claude
|
56 | 58 | - LM Studio
|
@@ -184,6 +186,18 @@ You're all set! QodeAssist is now ready to use in Qt Creator.
|
184 | 186 | <img width="824" alt="Ollama Settings" src="https://github.com/user-attachments/assets/ed64e03a-a923-467a-aa44-4f790e315b53" />
|
185 | 187 | </details>
|
186 | 188 |
|
| 189 | +## Configure for llama.cpp |
| 190 | +1. Open Qt Creator settings and navigate to the QodeAssist section |
| 191 | +2. Go to General tab and configure: |
| 192 | + - Set "llama.cpp" as the provider for code completion or/and chat assistant |
| 193 | + - Set the llama.cpp URL (e.g. http://localhost:8080) |
| 194 | + - Fill in model name |
| 195 | + - Choose template for model(e.g. llama.cpp FIM for any model with FIM support) |
| 196 | +<details> |
| 197 | + <summary>Example of llama.cpp settings: (click to expand)</summary> |
| 198 | + <img width="829" alt="llama.cpp Settings" src="https://github.com/user-attachments/assets/8c75602c-60f3-49ed-a7a9-d3c972061ea2" /> |
| 199 | +</details> |
| 200 | + |
187 | 201 | ## System Prompt Configuration
|
188 | 202 |
|
189 | 203 | The plugin comes with default system prompts optimized for chat and instruct models, as these currently provide better results for code assistance. If you prefer using FIM (Fill-in-Middle) models, you can easily customize the system prompt in the settings.
|
|
0 commit comments