posts/ai-stack-tutorial/ #326
Replies: 7 comments 5 replies
-
Thanks for sharing Tim! Quick note, i was unable to get OpenWebUI working with searxng, even tho the containers were able to connect (docker exec into webui, curl the searxng, no problems). All i got in the WebUI was a 403. Tried it with and without the search?q= and always the same result. The fix for me was to edit the searxng settings.yml file and under formats, add json as a option (only html is there by default). And then it instantly works. This is also mentioned in the OpenWebUI documentation to enable searxng integratation: |
Beta Was this translation helpful? Give feedback.
-
I already mentioned it in the twitch chat yesterday, but got it all running now (hence the home assistant integration, need to do that next). |
Beta Was this translation helpful? Give feedback.
-
for me the ollama container would not want to use the graphics card, with an error "CPU does not have minimum vector extensions ...." based on AVX missing.. Thank you Tim for this tutorial! |
Beta Was this translation helpful? Give feedback.
-
Thanks Tim for this tutorial! Biggest take away... aside from the fact we now have local AI setup.... btop! I remember when htop was new and cool! I just set it up for gpu stats and my mind was blown. Actually that may make for good content: local system and collection of stats. |
Beta Was this translation helpful? Give feedback.
-
Thanks for the amazing resource. You seem to use the following environment variables everywhere;
Are you sure those are supported environment variables in these images? I can't find any mention of them in either the SearXNG or Open WebUI repos. I try to run most containers as a non-root user but both SearXNG and Open WebUI both create files as root despite using these variables so just wanted to check. Thanks again. |
Beta Was this translation helpful? Give feedback.
-
Awesome job as always! Maybe a dumb question, but where is the Traefik server running in this setup? Does it have to be setup on the same machine and using the same external Docker network? I got the Traefik + Wildcard certs container setup and running (another awesome video!) but now scratching my head a little for how to tie these together.. |
Beta Was this translation helpful? Give feedback.
-
Awesome job on this guide! I was able to get most of this up and running, but I noticed this guide wasn’t as detailed as your Traefik guide. Would you consider doing a follow-up or providing additional details, especially on pulling models and linking them to the Docker instance? For example, instructions on how to get the Llama files and configure them to run in Docker from the /opt/stacks/ai-stack/ollama/models directory. Additionally, I noticed that if you followed the Traefik guide, the network should be set to proxy, unless there’s another reason for adding a new network that I might not be understanding. Example: yaml traefik:external: trueWould love your insights or updates to make it clearer for everyone. Thanks again for the excellent work! |
Beta Was this translation helpful? Give feedback.
-
posts/ai-stack-tutorial/
In this tutorial we’ll walk through my local, private, self-hosted AI stack so that you can run it too. If you’re looking for the overview of this stack, you can out the video here Self-Hosted AI That’s Actually Useful
https://technotim.live/posts/ai-stack-tutorial/
Beta Was this translation helpful? Give feedback.
All reactions