🚀 Local DeepSeek-r1 Power with Ollama!
Hey everyone,
We've just rolled out a new release packed with awesome updates:
- Browser-Use Upgrade: We're now fully compatible with the latest
browser-use
version 0.1.29! 🎉 - Local Ollama Integration: Get ready for completely local and private AI with support for the incredible
deepseek-r1
model via Ollama! 🏠
Before You Dive In:
- Update Code: Don't forget to
git pull
to grab the latest code changes. - Reinstall Dependencies: Run
pip install -r requirements.txt
to ensure all your dependencies are up to date.
Important Notes on deepseek-r1
:
- Model Size Matters: We've found that
deepseek-r1:14b
and larger models work exceptionally well! Smaller models may not provide the best experience, so we recommend sticking with the larger options. 🤔
How to Get Started with Ollama and deepseek-r1
:
- Install Ollama: Head over to ollama and download/install Ollama on your system. 💻
- Run
deepseek-r1
: Open your terminal and run the command:ollama run deepseek-r1:14b
(or a larger model if you prefer). - WebUI Setup: Launch the WebUI following the instructions. Here's a crucial step: Uncheck "Use Vision" and set "Max Actions per Step" to 1. ✅
- Enjoy! You're now all set to experience the power of local
deepseek-r1
. Have fun! 🥳
Happy Chinese New Year! 🏮