Skip to content

🚀 Local DeepSeek-r1 Power with Ollama!

Compare
Choose a tag to compare
@warmshao warmshao released this 28 Jan 12:52
· 79 commits to main since this release
0c9cb9b

Hey everyone,

We've just rolled out a new release packed with awesome updates:

  1. Browser-Use Upgrade: We're now fully compatible with the latest browser-use version 0.1.29! 🎉
  2. Local Ollama Integration: Get ready for completely local and private AI with support for the incredible deepseek-r1 model via Ollama! 🏠

Before You Dive In:

  • Update Code: Don't forget to git pull to grab the latest code changes.
  • Reinstall Dependencies: Run pip install -r requirements.txt to ensure all your dependencies are up to date.

Important Notes on deepseek-r1:

  • Model Size Matters: We've found that deepseek-r1:14b and larger models work exceptionally well! Smaller models may not provide the best experience, so we recommend sticking with the larger options. 🤔

How to Get Started with Ollama and deepseek-r1:

  1. Install Ollama: Head over to ollama and download/install Ollama on your system. 💻
  2. Run deepseek-r1: Open your terminal and run the command: ollama run deepseek-r1:14b (or a larger model if you prefer).
  3. WebUI Setup: Launch the WebUI following the instructions. Here's a crucial step: Uncheck "Use Vision" and set "Max Actions per Step" to 1. ✅
  4. Enjoy! You're now all set to experience the power of local deepseek-r1. Have fun! 🥳

Happy Chinese New Year! 🏮