Skip to content

Releases: Luxadevi/Ollama-Colab-Integration

Latest Ollama-Collab

17 Dec 08:02
Compare
Choose a tag to compare

Ollama-Colab-Integration V4 with latest Ollama-Companion

Working to bring alot of Tools together and have a powerfull LLM strack that is multi purpose
And usable in alot of scenario's

Ollama-Companion Github

Downloads from Ollama-Companion branch "Collab-Installer"

Older notebooks are also still useable.

What's new in Ollama-Companion V2

Manage Ollama from a user friendly WebUI and Quantazize models automaticly to GGUF format.

🌟 Streamlit Integration
Transitioned from Gradio to Streamlit for a dynamic UI.
Streamlined Streamlit workflow for intuitive model interaction.

🔄 PyTorch Model Conversion and Quantization
Convert PyTorch models directly from the WebUI.
Intuitive quantization tool in the WebUI for various computational needs.
Schedule and manage multiple conversions and quantizations.
And upload these GGUF files to HuggingFace for ease of use.

🗨️ Enhanced Chat Interface with Vision Model Support
Quick model switching in the chat for efficient language model interactions.
New support for vision models: Upload images and engage with LLAVAfor analysis and recognition.

🚀 Enhanced Model Management Features
Advanced ModelFile Templater for easy parameter customization.
Detailed model insights including licensing and parameters.
Simplified public endpoint management for original and OpenAI models.

🔐 Secure Cloudflared Tunneling
Independent Cloudflared tunneling for secure endpoint creation.

🔄 LiteLLM Proxy Integration
Direct control and automated polling for LiteLLM proxy management.

Public endpoints and WebUI

28 Nov 23:11
587a438
Compare
Choose a tag to compare

Exciting news: No more reliance on third-party apps to run Ollama on a Colab instance! We're thrilled to unveil the Ollama-Colab-Integration V3, more powerful than ever before!

This major update introduces 'Ollama Companion,' a user-friendly WebUI built to enhance your experience with Ollama-Colab-Integration.

Key features include:

Effortless one-click setup for public endpoints.
Smooth integration with openaiproxy through LiteLLM.
Learn more about all the features of Ollama Companion and the new Ollama-Colab-Integration on our GitHub page.