Skip to content

I recently discovered how to run large language models locally and thought I would share.

Notifications You must be signed in to change notification settings

chrislambert-ky/ai-llm-local

Repository files navigation

ai-llm-local

This project is a work in progress.

My goal with this project was to simply develop a straigthforward solution that allows myself and others to interact with freely available large language models without the expense of cloud resources.

To accomplish this, I'm primarily using Jupyter Notebooks and Ollama (Online Learning and Model Adaptation). Since I'm still farily new to Python and I want to learn more about Python applications and web interfaces, I've also started two Python scripts that allow for interaction from a webpage using Gradio and Flask.


Prerequisites

  • Python 3.12
  • Jupyter Notebook
  • Ollama
  • Gradio
  • Flask

Installing

A step by step series of examples that tell you how to get a development environment running:

  1. Clone the repository
  2. Download and install Ollama
    (https://ollama.com/)
  3. Install the python dependencies using:
    'python.exe -r requirements.txt'
  4. Open either of the Jupyter Notebooks:
    • llm_notebook_all_models
    • llm_notebook_select_model
  5. Further instructions are inside the notebooks.

About

I recently discovered how to run large language models locally and thought I would share.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published