Skip to content

Latest commit

 

History

History
34 lines (23 loc) · 1.07 KB

README.md

File metadata and controls

34 lines (23 loc) · 1.07 KB

ai-llm-local

This project is a work in progress.

My goal with this project was to simply develop a straigthforward solution that allows myself and others to interact with freely available large language models without the expense of cloud resources.

To accomplish this, I'm primarily using Jupyter Notebooks and Ollama (Online Learning and Model Adaptation). Since I'm still farily new to Python and I want to learn more about Python applications and web interfaces, I've also started two Python scripts that allow for interaction from a webpage using Gradio and Flask.


Prerequisites

  • Python 3.12
  • Jupyter Notebook
  • Ollama
  • Gradio
  • Flask

Installing

A step by step series of examples that tell you how to get a development environment running:

  1. Clone the repository
  2. Download and install Ollama
    (https://ollama.com/)
  3. Install the python dependencies using:
    'python.exe -r requirements.txt'
  4. Open either of the Jupyter Notebooks:
    • llm_notebook_all_models
    • llm_notebook_select_model
  5. Further instructions are inside the notebooks.