A collection of Jupyter notebooks demonstrating the use of LlamaIndex, LangChain, Ollama, and the Transformers library for building generative AI applications in Python.
- Introduction
- Notebooks
- Prerequisites
- Installation
- Usage
- Slides
- Project: AI chatbot with RAG
- License
- Contact
This repository contains Jupyter notebooks that provide practical examples of using various frameworks for generative AI in Python. The notebooks are designed to help you understand and apply these tools in your own projects.
-
transformers_llms_usa_case_types.ipynb
- Description: Common examples of using transformers library with models from huggingface.co.
-
- Description: This notebook shows usage of transformers library for same models with different values of parameters like temperature, top_k, no_repeat_ngram_size and others.
-
- Description: This notebook shows how to add examples to the prompt. LLM provided with examples can improve significantly results.
-
- Description: This notebook shows how to fine tune LLM model. Fine-tuning is advanced technique which can modify model parameters to perform better in specific task.
-
langchain_basic_examples.ipynb
- Description: "Get started" examples of using LLMs with LangChain framework.
-
- Description: Chaining LLM tasks helps building advanced AI applications that can handle a sequence of tasks or resoning.
-
langchain_templates_and_output_parsers.ipynb
- Description: This notebook presents using LangChain prompt templates and output parsers. Prompt templates help to structurize prompt and better instruct LLM to deal with specific tasks. Output parsers are used to ensure type of return values and formatting is always as expected.
-
- Description: Storing and summarizing conversation history in a structurized form.
-
- Description: This notebook presents LangChain criteria evaluators which help to evaluate generated output with other LLM using defined categories.
-
- Description: Agents use LLM to determine what actions to perform and in what order. Agents can use set of predefined tools to solve complex tasks.
- Notebook: llamaindex_example.ipynb
- Description: This notebook shows how to run Llamaindex framework locally to create virtual AI assistant based on RAG (Retrieval Augmented Generation).
- Notebook: ollama_example.ipynb
- Description: Ollama is a tool for using LLMs on local environment via API. This gives simplicity and flexibility for creating AI/LLM/RAG based applications.
- Notebook: vector_database.ipynb
- Description: This notebook presents how to use embeddings and store them in vector database.
- Notebook: evaluation_rouge_bleu.ipynb
- Description: This notebook presents evaluation techniques for LLMs output such as ROUGE, BLEU and METEOR. All presented metrics are based on comparison of words in AI generated text with human provided result text.
- Python 3.7 or higher
- Jupyter Notebook or JupyterLab
- Installed libraries as specified in
requirements.txt
Clone the repository and install the required packages:
git clone https://github.com/rzarno/course-generative-ai-python.git
cd course-generative-ai-python
pip install -r requirements.txt
Open the notebooks using Jupyter:
jupyter lab
Navigate to the notebook you wish to explore and run the cells sequentially.
Find definitions and examples of working with LLMs in Python on presentations in SLIDES dir. Currently available in Polish (translation to en in progress).
1. Architecture and types of large language models.pdf
2. Usage of LLM and RAG - case study.pdf
3. Practising with LLMs, prompt engineering.pdf
4. Frameworks for LLMs - LangChain, Ollama, LlamaIndex, HuggingFace.pdf
5. RAG - Retrieval Augmented Generation.pdf
6. LLM Results evaluation.pdf
7. Parameter-Efficient Fine-Tuning (PEFT), LoRA i RLHF.pdf
AI chatbot project in Streamlit framework project-ai-chatbot-rag-langchain
based on https://github.com/shashankdeshpande/langchain-chatbot
This project is licensed under the MIT License—see the LICENSE file for details.
For any questions or feedback, please open an issue or contact rzarno.