Skip to content

aimclub/ProtoLLM

Repository files navigation

ProtoLLM

license
Licence for repo
support
languages
mirror
GitLab mirror for this repository
funding

Intro

Proto-LLM is an open-source framework for fast protyping of LLM-based applications.

Proto LLM features

  • Rapid prototyping of information retrieval systems based on LLM using RAG:
    Implementations of architectural patterns for interacting with different databases and web service interfaces; Methods for optimising RAG pipelines to eliminate redundancy.
  • Development and integration of applications with LLM with connection of external services and models through plugin system:
    Integration with AutoML solutions for predictive tasks; Providing structured output generation and validation;
  • Implementation of ensemble methods and multi-agent approaches to improve the efficiency of LLMs:
    Possibility of combining arbitrary LLMs into ensembles to improve generation quality, automatic selection of ensemble composition; Work with model-agents and ensemble pipelines;
  • Generation of complex synthetic data for further training and improvement of LLM:
    Generating examples from existing models and data sets; Evolutionary optimisation to increase the diversity of examples; Integration with Label Studio;
  • Providing interoperability with various LLM providers:
    Support for native models (GigaChat, YandexGPT, vsegpt, etc.). Interaction with open-source models deployed locally.

Installation

  • Package installer for Python pip

The simplest way to install ProtoLLM is using pip:

$ pip install protollm

Modules with tools can be installed separately:

$ pip install protollm-worker

$ pip install protollm-api

$ pip install protollm-sdk

Project Structure

The latest stable release of ProtoLLM is in the master branch.

The repository includes the following directories:

  • Package protollm contains the main modules. It is the core of the ProtoLLM framework;
  • Package protollm_tools contains side tools with specific dependensied;
  • Package examples includes several how-to-use-cases where you can start to discover how ProtoLLM works;
  • All unit and integration tests can be observed in the test directory;
  • The sources of the documentation are in the docs directory.

Contribution Guide

  • The contribution guide is available in this repository.

Acknowledgments

We acknowledge the contributors for their important impact and the participants of the numerous scientific conferences and workshops for their valuable advice and suggestions.

Supported by

The study is supported by the Research Center Strong Artificial Intelligence in Industry of ITMO University as part of the plan of the center's program "Framework for rapid application prototyping based on large language models".

Contacts

Papers about ProtoLLM-based solutions:

  • Zakharov K. et al. Forecasting Population Migration in Small Settlements Using Generative Models under Conditions of Data Scarcity //Smart Cities. – 2024. – Т. 7. – №. 5. – С. 2495-2513.
  • Kovalchuk M. A. et al. SemConvTree: Semantic Convolutional Quadtrees for Multi-Scale Event Detection in Smart City //Smart Cities. – 2024. – Т. 7. – №. 5. – С. 2763-2780.
  • Kalyuzhnaya A. et al. LLM Agents for Smart City Management: Enhancing Decision Support through Multi-Agent AI Systems - 2024 - Under Review

About

Framework for prototyping of LLM-based applications

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published