Skip to content

inference-gateway/docs

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

75 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Inference Gateway Documentation

CI Status Version License

This repository contains the documentation website for Inference Gateway, an open-source API gateway for Large Language Models (LLMs) that provides a unified interface for accessing multiple AI providers.

About Inference Gateway

Inference Gateway offers a unified API layer to interact with multiple LLM providers including OpenAI, DeepSeek, Anthropic, Groq, Cohere, Ollama and more. It provides a consistent interface for interacting with different LLMs, abstracting away the differences between each provider's API.

Development

This documentation site is built with Next.js.

# Install dependencies
npm install

# Start development server
npm run dev

You can use the devcontainer for a consistent development environment. The devcontainer is configured with all the necessary tools and extensions for development.

Contributing

Contributions to improve the documentation are welcome! You can:

  1. Edit existing MDX files in the markdown directory
  2. Add new documentation pages
  3. Improve the site's design and functionality

License

This project is licensed under the MIT License.