Bridges Meta's Llama Stack serve endpoint via Nginx JS transform to expose an OpenAI-compatible endpoint.
This project provides a bridge between Meta's Llama Stack serve endpoint and an OpenAI-compatible API. It uses Nginx with JavaScript transformations to adapt requests and responses between the two APIs.
- Configurable Upstream Host and Port: Easily set the upstream Llama Stack host and port using a
.env
file. - Transformation Logic: Converts Llama Stack responses to match OpenAI API specifications.
- Dockerized Deployment: Simplifies deployment using Docker and Docker Compose.
- Docker and Docker Compose installed on your system.
- An instance of Llama Stack running and accessible.
git clone https://github.com/matthewhand/llama-stack-bridge.git
cd llama-stack-bridge
Copy the .env.sample
file to .env
and modify it according to your setup.
cp .env.sample .env
Edit the .env
file to set the upstream Llama Stack host and port:
UPSTREAM_HOST=host.docker.internal
UPSTREAM_PORT=42069
- UPSTREAM_HOST: The hostname or IP address where your Llama Stack is running.
- UPSTREAM_PORT: The port on which your Llama Stack is listening.
Start the Nginx bridge using Docker Compose:
docker-compose up -d
This command builds the Docker image and starts the Nginx container.
You can test the bridge by sending a request to the OpenAI-compatible endpoint exposed by the bridge:
curl http://localhost:42070/chat/completions \
-H "Content-Type: application/json" \
-d '{
"model": "your-model-id",
"messages": [{"role": "user", "content": "Hello, world!"}]
}'
The bridge expects a Llama Stack instance running and accessible at the host and port specified in your .env
file.
To run Llama Stack, you can use the following command (adjust as necessary):
llama stack run llamastack --port 42069
This command launches Llama Stack with the profile llamastack
on port 42069
.
For more information on setting up and running Llama Stack, refer to the official documentation:
Note: Setting up Llama Stack is outside the scope of this project.
nginx/
nginx.conf
: Nginx configuration file.transform.js
: JavaScript file containing transformation logic.Dockerfile
: Dockerfile for building the Nginx image.cache/
: Directory for Nginx cache.logs/
: Directory for Nginx logs.
docker-compose.yaml
: Docker Compose file for the bridge..env.sample
: Sample environment variables file..gitignore
: Git ignore file.README.md
: Project documentation.LICENSE
: License information.
- Fix
stop_reason
bug - Streaming
This project is licensed under the MIT License. See the LICENSE file for details.