Skip to content

run-llama/notebookllama

Repository files navigation

NotebookLlaMa🦙

A fluffy and open-source alternative to NotebookLM!

WLhI-yLBLWlfL8c6.mp4

This project is aimed at producing a fully open-source, LlamaCloud-backed alternative to NotebookLM.

Prerequisites

This project uses uv to manage dependencies. Before you begin, make sure you have uv installed.

On macOS and Linux:

curl -LsSf https://astral.sh/uv/install.sh | sh

On Windows:

powershell -ExecutionPolicy ByPass -c "irm https://astral.sh/uv/install.ps1 | iex"

For more install options, see uv's official documentation.

Get it up and running!

Get the GitHub repository:

git clone https://github.com/run-llama/notebookllama

Install dependencies:

cd notebookllama/
uv sync

Modify the .env.example file with your API keys:

Rename the file to .env:

mv .env.example .env

Now, you will have to execute the following scripts:

uv run tools/create_llama_extract_agent.py
uv run tools/create_llama_cloud_index.py

And you're ready to set up the app!

Launch Postgres and Jaeger:

docker compose up -d

Run the MCP server:

uv run src/notebookllama/server.py

Now, launch the Streamlit app:

streamlit run src/notebookllama/Home.py

Important

You might need to install ffmpeg if you do not have it installed already

And start exploring the app at http://localhost:8751/.

Contributing

Contribute to this project following the guidelines.

License

This project is provided under an MIT License.