WLhI-yLBLWlfL8c6.mp4
This project is aimed at producing a fully open-source, LlamaCloud-backed alternative to NotebookLM.
This project uses uv
to manage dependencies. Before you begin, make sure you have uv
installed.
On macOS and Linux:
curl -LsSf https://astral.sh/uv/install.sh | sh
On Windows:
powershell -ExecutionPolicy ByPass -c "irm https://astral.sh/uv/install.ps1 | iex"
For more install options, see uv
's official documentation.
Get the GitHub repository:
git clone https://github.com/run-llama/notebookllama
Install dependencies:
cd notebookllama/
uv sync
Modify the .env.example
file with your API keys:
OPENAI_API_KEY
: find it on OpenAI PlatformELEVENLABS_API_KEY
: find it on ElevenLabs SettingsLLAMACLOUD_API_KEY
: find it on LlamaCloud Dashboard
Rename the file to .env
:
mv .env.example .env
Now, you will have to execute the following scripts:
uv run tools/create_llama_extract_agent.py
uv run tools/create_llama_cloud_index.py
And you're ready to set up the app!
Launch Postgres and Jaeger:
docker compose up -d
Run the MCP server:
uv run src/notebookllama/server.py
Now, launch the Streamlit app:
streamlit run src/notebookllama/Home.py
Important
You might need to install ffmpeg
if you do not have it installed already
And start exploring the app at http://localhost:8751/
.
Contribute to this project following the guidelines.
This project is provided under an MIT License.