Skip to content

Commit

Permalink
updated readme
Browse files Browse the repository at this point in the history
  • Loading branch information
rashadphz committed Jun 29, 2024
1 parent f238bdd commit ecc9e68
Showing 1 changed file with 7 additions and 84 deletions.
91 changes: 7 additions & 84 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -28,6 +28,7 @@ Please feel free to contact me on [Twitter](https://twitter.com/rashadphz) or [c
- [x] Create a pre-built Docker Image
- [x] Add support for custom LLMs through LiteLLM
- [x] Chat History
- [x] Expert Search
- [ ] Chat with local files


Expand All @@ -47,6 +48,7 @@ Please feel free to contact me on [Twitter](https://twitter.com/rashadphz) or [c
- Answer questions with cloud models (OpenAI/gpt4-o, OpenAI/gpt3.5-turbo, Groq/Llama3)
- Answer questions with local models (llama3, mistral, gemma, phi3)
- Answer questions with any custom LLMs through [LiteLLM](https://litellm.vercel.app/docs/providers)
- Search with an agent that plans and executes the search for better results

## 🏃🏿‍♂️ Getting Started Locally

Expand All @@ -67,96 +69,17 @@ Please feel free to contact me on [Twitter](https://twitter.com/rashadphz) or [c

### Quick Start:
```
docker run \
-p 8000:8000 -p 3000:3000 -p 8080:8080 \
--add-host=host.docker.internal:host-gateway \
ghcr.io/rashadphz/farfalle:main
git clone https://github.com/rashadphz/farfalle.git
cd farfalle && cp .env-template .env
```
Modify .env with your API keys (Optional, not required if using Ollama)

#### Optional
- `OPENAI_API_KEY`: Your OpenAI API key. Not required if you are using Ollama.
- `SEARCH_PROVIDER`: The search provider to use. Can be `tavily`, `serper`, `bing`, or `searxng`.
- `OPENAI_API_KEY`: Your OpenAI API key. Not required if you are using Ollama.
- `TAVILY_API_KEY`: Your Tavily API key.
- `SERPER_API_KEY`: Your Serper API key.
- `BING_API_KEY`: Your Bing API key.
- `GROQ_API_KEY`: Your Groq API key.
- `SEARXNG_BASE_URL`: The base URL for the SearXNG instance.

Add any env variable to the docker run command like so:
```
docker run \
-e ENV_VAR_NAME1='YOUR_ENV_VAR_VALUE1' \
-e ENV_VAR_NAME2='YOUR_ENV_VAR_VALUE2' \
-p 8000:8000 -p 3000:3000 -p 8080:8080 \
--add-host=host.docker.internal:host-gateway \
ghcr.io/rashadphz/farfalle:main
```



Wait for the app to start then visit [http://localhost:3000](http://localhost:3000).

or follow the instructions below to clone the repo and run the app locally


### 1. Clone the Repo

```
git clone [email protected]:rashadphz/farfalle.git
cd farfalle
```

### 2. Add Environment Variables
```
touch .env
```

Add the following variables to the .env file:

#### Search Provider
You can use Tavily, Searxng, Serper, or Bing as the search provider.

**Searxng** (No API Key Required)
```
SEARCH_PROVIDER=searxng
```

**Tavily** (Requires API Key)
```
TAVILY_API_KEY=...
SEARCH_PROVIDER=tavily
```
**Serper** (Requires API Key)
```
SERPER_API_KEY=...
SEARCH_PROVIDER=serper
```

**Bing** (Requires API Key)
```
BING_API_KEY=...
SEARCH_PROVIDER=bing
```


#### Optional
```
# Cloud Models
OPENAI_API_KEY=...
GROQ_API_KEY=...
# See https://litellm.vercel.app/docs/providers for the full list of supported models
CUSTOM_MODEL=...
```

### 3. Run Containers
This requires Docker Compose version 2.22.0 or later.
Start the app:
```
docker-compose -f docker-compose.dev.yaml up -d
```

Visit [http://localhost:3000](http://localhost:3000) to view the app.
Wait for the app to start then visit [http://localhost:3000](http://localhost:3000).

For custom setup instructions, see [custom-setup-instructions.md](/custom-setup-instructions.md)

Expand Down

0 comments on commit ecc9e68

Please sign in to comment.