An intelligent FAQ assistant that accurately answers user queries by matching them with semantically similar questions from a predefined FAQ database. When a close enough match isn't found, the system seamlessly interacts with the OpenAI API to generate the most relevant response.
- About
- Features
- Project Structure
- Prerequisites
- Installation
- Usage
- API Documentation
- Monitoring
- Testing
- FAQ Database
- License
AskMe is a web-based FAQ assistant powered by NLP models and PostgreSQL's pgvector
for embedding similarity search. It provides users with precise answers by querying preloaded FAQs stored in a PostgreSQL database and can be extended using OpenAI's GPT models for more complex queries.
- FAQ Querying: Get precise answers to commonly asked questions.
- Embedding Computing: Compute embeddings from the provided FAQ data for later similarity searches.
- LangChain: Utilize LangChain for NLP tasks outside of OpenAI interactions.
- OpenAI Integration: For queries that don't match any existing FAQ, the system can generate responses using OpenAI via LangChain.
- FastAPI Endpoint: Provide an API for users to submit questions and receive answers.
- Authentication: Added to endpoints using FastAPI’s dependency mechanism.
- PostgreSQL for FAQ and Embeddings: Integrated pgVector to store and query vector embeddings.
- Containerized Deployment: Full Docker support for easy deployment.
.
├── app
│ ├── scripts
│ │ ├── initialize_embeddings.py
│ │ └── load_faq_data.py
│ ├── services
│ │ ├── config.py
│ │ ├── embeddings.py
│ │ ├── openai_client.py
│ │ └── similarity.py
│ ├── static
│ │ ├── favicon.ico
│ │ ├── main.js
│ │ └── styles.css
│ ├── templates
│ │ ├── error.html
│ │ └── index.html
│ ├── utils
│ │ └── logger.py
│ ├── db.py
│ ├── main.py
│ ├── models.py
│ └── init.sql
├── data
│ └── faq_data.json
├── tests
├── .env_example
├── Dockerfile
├── docker-compose.yml
├── requirements.txt
└── README.md
Before you begin, ensure you have the following installed on your system:
- Python 3.10+
- Docker
Note: You will also need an OpenAI API key to interact with OpenAI.
To install and run AskMe app using Docker, follow these steps:
- Clone the repository:
git clone https://github.com/lkmeta/askme.git
cd askme
- Set Up Environment Variables
cp .env_example .env
Edit the .env file and add your API keys for OpenAI API, Authentication, and PostgreSQL necessary environment variables.
Additionally, you can configure the following parameters:
EMBEDDINGS_MODEL: Define the model for embeddings (e.g., text-embedding-3-small).
SIMILARITY_THRESHOLD: Set the similarity threshold for searching (default: 0.7).
- Build the Docker Image
docker-compose build
- Run the app
docker-compose up
Note: Upon the first run, init.sql is automatically executed, initializing the database with the required schema. The
load_faq_data.py
andinitialize_embeddings.py
scripts will load the FAQs and embeddings.
Open your web browser and navigate to http://localhost:8000
to access AskMe.
FastAPI automatically generates interactive API documentation. Once the application is running, access the documentation at:
http://localhost:8000/docs
To monitor the application and the services:
-
Ensure you have completed the installation steps above.
-
You can view the logs of the running Docker container to monitor the application output.
docker-compose logs -f
Note: The -f option follows the log output in real-time.
To benchmark the performance of the AskMe application, you can use Locust, a load-testing tool that helps simulate concurrent users sending requests to your API.
The locustfile.py
is located inside the test
directory.
-
Install Locust: If you don't have Locust installed, you can install it using pip (preferably within a virtual environment such as python env):
pip install locust
-
Navigate to the Test Directory:
cd test
-
Run Locust: Start Locust by running:
locust -f locustfile.py
-
Open Locust Web Interface: Open your web browser and go to
http://localhost:8089
. -
Start the Test: Fill in the required fields (like the number of users) and set the Host to
http://localhost:8000
to begin testing the AskMe application.
The predefined set of FAQ questions and answers used to compute the embeddings is located in the file data/faq_data.json
. You can update this file with additional FAQs to improve the system’s accuracy.
This project is licensed under the Apache-2.0 License.