To run the Semantic Dataset Search Engine locally, follow these steps.
Before setting up the project, ensure you have the following installed:
- Node.js and npm
- Python 3.7 or higher
- PostgreSQL with pgvector extension
- An Azure OpenAI API key
- Start by cloning the repository to your local machine:
git clone https://github.com/Gitcatmeoww/HITS-system-implementation.git
cd HITS-system-implementation
- On macOS and Linux:
python3 -m venv venv
source venv/bin/activate
- On Windows:
py -m venv venv
.\venv\Scripts\activate
- Install the required packages:
pip install -r requirements.txt
-
Set up the environment variables according to
.env.example
:- This involves creating a .env file and populating it with your configuration details
-
Initialize the database:
python backend/app/db/construct_db.py
- Start the Flask backend server:
python backend/app/app.py
- Navigate to the frontend application directory:
cd frontend/my-app
- Install the necessary npm packages:
npm install
- Start the frontend React application:
npm start
- The application should now be running at http://localhost:3000 and is ready for use
This section addresses common issues you may encounter while setting up the application. Follow these steps to troubleshoot and resolve the issues.
ModuleNotFoundError
: No module named 'backend.app'
This error indicates that the application cannot locate the required module, possibly due to environment variables not being set correctly. Perform the following steps:
chmod +x setenv.sh # Ensure the setenv.sh script is executable
source ./setenv.sh # Execute the setenv.sh script to set the necessary environment variables
- Database connection issue:
If you're experiencing difficulties connecting to the PostgreSQL database, follow these steps for a possible solution:
- Connect to PostgreSQL server as a superuser, often the default
postgres
user:
psql -U postgres
- Create the database:
CREATE DATABASE dbname;
After launching the frontend application, you will be presented with an interface that allows you to interact with the chatbot, explore datasets, and view detailed metadata for each dataset (under development)