The DIBBs Query Connector app is built with the Next.js framework and offers both a UI and a REST API for searching for a patient and viewing information tied to your case investigation.
Query Connector depends on a few external services to work properly:
- A PostgreSQL database, which stores information on conditions, value sets, FHIR server configuration, etc. Database migrations are managed using Flyway
- An OAuth capable identity provider for authentication. For local development, we use Keycloak
- An external FHIR server. For local development, we use Aidbox
Before running the Query Connector locally, you will need to obtain API keys for the electronic Reporting and Surveillance Distribution (eRSD) and for the Unified Medical Language System (UMLS). These API keys will be used to download information about reportable conditions (e.g., chlamydia, influenza, hepatitis A, etc), medical code value sets, and mappings between them. This information is used to build queries in the Query Connector app.
Additionally, you will need a free license key for Aidbox in order to run it in dev mode for local development.
To obtain the free API and license keys, please visit the following URLs and follow the sign up instructions.
Next, set up your .env
file with the following command: cp .env.sample .env
Add your API keys as an environment variables called ERSD_API_KEY
, UMLS_API_KEY
, and AIDBOX_LICENSE
in the .env
file so that they can be accessed when running the Query Connector app.
The default DATABASE_URL
value in .env.sample
assumes you will use the database container spun up by Docker Compose. It is also possible to run a PostgreSQL server on your local machine, which may lead to some modest speed increases, especially when seeding from the eRSD and UMLS APIs. If you'd like to run a local PostgreSQL server, adjust your DATABASE_URL
as needed, e.g. DATABASE_URL=postgresql://myuser@localhost:5432/tefca_db
. To run Flyway migrations against a local PostgreSQL server, you'll also need to adjust flyway.conf accordingly.
We provide a Docker Compose file that spins up all of the necessary services for Query Connector, and a simple npm script for invoking both Docker Compose and running the Next.js application.
- Ensure that both Git and Node 22.x or higher are installed.
- Clone the Query Connector repository with
git clone [email protected]:CDCgov/dibbs-query-connector.git
. - Navigate to the source folder with
cd dibbs-query-connector
. - Install all of the Node dependencies for the Query Connector app with
npm install
. - Run the Query Connector app on
localhost:3000
withnpm run dev
. If you are on a Windows Machine, you may need to runnpm run dev-win
instead.
The containers should take a few minutes to spin up, but if all goes well, congratulations, the Query Connector app should now be running on localhost:3000
! You can also access Aidbox at localhost:8080
, and Keycloak at localhost:8081
.
To login via Keycloak, make sure your .env
is updated using cp
command above and use the following credentials to login at localhost:8080
after spinning up the container:
Username: qc-admin
Password: QcDev2024!
- Download a copy of the Docker image from the Query Connector repository by running
docker pull ghcr.io/cdcgov/dibbs-query-connector/query-connector:latest
.- If you're using an M1 Mac, you'll need to tell Docker to pull the non-Apple Silicon image using
docker pull --platform linux/amd64 ghcr.io/cdcgov/dibbs-query-connector/query-connector:latest
since we don't have a image for Apple Silicon. If you're using this setup, there might be some issues with architecture incompatability that the team hasn't run into, so please flag if you run into something!
- If you're using an M1 Mac, you'll need to tell Docker to pull the non-Apple Silicon image using
- Run the service with
docker run -p 3000:3000 query-connector:latest
. If you're on a Windows machine, you may need to rundocker run -p 3000:3000 ghcr.io/cdcgov/phdi/query-connector:latest
instead. - You will need to run the supporting services with our Docker Compose file by running
docker compose -f docker-compose-dev.yaml up
.
To build the Docker image for the Query Connector app from source instead of downloading it from the GitHub repository follow these steps.
- Clone the Query Connector repository with
git clone [email protected]:CDCgov/dibbs-query-connector.git
. - Navigate to the source folder with
cd dibbs-query-connector
. - Run
docker build -t query-connector .
.
When initializing the backend database for the first time, the Query Connector makes the value sets associated with 200+ reportable conditions available to users tasked with building queries for their jurisdiction. To run this seeding script, you'll need to obtain the UMLS and eRSD API key's using the instructions below.
To group value sets by condition and to group the conditions by type, the Query Connector obtains and organizes data from the eRSD and the VSAC in the following way:
- The Query Connector retrieves the 200+ reportable conditions from the eRSD as well as the value sets' associated IDs.
- Using the value set IDs from the eRSD, the Query Connector retrieves the value set's comprehensive information from the VSAC, i.e., the LOINC, SNOMED, etc. codes associated with each value set ID.
- The Query Connector then organizes these value sets according to the conditions with which they're associated, making the result available to users interested in building queries. The conditions are additionally organized by category, e.g., sexually transmitted diseases or respiratory conditions, using a mapping curated by HLN Consulting.
In order to make the dev process as low-lift as possible, we want to avoid executing the db-creation
scripts when booting up the application in dev mode via npm run dev
or npm run dev-win
. To that end, we've created a pg_dump
file containing all the value sets, concepts, and foreign key mappings that would be extracted from a fresh pull of the eRSD and processed through our creation functions. This file, vs_dump.sql
has been mounted into the docker volume of our postgres DB when running in dev mode as an entrypoint script. This means it will be automatically executed when the DB is freshly spun up. You shouldn't need to do anything to facilitate this mounting or file running.
If the DB extract file ever needs to be updated, you can use the following simple process:
- Start up the application on your local machine using a regular
docker compose up
, and wait for the DB to be ready. - Load the eRSD and value sets into the DIBBs DB by using the
Create Query
button on the/queryBuilding
page. Optionally, use DBeaver to verify that value sets exist in the database. - In a fresh terminal window, run
pg_dump -U postgres -f vs_dump.sql -h localhost -p 5432 tefca_db
If the above doesn't work, try replacing localhost
with 0.0.0.0
.
- Enter the DB password when prompted.
- The extract file,
vs_dump.sql
, should now be created. It should automatically be located in/query-connector
, but if it isn't, putvs_dump.sql
there.
A Postman collection demonstrating use of the API can be found here.
The Query Connector uses Playwright Test as its end-to-end testing framework. Playwright is a browser-based testing library that enables tests to run against a variety of different browsers under a variety of different conditions. To manage this suite, Playwright creates some helpful files (and commands) that can be used to tweak its configuration.
Playwright's configuration is managed by the file playwright.config.ts
. This file has information on which browsers to test against, configuration options for those browsers, optional mobile browser ports, retry and other utility options, and a dev webserver. Changing this file will make global changes to Playwright's operations.
By default, Playwright will look for end to end tests in /e2e
.
Playwright provides a number of different ways of executing end to end tests. From the root directory, you can run several commands:
npm run test:playwright:ui
Runs the end-to-end tests locally by spawning the Playwright UI mode. This will start a dev server off localhost:3000, so make sure you don't have another app instance running off that port.
You'll need to have a token for Aidbox set under AIDBOX_LICENSE in your .env for the Aidbox seeder to run correctly. You can sign up for a dev license at https://aidbox.app and use that in your local setup.
npm run test:playwright
Runs the end-to-end tests.
npm run test:playwright -- --project=chromium
Runs the tests only on Desktop Chrome.
npm run test:playwright -- example
Runs the tests in a specific file.
npm run test:playwright -- -g "test name"
Runs the test with the name "test name", e.g., Query("test name") would run here.
npm run test:playwright -- --debug
Runs the tests in debug mode.
npm run test:playwright -- codegen
Auto generate tests with Codegen.
After running a test set on your local, you can also additionally type npx playwright show-report
to view an HTML report page of different test statuses and results.
Playwright is managed by an end-to-end job in the .github/workflows/ci.yaml
file of the project root. Since it requires browser installation to effectively test, and since it operates using an independent framework from jest, it's run separately from the unit and
integration tests ran through npm run test:ci
.
For more info, see architecture.md.