The following sections will provide information on how to develop the project locally, how to run tests, how to run and debug the app locally and how the code is structured.
The api
folder contains the OpenAPI specification file which documents the api and is also used to generate the server boilerplate code.
The cmd
folder contains the main entrypoint for the application with themain.go
file. There is also a default-config.yaml
file containing default values to configure the service. These values are applied if the provided config file of the user doesn't include a value.
Documentation files for the development or database specific information is located in the docs
folder.
The tools
folder contains code generation tools like the oapi-codegen
to generate the server boilerplate code.
The test
folder contains the integration tests for the application.
The internal
folder contains the main application code.
The iternal
folder is further subdivided into the following packages:
-
api
: contains the generated code from the OpenAPI specification file divided intoserver_gen.go
and thetypes_gen.go
code. The types define the request and response objects with structs while the server file includes the Gin router handlers and validation of the query and path parameters. -
config
: contains theConfig
struct which holds all values related to configure the service externally by environment variables or the config file. The logic for loading the configuration file and environment variables as well as using default values if non are provided is handled here. -
utilities
: hold different functions which are widely used accros all packages like generating UUIDs, loading environment variables, etc. -
server
:server.go
creates the server struct which implements theStrictServerInterface
form theapi
package. Middlewares like authentication and rate limiting are implemented in themiddleware
directory. The other files hold the business logic of the endpoints. The files are divided in the same way as they are structured in the OpenAPI specification files. The files only hold the business logic and return the corresponding api responses (success or different errors depending on the specific endpoins). Database operations aren't implemented here in order to isolate the the database logic from the api business logic. The functions in the api endpoints only call generic operations defined in theDatastore
interface from thedatabase
package or transform methods in order to transfrom API structs to their corresponding database ones. This also makes it easier to test the api endpoints without the need of a database. And the database and transform logic can be tested separately as well. -
database
: The directory contains all database related operations.datastore.go
defines theDatastore
interface as well asDatabaseErrors
. The interface capsulates all database logic so the defined functions can be called from the endpoints. The foldergormQuery
is an implementation of theDatastore
interface using GORM asn an ORM. Inside themodels
folder the GORM models are defined which specify the database tables and relations between them.gormInit
is used to create a connection to the database and autoMigrate these models. In thetransform
directory the transformation between the API models and database models are defined at one central place.
The tools
directory contains tools and code generation scripts that are used to generate code for the project. In order to use these tools, just run this command:
cd tools
go generate -tags tools
Python scripts for importing Code Systems or converting SQL database dumps into importable csv files for the mapper can be found in the codesystem-import directory.
Boilerplate Code for the go server gets generated by oapi-codegen version v2.1.0. The code generation is based on the OpenAPI specification file api/openapi.yaml
. The generated code is placed in the internal/api
directory. A types_gen.go'
file gets generated defining the request and response objects of the OpenAPI files in Go structs. The server_gen.go
file contains the Gin router handlers and validation of the query and path parameters.
The following section describes how to develop code for the project.
Go 1.22
is used to develop the project so it has to be installed. Make
has to be available to use the Makefiles for building the Application. Docker is necessary to run the application in an containerized environment and start the postgres database. In order to make the development process easiert in the future, a Dev Container will be provided which contains all necessary tools and dependencies so the project can be developed in a consistent environment.
The application can be run locally using the following command:
go run cmd/miracummapper/main.go
When the applications starts, it waits until the database and keycloak are reachable. The easiest way to create all services in a dockerized environment, use the docker-compose.yaml
file. Further configuration of keycloak is documented in the Quick Start section of the Readme in this repository.
docker-compose up -d
To stop all services, use following command:
docker-compose down
There is also a Makefile
defined to create the binary for a linux environment using the following command (This makefile is also used in the build stage of the docker container to build the binary):
make
Also the docker image can be build locally using the following command:
docker build -t miracummapper .
When you want to debug the miracum-mapper
, first you have to start all other services normally:
docker compose up -d miracum-postgres keycloak keycloak-postgres
Afterwards, the binary can be debugged by using the ./vscode/launch.json
file. Just click on Run and Debug
and select Launch Miracum Mapper
.
For testing the API Endpoints many tools are suitable but a very simple way to do it is using the Swagger Viewer VSCode Extension which is preinstalled when using the DevContainer setup for development.
For visualizing the database structure the tool DrawSQL was used.
A Dev Container is provided to run the project in a containerized environment and make development easy as it has all prerequisites installed. To use it, you need to have Docker installed on your machine. The Dev Container Extension for VSCode is also recommended. Please refer to the Dev Container Documentation for more information.
The project uses GitHub Actions as a CI/CD pipeline. The workflow is defined in the .github/workflows
directory. Actions from the Miracum Project .github repository are integrated to the workflow. The pipeline is divided into the jobs:
-
Static Code Analysis
is performed in the lint job using the Miracum Projectstandard-lint.yaml
workflow. Additionally acheck-oapi-codegen
job is defined which checks if the generated code from the OpenAPI specification file is up-to-date with the spec file. -
unit-test
are performed afterwards to test the individual functions of the application. -
TODO:
integration-test
will be added to test the application as a whole. -
In the
build
job the docker image is build, scanned and pushed to the GitHub Container Registry.