Open city profile is used to store common information (name, contact information, ...) about the citizens of the city of Helsinki.
When a citizen is using a service which is connected to the profile, the service can query for the citizen's information from the profile so that the citizen doesn't have to enter all of their data every time it is needed. The services may also provide a better user experience using the profile's data, for example by returning more relevant search results based on the citizen's interests.
The same data may also be queried by the employees of the city of Helsinki while performing their daily duties, for example using the administrative functions of services.
Open city profile is implemented using Django and it provides a GraphQL API.
See docs/config.adoc.
Development with Docker
Prerequisites:
- Docker engine: 18.06.0+
- Docker compose 1.22.0+
-
Create a
docker-compose.env.yaml
file in the project folder:- Use
docker-compose.env.yaml.example
as a base, it does not need any changes for getting the project running. - Change
DEBUG
and the rest of the Django settings if needed.TOKEN_AUTH_*
, settings for tunnistamo authentication service
- Set entrypoint/startup variables according to taste.
CREATE_SUPERUSER
, creates a superuser with credentialsadmin
:admin
([email protected])APPLY_MIGRATIONS
, applies migrations on startupENABLE_GRAPHIQL
, enables GraphiQL interface for/graphql/
ENABLE_GRAPHQL_INTROSPECTION
, enables GraphQL introspection queriesSEED_DEVELOPMENT_DATA
, flush data and recreate the environment with fake development data (requiresAPPLY_MIGRATIONS
)OIDC_CLIENT_ID
, Tunnistamo client id for enabling GDPR API authorization code flowsOIDC_CLIENT_SECRET
, Tunnistamo client secret for enabling GDPR API authorization code flowsGDPR_AUTH_CALLBACK_URL
, GDPR auth callback URL should be the same which is used by the UI for fetching OAuth/OIDC authorization token for using the GDPR APITUNNISTAMO_API_TOKENS_URL
, Tunnistamo URL from which the backend will fetch API tokens for GDPR API use.
- Use
-
Run
docker-compose up
- The project is now running at localhost:8080
Optional steps
-
Run migrations:
- Taken care by the example env
docker exec profile-backend python manage.py migrate
-
Seed development data
- Taken care by the example env
- See also Seed development data below
docker exec profile-backend python manage.py seed_development_data
-
Create superuser:
- Taken care by the example env
docker exec -it profile-backend python manage.py createsuperuser
-
Set permissions for service staff members if needed:
- Create group(s) (via Django admin) and add user(s) to the group
- Create service permissions for group manually via Django admin or for example:
docker exec profile-backend python manage.py add_object_permission ServiceName GroupName can_view_profiles
, where:ServiceName
is the name of the Service the permission is given forGroupName
is the name of the group to whom the permission is givecan_view_profiles
is the name of the permission
- Permissions can be removed as follows:
docker exec profile-backend python manage.py remove_object_permission ServiceName GroupName can_view_profiles
-
Seed development data
- Note! This command will flush the database.
- Add all data with defaults:
docker exec profile-backend python manage.py seed_development_data
- See
python manage.py help seed_development_data
for optional arguments - Command will generate:
- All available services
- One group per service (with
can_manage_profiles
permissions) - One user per group (with username
{group.name}_user
) - Profiles
- With user
- With email, phone number and address
- Connects to one random service
Prerequisites:
- PostgreSQL 13
- PostGIS 3.2
- Python 3.11
- Run
pip install -r requirements.txt
- Run
pip install -r requirements-dev.txt
(development requirements)
To setup a database compatible with default database settings:
Create user and database
sudo -u postgres createuser -P -R -S open_city_profile # use password `open_city_profile`
sudo -u postgres createdb -O open_city_profile open_city_profile
Allow user to create test database
sudo -u postgres psql -c "ALTER USER open_city_profile CREATEDB;"
- Create
.env
file:touch .env
- Set the
DEBUG
environment variable to1
. - Run
python manage.py migrate
- Run
python manage.py createsuperuser
- Run
python manage.py runserver 0:8000
The project is now running at localhost:8000
This repository contains requirements*.in
and corresponding requirements*.txt
files for requirements handling. The requirements*.txt
files are generated from the requirements*.in
files with pip-compile
.
-
Add new packages to
requirements.in
orrequirements-dev.in
-
Update
.txt
file for the changed requirements file:pip-compile requirements.in
pip-compile requirements-dev.in
- Note: the
requirements*.txt
files added to version control are meant to be used in the containerized environment where the service is run. Because Python package dependencies are environment dependent they need to be generated within a similar environment. This can be done by running thepip-compile
command within Docker, for example like this:docker-compose exec django pip-compile requirements.in
(the container needs to be running beforehand).
-
If you want to update dependencies to their newest versions, run:
pip-compile --upgrade requirements.in
-
To install Python requirements run:
pip-sync requirements.txt
Note: when updating dependencies, read the dependency update checklist if there's anything you need to pay attention to.
This project uses Ruff for code formatting and quality checking.
Basic ruff
commands:
- lint:
ruff check
- apply safe lint fixes:
ruff check --fix
- check formatting:
ruff format --check
- format:
ruff format
pre-commit
can be used to install and
run all the formatting tools as git hooks automatically before a
commit.
New commit messages must adhere to the Conventional Commits specification, and line length is limited to 72 characters.
When pre-commit
is in use, commitlint
checks new commit messages for the correct format.
The tests require a Postgres database to which to connect to. Here's one way to run the tests:
- Bring the service up with
docker-compose up
. This also brings up the required Postgres server. - Run tests within the Django container:
docker-compose exec django pytest
.
- Dev: https://profile-api.dev.hel.ninja//graphql/
- Test: https://profile-api.test.hel.ninja//graphql/
- Staging: https://api.hel.fi/profiili-stage/graphql/
- Production: https://api.hel.fi/profiili/graphql/
For a complete service the following additional components are also required:
- tunnistamo is used as the authentication service
- open-city-profile-ui provides UI