Skip to content

Commit

Permalink
Merge pull request #41 from OpenLXP/readme-docs
Browse files Browse the repository at this point in the history
README updated
  • Loading branch information
KarenAJ authored Dec 4, 2024
2 parents 6db7f86 + c54bef7 commit 7532337
Show file tree
Hide file tree
Showing 2 changed files with 127 additions and 109 deletions.
232 changes: 124 additions & 108 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,134 +1,161 @@
# OPENLXP-XSS - Experience Schema Service

# OPENLXP-XSS
The Experience Schema Service (XSS) maintains referential representations of domain entities, as well as transformational mappings that describe how to convert an entity from one particular schema representation to another.

## Experience Schema Service
This component responsible for managing pertinent object/record metadata schemas, and the mappings for transforming records from a source metadata schema to a target metadata schema. This component will also be used to store and link vocabularies from stored schema.


The Experience Schema Service maintains referential representations of domain entities, as well as transformational mappings that describe how to convert an entity from one particular schema representation to another.
## Prerequisites
### Install Docker & docker-compose
#### Windows & MacOS
- Download and install [Docker Desktop](https://www.docker.com/products/docker-desktop) (docker compose included)

This component responsible for managing pertinent object/record metadata schemas, and the mappings for transforming records from a source metadata schema to a target metadata schema. This component will also be used to store and link vocabularies from stored schema.

#### Linux
You can download Docker Compose binaries from the
[release page](https://github.com/docker/compose/releases) on this repository.

# Prerequisites
`Python >=3.7` *Download and install Python from here [Python](https://www.python.org/downloads/).*
Rename the relevant binary for your OS to `docker-compose` and copy it to `$HOME/.docker/cli-plugins`

`Docker` *Download and install Docker from here [Docker](https://www.docker.com/products/docker-desktop).*
Or copy it into one of these folders to install it system-wide:

* `/usr/local/lib/docker/cli-plugins` OR `/usr/local/libexec/docker/cli-plugins`
* `/usr/lib/docker/cli-plugins` OR `/usr/libexec/docker/cli-plugins`

# Environment Variables
(might require making the downloaded file executable with `chmod +x`)

### To run this project, you will need to add the following environment variables to your .env file
### Python
`Python >=3.9` : Download and install it from here [Python](https://www.python.org/downloads/).

`DB_NAME` - MySql database name

`DB_USER` - MySql database user
## 1. Clone the project
Clone the Github repository
```
git clone https://github.com/OpenLXP/openlxp-xss.git
```

`DB_PASSWORD` - MySql database password
## 2. Set up your environment variables
- Create a `.env` file in the root directory
- The following environment variables are required:

`DB_ROOT_PASSWORD` - MySql database root password
| Environment Variable | Description |
| ------------------------- | ----------- |
| AWS_ACCESS_KEY_ID | The Access Key ID for AWS |
| AWS_SECRET_ACCESS_KEY | The Secret Access Key for AWS |
| AWS_DEFAULT_REGION | The region for AWS |
| DB_HOST | The host name, IP, or docker container name of the database |
| DB_NAME | The name to give the database |
| DB_PASSWORD | The password for the user to access the database |
| DB_ROOT_PASSWORD | The password for the root user to access the database, should be the same as `DB_PASSWORD` if using the root user |
| DB_USER | The name of the user to use when connecting to the database. When testing use root to allow the creation of a test database |
| DJANGO_SUPERUSER_EMAIL | The email of the superuser that will be created in the application |
| DJANGO_SUPERUSER_PASSWORD | The password of the superuser that will be created in the application |
| DJANGO_SUPERUSER_USERNAME | The username of the superuser that will be created in the application |
| LOG_PATH | The path to the log file to use |
| SECRET_KEY_VAL | The Secret Key for Django |

`DB_HOST` - MySql database host
## 3. Deployment
1. Create the OpenLXP docker network. Open a terminal and run the following command in the root directory of the project
```
docker network create openlxp
```
`DJANGO_SUPERUSER_USERNAME` - Django admin user name
2. Run the command below to deploy XSS along with it's resources
```
docker-compose up -d --build
```
`DJANGO_SUPERUSER_PASSWORD` - Django admin user password
## 4. Configuration for XSS
1. Navigate over to `http://localhost:8000/admin/` in your browser and login to the Django Admin page with the admin credentials set in your `.env` (`DJANGO_SUPERUSER_EMAIL` & `DJANGO_SUPERUSER_PASSWORD`)
`DJANGO_SUPERUSER_EMAIL` -Django admin user email
2. <u>CORE</u>
- Schema Ledgers
1. Click on `Schema Ledgers` > `Add schema ledgers`
- Enter the configurations below:
`SECRET_KEY_VAL` -Django Secret key to put in Settings.py
- `Schema Name`: Schema file title
- `Schema File` Upload the Schema file in the required format(JSON)
# Installation
- `Status` Select if the Schema is Published or Retired
1. Clone the Github repository:
- `Major version` Add the Major value of the schema version
[GitHub-XSS](https://github.com/OpenLXP/openlxp-xss.git)
- `Minor Version` Add the Minor value of the schema version
2. Open terminal at the root directory of the project.

example: ~/PycharmProjects/openlxp-xss
- `Patch Version` Add the Patch version number of the schema
3. Run command to install all the requirements from requirements.txt

docker-compose build.
**Note: On uploading the schema file in the required format to the schema ledger the creation of corresponding term set, linked child term set and terms process is triggered.**
4. Once the installation and build are done, run the below command to start the server.

docker-compose up
- Transformation Ledger
1. Click on `Transformation Ledgers` > `Add transformation ledger`
- Enter configurations below:
- `Source Schema`: Select source term set file from drop-down
- `Target Schema`: Select Target term set from drop-down to be mapped to
- `Schema Mapping File`: Upload the Schema Mapping file to be referenced for mapping in the required format(JSON)
- `Status`: Select if the Schema Mapping is Published or Retired
5. Once the server is up, go to the admin page:
**Note: On uploading the Schema Mapping File in the required format to the transformation ledger, this triggers the process of adding the mapping for the corresponding term values.**
http://localhost:8000/admin (replace localhost with server IP)
- Term sets: Term sets support the concept of a vocabulary in the context of semantic linking
1. Click on `Term set` > `Add term set`
- Enter configurations below:
- `IRI` Term set's corresponding IRI
# Configuration
- `Name` Term set title
1. On the Admin page, log in with the admin credentials
- `Version` Add the version number
- `Status` Select if the Term set is Published or Retired
2. **Add Schema Ledger:**

*Registry for Maintaining and Managing Schemas*

- `Schema Name` Schema file title
- `Schema IRI` Schema files corresponding IRI
- `Schema File` Upload the Schema file in the required format(JSON)
- `Status` Select if the Schema is Published or Retired
- `Major version` Add the Major value of the schema version
- `Minor Version` Add the Minor value of the schema version
- `Patch Version` Add the Patch version number of the schema

Note: On uploading the schema file in the required format to the schema ledger the creation of corresponding term set, linked child term set and terms process is triggered.

- Child Term sets: Is a term set that contains a references to other term-sets (schemas)
1. Click on `Child term sets` > `Add child term set`
- Enter configurations below:
3. **Add Transformation Ledger:**

*Registry for Maintaining and Managing the Mapping of Schemas*
- `Source Schema` Select source term set file from drop-down
- `Target Schema` Select Target term set from drop-down to be mapped to
- `Schema Mapping File` Upload the Schema Mapping file to be referenced for mapping in the required format(JSON)
- `Status` Select if the Schema Mapping is Published or Retired

Note: On uploading the Schema Mapping File in the required format to the transformation ledger, this triggers the process of adding the mapping for the corresponding term values.

4. **Add Term set:**

*Term sets supports the concept of a vocabulary in the context of semantic linking*
- `Name` Term set title
- `IRI` Term set's corresponding IRI
- `Version` Add the version number
- `Status` Select if the Term set is Published or Retired
- `Updated by` User that creates/updates the term set
- `Modified` Date & time when term set was created or modified

5. **Add Child Term set:**
- `IRI` Term set's corresponding IRI
- `Name` Term set title
- `Status` Select if the Term set is Published or Retired
- `Parent term set` Select the reference to the parent term set from the drop down
*Child term sets is a term set that contains a references to other term-sets (schemas)*
- `Name` Term set title
- `IRI` Term set's corresponding IRI
- `Version` Add the version number
- `Status` Select if the Term set is Published or Retired
- `Parent term set` Select the reference to the parent term set from the drop down
- `Updated by` User that creates/updates the term set
- `Modified` Date & time when term set was created or modified
- Terms: A term entity can be seen as a word in our dictionary. This entity captures a unique word/term in a term-set or schema.
1. Click on `Terms` > `Add term`
- Enter configurations below:
- `IRI` Term corresponding IRI
- `Name` Term title
- `Desciption` Term entity's description
- `Status` Select if the Term set is Published or Retired
- `Data Type` Term entity's corresponding data type
- `Use` Term entity's corresponding use case
- `Source` Term entity's corresponding source
- `term set` Select the reference to the parent term set from the drop down
- `Mapping` Add mappings between terms entity's of different parent term set
- `Updated by` User that creates/updates the term
## 5. Removing Deployment
To destroy the created resources, simply run the command below in your terminal:
6. **Add Term:**
*A term entity can be seen as a word in our dictionary. This entity captures a unique word/term in a term-set or schema.*
- `Name` Term title
- `IRI` Term corresponding IRI
- `Desciption` Term entity's description
- `Data Type` Term entity's corresponding data type
- `Use` Term entity's corresponding use case
- `Source` Term entity's corresponding source
- `Version` Add the version number
- `Status` Select if the Term set is Published or Retired
- `term set` Select the reference to the parent term set from the drop down
- `Mapping` Add mappings between terms entity's of different parent term set
- `Updated by` User that creates/updates the term
- `Modified` Date & time when term was created or modified

# API's
docker-compose down
## API's
**XSS contains API endpoints which can be called from other components**
Query string parameter: `name` `version` `iri`
Expand All @@ -137,23 +164,16 @@ Query string parameter: `name` `version` `iri`
**This API fetches the required schema from the repository using the Name and Version or IRI parameters**
**Note:This API fetches the required schema from the repository using the Name and Version or IRI parameters**
Query string parameter: `sourceName` `sourceVersion` `sourceIRI` `targetName` `targetVersion` `targetIRI`
http://localhost:8080/api/mappings/
*This API fetches the required mapping schema from the repository using the Source Name, Source Version, Target Name and Target Version or source IRI and Target IRI parameters*
*Note: This API fetches the required mapping schema from the repository using the Source Name, Source Version, Target Name and Target Version or source IRI and Target IRI parameters*
# Update

To update an existing installation, pull the latest changes using git
Then restart the application using `docker-compose restart`

Occasionally some updates may require the application be rebuilt using `docker-compose up --build`, but this can also rebuild the database resulting in data loss

# Testing
## Testing
To run the automated tests on the application run the command below
Expand All @@ -163,11 +183,7 @@ Test coverage information will be stored in an htmlcov directory
docker-compose --env-file .env run app sh -c "coverage run manage.py test && coverage html && flake8"
```

# Logs
Check the logs of application in the docker container.


# License
## License

This project uses the [MIT](http://www.apache.org/licenses/LICENSE-2.0) license.

4 changes: 3 additions & 1 deletion start-app.sh
Original file line number Diff line number Diff line change
Expand Up @@ -3,6 +3,8 @@

python manage.py waitdb
python manage.py migrate
python manage.py createcachetable
python manage.py collectstatic --no-input
python manage.py loaddata admin_theme_data.json
cd /opt/app/
if [ -n "$TMP_SCHEMA_DIR" ] ; then
Expand All @@ -12,4 +14,4 @@ else
fi
pwd
service clamav-daemon restart
./start-server.sh
./start-server.sh

0 comments on commit 7532337

Please sign in to comment.