Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

KG2.10.1c rollout #2380

Open
sundareswarpullela opened this issue Sep 18, 2024 · 6 comments
Open

KG2.10.1c rollout #2380

sundareswarpullela opened this issue Sep 18, 2024 · 6 comments
Assignees

Comments

@sundareswarpullela
Copy link
Collaborator

sundareswarpullela commented Sep 18, 2024

NOTE: To create a new issue based on this template, simply go to: https://github.com/RTXteam/RTX/issues/new?template=kg2rollout.md

THE BRANCH FOR THIS ROLLOUT IS: kg2.10.1c
THE ARAX-DATABASES.RTX.AI DIRECTORY FOR THIS ROLLOUT IS: /home/rtxconfig/KG2.10.1
Sprint changelog link: (Changelog)

Prerequisites

ssh access

To complete this workflow, you will need ssh access to:

  • arax-databases.rtx.ai
  • the self-hosted ARAX/KG2 instance, arax.ncats.io (see example configuration information below)
  • the self-hosted PloverDB instances, kg2cploverN.rtx.ai
  • the self-hosted Neo4j instances for KG2c, kg2canoncalizedN.rtx.ai
  • the self-hosted CI/CD instance, cicd.rtx.ai
  • the webserver for downloading of the KG2c "lite" JSON file, kg2webhost.rtx.ai
GitHub access
  • write access to the RTXteam/PloverDB project area
  • write access to the RTXteam/RTX project area
  • write access to the ncats/translator-lfs-artifacts project area (not critical, but needed for some final archiving steps; Amy Glen and Sundar Pullela have access)
AWS access

You will need:

  • access to the AWS Console (you'll need an IAM username; ask Stephen Ramsey about getting one)
  • IAM permission to start and stop instances in EC2 via the AWS Console
  • access to the S3 bucket s3://rtx-kg2/ (ask Stephen Ramsey for access)
Slack workspaces

You will also need access to the following Slack workspaces:

  • ARAXTeam (subscribe to #deployment)
  • NCATSTranslator (subscribe to `#devops-teamexpanderagent)

Example ssh config for setting up login into arax.ncats.io:

Host arax.ncats.io
    User stephenr
    ProxyCommand ssh -i ~/.ssh/id_rsa_long -W %h:%p [email protected]
    IdentityFile ~/.ssh/id_rsa_long
    Hostname 172.31.53.16

1. Build and load KG2c:

  • merge master into the branch being used for this KG2 version (which would typically be named like KG2.X.Yc). Record this issue number in the merge message.
  • update the four hardcoded biolink version numbers in the branch (as needed):
    • in code/UI/OpenAPI/python-flask-server/openapi_server/openapi/openapi.yaml (github; local)
    • in code/UI/OpenAPI/python-flask-server/KG2/openapi_server/openapi/openapi.yaml (github; local)
    • in code/UI/OpenAPI/specifications/export/ARAX/1.5.0/openapi.yaml(github)
    • in code/UI/OpenAPI/specifications/export/KG2/1.5.0/openapi.yaml(github)
  • build a new KG2c on buildkg2c.rtx.ai from the branch (how-to is here)
    • before starting the build:
      • make sure there is enough disk space available on arax-databases.rtx.ai (need at least 100G, ideally >120G). delete old KG2 database directories as needed (warn the team on Slack in advance).
      • make sure to choose to build a new synonymizer in kg2c_config.json, as described in the how-to
    • after the build is done, verify it looks ok:
      • node_synonymizer.sqlite should be around 8-15 GB
      • make sure node_synonymizer.sqlite's last modified date is today (or whatever day the build was run)
      • make sure kg2c_lite.json.gz's last modified date is today (or whatever day the build was run)
      • the entire build runtime (synonymizer + KG2c) shouldn't have been more than 24 hours
      • the synonymizer and KG2c artifacts should have been auto-uploaded into the proper directory on arax-databases.rtx.ai (/home/rtxconfig/KG2.X.Y)
  • load the new KG2c into neo4j at http://kg2-X-Yc.rtx.ai:7474/browser/ (how to is here)
    • verify the correct KG2 version was uploaded by running this query: match (n {id:"RTX:KG2c"}) return n
  • update RTX/code/config_dbs.json in the branch:
    • update the synonymizer version number/path
    • update the fda_approved_drugs version number/path
    • update the autocomplete version number/path
    • update the meta_kg version number/path
    • update the kg2c sqlite version number/path
    • update the KG2pre and KG2c Neo4j endpoints
  • copy the kg2c_lite_2.X.Y.json.gz file (which you can get from the S3 bucket s3://rtx-kg2/kg2c_lite.json.gz (but CHECK THE DATE AND MD5 HASH TO BE SURE YOU ARE NOT GETTING AN OLD FILE) to the directory /home/ubuntu/nginx-document-root/ on kg2webhost.rtx.ai
  • load the new KG2c into Plover (how-to is here)
  • start the new self-hosted PloverDB on kg2cploverN.rtx.ai:
    • ssh [email protected]
    • cd PloverDB && git pull origin kg2.X.Yc
    • if you have not yet built the 2.X.Y docker image/container on this instance, run:
      • ./run.sh ploverimage2.X.Y plovercontainer2.X.Y "sudo docker" (takes about an hour)
    • otherwise, simply run:
      • sudo docker start plovercontainer2.X.Y (takes about five minutes)
  • verify that Plover's regression tests pass, and fix any broken tests (note: tests must use canonical curies!); from any instance/computer, run:
    • cd PloverDB
    • pytest -v test/test.py --endpoint http://kg2cploverN.rtx.ai:9990
  • update config_dbs.json in the branch for this KG2 version in the RTX repo to point to the new Plover for the 'dev' maturity level

2. Rebuild downstream databases:

The following databases should be rebuilt and copies of them should be put in /home/rtxconfig/KG2.X.Y on arax-databases.rtx.ai. Please use this kind of naming format: mydatabase_v1.0_KG2.X.Y.sqlite.

  • NGD database (how-to is here)
  • Build CURIE NGD database @mohsenht
  • refreshed XDTD database @chunyuma
  • XDTD database @chunyuma (may be skipped - depends on the changes in this KG2 version)
  • refreshed XCRG database @chunyuma
  • XCRG database @chunyuma (may be skipped - depends on the changes in this KG2 version)

NOTE: As databases are rebuilt, RTX/code/config_dbs.json will need to be updated to point to their new paths! Push these changes to the branch for this KG2 version, unless the rollout of this KG2 version has already occurred, in which case you should push to master (but first follow the steps described here).

3. Update the ARAX codebase:

All code changes should go in the branch for this KG2 version!

  • regenerate the KG2c test triples file in the branch for this KG2 version
    • ensure the new KG2c Neo4j is currently running
    • check out the branch and pull to get the latest changes (this is important for ensuring the correct KG2c Neo4j is used)
    • run create_json_of_kp_predicate_triples.py
    • push the regenerated file to RTX/code/ARAX/KnowledgeSources/RTX_KG2c_test_triples.json
  • update Expand code as needed
  • update any other modules as needed
  • test everything together:
    • check out the branch and pull to get the latest changes
    • locally set force_local = True in ARAX_expander.py (to avoid using the old KG2 API)
    • then run the entire ARAX pytest suite (i.e., pytest -v)
    • address any failing tests
  • update the KG2 and ARAX version numbers in the appropriate places (in the branch for this KG2 version)
    • Bump version on line 12 in RTX/code/UI/OpenAPI/python-flask-server/openapi_server/openapi/openapi.yaml (github; local); the major and minor release numbers are kept synchronous with the TRAPI version; just bump the patch release version (least significant digit)
    • Bump version on line 12 in RTX/code/UI/OpenAPI/python-flask-server/KG2/openapi_server/openapi/openapi.yaml (github; local); the first three digits are kept synchronous with the KG2 release version
    • Bump version on line 4 in RTX/code/UI/OpenAPI/python-flask-server/RTX_OA3_TRAPI1.4_ARAX.yaml (github; local); same as for the ARAX openapi.yaml file
    • Bump version on line 4 in RTX/code/UI/OpenAPI/python-flask-server/RTX_OA3_TRAPI1.4_KG2.yaml (github; local); same as for the KG2 openapi.yaml file

4. Pre-upload databases:

Before rolling out, we need to pre-upload the new databases (referenced in config_dbs.json) to arax.ncats.io and the ITRB SFTP server. These steps can be done well in advance of the rollout; it doesn't hurt anything to do them early.

  • make sure arax.ncats.io has at least 100G of disk space free; delete old KG2 databases to free up space as needed (before doing this, warn the team on the #deployment Slack channel on the ARAXTeam workspace)
  • copy the new databases from arax-databases.rtx.ai to arax.ncats.io:/translator/data/orangeboard/databases/KG2.X.Y; example for KG2.8.0:
  • upload the new databases and their md5 checksums to ITRB's SFTP server using the steps detailed here

5. Rollout new KG2c version to arax.ncats.io development endpoints

  • Notify the #deployment channel in the ARAXTeam Slack workspace that you are rolling out a new version of KG2c to the various arax.ncats.io development endpoints. Provide the KG2c version number in this notification.
  • for the RTXteam/RTX project, merge the master branch into the branch for this KG2 version. Record the RTX issue number (for the KG2c rollout checklist issue) in the merge message.
  • for the RTXteam/RTX project, merge this KG2 version's branch back into the master branch. Record this issue number in the merge message.
  • to roll master out to a specific ARAX or KG2 endpoint named /EEE, you would do the following steps:
    • If you are offsite, log into your office VPN (there are strict IP address block restrictions on client IPs that can ssh into arax.ncats.io)
    • Log in to arax.ncats.io: ssh arax.ncats.io (you previously need to have set up your username, etc. in ~/.ssh/config; see the top of this issue template for an example)
    • Enter the rtx1 container: sudo docker exec -it rtx1 bash
    • Become user rt: su - rt
    • Go to the directory of the code repo for the EEE endpoint: cd /mnt/data/orangeboard/EEE/RTX
    • Make sure it is on the master branch: git branch (should show * master)
    • Stash any updated files (this is IMPORTANT): git stash
    • Update the code: git pull origin master
    • Restore updated files: git stash pop
    • If there have been changes to requirements.txt, make sure to do pip3 install -r code/requirements.txt
    • Become superuser: exit (exiting out of your shell session as user rt should return you to a root user session)
    • Restart the service: service RTX_OpenAPI_EEE restart
    • View the STDERR logfile as the service starts up: tail -f /tmp/RTX_OpenAPI_EEE.elog
    • Test the endpoint via the web browser interface to make sure it is working
    • Query the KG2c version by entering this TRAPI query JSON into the browser UI: {"nodes": {"n00": {"ids": ["RTX:KG2c"]}}, "edges": {}} (it should return 1 result and the name of that node gives the KG2c version that is installed in the PloverDB that is being queried by the endpoint)
    • look up RTX:KG2 in the Synonyms tab in the UI
  • roll master out to the various arax.ncats.io development endpoints. Usually in this order:
    • devED
    • kg2beta
    • beta
    • kg2test
    • test
    • devLM
  • inside the Docker rtx1 container, run the pytest suite on the various ARAX development endpoints (that means devED, devLM, test, and beta):
    • cd /mnt/data/orangeboard/EEE/RTX/code/ARAX/test && pytest -v
  • update our CI/CD testing instance with the new databases:
    • ssh [email protected]
    • cd RTX
    • git pull origin master
    • If there have been changes to requirements.txt, make sure to do ~/venv3.9/bin/pip3 install -r requirements.txt
    • sudo bash
    • mkdir -m 777 /mnt/data/orangeboard/databases/KG2.X.Y
    • exit
    • ~/venv3.9/bin/python3 code/ARAX/ARAXQuery/ARAX_database_manager.py --mnt --skip-if-exists --remove_unused
    • run a Test Build through GitHub Actions, to ensure that the CI/CD is working with the updated databases; all of the pytest tests that are not skipped, should pass

6. Final items/clean up:

  • update the current RTX GitHub changelog issue (add the rollout of this KG2 version as a changelog item)
  • delete the kg2.X.Yc branch in the RTX repo (since it has been merged into master at this point)
  • turn off the old KG2c version's neo4j instance (if it has not already been turned off; it is likely to have been turned off when the old KG2c was rolled out)
    • determine what is the DNS A record hostname for kg2-X-Zc.rtx.ai (where Z is one less than the new minor release version): run nslookup kg2-X-Zc.rtx.ai (it will return either kg2canonicalized.rtx.ai or kg2canonicalized2.rtx.ai; we'll call it kg2canonicalizedN.rtx.ai).
    • message the #deployment channel in the ARAXTeam Slack workspace that you will be stopping the kg2canonicalizedN.rtx.ai Neo4j endpoint
    • ssh [email protected]
    • sudo service neo4j stop
    • In the AWS console, stop the instance kg2canonicalizedN.rtx.ai
  • turn off the old KG2c version's plover instance (if it has not already been turned off during the previous KG2c roll-out; under normal circumstances, we turn off the self-hosted PloverDB for the new KG2c, during clean-up)
    • Determine what is the DNS A record hostname for kg2-X-Zcplover.rtx.ai (where Z is one less than the new minor release version): run nslookup kg2-X-Zploverc.rtx.ai (it will return either kg2cplover.rtx.ai, kg2cplover2.rtx.ai, or kg2cplover3.rtx.ai; we'll call it kg2cploverN.rtx.ai).
    • message the #deployment channel in the ARAXTeam Slack workspace that you will be stopping the kg2-X-Zcplover.rtx.ai PloverDB service
    • Log into kg2cploverN.rtx.ai: ssh [email protected]
    • Stop the PloverDB container: sudo docker stop plovercontainer2.X.Z (if you are not sure of the container name, use sudo docker container ls -a to get the container name).
  • turn off the new KG2pre version's Neo4j instance (Coordinate with the KG2pre team before doing this)
  • deploy new PloverDB service into ITRB CI that is backed by the new KG2c database:
    • merge PloverDB main branch into kg2.X.Yc branch (if main has any commits ahead of kg2.X.Yc). Reference this issue (via its full GitHub URL) in the merge message.
    • merge PloverDB kg2.X.Yc branch into main branch. Reference this issue (via its full GitHub URL) in the merge message.
    • update kg_config.json in the main branch of the Plover repo to point to the new kg2c_lite_2.X.Y.json.gz file (push this change)
    • wait about 60 minutes for Jenkins to build the PloverDB project and deploy it to kg2cploverdb.ci.transltr.io
    • verify the CI Plover is running the new KG2 version by running the following test and inspecting the command line output: cd PloverDB/test && pytest -vsk test_version --endpoint https://kg2cploverdb.ci.transltr.io
    • run Plover tests to verify it's working: cd PloverDB/test && pytest -v --endpoint https://kg2cploverdb.ci.transltr.io
    • run the ARAX pytest suite with the NCATS endpoint plugged in (locally change the URL in RTX/code/config_dbs.json and set force_local = True in Expand)
    • if all tests pass, update RTX/code/config_dbs.json in the master branch to point to the ITRB Plover endpoints (all maturity levels): (dev: kg2cploverdb.ci.transltr.io; test: kg2cploverdb.test.transltr.io; prod: kg2cploverdb.transltr.io)
    • push the latest master branch code commit to the various endpoints on arax.ncats.io that you previously updated (this is in order to get the changed config_dbs.json file) and restart ARAX and KG2 services
    • check the Test Build (CI/CD tests) to make sure all non-skipped pytest tests have passed
    • turn off the self-hosted plover endpoint for the new version of KG2c
      • message the #deployment channel to notify people what you are about to do
      • ssh [email protected]
      • sudo docker container ls -a (gives you the name of the container; assume it is plovercontainer2.X.Y)
      • sudo docker stop plovercontainer2.X.Y
    • verify once more that ARAX is still working properly, even with the self-hosted new-KG2c-version PloverDB service turned off
    • delete the kg2.X.Yc branch in the PloverDB repo (since it has been merged into main at this point)
  • upload the new kg2c_lite_2.X.Y.json.gz file to the translator-lfs-artifacts repo (ask Amy Glen or Sundar Pullela, who have permission to do this)
  • upload the new kg2_nodes_not_in_sri_nn.tsv file to the translator-lfs-artifacts repo
@sundareswarpullela
Copy link
Collaborator Author

KG2.10.1c Synonymizer build completed, will be poking at it a bit and then proceed with a test KG2.10.1c build followed by the full build.

@sundareswarpullela
Copy link
Collaborator Author

Commenced the KG2.10.1c build on buildkg2c.rtx.ai on kg2cbuild screen session.

@amykglen
Copy link
Member

from today's AHM: @sundareswarpullela - let's make a final decision on Friday as to whether we will squeeze KG2.10.1c into Sprint 6 (ends Oct. 4). (figuring we will, but would be good to get a little more testing in before officially deciding)

@sundareswarpullela
Copy link
Collaborator Author

All ARAX pytests are passing.

@sundareswarpullela
Copy link
Collaborator Author

All available databases uploaded onto arax.ncats.io. Once mohsen is done with building the latest curie_ngd database, I’ll upload that onto arax.ncats.io too.

@amykglen
Copy link
Member

amykglen commented Oct 12, 2024

@sundareswarpullela - looks like CI Plover finished building! https://kg2cploverdb.ci.transltr.io/code_version

I'm seeing Plover's pytest suite passing with:

cd PloverDB/test
pytest -v --endpoint https://kg2cploverdb.ci.transltr.io

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants