Skip to content

Files

Latest commit

d796f3f · Aug 11, 2023

History

History
106 lines (58 loc) · 4.51 KB

README.md

File metadata and controls

106 lines (58 loc) · 4.51 KB

OGC API Testbed 17 Dataset testing (D168)

Installing the ogcapi conda environment

conda env create -f environment.yml python=3.9

The environment will not be activated, so to activate run:

conda activate ogcapi

Note: If using AWS for Elasticsearch, then you will need to install elasticsearch version 7.13.4 as release 7.14.0 no longer supports an AWS hosted Elasticsearch instance (now called Amazon OpenSearch)

Install pystac schemas

Install pystac with the validation optional requirement (e.g. pip install pystac[validation]) as these schemas are used

Install pygeometa

This implementation uses the version from the Pixalytics forked repository, so the that specific version needs to be installed:

python -m pip install git+https://github.com/pixalytics-ltd/pygeometa.git@t17-rcatalog

OR if you want to further edit the code, setup pygeometa in develop mode:

git clone https://github.com/pixalytics-ltd/pygeometa.git@t17-rcatalog

cd pygeometa

python setup.py develop

Note: pygeometa was updated to include a Record to describe a 'dataset', see https://github.com/cholmes/ogc-collection/blob/main/ogc-dataset-record-spec.md

Install pytdml

As this implementation uses updated code, found in the Pixalytics forked repository, the modified version needs to be installed:

python -m pip install git+https://github.com/pixalytics-ltd/pytdml.git@develop

Clone this repository

git clone https://github.com/opengeospatial/T17-API-D168.git

cd T17-API-D168

There will be a soft link to pygeometa under build catalog, so adjust that link if the location of that repository is different

Code folders

Before running the code activate the conda repository: conda activate ogcapi

Build catalog

Use create_catalog.py to create STAC or Records catalogs with the configuration stored in test-configuration.yaml alongside eo4sas-record.yml for a Record's catalog. The data referenced in these YAML files is stored in a publicly accessible AWS S3 bucket.

For example, to create a STAC collection run:

python create_catalog.py --collection

If an output directory to store the catalog is not specified by --outdir then the folder specified in test-configuration.yaml will be used.

Deploy catalog

Then, tupload the catlog to an Elasticsearch instance and run the following script with es_upload_conf.yaml to define what is uploaded:

python upload_esearch.py --verbose --upload

If you have problems connecting to Elasticsearch then use the diagnose option:

python upload_esearch.py --verbose --diagnose

Note: An example configuration files is provide as deploy_catalog/[example]es_upload_conf.yaml that needs to be renamed to deploy_catalog/es_upload_conf.yaml and edited with the details of your Elasticsearch instance.

utils

Utilities used to support file conversion from GeoTiFF to COG or NetCDF.

Example outputs

Static deployment via AWS S3 bucket

These are version 0-8 catalogs with multiple objects. A public access S3 bucket has been set up, and contains both the catalogs and imagery:

Dynamic catalogs deployed using the D165 server

D165 server has its own GitHub repository at https://github.com/opengeospatial/T17-API-D165

OGC API - Features server with three catalogs (Cubewerx alongside Elasticsearch versions of Records and STAC GeoTiFF catalogs):

OGI API - EDR implementation with a single multi time-step NetCDF: