The sections below detail how to run the test suite.
(Table of contents automatically generated by https://luciopaiva.com/markdown-toc/).
It is difficult if not impossible to install the PDP on a typical development workstation (particularly since the transition to Ubuntu 20.04).
To fill that gap, we've defined Docker infrastructure that allows you to
build and run a Docker container for testing that is equivalent to the
production environment. The infrastructure is in docker/local-test/
.
-
Advance prep
Do each of the following things once per workstation.
-
Configure Docker user namespace mapping.
-
Clone
pdp-docker
. -
Follow the instructions in the
pdp-docker
documentation: Setting up Docker namespace remapping (with recommended parameters).
-
-
Create
docker/local-test/env-with-passwords.env
fromdocker/local-run/env.env
by adding passwords for thepcic_meta
andcrmp
databases.
-
-
Build the image
The image need only be (re)built when:
- the project is first cloned, or
- the local-test Dockerfile changes.
To build the image:
docker-compose -f docker/local-test/docker-compose.yaml build
-
Mount the gluster
/storage
volumeMount locally to
/storage
so that those data files are accessible on your workstation.sudo mount -t cifs -o [email protected] //pcic-storage.pcic.uvic.ca/storage/ /storage
-
Start the test container
docker-compose -f ./docker/local-test/docker-compose.yaml run --rm local-test
This starts the container, installs the local codebase (which may take over a minute), and gives you a bash shell. You should see a standard bash prompt.
-
Change code and run tests
Each time you wish to run tests on your local codebase, enter a suitable command at the prompt. For example:
pytest -v -m "not local_only" --tb=short tests -x
Do not stop the container until you have finished all changes and testing you wish to make for a given session. It is far more time efficient run tests inside the same container (avoiding startup time) than to restart the container for each test.
Your local codebase is mounted to the container and installed in editable/development mode (
pip install -e .
). Any code changes you make externally (in your local filesystem) are reflected "live" inside the container. -
Stop the test container
When you have completed a develop-and-test session and no longer wish to have the test container running, enter Ctrl+D on the command line. The container is stopped and automatically removed.
- As noted above, running tests in the test container in read/write mode
leaves problematic pycache junk behind in the host filesystem. This can be
cleaned up by running
py3clean .
.
All JS tests are found in the directory pdp/static/js/__test__
.
No configuration is required to run the Node.js tests. Simply:
npm run test
When each portal is instantiated, it loads its default dataset, which can be found in the javascript code for each portal. In order to completely and correctly load a dataset (and therefore entire portal for testing), the front end will make several queries to data services, all of which must be mocked with information for that particular dataset.
Please note that the testing harness does not set url_base
, and portals
instantiated for testing purposes have a $(location).href
value of
http://localhost/
. This means that portals with an "archive"
functionality that display two different sets of data but otherwise behave
identically depending on the URL used to access them will always determine,
using pdp_controls.isArchivePortal()
, that they are not currently
displaying the archived data, because the word "archive" is not present in
their self-perceived URL when instantiated by tests. Therefore, they will
load the non-archive choice when loading their default dataset for testing,
and that is the dataset that needs to be mocked. There are no portals that
currently have "archive" functionality, but they will in the future whenever we
make new portals to replace the current ones.
A mockup of the backend's catalog.json
. JSON object with properties
{[unique_id]: data_url}
. The data urls should use the data_root
set in
app-test-helper.js
. The default dataset needs to have an entry, but does
not need to be the only entry.
A mockup of the backend's menu.json
. JSON Object with portal-specific
organization of datasets. The default dataset needs to be represented, but
does not need to be the only entry.
A mockup if the backend's metadata.json
. JSON object with units
, max
,
and min
attributes for the default dataset.
A mockup of a pyDAP DDS call. Needs to match the time
metadata for the
default dataset.
A mockup of a pyDAP DAS call. Probably needs to include lat
, lon
,
time
and a variable matching the default dataset's variable, along with
their load-bearing attributes like units
, but "extra" attributes like
REFERENCES
are probably skippable.
Thankfully, full maps do not need to be loaded for tests, but some metadata is needed.
A representation of ncWMS's GetCapabilities
query, a very long xml
document. Needs to include, at minimum, the unique ID of the dataset in
question, and the default timestamp, which can be found in the portal's
initialization javascript, in their usual places in the <LAYER>
element.
The xml includes a long list of available palettes; all but the default one
specified in the portal's map initial code are skippable and do not need to
be mocked (default/ferret
for most variables; default/blueheat
for some
precipitation datasets, default/occam
for some others).