Skip to content

Commit

Permalink
Adjust based on pre-commit
Browse files Browse the repository at this point in the history
  • Loading branch information
LouisK92 committed Mar 27, 2024
1 parent a8f0d53 commit 8025d0d
Show file tree
Hide file tree
Showing 2 changed files with 22 additions and 15 deletions.
21 changes: 13 additions & 8 deletions tests/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -11,22 +11,27 @@ pytest <args>
```

## Regenerating the test data
For testing there were some files generated at some point in the past. These files shouldn't be changed. However,
there are cases when they might need to be recreated. E.g. an `anndata` update could lead to warnings when loading h5ads

For testing there were some files generated at some point in the past. These files shouldn't be changed. However,
there are cases when they might need to be recreated. E.g. an `anndata` update could lead to warnings when loading h5ads
saved with older versions of `anndata` or similar things. In this case, the test data can be regenerated by running:

1. For test data etc.:
Run the functions of the file `tests/_generate_test_files.py`. (not implemented yet)
Run the functions of the file `tests/_generate_test_files.py`. (not implemented yet)

2. For tests that compare their outputs to previously generated outputs (mainly plots):
- Run the according tests
- Find out the temp directory of pytest: e.g. from `python`:

- Run the according tests
- Find out the temp directory of pytest: e.g. from `python`:

```python
import tempfile
tempfile.gettempdir()
```

The newest outputs should be in a folder like `<tempdir>/pytest-of-<user>/pytest-<pid>/test_<testname>/`
- Copy the new outputs to the according test subfolder `tests/...` and overwrite the old ones.

3. Some tests don't save their outputs to files but compare them directly to some reference values. In this case, there
should be a comment in the test code that explains how to regenerate the reference values. E.g. function `test_knns_shared_comp` in `spapros/tests/evaluation/test_metrics.py`.
- Copy the new outputs to the according test subfolder `tests/...` and overwrite the old ones.

3. Some tests don't save their outputs to files but compare them directly to some reference values. In this case, there
should be a comment in the test code that explains how to regenerate the reference values. E.g. function `test_knns_shared_comp` in `spapros/tests/evaluation/test_metrics.py`.
16 changes: 9 additions & 7 deletions tests/plotting/test_plotting.py
Original file line number Diff line number Diff line change
Expand Up @@ -8,22 +8,24 @@
# Note: The figures depend somehow on the environment!
# Tests might fail if compared figures derived from different envs eg development env and test env


def _transform_string(s):
"""Transforms a string by replacing ': ' with '-' and ', ' with '_',
and then removing problematic characters for Windows filenames.
In the tests we use the kwargs of the functions as strings to name output files. To get a valid name for windows
In the tests we use the kwargs of the functions as strings to name output files. To get a valid name for windows
we need to replace certain characters.
"""
# Initial replacements
transformed = s.replace(': ', '-').replace(', ', '_')
transformed = s.replace(": ", "-").replace(", ", "_")

# Additional removals
for char in [':', ',', '{', '}', "'", '[', ']']:
transformed = transformed.replace(char, '')
for char in [":", ",", "{", "}", "'", "[", "]"]:
transformed = transformed.replace(char, "")

return transformed


#############
# selection #
#############
Expand Down

0 comments on commit 8025d0d

Please sign in to comment.