Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Implement pre-commit. #6

Merged
merged 4 commits into from
Apr 6, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
5 changes: 5 additions & 0 deletions .git-blame-ignore-revs
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
# For more information, see:
# https://docs.github.com/en/repositories/working-with-files/using-files/viewing-a-file#ignore-commits-in-the-blame-view

# Black code formatting of entire repository
56dd43f69d901abbba6cfb765a98dee26ff71cfc
2 changes: 1 addition & 1 deletion .github/pull_request_template.md
Original file line number Diff line number Diff line change
Expand Up @@ -11,6 +11,6 @@ A short description of the changes in this PR.
## PR Acceptance Checklist
* [ ] Jira ticket acceptance criteria met.
* [ ] `CHANGELOG.md` updated to include high level summary of PR changes.
* [ ] `VERSION` updated if publishing a release.
* [ ] `docker/service_version.txt` updated if publishing a release.
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@sudha-murthy spotted this in her PR for DAS-2106.

* [ ] Tests added/updated and passing.
* [ ] Documentation updated (if needed).
20 changes: 20 additions & 0 deletions .pre-commit-config.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,20 @@
repos:
- repo: https://github.com/pre-commit/pre-commit-hooks
rev: v3.2.0
hooks:
- id: trailing-whitespace
- id: end-of-file-fixer
- id: check-json
- id: check-yaml
- id: check-added-large-files
- repo: https://github.com/astral-sh/ruff-pre-commit
rev: v0.3.4
hooks:
- id: ruff
args: ["--fix", "--show-fixes"]
- repo: https://github.com/psf/black-pre-commit-mirror
rev: 24.3.0
hooks:
- id: black-jupyter
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is a new thing since v21.something of black. The name is a little misleading - it adds checking of Jupyter notebooks, alongside existing functionality. It won't just check Jupyter notebooks.

args: ["--skip-string-normalization"]
language_version: python3.11
18 changes: 12 additions & 6 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -1,13 +1,19 @@
## v1.0.4
### 2024-04-05

This version of HOSS implements `black` code formatting across the repository.
There should be no functional changes in the service.

## v1.0.3
### 2024-3-29
### 2024-03-29

This version of HOSS handles the error in the crs_wkt attribute in ATL19 where the
north polar crs variable has a leading iquotation mark escaped by back slash in the
crs_wkt attribute. This causes errors when the projection is being interpreted from
the crs variable attributes.
This version of HOSS handles the error in the crs_wkt attribute in ATL19 where the
north polar crs variable has a leading iquotation mark escaped by back slash in the
crs_wkt attribute. This causes errors when the projection is being interpreted from
the crs variable attributes.

## v1.0.2
### 2024-2-26
### 2024-02-26

This version of HOSS correctly handles edge-aligned geographic collections by
adding the attribute `cell_alignment` with the value `edge` to `hoss_config.json`
Expand Down
33 changes: 33 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -240,6 +240,39 @@ newest release of the code (starting at the top of the file).
## vX.Y.Z
```

### pre-commit hooks:

This repository uses [pre-commit](https://pre-commit.com/) to enable pre-commit
checking the repository for some coding standard best practices. These include:

* Removing trailing whitespaces.
* Removing blank lines at the end of a file.
* JSON files have valid formats.
* [ruff](https://github.com/astral-sh/ruff) Python linting checks.
* [black](https://black.readthedocs.io/en/stable/index.html) Python code
formatting checks.

To enable these checks:

```bash
# Install pre-commit Python package as part of test requirements:
pip install -r tests/pip_test_requirements.txt

# Install the git hook scripts:
pre-commit install

# (Optional) Run against all files:
pre-commit run --all-files
```

When you try to make a new commit locally, `pre-commit` will automatically run.
If any of the hooks detect non-compliance (e.g., trailing whitespace), that
hook will state it failed, and also try to fix the issue. You will need to
review and `git add` the changes before you can make a commit.

It is planned to implement additional hooks, possibly including tools such as
`mypy`.

## Get in touch:

You can reach out to the maintainers of this repository via email:
Expand Down
2 changes: 1 addition & 1 deletion docker/service_version.txt
Original file line number Diff line number Diff line change
@@ -1 +1 @@
1.0.3
1.0.4
2 changes: 1 addition & 1 deletion docker/tests.Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -16,7 +16,7 @@ ENV PYTHONDONTWRITEBYTECODE=1
COPY tests/pip_test_requirements.txt .
RUN conda run --name hoss pip install --no-input -r pip_test_requirements.txt

# Copy test directory containing Python unittest suite, test data and utilities
# Copy test directory containing Python unittest suite, test data and utilities
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Woop - the trailing whitespace check picked this up and automatically fixed it (note - that fix still needs to be added/committed, so we still get to review the magical changes)

COPY ./tests tests

# Set conda environment to hoss, as conda run will not stream logging.
Expand Down
60 changes: 41 additions & 19 deletions docs/HOSS_DAAC_Operator_Documentation.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -170,8 +170,10 @@
"metadata": {},
"outputs": [],
"source": [
"temporal_range = {'start': datetime(2020, 1, 1, 0, 0, 0),\n",
" 'stop': datetime(2020, 1, 31, 23, 59, 59)}"
"temporal_range = {\n",
" 'start': datetime(2020, 1, 1, 0, 0, 0),\n",
" 'stop': datetime(2020, 1, 31, 23, 59, 59),\n",
"}"
]
},
{
Expand Down Expand Up @@ -273,14 +275,19 @@
"outputs": [],
"source": [
"# Define the request:\n",
"variable_subset_request = Request(collection=collection, variables=[variable_to_subset], max_results=1)\n",
"variable_subset_request = Request(\n",
" collection=collection, variables=[variable_to_subset], max_results=1\n",
")\n",
"\n",
"# Submit the request and download the results\n",
"variable_subset_job_id = harmony_client.submit(variable_subset_request)\n",
"harmony_client.wait_for_processing(variable_subset_job_id, show_progress=True)\n",
"variable_subset_outputs = [file_future.result()\n",
" for file_future\n",
" in harmony_client.download_all(variable_subset_job_id, overwrite=True)]\n",
"variable_subset_outputs = [\n",
" file_future.result()\n",
" for file_future in harmony_client.download_all(\n",
" variable_subset_job_id, overwrite=True\n",
" )\n",
"]\n",
"\n",
"replace(variable_subset_outputs[0], 'hoss_variable_subset.nc4')\n",
"\n",
Expand Down Expand Up @@ -308,15 +315,22 @@
"outputs": [],
"source": [
"# Define the request:\n",
"temporal_subset_request = Request(collection=collection, temporal=temporal_range,\n",
" variables=[variable_to_subset], max_results=1)\n",
"temporal_subset_request = Request(\n",
" collection=collection,\n",
" temporal=temporal_range,\n",
" variables=[variable_to_subset],\n",
" max_results=1,\n",
")\n",
"\n",
"# Submit the request and download the results\n",
"temporal_subset_job_id = harmony_client.submit(temporal_subset_request)\n",
"harmony_client.wait_for_processing(temporal_subset_job_id, show_progress=True)\n",
"temporal_subset_outputs = [file_future.result()\n",
" for file_future\n",
" in harmony_client.download_all(temporal_subset_job_id, overwrite=True)]\n",
"temporal_subset_outputs = [\n",
" file_future.result()\n",
" for file_future in harmony_client.download_all(\n",
" temporal_subset_job_id, overwrite=True\n",
" )\n",
"]\n",
"\n",
"replace(temporal_subset_outputs[0], 'hoss_temporal_subset.nc4')\n",
"\n",
Expand Down Expand Up @@ -351,14 +365,17 @@
"outputs": [],
"source": [
"# Define the request:\n",
"bbox_subset_request = Request(collection=collection, spatial=bounding_box, max_results=1)\n",
"bbox_subset_request = Request(\n",
" collection=collection, spatial=bounding_box, max_results=1\n",
")\n",
"\n",
"# Submit the request and download the results\n",
"bbox_subset_job_id = harmony_client.submit(bbox_subset_request)\n",
"harmony_client.wait_for_processing(bbox_subset_job_id, show_progress=True)\n",
"bbox_subset_outputs = [file_future.result()\n",
" for file_future\n",
" in harmony_client.download_all(bbox_subset_job_id, overwrite=True)]\n",
"bbox_subset_outputs = [\n",
" file_future.result()\n",
" for file_future in harmony_client.download_all(bbox_subset_job_id, overwrite=True)\n",
"]\n",
"\n",
"replace(bbox_subset_outputs[0], 'hoss_bbox_subset.nc4')\n",
"\n",
Expand Down Expand Up @@ -389,14 +406,19 @@
"outputs": [],
"source": [
"# Define the request:\n",
"shape_file_subset_request = Request(collection=collection, shape='shape_files/bermuda_triangle.geo.json', max_results=1)\n",
"shape_file_subset_request = Request(\n",
" collection=collection, shape='shape_files/bermuda_triangle.geo.json', max_results=1\n",
")\n",
"\n",
"# Submit the request and download the results\n",
"shape_file_subset_job_id = harmony_client.submit(shape_file_subset_request)\n",
"harmony_client.wait_for_processing(shape_file_subset_job_id, show_progress=True)\n",
"shape_file_subset_outputs = [file_future.result()\n",
" for file_future\n",
" in harmony_client.download_all(shape_file_subset_job_id, overwrite=True)]\n",
"shape_file_subset_outputs = [\n",
" file_future.result()\n",
" for file_future in harmony_client.download_all(\n",
" shape_file_subset_job_id, overwrite=True\n",
" )\n",
"]\n",
"\n",
"replace(shape_file_subset_outputs[0], 'hoss_shape_file_subset.nc4')\n",
"# Inspect the results:\n",
Expand Down
90 changes: 63 additions & 27 deletions docs/HOSS_User_Documentation.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -127,14 +127,19 @@
"source": [
"variables = ['atmosphere_cloud_liquid_water_content']\n",
"\n",
"variable_subset_request = Request(collection=ghrc_collection, variables=variables, granule_id=[ghrc_granule_id])\n",
"variable_subset_request = Request(\n",
" collection=ghrc_collection, variables=variables, granule_id=[ghrc_granule_id]\n",
")\n",
"variable_subset_job_id = harmony_client.submit(variable_subset_request)\n",
"\n",
"print(f'Processing job: {variable_subset_job_id}')\n",
"\n",
"for filename in [file_future.result()\n",
" for file_future\n",
" in harmony_client.download_all(variable_subset_job_id, overwrite=True, directory=demo_directory)]:\n",
"for filename in [\n",
" file_future.result()\n",
" for file_future in harmony_client.download_all(\n",
" variable_subset_job_id, overwrite=True, directory=demo_directory\n",
" )\n",
"]:\n",
" print(f'Downloaded: {filename}')"
]
},
Expand All @@ -157,14 +162,19 @@
"source": [
"gpm_bounding_box = BBox(w=45, s=-45, e=75, n=-15)\n",
"\n",
"bbox_request = Request(collection=gpm_collection, spatial=gpm_bounding_box, granule_id=[gpm_granule_id])\n",
"bbox_request = Request(\n",
" collection=gpm_collection, spatial=gpm_bounding_box, granule_id=[gpm_granule_id]\n",
")\n",
"bbox_job_id = harmony_client.submit(bbox_request)\n",
"\n",
"print(f'Processing job: {bbox_job_id}')\n",
"\n",
"for filename in [file_future.result()\n",
" for file_future\n",
" in harmony_client.download_all(bbox_job_id, overwrite=True, directory=demo_directory)]:\n",
"for filename in [\n",
" file_future.result()\n",
" for file_future in harmony_client.download_all(\n",
" bbox_job_id, overwrite=True, directory=demo_directory\n",
" )\n",
"]:\n",
" print(f'Downloaded: {filename}')"
]
},
Expand Down Expand Up @@ -196,15 +206,22 @@
"gpm_bounding_box = BBox(w=45, s=-45, e=75, n=-15)\n",
"gpm_variables = ['/Grid/precipitationCal']\n",
"\n",
"combined_request = Request(collection=gpm_collection, spatial=gpm_bounding_box,\n",
" granule_id=[gpm_granule_id], variables=gpm_variables)\n",
"combined_request = Request(\n",
" collection=gpm_collection,\n",
" spatial=gpm_bounding_box,\n",
" granule_id=[gpm_granule_id],\n",
" variables=gpm_variables,\n",
")\n",
"combined_job_id = harmony_client.submit(combined_request)\n",
"\n",
"print(f'Processing job: {combined_job_id}')\n",
"\n",
"for filename in [file_future.result()\n",
" for file_future\n",
" in harmony_client.download_all(combined_job_id, overwrite=True, directory=demo_directory)]:\n",
"for filename in [\n",
" file_future.result()\n",
" for file_future in harmony_client.download_all(\n",
" combined_job_id, overwrite=True, directory=demo_directory\n",
" )\n",
"]:\n",
" print(f'Downloaded: {filename}')"
]
},
Expand All @@ -229,14 +246,19 @@
"source": [
"ghrc_bounding_box = BBox(w=-30, s=-50, e=30, n=0)\n",
"\n",
"edge_request = Request(collection=ghrc_collection, spatial=ghrc_bounding_box, granule_id=[ghrc_granule_id])\n",
"edge_request = Request(\n",
" collection=ghrc_collection, spatial=ghrc_bounding_box, granule_id=[ghrc_granule_id]\n",
")\n",
"edge_job_id = harmony_client.submit(edge_request)\n",
"\n",
"print(f'Processing job: {edge_job_id}')\n",
"\n",
"for filename in [file_future.result()\n",
" for file_future\n",
" in harmony_client.download_all(edge_job_id, overwrite=True, directory=demo_directory)]:\n",
"for filename in [\n",
" file_future.result()\n",
" for file_future in harmony_client.download_all(\n",
" edge_job_id, overwrite=True, directory=demo_directory\n",
" )\n",
"]:\n",
" print(f'Downloaded: {filename}')"
]
},
Expand Down Expand Up @@ -268,15 +290,22 @@
"point_in_pixel_box = BBox(w=43.2222, s=-25.1111, e=43.2222, n=-25.1111)\n",
"gpm_variables = ['/Grid/precipitationCal']\n",
"\n",
"point_in_pixel_request = Request(collection=gpm_collection, spatial=point_in_pixel_box,\n",
" granule_id=[gpm_granule_id], variables=gpm_variables)\n",
"point_in_pixel_request = Request(\n",
" collection=gpm_collection,\n",
" spatial=point_in_pixel_box,\n",
" granule_id=[gpm_granule_id],\n",
" variables=gpm_variables,\n",
")\n",
"point_in_pixel_job_id = harmony_client.submit(point_in_pixel_request)\n",
"\n",
"print(f'Processing job: {point_in_pixel_job_id}')\n",
"\n",
"for filename in [file_future.result()\n",
" for file_future\n",
" in harmony_client.download_all(point_in_pixel_job_id, overwrite=True, directory=demo_directory)]:\n",
"for filename in [\n",
" file_future.result()\n",
" for file_future in harmony_client.download_all(\n",
" point_in_pixel_job_id, overwrite=True, directory=demo_directory\n",
" )\n",
"]:\n",
" print(f'Downloaded: {filename}')"
]
},
Expand All @@ -298,15 +327,22 @@
"corner_point_box = BBox(w=160, s=20, e=160, n=20)\n",
"gpm_variables = ['/Grid/precipitationCal']\n",
"\n",
"corner_point_request = Request(collection=gpm_collection, spatial=corner_point_box,\n",
" granule_id=[gpm_granule_id], variables=gpm_variables)\n",
"corner_point_request = Request(\n",
" collection=gpm_collection,\n",
" spatial=corner_point_box,\n",
" granule_id=[gpm_granule_id],\n",
" variables=gpm_variables,\n",
")\n",
"corner_point_job_id = harmony_client.submit(corner_point_request)\n",
"\n",
"print(f'Processing job: {corner_point_job_id}')\n",
"\n",
"for filename in [file_future.result()\n",
" for file_future\n",
" in harmony_client.download_all(corner_point_job_id, overwrite=True, directory=demo_directory)]:\n",
"for filename in [\n",
" file_future.result()\n",
" for file_future in harmony_client.download_all(\n",
" corner_point_job_id, overwrite=True, directory=demo_directory\n",
" )\n",
"]:\n",
" print(f'Downloaded: {filename}')"
]
}
Expand Down
2 changes: 1 addition & 1 deletion docs/requirements.txt
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
#
#
# These requirements are used by the documentation Jupyter notebooks in the
# harmony-opendap-subsetter/docs directory.
#
Expand Down
Loading
Loading