Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Github actions and ReadTheDocs integration #262

Open
wants to merge 27 commits into
base: master
Choose a base branch
from

Conversation

aperezhortal
Copy link
Member

Recently I stumbled upon this Github action (rtds-action) designed to build example galleries using Github Actions and building the documentation using ReadTheDocs. I was curious to try this out and see how it goes (also I wanted to play with GH artifacts 🤓).

Inspired by the rtds-action workflow, this PR adds a new Github action that renders the example gallery and stores it as an artifact in Github. Then, it triggers a ReadTheDocs build that uses the rendered example gallery to build the documentation. This reduces the ReadTheDocs build time and allows us to keep the example gallery in the main repo, and control when it is rendered. For example, by default, the RTD builds can be triggered by changes in the main branch and new releases.

One advantage of the new approach using artifacts is that everything happens in the same repo. An event triggers the action, the action builds the documentation, uploads the artifact, and finally triggers read the docs. A disadvantage is that the artifacts are deleted after 90 days. If we re-build the documentation from ReadTheDocs after that period for a particular tagged version, the build will fail. In this case, we need to manually trigger the GitHub action for that tag using Github's REST API.

To decouple or not to decouple? - That is the question. 🤔

There is another PR (#251) that solves the same issue (reducing the RTD build time) by moving the example gallery to a different repository. This PR and #251 are both valid solutions to the problem. However, in my (short) experience, the workflow using artifacts was easier to develop (and debug) than the one using the decoupled gallery since almost everything is taking place in a single place.

What are your thoughts on this? @kmuehlbauer, @dnerini?

Notes

I have submitted this PR from my personal fork since it requires a different configuration for the RTD Webhooks. Changing this in the Pysteps will stop the automatic triggering of RTD for the master and the other branches.

https://github.com/aperezhortal/pysteps/tree/rtd_using_gh_artifacts

Here is the example gallery

Important

Prior merging this PR, we need to update the Github secrets and the RTD configuration.

@codecov
Copy link

codecov bot commented Feb 6, 2022

Codecov Report

Merging #262 (359680f) into master (72f3bf5) will decrease coverage by 0.02%.
The diff coverage is 25.00%.

❗ Current head 359680f differs from pull request most recent head dc46166. Consider uploading reports for the commit dc46166 to get more accurate results

Impacted file tree graph

@@            Coverage Diff             @@
##           master     #262      +/-   ##
==========================================
- Coverage   80.92%   80.90%   -0.03%     
==========================================
  Files         142      142              
  Lines       10758    10761       +3     
==========================================
  Hits         8706     8706              
- Misses       2052     2055       +3     
Flag Coverage Δ
unit_tests 80.90% <25.00%> (-0.03%) ⬇️

Flags with carried forward coverage won't be shown. Click here to find out more.

Impacted Files Coverage Δ
pysteps/verification/spatialscores.py 85.42% <ø> (ø)
pysteps/datasets.py 38.01% <25.00%> (-0.68%) ⬇️

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 72f3bf5...dc46166. Read the comment docs.

@dnerini
Copy link
Member

dnerini commented Feb 7, 2022

This looks very promising! As you say, this solution with Github Artifacts seems very elegant and arguably easier to maintain, which is an important aspect to consider.

A disadvantage is that the artifacts are deleted after 90 days. If we re-build the documentation from ReadTheDocs after that period for a particular tagged version, the build will fail. In this case, we need to manually trigger the GitHub action for that tag using Github's REST API.

Would a schedule event be a solution for this? I assume we would need to regularly trigger the master branch only for the "latest" version of our docs, since stable releases are unlikely to need a rebuild.

@kmuehlbauer
Copy link

@aperezhortal 👍 💯 Great work! I think the major selling point is the neat automation and that the rendered artifact IS actually deleted after some period of time. So no rendered branches are left with the repo clogging up space.

Two things to consider. First reproducibility, if an artifact need to be rebuild after some time it might be worthwhile to keep the used environment around somehow (maybe rendered in the docs?). Second, automation has it's cost, if notebook cells render errors (eg. broken images) this has to be captured somehow. Otherwise it will directly end up on RTD (as it does now, if no one checks).

@aperezhortal
Copy link
Member Author

Would a schedule event be a solution for this? I assume we would need to regularly trigger the master branch only for the "latest" version of our docs, since stable releases are unlikely to need a rebuild.

The docs for the master branch are rendered with every push and pull event. That will keep RTD up to date. For the manual retriggering I was actually thinking in the unlikely case where the documentation for a tagged version needs to be rebuilt.

@aperezhortal
Copy link
Member Author

Thanks for the suggestion @kmuehlbauer!

Two things to consider. First reproducibility, if an artifact need to be rebuild after some time it might be worthwhile to keep the used environment around somehow (maybe rendered in the docs?).

Good point. I like the idea! I will try including it in RTD in some folder.

Second, automation has it's cost, if notebook cells render errors (eg. broken images) this has to be captured somehow. Otherwise it will directly end up on RTD (as it does now, if no one checks).

Yes, now the CI only deals with errors in the documentation build. That is, errors that stop the rendering of the documentation. In this case, RTD is not triggered. However, I can see small errors in the gallery going through unnoticed. I'll check how they are captured and reported.

@aperezhortal
Copy link
Member Author

if notebook cells render errors (eg. broken images) this has to be captured somehow. Otherwise, it will directly end up on RTD (as it does now if no one checks).

Commit 933ecec adds the "abort_on_example_error": True option in conf.py to abort the documentation render if an error is raised, and avoids triggering RTD. However, this only captures execution errors and ignores possible errors in the rendered images. Maybe this is not the ultimate solution, but it is a good start.

Also, now the environment used to render the examples is saved in the auto-examples folder in the documentation. This environment can be used to trigger the action by manually checking out specific commits. However, this won't be straightforward and will probably require a custom action temporary branch. Nonetheless, the information necessary to re-render is there in the exceptional case we need to fix something.

Copy link
Member

@dnerini dnerini left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@aperezhortal I like the solution and I think we should go ahead and deploy it. Using the GH actions is a very elegant solution to remove some load from the RTD build.

However, this only captures execution errors and ignores possible errors in the rendered images. Maybe this is not the ultimate solution, but it is a good start.

I think this is fine, already now we can only check manually that the examples render correctly (and it has already proven useful to spot bugs). I'd say this is one part of our QA (visual assessment) that is going to be difficult to automatize.

I understand not all details are perfectly solved yet, but as you say, this looks more than good enough for a first version. With time I'm convinced we'll be able to improve it and fine tune it.

Well done!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants