Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

docs: improve website organization #1147

Merged
merged 25 commits into from
Aug 13, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
25 commits
Select commit Hold shift + click to select a range
8a77ffa
DOC improve organisation
tomMoral Apr 22, 2024
14dbf9c
FIX pre-commit hooks
tomMoral Apr 22, 2024
89b45a0
add index for tutorials
tomMoral Apr 22, 2024
8818d70
refactor new landing page.
janfb May 7, 2024
867355a
update docs dependencies
janfb May 7, 2024
b20136f
wip: replace nav-bar dropdown with index.
janfb Jun 3, 2024
5891f82
shorter snippet, add publications to main page.
janfb Jun 11, 2024
4552d15
show faq as ordered list, format install.md.
janfb Jun 11, 2024
b19e4d2
join tutorial instructions from README and tutorials/index.
janfb Jun 11, 2024
d388422
shorten snippet, remove navigation bar dropdown details
janfb Jun 11, 2024
ca32ce1
fix snippet
janfb Jun 11, 2024
8d93d5e
fix: links and md headings.
janfb Jun 20, 2024
c7480c1
FIX small changes in index.md
tomMoral Jul 30, 2024
e5d7209
DOC expose doc on PR
tomMoral Jul 30, 2024
2f63976
FIX linter
tomMoral Jul 30, 2024
d2a7b28
FIX workflow target
tomMoral Jul 30, 2024
d1c555d
FIX check doc workflow
tomMoral Jul 30, 2024
2f30d43
Merge branch 'main' into DOC_improve_doc
janfb Aug 6, 2024
b732f6c
refactor: improve landing page and credits; update methods
janfb Aug 6, 2024
7f22346
docs: change gh action to convert nbs and deploy docs upon release
janfb Aug 7, 2024
1a17846
CLN remove mkdocs-jupyter pluggin
tomMoral Aug 7, 2024
1393d8a
DOC remove mkdocs-jupyter+add doc version control
tomMoral Aug 7, 2024
d4675c6
MTN update .gitignore
tomMoral Aug 7, 2024
9af6295
fix: griffe warnings about .md links; refactoring text.
janfb Aug 8, 2024
988afa4
fix: configure gh user in action for pushing to gh-pages
janfb Aug 8, 2024
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
58 changes: 58 additions & 0 deletions .github/workflows/build_docs.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,58 @@
name: "Build docs and deploy"
on:
push:
branches:
- main
release:
types: [ published ]

jobs:
docs:
name: Build Documentation
runs-on: ubuntu-latest
steps:
- name: Checkout
uses: actions/checkout@v4
with:
fetch-depth: 0
lfs: false

- name: Set up Python
uses: actions/setup-python@v2
with:
python-version: '3.10'

- name: Cache dependency
id: cache-dependencies
uses: actions/cache@v4
with:
path: ~/.cache/pip
key: ${{ runner.os }}-pip

- name: Install sbi and dependencies
run: |
python -m pip install --upgrade pip
python -m pip install .[doc]

- name: convert notebooks to markdown
run: |
cd docs
jupyter nbconvert --to markdown ../examples/*.ipynb --output-dir docs/examples/
jupyter nbconvert --to markdown ../tutorials/*.ipynb --output-dir docs/tutorials/

- name: Configure Git user for bot
run: |
git config --local user.email "github-actions[bot]@users.noreply.github.com"
git config --local user.name "github-actions[bot]"

- name: Build and deploy dev documentation upon push to main
if: ${{ github.event_name == 'push' }}
run: |
cd docs
mike deploy dev --push

- name: Build and deploy the lastest documentation upon new release
if: ${{ github.event_name == 'release' }}
run: |
cd docs
mike deploy ${{ github.event.release.name }} latest -u --push
2 changes: 0 additions & 2 deletions .gitignore
Original file line number Diff line number Diff line change
@@ -1,8 +1,6 @@
# Project specific
.sbi_env/
*sbi-logs/
/docs/docs/tutorial/*
/docs/docs/examples/*
/docs/site/*

# Development files and python cache
Expand Down
48 changes: 26 additions & 22 deletions docs/docs/contribute.md
Original file line number Diff line number Diff line change
Expand Up @@ -38,12 +38,13 @@ the end of every year. Additionally, we mention all contributors in the releases
propose and then working on your pull request after getting some feedback from
others.

### How to contribute
### Contribution workflow

The following steps describe all parts of the workflow for doing a contribution such as
installing locally `sbi` from source, creating a `conda` environment, setting up
your `git` repository, etc. We've taken strong inspiration from the guides for
contribution of [`scikit-learn`](https://scikit-learn.org/stable/developers/contributing.html)
The following steps describe all parts of the workflow for doing a contribution
such as installing locally `sbi` from source, creating a `conda` environment,
setting up your `git` repository, etc. We've taken strong inspiration from the
contribution guides of
[`scikit-learn`](https://scikit-learn.org/stable/developers/contributing.html)
and [`mne`](https://mne.tools/stable/development/contributing.html):

**Step 1**: [Create an account](https://github.com/) on GitHub if you do not
Expand Down Expand Up @@ -178,19 +179,22 @@ to also run them without `-n auto`.
When you create a PR onto `main`, our Continuous Integration (CI) actions on
GitHub will perform the following checks:

- **`ruff`** for linting and formatting (including `black`, `isort`, and `flake8`)
- **[`ruff`](https://docs.astral.sh/ruff/formatter/)** for linting and formatting
(including `black`, `isort`, and `flake8`)
- **[`pyright`](https://github.com/Microsoft/pyright)** for static type checking.
- **`pytest`** for running a subset of fast tests from our test suite.
- **[`pytest`](https://docs.pytest.org/en/stable/index.html)** for running a subset of
fast tests from our test suite.

If any of these fail, try reproducing and solving the error locally:

- **`ruff`**: Make sure you have `pre-commit` installed locally with the same
version as specified in the [requirements](pyproject.toml). Execute it
using `pre-commit run --all-files`. `ruff` tends to give informative error
messages that help you fix the problem. Note that pre-commit only detects
problems with `ruff` linting and formatting, but does not fix them. You can
fix them either by running `ruff check . --fix(linting)`, followed by
`ruff format . --fix(formatting)`, or by hand.
- **`ruff`**: Make sure you have `pre-commit` installed locally with the same version as
specified in the
[`pyproject.toml`](https://github.com/sbi-dev/sbi/blob/main/pyproject.toml). Execute it
using `pre-commit run --all-files`. `ruff` tends to give informative error messages
that help you fix the problem. Note that pre-commit only detects problems with `ruff`
linting and formatting, but does not fix them. You can fix them either by running
`ruff check . --fix(linting)`, followed by `ruff format . --fix(formatting)`, or by
hand.
- **`pyright`**: Run it locally using `pyright sbi/` and ensure you are using
the same
`pyright` version as used in the CI (which is the case if you have installed
Expand All @@ -209,17 +213,17 @@ fails (xfailed).
## Contributing to the documentation
Most of the documentation for `sbi` is written in markdown and the website is
generated using `mkdocs` with `mkdocstrings`. To work on improvements of the
documentation, you should first run the command on your terminal
documentation, you should first install the `doc` dependencies:
```
pip install -e ".[doc]"
```
Then, you can run the command on your terminal
```
cd docs
jupyter nbconvert --to markdown ../examples/*.ipynb --output-dir docs/examples/
jupyter nbconvert --to markdown ../tutorials/*.ipynb --output-dir docs/tutorials/
mkdocs serve
```
and open a browser on the page proposed by `mkdocs`. Now, whenever you
make changes to the markdown files of the documentation, you can see the results
almost immediately in the browser.

Note that the tutorials and examples are initially written in jupyter notebooks
and then converted to markdown programatically. To do so locally, you should run
```
jupyter nbconvert --to markdown ../tutorials/*.ipynb --output-dir docs/tutorial/
jupyter nbconvert --to markdown ../examples/*.ipynb --output-dir docs/examples/
```
janfb marked this conversation as resolved.
Show resolved Hide resolved
39 changes: 14 additions & 25 deletions docs/docs/credits.md
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,8 @@ direct contributions to the codebase have been instrumental in the development o

## License

`sbi` is licensed under the [Apache License (Apache-2.0)](https://www.apache.org/licenses/LICENSE-2.0) and
`sbi` is licensed under the [Apache License
(Apache-2.0)](https://www.apache.org/licenses/LICENSE-2.0) and

> Copyright (C) 2020 Álvaro Tejero-Cantero, Jakob H. Macke, Jan-Matthis Lückmann,
> Michael Deistler, Jan F. Bölts.
Expand All @@ -20,35 +21,23 @@ direct contributions to the codebase have been instrumental in the development o
## Support

`sbi` has been supported by the German Federal Ministry of Education and Research (BMBF)
through the project ADIMEM (FKZ 01IS18052 A-D).
[ADIMEM](https://fit.uni-tuebingen.de/Project/Details?id=9199) is a collaborative
project between the groups of Jakob Macke (Uni Tübingen), Philipp Berens (Uni Tübingen),
Philipp Hennig (Uni Tübingen), and Marcel Oberlaender (caesar Bonn), which aims to develop
inference methods for mechanistic models.
through project ADIMEM (FKZ 01IS18052 A-D), project SiMaLeSAM (FKZ 01IS21055A) and the
Tübingen AI Center (FKZ 01IS18039A). Since 2024, `sbi` has been supported by the
appliedAI Institute for Europe gGmbH.

![](static/logo_bmbf.svg)

## Important dependencies and prior art

* `sbi` is the successor to [`delfi`](https://github.com/mackelab/delfi), a Theano-based
toolbox for sequential neural posterior estimation developed at [mackelab](https://uni-tuebingen.de/en/research/core-research/cluster-of-excellence-machine-learning/research/research/cluster-research-groups/professorships/machine-learning-in-science/). If you were
using `delfi`, we strongly recommend to move your inference over to `sbi`. Please open
issues if you find unexpected behaviour or missing features. We will consider these
bugs and give them priority.

* `sbi` as a PyTorch-based toolbox started as a fork of
- `sbi` is the successor to [`delfi`](https://github.com/mackelab/delfi), a Theano-based
toolbox for sequential neural posterior estimation developed at
[mackelab](https://www.mackelab.org).If you were using `delfi`, we strongly recommend
moving your inference over to `sbi`. Please open issues if you find unexpected
behavior or missing features. We will consider these bugs and give them priority.
- `sbi` as a PyTorch-based toolbox started as a fork of
[conormdurkan/lfi](https://github.com/conormdurkan/lfi), by [Conor
M.Durkan](https://conormdurkan.github.io/).

* `sbi` uses density estimators from
[bayesiains/nflows](https://github.com/bayesiains/nsf) by [Conor
M.Durkan](https://conormdurkan.github.io/), [George
Papamakarios](https://gpapamak.github.io/) and [Artur
Bekasov](https://arturbekasov.github.io/). These are proxied through
[`pyknos`](https://github.com/mackelab/pyknos), a package focused on density estimation.

* `sbi` uses `PyTorch` and tries to align with the interfaces (e.g. for probability
- `sbi` uses `PyTorch` and tries to align with the interfaces (e.g. for probability
distributions) adopted by `PyTorch`.

* See [README.md](https://github.com/mackelab/sbi/blob/master/README.md) for a list of
publications describing the methods implemented in `sbi`.
- See [README.md](https://github.com/mackelab/sbi/blob/master/README.md) for a
list of publications describing the methods implemented in `sbi`.
2 changes: 2 additions & 0 deletions docs/docs/examples/.gitignore
Original file line number Diff line number Diff line change
@@ -0,0 +1,2 @@
*.md
*.png
24 changes: 11 additions & 13 deletions docs/docs/faq.md
Original file line number Diff line number Diff line change
@@ -1,15 +1,13 @@
# Frequently asked questions

[Can the algorithms deal with invalid data, e.g., NaN or inf?](faq/question_02_nans.md)

[What should I do when my 'posterior samples are outside of the prior support' in SNPE?](faq/question_01_leakage.md)

[When using multiple workers, I get a pickling error. Can I still use multiprocessing?](faq/question_03_pickling_error.md)

[Can I use the GPU for training the density estimator?](faq/question_04_gpu.md)

[How should I save and load objects in `sbi`?](faq/question_05_pickling.md)

[Can I stop neural network training and resume it later?](faq/question_06_resume_training.md)

[How can I use a prior that is not defined in PyTorch?](faq/question_07_custom_prior.md)
1. [What should I do when my 'posterior samples are outside of the prior support' in SNPE?](faq/question_01_leakage.md)
2. [Can the algorithms deal with invalid data, e.g., NaN or inf?](faq/question_02_nans.md)
3. [When using multiple workers, I get a pickling error. Can I still use multiprocessing?](faq/question_03_pickling_error.md)
4. [Can I use the GPU for training the density estimator?](faq/question_04_gpu.md)
5. [How should I save and load objects in `sbi`?](faq/question_05_pickling.md)
6. [Can I stop neural network training and resume it later?](faq/question_06_resume_training.md)
7. [How can I use a prior that is not defined in PyTorch?](faq/question_07_custom_prior.md)

See also [discussion page](https://github.com/sbi-dev/sbi/discussions) and [issue
tracker](https://github.com/sbi-dev/sbi/issues) on the `sbi` GitHub repository for
recent questions and problems.
Loading