forked from pyro-ppl/numpyro
-
Notifications
You must be signed in to change notification settings - Fork 0
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
Change README.md to reflect current status of the project (pyro-ppl#173)
* Update README.md * Update README.md * Update README.md
- Loading branch information
1 parent
89cbdd3
commit e6b9e50
Showing
1 changed file
with
53 additions
and
21 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -1,28 +1,60 @@ | ||
# NumPyro | ||
|
||
[](https://travis-ci.com/pyro-ppl/numpyro) | ||
[](https://numpyro.readthedocs.io/en/latest/?badge=latest) | ||
|
||
[Pyro](https://github.com/pyro-ppl/pyro) on Numpy. This uses | ||
[JAX](https://github.com/google/jax) for autograd and JIT support. This is an | ||
early stage experimental library that is under active development, and there are | ||
likely to be many changes to the API and internal classes, as the design evolves. | ||
Probabilistic programming with Numpy powered by [JAX](https://github.com/google/jax) for autograd and JIT compilation to GPU/CPU. | ||
|
||
## Design Goals | ||
## What is NumPyro? | ||
|
||
- **Lightweight** - We do not intend to reimplement any heavy inference machinery | ||
from Pyro, but would like to provide a flexible substrate that can be built | ||
upon. We will provide support for Pyro primitives like `sample` and `param` | ||
which can be interpreted with side-effects using effect handlers. Users should | ||
be able to extend this to implement custom inference algorithms, and write | ||
their models using the familiar Numpy API. | ||
- **Functional** - The API for the inference algorithms and other utility functions | ||
may deviate from Pyro in favor of a more *functional* style that works better | ||
with JAX. e.g. no global param store or random state. | ||
- **Fast** - Using JAX, we aim to aggressively JIT compile intermediate computations | ||
to XLA optimized kernels. We will evaluate JIT compilation, and benchmark runtime | ||
for Hamiltonian Monte Carlo. | ||
|
||
## Longer-term Plans | ||
NumPyro is a small probabilistic programming library built on [JAX](https://github.com/google/jax). It essentially provides a NumPy backend for [Pyro](https://github.com/pyro-ppl/pyro), with some minor changes to the inference API and syntax. Since we use JAX, we get autograd and JIT compilation to GPU / CPU for free. This is an alpha release, and the API is subject to change as the design evolves. | ||
|
||
NumPyro is designed to be *lightweight* and focuses on providing a flexible substrate that users can build on: | ||
|
||
- **Pyro Primitives:** NumPyro programs can contain regular Python and NumPy code, in addition to [Pyro primitives](http://pyro.ai/examples/intro_part_i.html) like `sample` and `param`. The model code should look very similar to Pyro except for some minor differences between PyTorch and Numpy's API. See [Examples](https://github.com/pyro-ppl/numpyro/#Examples). | ||
- **Inference algorithms:** NumPyro currently supports Hamiltonian Monte Carlo, including an implementation of the No U-Turn Sampler. One of the motivations for NumPyro was to speed up Hamiltonian Monte Carlo by JIT compiling the verlet integration step that includes multiple gradient computations. With JAX, we can compose `jit` and `grad` to compile the entire integration step into an XLA optimized kernel. We also eliminate Python overhead by JIT compiling the entire tree building stage in NUTS (this is possible using [Iterative NUTS](https://github.com/pyro-ppl/numpyro/wiki/Iterative-NUTS)). There is also a basic Variational Inference implementation for reparameterized distributions. | ||
- **Distributions:** The [numpyro.distributions](https://numpyro.readthedocs.io/en/latest/distributions.html) module provides distribution classes, constraints and bijective transforms. The distribution classes wrap over samplers implemented to work with JAX's [functional pseudo-random number generator](https://github.com/google/jax#random-numbers-are-different). The design of the distributions module largely follows from [PyTorch](https://pytorch.org/docs/stable/distributions.html). A major subset of the API is implemented, and it contains most of the common distributions that exist in PyTorch. As a result, Pyro and PyTorch users can rely on the same API and batching semantics as in `torch.distributions`. In addition to distributions, `constraints` and `transforms` are very useful when operating on distribution classes with bounded support. | ||
- **Effect handlers:** Like Pyro, primitives like `sample` and `param` can be interpreted with side-effects using effect-handlers from the [numpyro.handlers](https://numpyro.readthedocs.io/en/latest/handlers.html) module, and these can be easily extended to implement custom inference algorithms and inference utilities. | ||
|
||
|
||
## Installation | ||
|
||
To install NumPyro with a CPU version of JAX, you can use pip: | ||
|
||
``` | ||
pip install numpyro | ||
``` | ||
|
||
To use NumPyro on the GPU, you will need to first [install](https://github.com/google/jax#installation) `jax` and `jaxlib` with CUDA support. | ||
|
||
You can also install NumPyro from source: | ||
|
||
``` | ||
git clone https://github.com/pyro-ppl/numpyro.git | ||
# install jax/jaxlib first for CUDA support | ||
pip install -e .[dev] | ||
``` | ||
|
||
## Examples | ||
|
||
|
||
For some examples on specifying models and doing inference in NumPyro: | ||
|
||
- [Bayesian Regression in Numpyro](https://nbviewer.jupyter.org/github/pyro-ppl/numpyro/blob/master/notebooks/bayesian_regression.ipynb) - Start here to get acquainted with writing a simple model in NumPyro, MCMC inference API, effect handlers and writing custom inference utilities. | ||
- [Time Series Forecasting](https://nbviewer.jupyter.org/github/pyro-ppl/numpyro/blob/master/notebooks/time_series_forecasting.ipynb) - Illustrates how to convert for loops in the model to JAX's `lax.scan` primitive for fast inference. | ||
- [Baseball example](https://github.com/pyro-ppl/numpyro/blob/master/examples/baseball.py) - Using NUTS for a simple hierarchical model. Compare this with the baseball example in [Pyro](https://github.com/pyro-ppl/pyro/blob/dev/examples/baseball.py). | ||
- [Hidden Markov Model](https://github.com/pyro-ppl/numpyro/blob/master/examples/hmm.py) in NumPyro as compared to [Stan](https://mc-stan.org/docs/2_19/stan-users-guide/hmms-section.html). | ||
- [Variational Autoencoder](https://github.com/pyro-ppl/numpyro/blob/master/examples/vae.py) - As a simple example that uses Variational Inference. [Pyro implementation](https://github.com/pyro-ppl/pyro/blob/dev/examples/vae/vae.py) for comparison. | ||
- Other model examples can be found in the [examples](https://github.com/pyro-ppl/numpyro/tree/master/examples) folder. | ||
|
||
Users will note that the API for model specification is largely the same as Pyro including the distributions API, by design. The interface for inference algorithms and other utility functions might deviate from Pyro in favor of a more *functional* style that works better with JAX. e.g. there is no global parameter store or random state. | ||
|
||
## Future Work | ||
|
||
In the near term, we plan to work on the following. Please open new issues for feature requests and enhancements: | ||
|
||
- More inference algorithms, particularly those that require second order derivaties or use HMC. | ||
- Integration with [Funsor](https://github.com/pyro-ppl/funsor) to support inference algorithms with delayed sampling. | ||
- Supporting more distributions, extending the distributions API, and adding more samplers to JAX. | ||
- Other areas motivated by Pyro's research goals and application focus, and interest from the community. | ||
|
||
It is possible that much of this code will end up being absorbed into the Pyro | ||
project itself as an alternate Numpy backend. |