Last updated: 2023-Aug-23
A normalizing flow library for Julia.
The purpose of this package is to provide a simple and flexible interface for variational inference (VI) and normalizing flows (NF) for Bayesian computation or generative modeling. The key focus is to ensure modularity and extensibility, so that users can easily construct (e.g., define customized flow layers) and combine various components (e.g., choose different VI objectives or gradient estimates) for variational approximation of general target distributions, without being tied to specific probabilistic programming frameworks or applications.
See the documentation for more.
To install the package, run the following command in the Julia REPL:
] # enter Pkg mode
(@v1.9) pkg> add git@github.com:TuringLang/NormalizingFlows.jl.git
Then simply run the following command to use the package:
using NormalizingFlows
Normalizing flows transform a simple reference distribution
In more details, given the base distribution, usually a standard Gaussian distribution, i.e.,
where
Since all the transformations are invertible (technically diffeomorphic), we can evaluate the density of a normalizing flow distribution
Here we drop the subscript
Given the feasibility of i.i.d. sampling and density evaluation, normalizing flows can be trained by minimizing some statistical distances to the target distribution
and
Both problems can be solved via standard stochastic optimization algorithms, such as stochastic gradient descent (SGD) and its variants.
Reverse KL minimization is typically used for Bayesian computation, where one
wants to approximate a posterior distribution
- general interface development
- documentation
- including more NF examples/Tutorials
- WIP: PR#11
- GPU compatibility
- WIP: PR#25
- benchmarking
- Bijectors.jl: a package for defining bijective transformations, which can be used for defining customized flow layers.
- Flux.jl
- Optimisers.jl
- AdvancedVI.jl