Skip to content

ParadaCarleton/Soss.jl

 
 

Repository files navigation

Soss

Stable Dev Build Status Coverage

Soss is a library for probabilistic programming.

Let's look at an example. First we'll load things:

using MeasureTheory
using Soss

MeasureTheory.jl is designed specifically with PPLs like Soss in mind, though you can also use Distributions.jl.

Now for a model. Here's a linear regression:

m = @model x begin
    α ~ Lebesgue(ℝ)
    β ~ Normal()
    σ ~ Exponential()
    y ~ For(x) do xj
        Normal+ β * xj, σ)
    end
    return y
end

Next we'll generate some fake data to work with. For x-values, let's use

x = randn(20)

Now loosely speaking, Lebesgue(ℝ) is uniform over the real numbers, so we can't really sample from it. Instead, let's transform the model and make α an argument:

julia> predα = predictive(m, )
@model (x, α) begin
        σ ~ Exponential()
        β ~ Normal()
        y ~ For(x) do xj
                Normal+ β * xj, σ)
            end
        return y
    end

Now we can do

julia> y = rand(predα(x=x,α=10.0))
20-element Vector{Float64}:
 10.554133456468438
  9.378065258831002
 12.873667041657287
  8.940799408080496
 10.737189595204965
  9.500536439014208
 11.327606120726893
 10.899892855024445
 10.18488773139243
 10.386969795947177
 10.382195272387214
  8.358407507910297
 10.727173015711768
 10.452311211064654
 11.076232496702387
 11.362009520020141
  9.539433052406448
 10.61851691333643
 11.586170856832645
  9.197496058151618

Now for inference! Let's use DynamicHMC, which we have wrapped in SampleChainsDynamicHMC.

julia> using SampleChainsDynamicHMC
[ Info: Precompiling SampleChainsDynamicHMC [6d9fd711-e8b2-4778-9c70-c1dfb499d4c4]

julia> post = sample(DynamicHMCChain, m(x=x) | (y=y,))
4000-element MultiChain with 4 chains and schema (σ = Float64, β = Float64, α = Float64)
(σ = 1.0±0.15, β = 0.503±0.26, α = 10.2±0.25)

How is Soss different from Turing?

First, a fine point: When people say "the Turing PPL" they usually mean what's technically called "DynamicPPL".

  • In Soss, models are first class, and can be composed or nested. For example, you can define a model and later nest it inside another model, and inference will handle both together. DynamicPPL can also handle nested models (see this PR) though I'm not aware of a way to combine independently-defined DynamicPPL models for a single inference pass.
  • Soss has been updated to use MeasureTheory.jl, though everything from Distributions.jl is still available.
  • Soss allows model transformations. This can be used, for example, to easily express predictive distributions or Markov blanket as a new model.
  • Most of the focus of Soss is at the syntactic level; inference works in terms of "primitives" that transform the model's abstract syntax tree (AST) to new code. This adds the same benefits as using Julia's macros and generated functions, as opposed to higher-order functions alone.
  • Soss can evaluate log-densities symbolically, which can then be used to produce optimized evaluations for much faster inference. This capability is in relatively early stages, and will be made more robust in our ongoing development.
  • The Soss team is much smaller than that of DynamicPPL. But I hope that will change (contributors welcome!)

Soss and DynamicPPL are both maturing and becoming more complete, so the above will change over time. It's also worth noting that we (the Turing team and I) hope to move toward a natural way of using these systems together to arrive at the best of both.

How can I get involved?

I'm glad you asked! Lots of things:

For more details, please see the documentation.

Stargazers over time

Stargazers over time

About

Probabilistic programming via source rewriting

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Languages

  • Julia 96.2%
  • TeX 3.8%