diff --git a/previews/PR510/.documenter-siteinfo.json b/previews/PR510/.documenter-siteinfo.json index 29d2410d7..2a6e248c5 100644 --- a/previews/PR510/.documenter-siteinfo.json +++ b/previews/PR510/.documenter-siteinfo.json @@ -1 +1 @@ -{"documenter":{"julia_version":"1.11.0","generation_timestamp":"2024-10-23T13:17:39","documenter_version":"1.7.0"}} \ No newline at end of file +{"documenter":{"julia_version":"1.11.0","generation_timestamp":"2024-10-24T18:15:27","documenter_version":"1.7.0"}} \ No newline at end of file diff --git a/previews/PR510/developer/checklist/index.html b/previews/PR510/developer/checklist/index.html index 3d1f87bec..1b5231f4c 100644 --- a/previews/PR510/developer/checklist/index.html +++ b/previews/PR510/developer/checklist/index.html @@ -20,4 +20,4 @@ Either of those should automatically publish a new version to the Julia registry. - Once registered, the `TagBot.yml` workflow should create a tag, and rebuild the documentation for this tag. - - These steps can take quite a bit of time (1 hour or more), so don't be surprised if the new documentation takes a while to appear. + - These steps can take quite a bit of time (1 hour or more), so don't be surprised if the new documentation takes a while to appear. diff --git a/previews/PR510/developer/contributing/index.html b/previews/PR510/developer/contributing/index.html index 860e5a06e..1611e2c4d 100644 --- a/previews/PR510/developer/contributing/index.html +++ b/previews/PR510/developer/contributing/index.html @@ -1,5 +1,5 @@ -Contributing · EpiAware.jl

Contributing

This page details the some of the guidelines that should be followed when contributing to this package. It is adapted from Documenter.jl.

Branches

release-* branches are used for tagged minor versions of this package. This follows the same approach used in the main Julia repository, albeit on a much more modest scale.

Please open pull requests against the master branch rather than any of the release-* branches whenever possible.

Backports

Bug fixes are backported to the release-* branches using git cherry-pick -x by a EpiAware member and will become available in point releases of that particular minor version of the package.

Feel free to nominate commits that should be backported by opening an issue. Requests for new point releases to be tagged in METADATA.jl can also be made in the same way.

release-* branches

  • Each new minor version x.y.0 gets a branch called release-x.y (a protected branch).
  • New versions are usually tagged only from the release-x.y branches.
  • For patch releases, changes get backported to the release-x.y branch via a single PR with the standard name "Backports for x.y.z" and label "Type: Backport". The PR message links to all the PRs that are providing commits to the backport. The PR gets merged as a merge commit (i.e. not squashed).
  • The old release-* branches may be removed once they have outlived their usefulness.
  • Patch version milestones are used to keep track of which PRs get backported etc.

Style Guide

Follow the style of the surrounding text when making changes. When adding new features please try to stick to the following points whenever applicable. This project follows the SciML style guide.

Tests

Unit tests

As is conventional for Julia packages, unit tests are located at test/*.jl with the entrypoint test/runtests.jl.

End to end testing

Tests that build example package docs from source and inspect the results (end to end tests) are located in /test/examples. The main entry points are test/examples/make.jl for building and test/examples/test.jl for doing some basic checks on the generated outputs.

Pluto usage in showcase documentation

Some of the showcase examples in EpiAware/docs/src/showcase use Pluto.jl notebooks for the underlying computation. The output of the notebooks is rendered into HTML for inclusion in the documentation in two steps:

  1. PlutoStaticHTML.jl converts the notebook with output into a machine-readable .md format.
  2. Documenter.jl renders the .md file into HTML for inclusion in the documentation during the build process.

For other examples of using Pluto to generate documentation see the examples shown here.

Running Pluto notebooks from EpiAware locally

To run the Pluto.jl scripts in the EpiAware documentation directly from the source code you can do these steps:

  1. Install Pluto.jl locally. We recommend using the version of Pluto that is pinned in the Project.toml file defining the documentation environment.
  2. Clone the EpiAware repository.
  3. Start Pluto.jl either from REPL (see the Pluto.jl documentation) or from the command line with the shell script EpiAware/docs/pluto-scripts.sh.
  4. From the Pluto.jl interface, navigate to the Pluto.jl script you want to run.

Contributing to Pluto notebooks in EpiAware documentation

Modifying an existing Pluto notebook

Committing changes to the Pluto.jl notebooks in the EpiAware documentation is the same as committing changes to any other part of the repository. However, please note that we expect the following features for the environment management of the notebooks:

  1. Use the environment determined by the Project.toml file in the EpiAware/docs directory. If you want extra packages, add them to this environment.
  2. Use the version of EpiAware that is used in these notebooks to be the version of EpiAware on the branch being pull requested into main. To do this use the Pkg.develop function.

To do this you can use the following code snippet in the Pluto notebook:

# Determine the relative path to the `EpiAware/docs` directory
+Contributing · EpiAware.jl

Contributing

This page details the some of the guidelines that should be followed when contributing to this package. It is adapted from Documenter.jl.

Branches

release-* branches are used for tagged minor versions of this package. This follows the same approach used in the main Julia repository, albeit on a much more modest scale.

Please open pull requests against the master branch rather than any of the release-* branches whenever possible.

Backports

Bug fixes are backported to the release-* branches using git cherry-pick -x by a EpiAware member and will become available in point releases of that particular minor version of the package.

Feel free to nominate commits that should be backported by opening an issue. Requests for new point releases to be tagged in METADATA.jl can also be made in the same way.

release-* branches

  • Each new minor version x.y.0 gets a branch called release-x.y (a protected branch).
  • New versions are usually tagged only from the release-x.y branches.
  • For patch releases, changes get backported to the release-x.y branch via a single PR with the standard name "Backports for x.y.z" and label "Type: Backport". The PR message links to all the PRs that are providing commits to the backport. The PR gets merged as a merge commit (i.e. not squashed).
  • The old release-* branches may be removed once they have outlived their usefulness.
  • Patch version milestones are used to keep track of which PRs get backported etc.

Style Guide

Follow the style of the surrounding text when making changes. When adding new features please try to stick to the following points whenever applicable. This project follows the SciML style guide.

Tests

Unit tests

As is conventional for Julia packages, unit tests are located at test/*.jl with the entrypoint test/runtests.jl.

End to end testing

Tests that build example package docs from source and inspect the results (end to end tests) are located in /test/examples. The main entry points are test/examples/make.jl for building and test/examples/test.jl for doing some basic checks on the generated outputs.

Benchmarking

Benchmarking is orchestrated using PkgBenchmark.jl along with a GitHub action that uses BenchmarkCI.jl The benchmarks are located in benchmarks/ and the main entry point is benchmarks/runbenchmarks.jl.

The main function in the benchmark environment is make_epiaware_suite which calls TuringBenchmarking.make_turing_suite on a set of Turing models generated by EpiAware benchmarking their sampling with the following autodiff backends:

  • ForwardDiff.jl.
  • ReverseDiff.jl: With compile = false.
  • ReverseDiff.jl: With compile = true.

Benchmarking "gotchas"

Models with no parameters

In EpiAware we do expose some models thats do not have parameters, for example, Poisson sampling with a transformation on a fixed mean process implemented by TransformObservationModel(NegativeBinomialError()) has no sampleable parameters (although it does contributed log-likelihood as part of a wider model). This causes TuringBenchmarking.make_turing_suite to throw an error as it expects all models to have parameters.

Pluto usage in showcase documentation

Some of the showcase examples in EpiAware/docs/src/showcase use Pluto.jl notebooks for the underlying computation. The output of the notebooks is rendered into HTML for inclusion in the documentation in two steps:

  1. PlutoStaticHTML.jl converts the notebook with output into a machine-readable .md format.
  2. Documenter.jl renders the .md file into HTML for inclusion in the documentation during the build process.

For other examples of using Pluto to generate documentation see the examples shown here.

Running Pluto notebooks from EpiAware locally

To run the Pluto.jl scripts in the EpiAware documentation directly from the source code you can do these steps:

  1. Install Pluto.jl locally. We recommend using the version of Pluto that is pinned in the Project.toml file defining the documentation environment.
  2. Clone the EpiAware repository.
  3. Start Pluto.jl either from REPL (see the Pluto.jl documentation) or from the command line with the shell script EpiAware/docs/pluto-scripts.sh.
  4. From the Pluto.jl interface, navigate to the Pluto.jl script you want to run.

Contributing to Pluto notebooks in EpiAware documentation

Modifying an existing Pluto notebook

Committing changes to the Pluto.jl notebooks in the EpiAware documentation is the same as committing changes to any other part of the repository. However, please note that we expect the following features for the environment management of the notebooks:

  1. Use the environment determined by the Project.toml file in the EpiAware/docs directory. If you want extra packages, add them to this environment.
  2. Use the version of EpiAware that is used in these notebooks to be the version of EpiAware on the branch being pull requested into main. To do this use the Pkg.develop function.

To do this you can use the following code snippet in the Pluto notebook:

# Determine the relative path to the `EpiAware/docs` directory
 docs_dir = dirname(dirname(dirname(dirname(@__DIR__))))
 # Determine the relative path to the `EpiAware` package directory
 pkg_dir = dirname(docs_dir)
@@ -7,4 +7,4 @@
 using Pkg: Pkg
 Pkg.activate(docs_dir)
 Pkg.develop(; path = pkg_dir)
-Pkg.instantiate()

Adding a new Pluto notebook

Adding a new Pluto.jl notebook to the EpiAware documentation is the same as adding any other file to the repository. However, in addition to following the guidelines for modifying an existing notebook, please note that the new notebook is added to the set of notebook builds using build in the EpiAware/docs/make.jl file. This will generate an .md of the same name as the notebook which can be rendered when makedocs is run. For this document to be added to the overall documentation the path to the .md file must be added to the Pages array defined in EpiAware/docs/pages.jl.

+Pkg.instantiate()

Adding a new Pluto notebook

Adding a new Pluto.jl notebook to the EpiAware documentation is the same as adding any other file to the repository. However, in addition to following the guidelines for modifying an existing notebook, please note that the new notebook is added to the set of notebook builds using build in the EpiAware/docs/make.jl file. This will generate an .md of the same name as the notebook which can be rendered when makedocs is run. For this document to be added to the overall documentation the path to the .md file must be added to the Pages array defined in EpiAware/docs/pages.jl.

diff --git a/previews/PR510/developer/index.html b/previews/PR510/developer/index.html index fda0350a8..43ab263c4 100644 --- a/previews/PR510/developer/index.html +++ b/previews/PR510/developer/index.html @@ -1,2 +1,2 @@ -Overview · EpiAware.jl

Developer documentation

Welcome to the EpiAware developer documentation! This section is designed to help you get started with developing the package.

+Overview · EpiAware.jl

Developer documentation

Welcome to the EpiAware developer documentation! This section is designed to help you get started with developing the package.

diff --git a/previews/PR510/getting-started/explainers/index.html b/previews/PR510/getting-started/explainers/index.html index c608e2b4e..4bc94b886 100644 --- a/previews/PR510/getting-started/explainers/index.html +++ b/previews/PR510/getting-started/explainers/index.html @@ -1,2 +1,2 @@ -Overview · EpiAware.jl

Explainers

This section contains a series of explainers that provide a detailed overview of the EpiAware platform and its features. These explainers are designed to help you understand the platform and its capabilities, and to provide you with the information you need to get started using EpiAware. See the sidebar for the list of explainers.

+Overview · EpiAware.jl

Explainers

This section contains a series of explainers that provide a detailed overview of the EpiAware platform and its features. These explainers are designed to help you understand the platform and its capabilities, and to provide you with the information you need to get started using EpiAware. See the sidebar for the list of explainers.

diff --git a/previews/PR510/getting-started/explainers/inference/index.html b/previews/PR510/getting-started/explainers/inference/index.html index a4033165f..84c323cbd 100644 --- a/previews/PR510/getting-started/explainers/inference/index.html +++ b/previews/PR510/getting-started/explainers/inference/index.html @@ -1,2 +1,2 @@ -Inference · EpiAware.jl
+Inference · EpiAware.jl
diff --git a/previews/PR510/getting-started/explainers/interfaces/index.html b/previews/PR510/getting-started/explainers/interfaces/index.html index 28b2a6ae9..01c5f8345 100644 --- a/previews/PR510/getting-started/explainers/interfaces/index.html +++ b/previews/PR510/getting-started/explainers/interfaces/index.html @@ -1,2 +1,2 @@ -Interfaces · EpiAware.jl

Interfaces

We support two primary workflows for using the package:

  • EpiProblem: A high-level interface for defining and fitting models to data. This is the recommended way to use the package.
  • Turing interface: A lower-level interface for defining and fitting models to data. This is the more flexible way to use the package and may also be more familiar to users of Turing.jl.

See the getting started section for tutorials on each of these workflows.

EpiProblem

Each module of the overall epidemiological model we are interested in is a Turing Model in its own right. In this section, we compose the individual models into the full epidemiological model using the EpiProblem struct.

The constructor for an EpiProblem requires:

  • An epi_model.
  • A latent_model.
  • An observation_model.
  • A tspan.

The tspan set the range of the time index for the models.

Turing interface

The Turing interface is a lower-level interface for defining and fitting models to data. This is the more flexible way to use the package and may also be more familiar to users of Turing.jl.

+Interfaces · EpiAware.jl

Interfaces

We support two primary workflows for using the package:

  • EpiProblem: A high-level interface for defining and fitting models to data. This is the recommended way to use the package.
  • Turing interface: A lower-level interface for defining and fitting models to data. This is the more flexible way to use the package and may also be more familiar to users of Turing.jl.

See the getting started section for tutorials on each of these workflows.

EpiProblem

Each module of the overall epidemiological model we are interested in is a Turing Model in its own right. In this section, we compose the individual models into the full epidemiological model using the EpiProblem struct.

The constructor for an EpiProblem requires:

  • An epi_model.
  • A latent_model.
  • An observation_model.
  • A tspan.

The tspan set the range of the time index for the models.

Turing interface

The Turing interface is a lower-level interface for defining and fitting models to data. This is the more flexible way to use the package and may also be more familiar to users of Turing.jl.

diff --git a/previews/PR510/getting-started/explainers/intro/index.html b/previews/PR510/getting-started/explainers/intro/index.html index 0edc79bf3..38c268e93 100644 --- a/previews/PR510/getting-started/explainers/intro/index.html +++ b/previews/PR510/getting-started/explainers/intro/index.html @@ -60,4 +60,4 @@ E-->|sample...NUTS...| G G-->H -H-->I +H-->I diff --git a/previews/PR510/getting-started/explainers/julia/index.html b/previews/PR510/getting-started/explainers/julia/index.html index 1d6599f8d..dd5fdecab 100644 --- a/previews/PR510/getting-started/explainers/julia/index.html +++ b/previews/PR510/getting-started/explainers/julia/index.html @@ -36,4 +36,4 @@ }, "julia.liveTestFile": "path/to/runtests.jl", "julia.environmentPath": "path/to/project/directory", -}

These settings set basic code formatting and whitespace settings for Julia files, as well as setting the path to the test file for the project and the path to the project directory for the environment.

The VS-Code command Julia: Start REPL will start a REPL in TERMINAL tab in the editor with the environment set to the project directory and the Testing tab will detect the defined tests for the project.

Literate programming with Julia in EpiAware

Its common to develop technical computing projects using a literate programming style, where code and documentation are interwoven. Julia supports this style of programming through a number of packages. In EpiAware we recommend the following:

We use Pluto for interactive development and Quarto for generating reports and academic articles. Both tools are useful for developing reproducible workflows.

+}

These settings set basic code formatting and whitespace settings for Julia files, as well as setting the path to the test file for the project and the path to the project directory for the environment.

The VS-Code command Julia: Start REPL will start a REPL in TERMINAL tab in the editor with the environment set to the project directory and the Testing tab will detect the defined tests for the project.

Literate programming with Julia in EpiAware

Its common to develop technical computing projects using a literate programming style, where code and documentation are interwoven. Julia supports this style of programming through a number of packages. In EpiAware we recommend the following:

We use Pluto for interactive development and Quarto for generating reports and academic articles. Both tools are useful for developing reproducible workflows.

diff --git a/previews/PR510/getting-started/explainers/latent-models/index.html b/previews/PR510/getting-started/explainers/latent-models/index.html index 0e0b5eb80..fd45dca0c 100644 --- a/previews/PR510/getting-started/explainers/latent-models/index.html +++ b/previews/PR510/getting-started/explainers/latent-models/index.html @@ -1,2 +1,2 @@ -Latent models · EpiAware.jl
+Latent models · EpiAware.jl
diff --git a/previews/PR510/getting-started/explainers/modelling-infections/index.html b/previews/PR510/getting-started/explainers/modelling-infections/index.html index 374c8693a..e58fba75d 100644 --- a/previews/PR510/getting-started/explainers/modelling-infections/index.html +++ b/previews/PR510/getting-started/explainers/modelling-infections/index.html @@ -1,2 +1,2 @@ -Modelling infections · EpiAware.jl
+Modelling infections · EpiAware.jl
diff --git a/previews/PR510/getting-started/explainers/observation-models/index.html b/previews/PR510/getting-started/explainers/observation-models/index.html index 505800465..ea55a1c15 100644 --- a/previews/PR510/getting-started/explainers/observation-models/index.html +++ b/previews/PR510/getting-started/explainers/observation-models/index.html @@ -1,2 +1,2 @@ -Observation models · EpiAware.jl
+Observation models · EpiAware.jl
diff --git a/previews/PR510/getting-started/faq/index.html b/previews/PR510/getting-started/faq/index.html index 3bd8dfc3c..d4529c50e 100644 --- a/previews/PR510/getting-started/faq/index.html +++ b/previews/PR510/getting-started/faq/index.html @@ -1,2 +1,2 @@ -Frequently asked questions · EpiAware.jl

Frequently asked questions

This page contains a list of frequently asked questions about the EpiAware package. If you have a question that is not answered here, please open a discussion on the GitHub repository.

    Pluto notebooks

    In some of the showcase examples in EpiAware/docs/src/showcase we use Pluto.jl notebooks for the underlying computation. As well as reading the code blocks and output of the notebooks in this documentation, you can also run these notebooks by cloning EpiAware and running the notebooks with Pluto.jl (for further details see developer notes).

    It should be noted that Pluto.jl notebooks are reactive, meaning that they re-run downstream code after changes with downstreaming determined by a tree of dependent code blocks. This is different from the standard Julia REPL, and some other notebook formats (e.g. .ipynb). In Pluto each code block is a single lines of code or encapsulated by let ... end and begin ... end. The difference between let ... end blocks and begin ... end blocks are that the let ... end type of code block only adds the final output/return value of the block to scope, like an anonymous function, whereas begin ... end executes each line and adds defined variables to scope.

    For installation instructions and more information and documentation on Pluto.jl see the Pluto.jl documentation.

    Manipulating EpiAware model specifications

    Modular model construction

    One of the key features of EpiAware is the ability to specify models as components of a larger model. This is useful for specifying models that are shared across multiple EpiProblems or for specifying models that are used in multiple methods. You can see an examples of this approach in our showcases.

    Remaking models

    An alternative to modular model construction is to remake models with different parameters. This can be useful for comparing models with different parameters or for comparing models with different priors. Whilst we don't have a built in function for this, we recommend the Accessors.jl package for this purpose. For examples of how to use this package see the documentation.

    Working with Turing.jl models

    DynamicPPL.jl

    Whilst Turing.jl is the front end of the Turing.jl ecosystem, it is not the only package that can be used to work with Turing.jl models. DynamicPPL.jl is the part of the ecosytem that deals with defining, running, and manipulating models.

    Conditioning and deconditioning models

    DynamicPPL supports the condition (alased with |) to fix values as known observations in the model (i.e fixing values on the left hand side of ~ definitions). This is useful for fixing parameters to known values or for conditioning the model on data. The decondition function can be used to remove these conditions. Internally this is what apply_method(::EpiProblem, ...) does to condition the user supplied EpiProblem to data. See more here.

    Fixing and unfixing models

    Similarly to conditioning and deconditioning models, DynamicPPL supports fixing and unfixing models via the fix and unfix functions. Fixing is essentially saying that variables are constants (i.e replacing the right hand side of ~ with a value and changing the ~ to a =). A common use of this would be to simplify a prespecified model, for example to make the variance of a random walk be known versus estimated from the data. We also use this functionality in apply_method(::EpiProblem, ...) to allow users to simplify EpiProblems on the fly. See more here.

    Tools for working with MCMCChain objects

    MCMCChain.jl

    MCMCChain.jl is the package from which MCMCChains is imported. It provides a number of useful functions for working with MCMCChain objects. These include functions for summarising, plotting, and manipulating chains. Below is a list of some of the most useful functions.

    • plot: Plots trace and density plots for each parameter in the chain object.
    • histogram: Plots histograms for each parameter in the chain object by chain.
    • get: Accesses the values of a parameter/s in the chain object.
    • DataFrames.DataFrame converts a chain into a wide format DataFrame.
    • describe: Prints the summary statistics of the chain object.

    There are many more functions available in the MCMCChain.jl package. For a full list of functions, see the documentation.

    Arviz.jl

    An alternative to MCMCChain.jl is the ArviZ.jl package. ArviZ.jl is a Julia meta-package for exploratory analysis of Bayesian models. It is part of the ArviZ project, which also includes a related Python package.

    ArviZ.jl uses a InferenceData object to store the results of a Bayesian analysis. This object can be created from a MCMCChain object using the from_mcmcchains function. The InferenceData object can then be used to create a range of plots and summaries of the model. This is particularly useful as it allows you to specify the indexes of your parameters (for example you could use dates for time parameters).

    In addition to this useful functionality from_mcmcchains can also be used to combine posterior predictions with prior predictions, prior information and the log likelihood of the model (see here for an example of this). This unlocks a range of useful diagnostics and plots that can be used to assess the model.

    There is a lot of functionality in ArviZ.jl and it is worth exploring the documentation to see what is available.

    +Frequently asked questions · EpiAware.jl

    Frequently asked questions

    This page contains a list of frequently asked questions about the EpiAware package. If you have a question that is not answered here, please open a discussion on the GitHub repository.

      Pluto notebooks

      In some of the showcase examples in EpiAware/docs/src/showcase we use Pluto.jl notebooks for the underlying computation. As well as reading the code blocks and output of the notebooks in this documentation, you can also run these notebooks by cloning EpiAware and running the notebooks with Pluto.jl (for further details see developer notes).

      It should be noted that Pluto.jl notebooks are reactive, meaning that they re-run downstream code after changes with downstreaming determined by a tree of dependent code blocks. This is different from the standard Julia REPL, and some other notebook formats (e.g. .ipynb). In Pluto each code block is a single lines of code or encapsulated by let ... end and begin ... end. The difference between let ... end blocks and begin ... end blocks are that the let ... end type of code block only adds the final output/return value of the block to scope, like an anonymous function, whereas begin ... end executes each line and adds defined variables to scope.

      For installation instructions and more information and documentation on Pluto.jl see the Pluto.jl documentation.

      Manipulating EpiAware model specifications

      Modular model construction

      One of the key features of EpiAware is the ability to specify models as components of a larger model. This is useful for specifying models that are shared across multiple EpiProblems or for specifying models that are used in multiple methods. You can see an examples of this approach in our showcases.

      Remaking models

      An alternative to modular model construction is to remake models with different parameters. This can be useful for comparing models with different parameters or for comparing models with different priors. Whilst we don't have a built in function for this, we recommend the Accessors.jl package for this purpose. For examples of how to use this package see the documentation.

      Working with Turing.jl models

      DynamicPPL.jl

      Whilst Turing.jl is the front end of the Turing.jl ecosystem, it is not the only package that can be used to work with Turing.jl models. DynamicPPL.jl is the part of the ecosytem that deals with defining, running, and manipulating models.

      Conditioning and deconditioning models

      DynamicPPL supports the condition (alased with |) to fix values as known observations in the model (i.e fixing values on the left hand side of ~ definitions). This is useful for fixing parameters to known values or for conditioning the model on data. The decondition function can be used to remove these conditions. Internally this is what apply_method(::EpiProblem, ...) does to condition the user supplied EpiProblem to data. See more here.

      Fixing and unfixing models

      Similarly to conditioning and deconditioning models, DynamicPPL supports fixing and unfixing models via the fix and unfix functions. Fixing is essentially saying that variables are constants (i.e replacing the right hand side of ~ with a value and changing the ~ to a =). A common use of this would be to simplify a prespecified model, for example to make the variance of a random walk be known versus estimated from the data. We also use this functionality in apply_method(::EpiProblem, ...) to allow users to simplify EpiProblems on the fly. See more here.

      Tools for working with MCMCChain objects

      MCMCChain.jl

      MCMCChain.jl is the package from which MCMCChains is imported. It provides a number of useful functions for working with MCMCChain objects. These include functions for summarising, plotting, and manipulating chains. Below is a list of some of the most useful functions.

      • plot: Plots trace and density plots for each parameter in the chain object.
      • histogram: Plots histograms for each parameter in the chain object by chain.
      • get: Accesses the values of a parameter/s in the chain object.
      • DataFrames.DataFrame converts a chain into a wide format DataFrame.
      • describe: Prints the summary statistics of the chain object.

      There are many more functions available in the MCMCChain.jl package. For a full list of functions, see the documentation.

      Arviz.jl

      An alternative to MCMCChain.jl is the ArviZ.jl package. ArviZ.jl is a Julia meta-package for exploratory analysis of Bayesian models. It is part of the ArviZ project, which also includes a related Python package.

      ArviZ.jl uses a InferenceData object to store the results of a Bayesian analysis. This object can be created from a MCMCChain object using the from_mcmcchains function. The InferenceData object can then be used to create a range of plots and summaries of the model. This is particularly useful as it allows you to specify the indexes of your parameters (for example you could use dates for time parameters).

      In addition to this useful functionality from_mcmcchains can also be used to combine posterior predictions with prior predictions, prior information and the log likelihood of the model (see here for an example of this). This unlocks a range of useful diagnostics and plots that can be used to assess the model.

      There is a lot of functionality in ArviZ.jl and it is worth exploring the documentation to see what is available.

      diff --git a/previews/PR510/getting-started/index.html b/previews/PR510/getting-started/index.html index 0b58f72c1..75776a48e 100644 --- a/previews/PR510/getting-started/index.html +++ b/previews/PR510/getting-started/index.html @@ -1,2 +1,2 @@ -Overview · EpiAware.jl

      Getting started

      Note that this section of the documentation is still under construction. Please see replications for the most up-to-date information. Please feel free to contribute to the documentation by submitting a pull request.

      Welcome to the EpiAware documentation! This section is designed to help you get started with the package. It includes a frequently asked questions (FAQ) section, a series of explainers that provide a detailed overview of the platform and its features, and tutorials that will help you get started with EpiAware for specific tasks. See the sidebar for the list of topics.

      +Overview · EpiAware.jl

      Getting started

      Note that this section of the documentation is still under construction. Please see replications for the most up-to-date information. Please feel free to contribute to the documentation by submitting a pull request.

      Welcome to the EpiAware documentation! This section is designed to help you get started with the package. It includes a frequently asked questions (FAQ) section, a series of explainers that provide a detailed overview of the platform and its features, and tutorials that will help you get started with EpiAware for specific tasks. See the sidebar for the list of topics.

      diff --git a/previews/PR510/getting-started/installation/index.html b/previews/PR510/getting-started/installation/index.html index 12fe34cf5..1d3575117 100644 --- a/previews/PR510/getting-started/installation/index.html +++ b/previews/PR510/getting-started/installation/index.html @@ -1,2 +1,2 @@ -Installation · EpiAware.jl

      Installation

      Eventually, EpiAware is likely to be added to the Julia registry. Until then, you can install it from the /EpiAware sub-directory of this repository by running the following command in the Julia REPL:

      using Pkg; Pkg.add(url="https://github.com/CDCgov/Rt-without-renewal", subdir="EpiAware")
      +Installation · EpiAware.jl

      Installation

      Eventually, EpiAware is likely to be added to the Julia registry. Until then, you can install it from the /EpiAware sub-directory of this repository by running the following command in the Julia REPL:

      using Pkg; Pkg.add(url="https://github.com/CDCgov/Rt-without-renewal", subdir="EpiAware")
      diff --git a/previews/PR510/getting-started/quickstart/index.html b/previews/PR510/getting-started/quickstart/index.html index 84fbac130..68fced66e 100644 --- a/previews/PR510/getting-started/quickstart/index.html +++ b/previews/PR510/getting-started/quickstart/index.html @@ -1,2 +1,2 @@ -Quickstart · EpiAware.jl

      Quickstart

      Get up and running with EpiAware in just a few minutes using this quickstart guide.

      +Quickstart · EpiAware.jl

      Quickstart

      Get up and running with EpiAware in just a few minutes using this quickstart guide.

      diff --git a/previews/PR510/getting-started/tutorials/censored-obs/index.html b/previews/PR510/getting-started/tutorials/censored-obs/index.html index e297edf54..5512390fd 100644 --- a/previews/PR510/getting-started/tutorials/censored-obs/index.html +++ b/previews/PR510/getting-started/tutorials/censored-obs/index.html @@ -274,7 +274,7 @@

      iterationchainmusigmalpn_stepsis_acceptacceptance_rate...125110.5698283.16687-6326.423.01.00.889234225210.515543.21602-6327.173.01.00.748919325310.6991243.15787-6327.727.01.00.895135425410.5230353.16482-6326.83.01.00.728358525510.4975833.17839-6327.163.01.00.840027625610.5489563.15474-6326.613.01.00.792753725710.5693353.16453-6326.433.01.01.0825810.6062343.2018-6326.533.01.00.977859925910.5306583.17006-6326.693.01.00.9002041026010.6271173.11173-6327.417.01.00.761508...
      summarize(naive_fit)
      -
      parametersmeanstdmcseess_bulkess_tailrhatess_per_sec
      1:mu0.5841290.07040670.001514492153.021487.541.00003311.987
      2:sigma3.177660.04957750.001135421905.041306.941.00127276.053
      +
      parametersmeanstdmcseess_bulkess_tailrhatess_per_sec
      1:mu0.5841290.07040670.001514492153.021487.541.00003310.726
      2:sigma3.177660.04957750.001135421905.041306.941.00127274.937
      let
           f = pairplot(naive_fit)
      @@ -335,7 +335,7 @@ 

      We see that the model has converged and the diagnostics look good. We also see that the posterior means are very near the true parameters and the 90% credible intervals include the true parameters.

      -
      + diff --git a/previews/PR510/getting-started/tutorials/index.html b/previews/PR510/getting-started/tutorials/index.html index 1d6aa9bce..6ef2ce729 100644 --- a/previews/PR510/getting-started/tutorials/index.html +++ b/previews/PR510/getting-started/tutorials/index.html @@ -1,2 +1,2 @@ -Overview · EpiAware.jl

      Tutorials

      This section contains tutorials that will help you get started with EpiAware for specific tasks. See the sidebar for the list of tutorials.

      +Overview · EpiAware.jl

      Tutorials

      This section contains tutorials that will help you get started with EpiAware for specific tasks. See the sidebar for the list of tutorials.

      diff --git a/previews/PR510/getting-started/tutorials/multiple-infection-processes/index.html b/previews/PR510/getting-started/tutorials/multiple-infection-processes/index.html index 8684bdb0a..f43b71f5b 100644 --- a/previews/PR510/getting-started/tutorials/multiple-infection-processes/index.html +++ b/previews/PR510/getting-started/tutorials/multiple-infection-processes/index.html @@ -1,2 +1,2 @@ -Multiple infection processes · EpiAware.jl
      +Multiple infection processes · EpiAware.jl
      diff --git a/previews/PR510/getting-started/tutorials/multiple-observation-models/index.html b/previews/PR510/getting-started/tutorials/multiple-observation-models/index.html index 2b0eafe6a..763aca7d4 100644 --- a/previews/PR510/getting-started/tutorials/multiple-observation-models/index.html +++ b/previews/PR510/getting-started/tutorials/multiple-observation-models/index.html @@ -1,2 +1,2 @@ -Multiple observation models · EpiAware.jl +Multiple observation models · EpiAware.jl diff --git a/previews/PR510/getting-started/tutorials/nowcasting/index.html b/previews/PR510/getting-started/tutorials/nowcasting/index.html index 9db8e1090..cba4e8d3b 100644 --- a/previews/PR510/getting-started/tutorials/nowcasting/index.html +++ b/previews/PR510/getting-started/tutorials/nowcasting/index.html @@ -1,2 +1,2 @@ -Nowcasting · EpiAware.jl +Nowcasting · EpiAware.jl diff --git a/previews/PR510/getting-started/tutorials/partial-pooling/index.html b/previews/PR510/getting-started/tutorials/partial-pooling/index.html index 0f8cd70b5..e83d7ee6a 100644 --- a/previews/PR510/getting-started/tutorials/partial-pooling/index.html +++ b/previews/PR510/getting-started/tutorials/partial-pooling/index.html @@ -1,2 +1,2 @@ -Partial pooling · EpiAware.jl +Partial pooling · EpiAware.jl diff --git a/previews/PR510/getting-started/tutorials/simple-renewal-with-delays/index.html b/previews/PR510/getting-started/tutorials/simple-renewal-with-delays/index.html index 76f8ba286..8181b1158 100644 --- a/previews/PR510/getting-started/tutorials/simple-renewal-with-delays/index.html +++ b/previews/PR510/getting-started/tutorials/simple-renewal-with-delays/index.html @@ -1,2 +1,2 @@ -Simple renewal with delays · EpiAware.jl +Simple renewal with delays · EpiAware.jl diff --git a/previews/PR510/index.html b/previews/PR510/index.html index 5667c8d96..27a3800a9 100644 --- a/previews/PR510/index.html +++ b/previews/PR510/index.html @@ -1,2 +1,2 @@ -EpiAware.jl: Real-time infectious disease monitoring · EpiAware.jl

      EpiAware.jl

      Infectious disease situational awareness modelling toolkit for Julia.

      Where to start

      +EpiAware.jl: Real-time infectious disease monitoring · EpiAware.jl

      EpiAware.jl

      Infectious disease situational awareness modelling toolkit for Julia.

      Where to start

      diff --git a/previews/PR510/lib/EpiAwareBase/index.html b/previews/PR510/lib/EpiAwareBase/index.html index ed61847fc..f204ff3a0 100644 --- a/previews/PR510/lib/EpiAwareBase/index.html +++ b/previews/PR510/lib/EpiAwareBase/index.html @@ -1,2 +1,2 @@ -Overview · EpiAware.jl

      EpiAwareBase.jl

      This package provides the core functionality for the EpiAware ecosystem. It is a dependency of all other EpiAware packages.

      API

        +Overview · EpiAware.jl

        EpiAwareBase.jl

        This package provides the core functionality for the EpiAware ecosystem. It is a dependency of all other EpiAware packages.

        API

          diff --git a/previews/PR510/lib/EpiAwareBase/internals/index.html b/previews/PR510/lib/EpiAwareBase/internals/index.html index f963cc6dd..0bdb0438c 100644 --- a/previews/PR510/lib/EpiAwareBase/internals/index.html +++ b/previews/PR510/lib/EpiAwareBase/internals/index.html @@ -1,2 +1,2 @@ -Internal API · EpiAware.jl
          +Internal API · EpiAware.jl
          diff --git a/previews/PR510/lib/EpiAwareBase/public/index.html b/previews/PR510/lib/EpiAwareBase/public/index.html index 3ac8ba427..c3b2a64ec 100644 --- a/previews/PR510/lib/EpiAwareBase/public/index.html +++ b/previews/PR510/lib/EpiAwareBase/public/index.html @@ -1,5 +1,5 @@ -Public API · EpiAware.jl

          Public Documentation

          Documentation for EpiAwareBae.jl's public interface.

          See the Internals section of the manual for internal package docs covering all submodules.

          Contents

          Index

          Public API

          EpiAware.EpiAwareBase.AbstractEpiOptMethodType
          abstract type AbstractEpiOptMethod <: AbstractEpiMethod

          Abstract supertype for infence/generative methods that are based on optimization, e.g. MAP estimation or variational inference.


          Fields

          source
          EpiAware.EpiAwareBase.EpiAwareObservablesType
          struct EpiAwareObservables

          The EpiAwareObservables struct represents the observables used in the EpiAware model.

          Fields

          • model: The model used for the observables.
          • data: The data used for the observables.
          • samples: Samples from the posterior distribution.
          • generated: The generated observables.

          Fields

          • model::Any

          • data::Any

          • samples::Any

          • generated::Any

          source
          EpiAware.EpiAwareBase.EpiMethodType
          struct EpiMethod{O<:AbstractEpiOptMethod, S<:AbstractEpiSamplingMethod} <: AbstractEpiMethod

          EpiMethod represents a method for performing EpiAware inference and/or generative modelling, which combines a sequence of optimization steps to pass initialisation information to a sampler method.


          Fields

          • pre_sampler_steps::Vector{O} where O<:AbstractEpiOptMethod: Pre-sampler optimization steps.

          • sampler::AbstractEpiSamplingMethod: Sampler method.

          source
          EpiAware.EpiAwareBase.EpiProblemType
          struct EpiProblem{E<:AbstractEpiModel, L<:AbstractLatentModel, O<:AbstractObservationModel} <: AbstractEpiProblem

          Defines an inference/generative modelling problem for case data.

          EpiProblem wraps the underlying components of an epidemiological model:

          • epi_model: An epidemiological model for unobserved infections.
          • latent_model: A latent model for underlying latent process.
          • observation_model: An observation model for observed cases.

          Along with a tspan tuple for the time span of the case data.


          Fields

          • epi_model::AbstractEpiModel: Epidemiological model for unobserved infections.

          • latent_model::AbstractLatentModel: Latent model for underlying latent process.

          • observation_model::AbstractObservationModel: Observation model for observed cases.

          • tspan::Tuple{Int64, Int64}: Time span for either inference or generative modelling of case time series.

          source
          EpiAware.EpiAwareBase._apply_methodFunction
          _apply_method(
          +Public API · EpiAware.jl

          Public Documentation

          Documentation for EpiAwareBae.jl's public interface.

          See the Internals section of the manual for internal package docs covering all submodules.

          Contents

          Index

          Public API

          EpiAware.EpiAwareBase.AbstractEpiOptMethodType
          abstract type AbstractEpiOptMethod <: AbstractEpiMethod

          Abstract supertype for infence/generative methods that are based on optimization, e.g. MAP estimation or variational inference.


          Fields

          source
          EpiAware.EpiAwareBase.EpiAwareObservablesType
          struct EpiAwareObservables

          The EpiAwareObservables struct represents the observables used in the EpiAware model.

          Fields

          • model: The model used for the observables.
          • data: The data used for the observables.
          • samples: Samples from the posterior distribution.
          • generated: The generated observables.

          Fields

          • model::Any

          • data::Any

          • samples::Any

          • generated::Any

          source
          EpiAware.EpiAwareBase.EpiMethodType
          struct EpiMethod{O<:AbstractEpiOptMethod, S<:AbstractEpiSamplingMethod} <: AbstractEpiMethod

          EpiMethod represents a method for performing EpiAware inference and/or generative modelling, which combines a sequence of optimization steps to pass initialisation information to a sampler method.


          Fields

          • pre_sampler_steps::Vector{O} where O<:AbstractEpiOptMethod: Pre-sampler optimization steps.

          • sampler::AbstractEpiSamplingMethod: Sampler method.

          source
          EpiAware.EpiAwareBase.EpiProblemType
          struct EpiProblem{E<:AbstractEpiModel, L<:AbstractLatentModel, O<:AbstractObservationModel} <: AbstractEpiProblem

          Defines an inference/generative modelling problem for case data.

          EpiProblem wraps the underlying components of an epidemiological model:

          • epi_model: An epidemiological model for unobserved infections.
          • latent_model: A latent model for underlying latent process.
          • observation_model: An observation model for observed cases.

          Along with a tspan tuple for the time span of the case data.


          Fields

          • epi_model::AbstractEpiModel: Epidemiological model for unobserved infections.

          • latent_model::AbstractLatentModel: Latent model for underlying latent process.

          • observation_model::AbstractObservationModel: Observation model for observed cases.

          • tspan::Tuple{Int64, Int64}: Time span for either inference or generative modelling of case time series.

          source
          EpiAware.EpiAwareBase._apply_methodFunction
          _apply_method(
               model::AbstractEpiModel,
               method::AbstractEpiMethod;
               ...
          @@ -10,18 +10,18 @@
               prev_result;
               kwargs...
           )
          -

          Apply the inference/generative method method to the AbstractEpiModel object mdl.

          Arguments

          • model::AbstractEpiModel: The model to apply the method to.
          • method::AbstractEpiMethod: The epidemiological method to apply.
          • prev_result: The previous result of the method.
          • kwargs: Additional keyword arguments passed to the method.

          Returns

          • nothing: If no concrete implementation is defined for the given method.
          source
          EpiAware.EpiAwareBase.apply_methodMethod
          apply_method(
          +

          Apply the inference/generative method method to the AbstractEpiModel object mdl.

          Arguments

          • model::AbstractEpiModel: The model to apply the method to.
          • method::AbstractEpiMethod: The epidemiological method to apply.
          • prev_result: The previous result of the method.
          • kwargs: Additional keyword arguments passed to the method.

          Returns

          • nothing: If no concrete implementation is defined for the given method.
          source
          EpiAware.EpiAwareBase.apply_methodMethod
          apply_method(
               model,
               method,
               data;
               kwargs...
           ) -> EpiAwareObservables
          -

          Wrap the _apply_method function by calling it with the given model, method, data, and optional keyword arguments (kwargs). The resulting solution is then passed to the generated_observables function, along with the model and input data, to compute the generated observables.

          Arguments

          • model: The model to apply the method to.
          • method: The method to apply to the model.
          • data: The data to pass to the apply_method function.
          • kwargs: Optional keyword arguments to pass to the apply_method function.

          Returns

          The generated observables computed from the solution.

          source
          EpiAware.EpiAwareBase.apply_methodMethod
          apply_method(
          +

          Wrap the _apply_method function by calling it with the given model, method, data, and optional keyword arguments (kwargs). The resulting solution is then passed to the generated_observables function, along with the model and input data, to compute the generated observables.

          Arguments

          • model: The model to apply the method to.
          • method: The method to apply to the model.
          • data: The data to pass to the apply_method function.
          • kwargs: Optional keyword arguments to pass to the apply_method function.

          Returns

          The generated observables computed from the solution.

          source
          EpiAware.EpiAwareBase.apply_methodMethod
          apply_method(
               epiproblem::EpiProblem,
               method::AbstractEpiMethod,
               data;
          @@ -29,43 +29,43 @@
               condition_parameters,
               kwargs...
           ) -> EpiAwareObservables
          -

          Run the EpiAware algorithm to estimate the parameters of an epidemiological model.

          Arguments

          • epiproblem::EpiProblem: An EpiProblem object specifying the epidemiological problem.
          • method::EpiMethod: An EpiMethod object specifying the inference method.
          • data: The observed data used for inference.

          Keyword Arguments

          • fix_parameters::NamedTuple: A NamedTuple of fixed parameters for the model.
          • condition_parameters::NamedTuple: A NamedTuple of conditioned parameters for the model.
          • kwargs...: Additional keyword arguments passed to the inference methods.

          Returns

          • A NamedTuple with a samples field which is the output of applying methods and a model field with the model used. Optionally, a gens field with the generated quantities from the model if that makes sense with the inference method.
          source
          EpiAware.EpiAwareBase.broadcast_nMethod
          broadcast_n(
          +

          Run the EpiAware algorithm to estimate the parameters of an epidemiological model.

          Arguments

          • epiproblem::EpiProblem: An EpiProblem object specifying the epidemiological problem.
          • method::EpiMethod: An EpiMethod object specifying the inference method.
          • data: The observed data used for inference.

          Keyword Arguments

          • fix_parameters::NamedTuple: A NamedTuple of fixed parameters for the model.
          • condition_parameters::NamedTuple: A NamedTuple of conditioned parameters for the model.
          • kwargs...: Additional keyword arguments passed to the inference methods.

          Returns

          • A NamedTuple with a samples field which is the output of applying methods and a model field with the model used. Optionally, a gens field with the generated quantities from the model if that makes sense with the inference method.
          source
          EpiAware.EpiAwareBase.broadcast_nMethod
          broadcast_n(
               broadcast_rule::AbstractBroadcastRule,
               latent,
               n,
               period
           )
          -

          This function is used to define the behavior of broadcasting for a specific type of AbstractBroadcastRule.

          The broadcast_n function returns the length of the latent periods to generate using the given broadcast_rule. Which model of broadcasting to be implemented is set by the type of broadcast_rule. If no implemention is defined for the given broadcast_rule, then EpiAware will return a warning and return nothing.

          source
          EpiAware.EpiAwareBase.broadcast_ruleMethod
          broadcast_rule(
          +

          This function is used to define the behavior of broadcasting for a specific type of AbstractBroadcastRule.

          The broadcast_n function returns the length of the latent periods to generate using the given broadcast_rule. Which model of broadcasting to be implemented is set by the type of broadcast_rule. If no implemention is defined for the given broadcast_rule, then EpiAware will return a warning and return nothing.

          source
          EpiAware.EpiAwareBase.broadcast_ruleMethod
          broadcast_rule(
               broadcast_rule::AbstractBroadcastRule,
               n,
               period
           )
          -

          This function is used to define the behavior of broadcasting for a specific type of AbstractBroadcastRule.

          The broadcast_rule function implements a model of broadcasting a latent process. Which model of broadcasting to be implemented is set by the type of broadcast_rule. If no implemention is defined for the given broadcast_rule, then EpiAware will return a warning and return nothing.

          source
          EpiAware.EpiAwareBase.condition_modelMethod
          condition_model(
          +

          This function is used to define the behavior of broadcasting for a specific type of AbstractBroadcastRule.

          The broadcast_rule function implements a model of broadcasting a latent process. Which model of broadcasting to be implemented is set by the type of broadcast_rule. If no implemention is defined for the given broadcast_rule, then EpiAware will return a warning and return nothing.

          source
          EpiAware.EpiAwareBase.condition_modelMethod
          condition_model(
               model,
               fix_parameters,
               condition_parameters
           ) -> Any
          -

          Condition a model on fixed (i.e to a value) and conditioned (i.e to data) parameters.

          Returns

          • model: The conditioned model.
          source
          EpiAware.EpiAwareBase.generate_epiawareMethod
          generate_epiaware(
               y_t,
               time_step,
               epi_model::AbstractEpiModel,
               latent_model::AbstractLatentModel,
               observation_model::AbstractObservationModel
           )
          -

          Create an epi-aware model using the specified epimodel, latentmodel, and observation_model.

          Arguments

          • y_t: The observed data.
          • time_steps: The time steps.
          • epi_model: An abstract epi model.
          • latent_model: An abstract latent model.
          • observation_model: An abstract observation model.

          Returns

          • nothing
          source
          EpiAware.EpiAwareBase.generate_epiawareMethod
          generate_epiaware(epiproblem::EpiProblem, data) -> Any
          -

          Generate an epi-aware model given an EpiProblem and data.

          Arguments

          • epiproblem: Epi problem specification.
          • data: Observed data.

          Returns

          A tuple containing the generated quantities of the epi-aware model.

          source
          EpiAware.EpiAwareBase.generate_latentMethod
          generate_latent(latent_model::AbstractLatentModel, n) -> Any
          -

          Constructor function for a latent process path $Z_t$ of length n.

          The generate_latent function implements a model of generating a latent process. Which model for generating the latent process infections is implemented is set by the type of latent_model. If no implemention is defined for the type of latent_model, then EpiAware will pass a warning and return nothing.

          Interface to Turing.jl probablilistic programming language (PPL)

          Apart from the no implementation fallback method, the generate_latent implementation function should return a constructor function for a DynamicPPL.Model object. Sample paths of $Z_t$ are generated quantities of the constructed model. Priors for model parameters are fields of epi_model.

          source
          EpiAware.EpiAwareBase.generate_latent_infsMethod
          generate_latent_infs(
          +

          Create an epi-aware model using the specified epimodel, latentmodel, and observation_model.

          Arguments

          • y_t: The observed data.
          • time_steps: The time steps.
          • epi_model: An abstract epi model.
          • latent_model: An abstract latent model.
          • observation_model: An abstract observation model.

          Returns

          • nothing
          source
          EpiAware.EpiAwareBase.generate_epiawareMethod
          generate_epiaware(epiproblem::EpiProblem, data) -> Any
          +

          Generate an epi-aware model given an EpiProblem and data.

          Arguments

          • epiproblem: Epi problem specification.
          • data: Observed data.

          Returns

          A tuple containing the generated quantities of the epi-aware model.

          source
          EpiAware.EpiAwareBase.generate_latentMethod
          generate_latent(latent_model::AbstractLatentModel, n) -> Any
          +

          Constructor function for a latent process path $Z_t$ of length n.

          The generate_latent function implements a model of generating a latent process. Which model for generating the latent process infections is implemented is set by the type of latent_model. If no implemention is defined for the type of latent_model, then EpiAware will pass a warning and return nothing.

          Interface to Turing.jl probablilistic programming language (PPL)

          Apart from the no implementation fallback method, the generate_latent implementation function should return a constructor function for a DynamicPPL.Model object. Sample paths of $Z_t$ are generated quantities of the constructed model. Priors for model parameters are fields of epi_model.

          source
          EpiAware.EpiAwareBase.generate_latent_infsMethod
          generate_latent_infs(
               epi_model::AbstractEpiModel,
               Z_t
           ) -> Any
          -

          Constructor function for unobserved/latent infections based on the type of epi_model <: AbstractEpimodel and a latent process path $Z_t$.

          The generate_latent_infs function implements a model of generating unobserved/latent infections conditional on a latent process. Which model of generating unobserved/latent infections to be implemented is set by the type of epi_model. If no implemention is defined for the given epi_model, then EpiAware will return a warning and return nothing.

          Interface to Turing.jl probablilistic programming language (PPL)

          Apart from the no implementation fallback method, the generate_latent_infs implementation function returns a constructor function for a DynamicPPL.Model object where the unobserved/latent infections are a generated quantity. Priors for model parameters are fields of epi_model.

          source
          EpiAware.EpiAwareBase.generate_observationsMethod
          generate_observations(
          +

          Constructor function for unobserved/latent infections based on the type of epi_model <: AbstractEpimodel and a latent process path $Z_t$.

          The generate_latent_infs function implements a model of generating unobserved/latent infections conditional on a latent process. Which model of generating unobserved/latent infections to be implemented is set by the type of epi_model. If no implemention is defined for the given epi_model, then EpiAware will return a warning and return nothing.

          Interface to Turing.jl probablilistic programming language (PPL)

          Apart from the no implementation fallback method, the generate_latent_infs implementation function returns a constructor function for a DynamicPPL.Model object where the unobserved/latent infections are a generated quantity. Priors for model parameters are fields of epi_model.

          source
          EpiAware.EpiAwareBase.generate_observationsMethod
          generate_observations(
               obs_model::AbstractObservationModel,
               y_t,
               Y_t
           ) -> Any
          -

          Constructor function for generating observations based on the given observation model.

          The generate_observations function implements a model of generating observations based on the given observation model. Which model of generating observations to be implemented is set by the type of obs_model. If no implemention is defined for the given obs_model, then EpiAware will return a warning and return nothing.

          source
          EpiAware.EpiAwareBase.generated_observablesMethod
          generated_observables(
          +

          Constructor function for generating observations based on the given observation model.

          The generate_observations function implements a model of generating observations based on the given observation model. Which model of generating observations to be implemented is set by the type of obs_model. If no implemention is defined for the given obs_model, then EpiAware will return a warning and return nothing.

          source
          EpiAware.EpiAwareBase.generated_observablesMethod
          generated_observables(
               model,
               data,
               solution
           ) -> EpiAwareObservables
          -

          Generate observables from a given model and solution and return them as a EpiAwareObservables struct.

          Arguments

          • model: The model used for generating observables.
          • data: The data used for generating observables.
          • solution: The solution used for generating observables.

          Returns

          An instance of EpiAwareObservables struct with the provided model, data, solution, and the generated observables if specified

          source
          +

          Generate observables from a given model and solution and return them as a EpiAwareObservables struct.

          Arguments

          • model: The model used for generating observables.
          • data: The data used for generating observables.
          • solution: The solution used for generating observables.

          Returns

          An instance of EpiAwareObservables struct with the provided model, data, solution, and the generated observables if specified

          source
          diff --git a/previews/PR510/lib/EpiAwareUtils/index.html b/previews/PR510/lib/EpiAwareUtils/index.html index 87ee00bdf..b65a87ed5 100644 --- a/previews/PR510/lib/EpiAwareUtils/index.html +++ b/previews/PR510/lib/EpiAwareUtils/index.html @@ -1,2 +1,2 @@ -Overview · EpiAware.jl
          +Overview · EpiAware.jl
          diff --git a/previews/PR510/lib/EpiAwareUtils/internals/index.html b/previews/PR510/lib/EpiAwareUtils/internals/index.html index be48c9a52..51e15e1ec 100644 --- a/previews/PR510/lib/EpiAwareUtils/internals/index.html +++ b/previews/PR510/lib/EpiAwareUtils/internals/index.html @@ -10,7 +10,7 @@ prev_result; kwargs... ) -> Any -

          Implements direct sampling from a Turing model.

          source
          EpiAware.EpiAwareBase._apply_methodFunction
          _apply_method(
          +

          Implements direct sampling from a Turing model.

          source
          EpiAware.EpiAwareBase._apply_methodFunction
          _apply_method(
               model::DynamicPPL.Model,
               method::AbstractEpiMethod;
               ...
          @@ -21,49 +21,49 @@
               prev_result;
               kwargs...
           ) -> Any
          -

          Apply the inference/generative method method to the Model object mdl.

          Arguments

          • model::AbstractEpiModel: The model to apply the method to.
          • method::AbstractEpiMethod: The epidemiological method to apply.
          • prev_result: The previous result of the method.
          • kwargs: Additional keyword arguments passed to the method.

          Returns

          • nothing: If no concrete implementation is defined for the given method.
          source
          EpiAware.EpiAwareBase._apply_methodMethod
          _apply_method(
          +

          Apply the inference/generative method method to the Model object mdl.

          Arguments

          • model::AbstractEpiModel: The model to apply the method to.
          • method::AbstractEpiMethod: The epidemiological method to apply.
          • prev_result: The previous result of the method.
          • kwargs: Additional keyword arguments passed to the method.

          Returns

          • nothing: If no concrete implementation is defined for the given method.
          source
          EpiAware.EpiAwareBase._apply_methodMethod
          _apply_method(
               model::DynamicPPL.Model,
               method::EpiMethod,
               prev_result;
               kwargs...
           ) -> Any
          -

          Apply steps defined by an EpiMethod to a model object.

          This function applies the steps defined by an EpiMethod object to a Model object. It iterates over the pre-sampler steps defined in the EpiMethod object and recursively applies them to the model. Finally, it applies the sampler step defined in the EpiMethod object to the model. The prev_result argument is used to pass the result obtained from applying the previous steps, if any.

          Arguments

          • method::EpiMethod: The EpiMethod object containing the steps to be applied.
          • model::Model: The model object to which the steps will be applied.
          • prev_result: The previous result obtained from applying the steps. Defaults to nothing.
          • kwargs...: Additional keyword arguments that can be passed to the steps.

          Returns

          • prev_result: The result obtained after applying the steps.
          source
          EpiAware.EpiAwareBase._apply_methodMethod
          _apply_method(
          +

          Apply steps defined by an EpiMethod to a model object.

          This function applies the steps defined by an EpiMethod object to a Model object. It iterates over the pre-sampler steps defined in the EpiMethod object and recursively applies them to the model. Finally, it applies the sampler step defined in the EpiMethod object to the model. The prev_result argument is used to pass the result obtained from applying the previous steps, if any.

          Arguments

          • method::EpiMethod: The EpiMethod object containing the steps to be applied.
          • model::Model: The model object to which the steps will be applied.
          • prev_result: The previous result obtained from applying the steps. Defaults to nothing.
          • kwargs...: Additional keyword arguments that can be passed to the steps.

          Returns

          • prev_result: The result obtained after applying the steps.
          source
          EpiAware.EpiAwareBase._apply_methodMethod
          _apply_method(
               model::DynamicPPL.Model,
               method::EpiMethod;
               kwargs...
           ) -> Any
          -

          Apply a method to a mode without previous results

          Arguments

          • model::Model: The model to apply the method to.
          • method::EpiMethod: The method to apply.
          • kwargs...: Additional keyword arguments.

          Returns

          • The result of applying the method to the model.
          source
          EpiAware.EpiAwareBase.condition_modelMethod
          condition_model(
          +

          Apply a method to a mode without previous results

          Arguments

          • model::Model: The model to apply the method to.
          • method::EpiMethod: The method to apply.
          • kwargs...: Additional keyword arguments.

          Returns

          • The result of applying the method to the model.
          source
          EpiAware.EpiAwareBase.condition_modelMethod
          condition_model(
               model::DynamicPPL.Model,
               fix_parameters::NamedTuple,
               condition_parameters::NamedTuple
           ) -> Any
          -

          Apply the condition to the model by fixing the specified parameters and conditioning on the others.

          Arguments

          • model::Model: The model to be conditioned.
          • fix_parameters::NamedTuple: The parameters to be fixed.
          • condition_parameters::NamedTuple: The parameters to be conditioned on.

          Returns

          • _model: The conditioned model.
          source
          EpiAware.EpiAwareBase.generate_epiawareMethod
          generate_epiaware(
          +

          Apply the condition to the model by fixing the specified parameters and conditioning on the others.

          Arguments

          • model::Model: The model to be conditioned.
          • fix_parameters::NamedTuple: The parameters to be fixed.
          • condition_parameters::NamedTuple: The parameters to be conditioned on.

          Returns

          • _model: The conditioned model.
          source
          EpiAware.EpiAwareBase.generate_epiawareMethod
          generate_epiaware(
               y_t,
               time_steps,
               epi_model::AbstractTuringEpiModel;
               latent_model,
               observation_model
           )
          -

          Generate an epi-aware model given the observed data and model specifications.

          Arguments

          • y_t: Observed data.
          • time_steps: Number of time steps.
          • epi_model: A Turing Epi model specification.
          • latent_model: A Turing Latent model specification.
          • observation_model: A Turing Observation model specification.

          Returns

          A DynamicPPPL.Model object.

          source
          EpiAware.EpiAwareBase.generated_observablesMethod
          generated_observables(
          +

          Generate an epi-aware model given the observed data and model specifications.

          Arguments

          • y_t: Observed data.
          • time_steps: Number of time steps.
          • epi_model: A Turing Epi model specification.
          • latent_model: A Turing Latent model specification.
          • observation_model: A Turing Observation model specification.

          Returns

          A DynamicPPPL.Model object.

          source
          EpiAware.EpiAwareBase.generated_observablesMethod
          generated_observables(
               model::DynamicPPL.Model,
               data,
               solution::Union{NamedTuple, MCMCChains.Chains}
           ) -> EpiAwareObservables
          -

          Generate observables from a given model and solution including generated quantities.

          source
          EpiAware.EpiAwareUtils._apply_direct_sampleMethod
          _apply_direct_sample(
          +

          Generate observables from a given model and solution including generated quantities.

          source
          EpiAware.EpiAwareUtils._apply_direct_sampleMethod
          _apply_direct_sample(
               model,
               method,
               n_samples::Int64;
               kwargs...
           ) -> Any
          -

          Sample the model directly using Turing.Prior() and a NamedTuple of the sampled random variables along with generated quantities.

          source
          EpiAware.EpiAwareUtils._apply_direct_sampleMethod
          _apply_direct_sample(
          +

          Sample the model directly using Turing.Prior() and a NamedTuple of the sampled random variables along with generated quantities.

          source
          EpiAware.EpiAwareUtils._apply_direct_sampleMethod
          _apply_direct_sample(
               model,
               method,
               n_samples::Nothing
           ) -> Any
          -

          Sample the model directly using rand and return a single set of sampled random variables.

          source
          EpiAware.EpiAwareUtils._check_and_give_tsMethod
          _check_and_give_ts(
          +

          Sample the model directly using rand and return a single set of sampled random variables.

          source
          EpiAware.EpiAwareUtils._check_and_give_tsMethod
          _check_and_give_ts(
               dist::Distributions.Distribution,
               Δd,
               D,
               upper
           ) -> Any
          -

          Internal function to check censored_pmf arguments and return the time steps of the rightmost limits of the censor intervals.

          source
          +

          Internal function to check censored_pmf arguments and return the time steps of the rightmost limits of the censor intervals.

          source diff --git a/previews/PR510/lib/EpiAwareUtils/public/index.html b/previews/PR510/lib/EpiAwareUtils/public/index.html index a15f8c926..ef59b29d0 100644 --- a/previews/PR510/lib/EpiAwareUtils/public/index.html +++ b/previews/PR510/lib/EpiAwareUtils/public/index.html @@ -1,5 +1,5 @@ -Public API · EpiAware.jl

          Public Documentation

          Documentation for EpiAwareBae.jl's public interface.

          See the Internals section of the manual for internal package docs covering all submodules.

          Contents

          Index

          Public API

          EpiAware.EpiAwareUtils.DirectSampleType
          struct DirectSample <: AbstractEpiSamplingMethod

          Sample directly from a Turing model.


          Fields

          • n_samples::Union{Nothing, Int64}: Number of samples from a model. If an integer is provided, the model is sampled n_samples times using Turing.Prior() returning an MCMChains. Chain object. If nothing, the model is sampled once returning a NamedTuple object of the sampled random variables along with generated quantities
          source
          EpiAware.EpiAwareUtils.HalfNormalType
          struct HalfNormal{T<:Real} <: Distributions.Distribution{Distributions.Univariate, Distributions.Continuous}

          Create a half-normal prior distribution with the specified mean.

          Arguments:

          • μ: The mean of the half-normal distribution.

          Returns:

          • A HalfNormal distribution with the specified mean.

          Examples:

          using EpiAware, Distributions
          +Public API · EpiAware.jl

          Public Documentation

          Documentation for EpiAwareBae.jl's public interface.

          See the Internals section of the manual for internal package docs covering all submodules.

          Contents

          Index

          Public API

          EpiAware.EpiAwareUtils.DirectSampleType
          struct DirectSample <: AbstractEpiSamplingMethod

          Sample directly from a Turing model.


          Fields

          • n_samples::Union{Nothing, Int64}: Number of samples from a model. If an integer is provided, the model is sampled n_samples times using Turing.Prior() returning an MCMChains. Chain object. If nothing, the model is sampled once returning a NamedTuple object of the sampled random variables along with generated quantities
          source
          EpiAware.EpiAwareUtils.HalfNormalType
          struct HalfNormal{T<:Real} <: Distributions.Distribution{Distributions.Univariate, Distributions.Continuous}

          Create a half-normal prior distribution with the specified mean.

          Arguments:

          • μ: The mean of the half-normal distribution.

          Returns:

          • A HalfNormal distribution with the specified mean.

          Examples:

          using EpiAware, Distributions
           
           hn = HalfNormal(1.0)
           # output
          @@ -15,7 +15,7 @@
           # output
           1.0
          var(hn)
           # output
          -0.5707963267948966

          Fields

          • μ::Real
          source
          EpiAware.EpiAwareUtils.SafeNegativeBinomialType
          struct SafeNegativeBinomial{T<:Real} <: Distributions.Distribution{Distributions.Univariate, Distributions.Discrete}

          Create a Negative binomial distribution with the specified mean that avoids InExactError when the mean is too large.

          Parameterisation:

          We are using a mean and cluster factorization of the negative binomial distribution such that the variance to mean relationship is:

          \[\sigma^2 = \mu + \alpha^2 \mu^2\]

          The reason for this parameterisation is that at sufficiently large mean values (i.e. r > 1 / p) p is approximately equal to the standard fluctuation of the distribution, e.g. if p = 0.05 we expect typical fluctuations of samples from the negative binomial to be about 5% of the mean when the mean is notably larger than 20. Otherwise, we expect approximately Poisson noise. In our opinion, this parameterisation is useful for specifying the distribution in a way that is easier to reason on priors for p.

          Arguments:

          • r: The number of successes, although this can be extended to a continous number.
          • p: Success rate.

          Returns:

          • A SafeNegativeBinomial distribution with the specified mean.

          Examples:

          using EpiAware, Distributions
          +0.5707963267948966

          Fields

          • μ::Real
          source
          EpiAware.EpiAwareUtils.SafeIntValuedType
          struct SafeIntValued <: Distributions.ValueSupport

          A type to represent real-valued distributions, the purpose of this type is to avoid problems with the eltype function when having rand calls in the model.


          Fields

          source
          EpiAware.EpiAwareUtils.SafeNegativeBinomialType
          struct SafeNegativeBinomial{T<:Real} <: Distributions.UnivariateDistribution{SafeIntValued}

          Create a Negative binomial distribution with the specified mean that avoids InExactError when the mean is too large.

          Parameterisation:

          We are using a mean and cluster factorization of the negative binomial distribution such that the variance to mean relationship is:

          \[\sigma^2 = \mu + \alpha^2 \mu^2\]

          The reason for this parameterisation is that at sufficiently large mean values (i.e. r > 1 / p) p is approximately equal to the standard fluctuation of the distribution, e.g. if p = 0.05 we expect typical fluctuations of samples from the negative binomial to be about 5% of the mean when the mean is notably larger than 20. Otherwise, we expect approximately Poisson noise. In our opinion, this parameterisation is useful for specifying the distribution in a way that is easier to reason on priors for p.

          Arguments:

          • r: The number of successes, although this can be extended to a continous number.
          • p: Success rate.

          Returns:

          • A SafeNegativeBinomial distribution with the specified mean.

          Examples:

          using EpiAware, Distributions
           
           bigμ = exp(48.0) #Large value of μ
           σ² = bigμ + 0.05 * bigμ^2 #Large variance
          @@ -33,7 +33,7 @@
           # output
           7.016735912097631e20
          var(d)
           # output
          -2.4617291430060293e40

          Fields

          • r::Real

          • p::Real

          source
          EpiAware.EpiAwareUtils.SafePoissonType
          struct SafePoisson{T<:Real} <: Distributions.Distribution{Distributions.Univariate, Distributions.Discrete}

          Create a Poisson distribution with the specified mean that avoids InExactError when the mean is too large.

          Arguments:

          • λ: The mean of the Poisson distribution.

          Returns:

          • A SafePoisson distribution with the specified mean.

          Examples:

          using EpiAware, Distributions
          +2.4617291430060293e40

          Fields

          • r::Real

          • p::Real

          source
          EpiAware.EpiAwareUtils.SafePoissonType
          struct SafePoisson{T<:Real} <: Distributions.UnivariateDistribution{SafeIntValued}

          Create a Poisson distribution with the specified mean that avoids InExactError when the mean is too large.

          Arguments:

          • λ: The mean of the Poisson distribution.

          Returns:

          • A SafePoisson distribution with the specified mean.

          Examples:

          using EpiAware, Distributions
           
           bigλ = exp(48.0) #Large value of λ
           d = SafePoisson(bigλ)
          @@ -46,7 +46,7 @@
           # output
           7.016735912097631e20
          var(d)
           # output
          -7.016735912097631e20

          Fields

          • λ::Real
          source
          EpiAware.EpiAwareUtils.get_param_arrayMethod
          get_param_array(chn::MCMCChains.Chains) -> Any
           

          Extract a parameter array from a Chains object chn that matches the shape of number of sample and chain pairs in chn.

          Arguments

          • chn::Chains: The Chains object containing the MCMC samples.

          Returns

          • param_array: An array of parameter samples, where each element corresponds to a single

          MCMC sample as a NamedTuple.

          Example

          Sampling from a simple model which has both scalar and vector quantity random variables across 4 chains.

          using Turing, MCMCChains, EpiAware
           
           @model function testmodel()
          @@ -172,7 +172,7 @@
           mdl = testmodel()
           chn = sample(mdl, Prior(), MCMCSerial(), 2, 1, progress=false)
           
          -A = get_param_array(chn)
          source
          EpiAware.EpiAwareUtils.get_stateMethod
          get_state(
               acc_step::AbstractAccumulationStep,
               initial_state,
               state
          @@ -186,15 +186,14 @@
           
           # Returns
           - `state`: The combination of the initial state and the last element of
          -  each accumulated state.
          source
          EpiAware.EpiAwareUtils.prefix_submodelMethod
          prefix_submodel(
               model::AbstractModel,
               fn::Function,
               prefix::String,
               kwargs...
           ) -> Any
           

          Generate a submodel with an optional prefix. A lightweight wrapper around the @submodel macro from DynamicPPL.jl.

          Arguments

          • model::AbstractModel: The model to be used.
          • fn::Function: The Turing @model function to be applied to the model.
          • prefix::String: The prefix to be used. If the prefix is an empty string, the submodel is created without a prefix.

          Returns

          • submodel: The returns from the submodel are passed through.

          Examples

          using EpiAware, DynamicPPL
          -
          -submodel = prefix_submodel(FixedIntercept(0.1), generate_latent, string(1), 2)

          We can now draw a sample from the submodel.

          rand(submodel)
          source
          EpiAware.EpiAwareUtils.scanMethod
          scan(f::AbstractModel, init, xs) -> Tuple{Any, Any}
          -

          Apply f to each element of xs and accumulate the results.

          f must be a callable on a sub-type of AbstractModel.

          Design note

          scan is being restricted to AbstractModel sub-types to ensure: 1. That compiler specialization is activated 2. Also avoids potential compiler overhead from specialisation on f<: Function.

          Arguments

          • f: A callable/functor that takes two arguments, carry and x, and returns a new carry and a result y.
          • init: The initial value for the carry variable.
          • xs: An iterable collection of elements.

          Returns

          • ys: An array containing the results of applying f to each element of xs.
          • carry: The final value of the carry variable after processing all elements of xs.

          Examples

          ```jldoctest using EpiAware

          struct Adder <: EpiAwareBase.AbstractModel end function (a::Adder)(carry, x) carry + x, carry + x end

          scan(Adder(), 0, 1:5) #output ([1, 3, 6, 10, 15], 15)

          source
          EpiAware.EpiAwareUtils.spread_drawsMethod
          spread_draws(chn::MCMCChains.Chains) -> DataFrames.DataFrame
          -
          spread_draws(chn::Chains)

          Converts a Chains object into a DataFrame in tidybayes format.

          Arguments

          • chn::Chains: The Chains object to be converted.

          Returns

          • df::DataFrame: The converted DataFrame.
          source
          EpiAware.EpiAwareUtils.∫FMethod
          ∫F(dist, t, Δd) -> Any
          -

          Calculate the CDF of the random variable X + U where X has cumulative distriubtion function F and U is a uniform random variable on [0, Δd).

          This is used in solving for censored CDFs and PMFs using numerical quadrature.

          source
          +submodel = prefix_submodel(FixedIntercept(0.1), generate_latent, string(1), 2)

          We can now draw a sample from the submodel.

          rand(submodel)
          source
          EpiAware.EpiAwareUtils.scanMethod
          scan(f::AbstractModel, init, xs) -> Tuple{Any, Any}
          +

          Apply f to each element of xs and accumulate the results.

          f must be a callable on a sub-type of AbstractModel.

          Design note

          scan is being restricted to AbstractModel sub-types to ensure: 1. That compiler specialization is activated 2. Also avoids potential compiler overhead from specialisation on f<: Function.

          Arguments

          • f: A callable/functor that takes two arguments, carry and x, and returns a new carry and a result y.
          • init: The initial value for the carry variable.
          • xs: An iterable collection of elements.

          Returns

          • ys: An array containing the results of applying f to each element of xs.
          • carry: The final value of the carry variable after processing all elements of xs.

          Examples

          ```jldoctest using EpiAware

          struct Adder <: EpiAwareBase.AbstractModel end function (a::Adder)(carry, x) carry + x, carry + x end

          scan(Adder(), 0, 1:5) #output ([1, 3, 6, 10, 15], 15)

          source
          EpiAware.EpiAwareUtils.spread_drawsMethod
          spread_draws(chn::MCMCChains.Chains) -> DataFrames.DataFrame
          +
          spread_draws(chn::Chains)

          Converts a Chains object into a DataFrame in tidybayes format.

          Arguments

          • chn::Chains: The Chains object to be converted.

          Returns

          • df::DataFrame: The converted DataFrame.
          source
          EpiAware.EpiAwareUtils.∫FMethod
          ∫F(dist, t, Δd) -> Any
          +

          Calculate the CDF of the random variable X + U where X has cumulative distriubtion function F and U is a uniform random variable on [0, Δd).

          This is used in solving for censored CDFs and PMFs using numerical quadrature.

          source
          diff --git a/previews/PR510/lib/EpiInfModels/index.html b/previews/PR510/lib/EpiInfModels/index.html index 85c1bb8c0..f0f4eb6fc 100644 --- a/previews/PR510/lib/EpiInfModels/index.html +++ b/previews/PR510/lib/EpiInfModels/index.html @@ -1,2 +1,2 @@ -Overview · EpiAware.jl

          EpiInfModels.jl

          This package provides infectious disease transmission models for the EpiAware ecosystem.

          API

            +Overview · EpiAware.jl

            EpiInfModels.jl

            This package provides infectious disease transmission models for the EpiAware ecosystem.

            API

              diff --git a/previews/PR510/lib/EpiInfModels/internals/index.html b/previews/PR510/lib/EpiInfModels/internals/index.html index 47d2b1fba..29a64c0f8 100644 --- a/previews/PR510/lib/EpiInfModels/internals/index.html +++ b/previews/PR510/lib/EpiInfModels/internals/index.html @@ -1,5 +1,5 @@ -Internal API · EpiAware.jl

              Internal Documentation

              Documentation for EpiInfModels.jl's internal interface.

              Contents

              Index

              Internal API

              EpiAware.EpiInfModels.ConstantRenewalStepType
              struct ConstantRenewalStep{T} <: EpiAware.EpiInfModels.AbstractConstantRenewalStep

              The renewal process iteration/step function struct with constant generation interval.

              Note that the generation interval is stored in reverse order.


              Fields

              • rev_gen_int::Vector
              source
              EpiAware.EpiInfModels.ConstantRenewalStepMethod
              function (recurrent_step::ConstantRenewalStep)(recent_incidence, Rt)

              Implement the Renewal model iteration/step function, with constant generation interval.

              Mathematical specification

              The new incidence is given by

              \[I_t = R_t \sum_{i=1}^{n-1} I_{t-i} g_i\]

              where I_t is the new incidence, R_t is the reproduction number, I_{t-i} is the recent incidence and g_i is the generation interval.

              Arguments

              • recent_incidence: Array of recent incidence values order least recent to most recent.
              • Rt: Reproduction number.

              Returns

              • Updated incidence array.
              source
              EpiAware.EpiInfModels.ConstantRenewalWithPopulationStepType
              struct ConstantRenewalWithPopulationStep{T} <: EpiAware.EpiInfModels.AbstractConstantRenewalStep

              The renewal process iteration/step function struct with constant generation interval and a fixed population size.

              Note that the generation interval is stored in reverse order.


              Fields

              • rev_gen_int::Vector

              • pop_size::Any

              source
              EpiAware.EpiInfModels.ConstantRenewalWithPopulationStepMethod
              function (recurrent_step::ConstantRenewalWithPopulationStep)(recent_incidence_and_available_sus, Rt)

              Callable on a RenewalWithPopulation struct for compute new incidence based on recent incidence, Rt and depletion of susceptibles.

              Mathematical specification

              The new incidence is given by

              \[I_t = {S_{t-1} / N} R_t \sum_{i=1}^{n-1} I_{t-i} g_i\]

              where I_t is the new incidence, R_t is the reproduction number, I_{t-i} is the recent incidence and g_i is the generation interval.

              Arguments

              • recent_incidence_and_available_sus: A tuple with an array of recent incidence

              values and the remaining susceptible/available individuals.

              • Rt: Reproduction number.

              Returns

              • Vector containing the updated incidence array and the new recent_incidence_and_available_sus

              value.

              source
              EpiAware.EpiAwareBase.generate_latent_infsMethod
              generate_latent_infs(
              +Internal API · EpiAware.jl

              Internal Documentation

              Documentation for EpiInfModels.jl's internal interface.

              Contents

              Index

              Internal API

              EpiAware.EpiInfModels.ConstantRenewalStepType
              struct ConstantRenewalStep{T} <: EpiAware.EpiInfModels.AbstractConstantRenewalStep

              The renewal process iteration/step function struct with constant generation interval.

              Note that the generation interval is stored in reverse order.


              Fields

              • rev_gen_int::Vector
              source
              EpiAware.EpiInfModels.ConstantRenewalStepMethod
              function (recurrent_step::ConstantRenewalStep)(recent_incidence, Rt)

              Implement the Renewal model iteration/step function, with constant generation interval.

              Mathematical specification

              The new incidence is given by

              \[I_t = R_t \sum_{i=1}^{n-1} I_{t-i} g_i\]

              where I_t is the new incidence, R_t is the reproduction number, I_{t-i} is the recent incidence and g_i is the generation interval.

              Arguments

              • recent_incidence: Array of recent incidence values order least recent to most recent.
              • Rt: Reproduction number.

              Returns

              • Updated incidence array.
              source
              EpiAware.EpiInfModels.ConstantRenewalWithPopulationStepType
              struct ConstantRenewalWithPopulationStep{T} <: EpiAware.EpiInfModels.AbstractConstantRenewalStep

              The renewal process iteration/step function struct with constant generation interval and a fixed population size.

              Note that the generation interval is stored in reverse order.


              Fields

              • rev_gen_int::Vector

              • pop_size::Any

              source
              EpiAware.EpiInfModels.ConstantRenewalWithPopulationStepMethod
              function (recurrent_step::ConstantRenewalWithPopulationStep)(recent_incidence_and_available_sus, Rt)

              Callable on a RenewalWithPopulation struct for compute new incidence based on recent incidence, Rt and depletion of susceptibles.

              Mathematical specification

              The new incidence is given by

              \[I_t = {S_{t-1} / N} R_t \sum_{i=1}^{n-1} I_{t-i} g_i\]

              where I_t is the new incidence, R_t is the reproduction number, I_{t-i} is the recent incidence and g_i is the generation interval.

              Arguments

              • recent_incidence_and_available_sus: A tuple with an array of recent incidence

              values and the remaining susceptible/available individuals.

              • Rt: Reproduction number.

              Returns

              • Vector containing the updated incidence array and the new recent_incidence_and_available_sus

              value.

              source
              EpiAware.EpiAwareBase.generate_latent_infsMethod
              generate_latent_infs(
                   epi_model::AbstractTuringRenewal,
                   _Rt
               ) -> Any
              @@ -18,7 +18,7 @@
               #Sample random parameters from prior
               θ = rand(latent_inf)
               #Get unobserved infections as a generated quantities from the model
              -I_t = generated_quantities(latent_inf, θ)
              source
              EpiAware.EpiAwareBase.generate_latent_infsMethod
              generate_latent_infs(
                   epi_model::DirectInfections,
                   Z_t
               ) -> Any
              @@ -37,7 +37,7 @@
               #Sample random parameters from prior
               θ = rand(latent_inf)
               #Get unobserved infections as a generated quantities from the model
              -I_t = generated_quantities(latent_inf, θ)
              source
              EpiAware.EpiAwareBase.generate_latent_infsMethod
              generate_latent_infs(epi_model::ExpGrowthRate, rt) -> Any
               

              Implement the generate_latent_infs function for the ExpGrowthRate model.

              Example usage with ExpGrowthRate type of model for unobserved infection process

              generate_latent_infs can be used to construct a Turing model for the latent infections conditional on the sample path of a latent process. In this example, we generate a sample of a white noise latent process.

              First, we construct an ExpGrowthRate struct with an EpiData object, an initialisation prior and a transformation function.

              using Distributions, Turing, EpiAware
               gen_int = [0.2, 0.3, 0.5]
               g = exp
              @@ -53,7 +53,7 @@
               #Sample random parameters from prior
               θ = rand(latent_inf)
               #Get unobserved infections as a generated quantities from the model
              -I_t = generated_quantities(latent_inf, θ)
              source
              EpiAware.EpiAwareBase.generate_latent_infsMethod
              generate_latent_infs(
                   epi_model::ODEProcess,
                   params::ODEParams
               ) -> Any
              @@ -78,29 +78,29 @@
                       sol2infs = sol -> sol[1, :])
               
               # Generate the latent infections
              -I_t = generate_latent_infs(expgrowth_model, params)()
              source
              EpiAware.EpiAwareUtils.get_stateMethod
              get_state(
                   acc_step::EpiAware.EpiInfModels.ConstantRenewalStep,
                   initial_state,
                   state
               ) -> Any
              -

              Method to get the state of the accumulation for a ConstantRenewalStep object.

              source
              EpiAware.EpiAwareUtils.get_stateMethod
              get_state(
                   acc_step::EpiAware.EpiInfModels.ConstantRenewalWithPopulationStep,
                   initial_state,
                   state
               ) -> Any
              -

              Method to get the state of the accumulation for a ConstantRenewalWithPopulationStep object.

              source
              EpiAware.EpiInfModels.make_renewal_initMethod
              make_renewal_init(epi_model::Renewal, I₀, Rt₀) -> Any
              -

              Create the initial state of the Renewal model.

              Arguments

              • epi_model::Renewal: The Renewal model.
              • I₀: The initial number of infected individuals.
              • Rt₀: The initial time-varying reproduction number.

              Returns

              The initial vector of infected individuals.

              source
              EpiAware.EpiInfModels.neg_MGFMethod
              neg_MGF(r, w::AbstractVector) -> Any
              -

              Compute the negative moment generating function (MGF) for a given rate r and weights w.

              Arguments

              • r: The rate parameter.
              • w: An abstract vector of weights.

              Returns

              The value of the negative MGF.

              source
              EpiAware.EpiInfModels.make_renewal_initMethod
              make_renewal_init(epi_model::Renewal, I₀, Rt₀) -> Any
              +

              Create the initial state of the Renewal model.

              Arguments

              • epi_model::Renewal: The Renewal model.
              • I₀: The initial number of infected individuals.
              • Rt₀: The initial time-varying reproduction number.

              Returns

              The initial vector of infected individuals.

              source
              EpiAware.EpiInfModels.neg_MGFMethod
              neg_MGF(r, w::AbstractVector) -> Any
              +

              Compute the negative moment generating function (MGF) for a given rate r and weights w.

              Arguments

              • r: The rate parameter.
              • w: An abstract vector of weights.

              Returns

              The value of the negative MGF.

              source
              EpiAware.EpiInfModels.renewal_init_stateMethod
              renewal_init_state(
                   recurrent_step::EpiAware.EpiInfModels.ConstantRenewalStep,
                   I₀,
                   r_approx,
                   len_gen_int
               ) -> Any
              -

              Constructs the initial conditions for a renewal model with ConstantRenewalStep type of step function.

              source
              EpiAware.EpiInfModels.renewal_init_stateMethod
              renewal_init_state(
                   recurrent_step::EpiAware.EpiInfModels.ConstantRenewalWithPopulationStep,
                   I₀,
                   r_approx,
                   len_gen_int
               ) -> Any
              -

              Constructs the initial conditions for a renewal model with ConstantRenewalWithPopulationStep type of step function.

              source
              +

              Constructs the initial conditions for a renewal model with ConstantRenewalWithPopulationStep type of step function.

              source
              diff --git a/previews/PR510/lib/EpiInfModels/public/index.html b/previews/PR510/lib/EpiInfModels/public/index.html index c42d23149..dfeb086cd 100644 --- a/previews/PR510/lib/EpiInfModels/public/index.html +++ b/previews/PR510/lib/EpiInfModels/public/index.html @@ -1,5 +1,5 @@ -Public API · EpiAware.jl

              Public Documentation

              Documentation for EpiInfModels.jl's public interface.

              See the Internals section of the manual for internal package docs covering all submodules.

              Contents

              Index

              Public API

              EpiAware.EpiInfModels.DirectInfectionsType
              struct DirectInfections{S<:Distributions.Sampleable} <: AbstractTuringEpiModel

              Model unobserved/latent infections as a transformation on a sampled latent process.

              Mathematical specification

              If $Z_t$ is a realisation of the latent model, then the unobserved/latent infections are given by

              \[I_t = g(\hat{I}_0 + Z_t).\]

              where $g$ is a transformation function and the unconstrained initial infections $\hat{I}_0$ are sampled from a prior distribution.

              DirectInfections are constructed by passing an EpiData object data and an initialisation_prior for the prior distribution of $\hat{I}_0$. The default initialisation_prior is Normal().

              Constructors

              • DirectInfections(; data, initialisation_prior)

              Example usage with generate_latent_infs

              generate_latent_infs can be used to construct a Turing model for the latent infections conditional on the sample path of a latent process. In this example, we generate a sample of a white noise latent process.

              First, we construct a DirectInfections struct with an EpiData object, an initialisation prior and a transformation function.

              using Distributions, Turing, EpiAware
              +Public API · EpiAware.jl

              Public Documentation

              Documentation for EpiInfModels.jl's public interface.

              See the Internals section of the manual for internal package docs covering all submodules.

              Contents

              Index

              Public API

              EpiAware.EpiInfModels.DirectInfectionsType
              struct DirectInfections{S<:Distributions.Sampleable} <: AbstractTuringEpiModel

              Model unobserved/latent infections as a transformation on a sampled latent process.

              Mathematical specification

              If $Z_t$ is a realisation of the latent model, then the unobserved/latent infections are given by

              \[I_t = g(\hat{I}_0 + Z_t).\]

              where $g$ is a transformation function and the unconstrained initial infections $\hat{I}_0$ are sampled from a prior distribution.

              DirectInfections are constructed by passing an EpiData object data and an initialisation_prior for the prior distribution of $\hat{I}_0$. The default initialisation_prior is Normal().

              Constructors

              • DirectInfections(; data, initialisation_prior)

              Example usage with generate_latent_infs

              generate_latent_infs can be used to construct a Turing model for the latent infections conditional on the sample path of a latent process. In this example, we generate a sample of a white noise latent process.

              First, we construct a DirectInfections struct with an EpiData object, an initialisation prior and a transformation function.

              using Distributions, Turing, EpiAware
               gen_int = [0.2, 0.3, 0.5]
               g = exp
               
              @@ -14,7 +14,7 @@
               #Sample random parameters from prior
               θ = rand(latent_inf)
               #Get unobserved infections as a generated quantities from the model
              -I_t = generated_quantities(latent_inf, θ)

              Fields

              • data::EpiData: Epidata object.

              • initialisation_prior::Distributions.Sampleable: Prior distribution for the initialisation of the infections. Default is Normal().

              source
              EpiAware.EpiInfModels.EpiDataType
              struct EpiData{T<:Real, F<:Function}

              The EpiData struct represents epidemiological data used in infectious disease modeling.

              Constructors

              • EpiData(gen_int, transformation::Function). Constructs an EpiData object with discrete

              generation interval gen_int and transformation function transformation.

              • EpiData(;gen_distribution::ContinuousDistribution, D_gen, Δd = 1.0, transformation::Function = exp).

              Constructs an EpiData object with double interval censoring discretisation of the continuous next generation interval distribution gen_distribution with additional right truncation at D_gen. Δd sets the interval width (default = 1.0). transformation sets the transformation function

              Examples

              Construction direct from discrete generation interval and transformation function:

              using EpiAware
              +I_t = generated_quantities(latent_inf, θ)

              Fields

              • data::EpiData: Epidata object.

              • initialisation_prior::Distributions.Sampleable: Prior distribution for the initialisation of the infections. Default is Normal().

              source
              EpiAware.EpiInfModels.EpiDataType
              struct EpiData{T<:Real, F<:Function}

              The EpiData struct represents epidemiological data used in infectious disease modeling.

              Constructors

              • EpiData(gen_int, transformation::Function). Constructs an EpiData object with discrete

              generation interval gen_int and transformation function transformation.

              • EpiData(;gen_distribution::ContinuousDistribution, D_gen, Δd = 1.0, transformation::Function = exp).

              Constructs an EpiData object with double interval censoring discretisation of the continuous next generation interval distribution gen_distribution with additional right truncation at D_gen. Δd sets the interval width (default = 1.0). transformation sets the transformation function

              Examples

              Construction direct from discrete generation interval and transformation function:

              using EpiAware
               gen_int = [0.2, 0.3, 0.5]
               g = exp
               data = EpiData(gen_int, g)

              Construction from continuous distribution for generation interval.

              using Distributions
              @@ -22,7 +22,7 @@
               gen_distribution = Uniform(0.0, 10.0)
               
               data = EpiData(;gen_distribution
              -    D_gen = 10.0)

              Fields

              • gen_int::Vector{T} where T<:Real: Discrete generation interval.

              • len_gen_int::Integer: Length of the discrete generation interval.

              • transformation::Function: Transformation function defining constrained and unconstrained domain bijections.

              source
              EpiAware.EpiInfModels.ExpGrowthRateType
              struct ExpGrowthRate{S<:Distributions.Sampleable} <: AbstractTuringEpiModel

              Model unobserved/latent infections as due to time-varying exponential growth rate $r_t$ which is generated by a latent process.

              Mathematical specification

              If $Z_t$ is a realisation of the latent model, then the unobserved/latent infections are given by

              \[I_t = g(\hat{I}_0) \exp(Z_t).\]

              where $g$ is a transformation function and the unconstrained initial infections $\hat{I}_0$ are sampled from a prior distribution.

              ExpGrowthRate are constructed by passing an EpiData object data and an initialisation_prior for the prior distribution of $\hat{I}_0$. The default initialisation_prior is Normal().

              Constructor

              • ExpGrowthRate(; data, initialisation_prior).

              Example usage with generate_latent_infs

              generate_latent_infs can be used to construct a Turing model for the latent infections conditional on the sample path of a latent process. In this example, we generate a sample of a white noise latent process.

              First, we construct an ExpGrowthRate struct with an EpiData object, an initialisation prior and a transformation function.

              using Distributions, Turing, EpiAware
              +    D_gen = 10.0)

              Fields

              • gen_int::Vector{T} where T<:Real: Discrete generation interval.

              • len_gen_int::Integer: Length of the discrete generation interval.

              • transformation::Function: Transformation function defining constrained and unconstrained domain bijections.

              source
              EpiAware.EpiInfModels.ExpGrowthRateType
              struct ExpGrowthRate{S<:Distributions.Sampleable} <: AbstractTuringEpiModel

              Model unobserved/latent infections as due to time-varying exponential growth rate $r_t$ which is generated by a latent process.

              Mathematical specification

              If $Z_t$ is a realisation of the latent model, then the unobserved/latent infections are given by

              \[I_t = g(\hat{I}_0) \exp(Z_t).\]

              where $g$ is a transformation function and the unconstrained initial infections $\hat{I}_0$ are sampled from a prior distribution.

              ExpGrowthRate are constructed by passing an EpiData object data and an initialisation_prior for the prior distribution of $\hat{I}_0$. The default initialisation_prior is Normal().

              Constructor

              • ExpGrowthRate(; data, initialisation_prior).

              Example usage with generate_latent_infs

              generate_latent_infs can be used to construct a Turing model for the latent infections conditional on the sample path of a latent process. In this example, we generate a sample of a white noise latent process.

              First, we construct an ExpGrowthRate struct with an EpiData object, an initialisation prior and a transformation function.

              using Distributions, Turing, EpiAware
               gen_int = [0.2, 0.3, 0.5]
               g = exp
               
              @@ -37,12 +37,12 @@
               #Sample random parameters from prior
               θ = rand(latent_inf)
               #Get unobserved infections as a generated quantities from the model
              -I_t = generated_quantities(latent_inf, θ)

              Fields

              • data::EpiData

              • initialisation_prior::Distributions.Sampleable

              source
              EpiAware.EpiInfModels.ODEParamsType
              struct ODEParams{T}

              A structure to hold the initial condition and parameters for an ODE (Ordinary Differential Equation) process. params::ODEParams is used in the method generate_latent_infs(epi_model::ODEProcess, params::ODEParams)

              Constructors

              • ODEParams(; u0::VecOrMat, p::VecOrMat): Create an ODEParams object with the initial condition(s) u0 and parameters p.

              Example

              using EpiAware
              +I_t = generated_quantities(latent_inf, θ)

              Fields

              • data::EpiData

              • initialisation_prior::Distributions.Sampleable

              source
              EpiAware.EpiInfModels.ODEParamsType
              struct ODEParams{T}

              A structure to hold the initial condition and parameters for an ODE (Ordinary Differential Equation) process. params::ODEParams is used in the method generate_latent_infs(epi_model::ODEProcess, params::ODEParams)

              Constructors

              • ODEParams(; u0::VecOrMat, p::VecOrMat): Create an ODEParams object with the initial condition(s) u0 and parameters p.

              Example

              using EpiAware
               params = ODEParams(; u0 = ones(10), p = [2, 3])
               
               # output
               
              -ODEParams{Float64}([1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0], [2.0, 3.0])

              Fields

              • u0::VecOrMat: The initial condition(s) for the ODE, which can be a vector or matrix of type T.

              • p::VecOrMat: The parameters for the ODE, which can be a vector or matrix of type T.

              source
              EpiAware.EpiInfModels.ODEProcessType
              struct ODEProcess{P<:SciMLBase.ODEProblem, T, S, F<:Function} <: AbstractTuringEpiModel

              A structure representing an infection process modeled by an Ordinary Differential Equation (ODE).

              Background

              The purpose of this structure is to define the behaviour of modelling an infection process using an ODE. We use the SciML ecosystem to define and solve the ODE. For ODEProcess structs we focus on defining from a restricted set of ODE problems:

              • The initial condition u0 must be a vector or matrix.
              • The parameters p must be a vector or matrix.
              • The output of the ODE should be interpreted as the infection incidence at each time point in

              ts via the function sol2infs which maps the solution object sol of the ODE solve to infection counts.

              Constructors

              • ODEProcess(prob::ODEProblem; ts, solver, sol2infs): Create an ODEProcess

              object with the ODE problem prob, time points ts, solver solver, and function sol2infs.

              Example

              using EpiAware, OrdinaryDiffEq
              +ODEParams{Float64}([1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0], [2.0, 3.0])

              Fields

              • u0::VecOrMat: The initial condition(s) for the ODE, which can be a vector or matrix of type T.

              • p::VecOrMat: The parameters for the ODE, which can be a vector or matrix of type T.

              source
              EpiAware.EpiInfModels.ODEProcessType
              struct ODEProcess{P<:SciMLBase.ODEProblem, T, S, F<:Function} <: AbstractTuringEpiModel

              A structure representing an infection process modeled by an Ordinary Differential Equation (ODE).

              Background

              The purpose of this structure is to define the behaviour of modelling an infection process using an ODE. We use the SciML ecosystem to define and solve the ODE. For ODEProcess structs we focus on defining from a restricted set of ODE problems:

              • The initial condition u0 must be a vector or matrix.
              • The parameters p must be a vector or matrix.
              • The output of the ODE should be interpreted as the infection incidence at each time point in

              ts via the function sol2infs which maps the solution object sol of the ODE solve to infection counts.

              Constructors

              • ODEProcess(prob::ODEProblem; ts, solver, sol2infs): Create an ODEProcess

              object with the ODE problem prob, time points ts, solver solver, and function sol2infs.

              Example

              using EpiAware, OrdinaryDiffEq
               r = log(2) / 7 # Growth rate corresponding to 7 day doubling time
               u0 = [1.0]
               p = [r]
              @@ -78,7 +78,7 @@
                1.9999990356297939
                2.2081789476865237
                2.438027196361022
              - 2.6918002758361723

              Fields

              • prob::SciMLBase.ODEProblem: The ODE problem instance, where P is a subtype of ODEProblem.

              • ts::Vector: A vector of time points, where T is the type of the time points.

              • solver::Any: The solver used for the ODE problem.

              • sol2infs::Function: A function that maps the solution object of the ODE to infection counts.

              source
              EpiAware.EpiInfModels.RenewalType
              struct Renewal{E, S<:Distributions.Sampleable, A} <: AbstractTuringRenewal

              Model unobserved/latent infections as due to time-varying Renewal model with reproduction number $\mathcal{R}_t$ which is generated by a latent process.

              Mathematical specification

              If $Z_t$ is a realisation of the latent model, then the unobserved/latent infections are given by

              \[\begin{align} + 2.6918002758361723


              Fields

              • prob::SciMLBase.ODEProblem: The ODE problem instance, where P is a subtype of ODEProblem.

              • ts::Vector: A vector of time points, where T is the type of the time points.

              • solver::Any: The solver used for the ODE problem.

              • sol2infs::Function: A function that maps the solution object of the ODE to infection counts.

              source
              EpiAware.EpiInfModels.RenewalType
              struct Renewal{E, S<:Distributions.Sampleable, A} <: AbstractTuringRenewal

              Model unobserved/latent infections as due to time-varying Renewal model with reproduction number $\mathcal{R}_t$ which is generated by a latent process.

              Mathematical specification

              If $Z_t$ is a realisation of the latent model, then the unobserved/latent infections are given by

              \[\begin{align} \mathcal{R}_t &= g(Z_t),\\ I_t &= \mathcal{R}_t \sum_{i=1}^{n-1} I_{t-i} g_i, \qquad t \geq 1, \\ I_t &= g(\hat{I}_0) \exp(r(\mathcal{R}_1) t), \qquad t \leq 0. @@ -97,13 +97,13 @@ #Sample random parameters from prior θ = rand(latent_inf) #Get unobserved infections as a generated quantities from the model -I_t = generated_quantities(latent_inf, θ)


              Fields

              • data::Any

              • initialisation_prior::Distributions.Sampleable

              • recurrent_step::Any

              source
              EpiAware.EpiInfModels.R_to_rMethod
              R_to_r(
              +I_t = generated_quantities(latent_inf, θ)

              Fields

              • data::Any

              • initialisation_prior::Distributions.Sampleable

              • recurrent_step::Any

              source
              EpiAware.EpiInfModels.R_to_rMethod
              R_to_r(
                   R₀,
                   w::Array{T<:AbstractFloat, 1};
                   newton_steps,
                   Δd
               ) -> Any
              -

              This function computes an approximation to the exponential growth rate r given the reproductive ratio R₀ and the discretized generation interval w with discretized interval width Δd. This is based on the implicit solution of

              \[G(r) - {1 \over R_0} = 0.\]

              where

              \[G(r) = \sum_{i=1}^n w_i e^{-r i}.\]

              is the negative moment generating function (MGF) of the generation interval distribution.

              The two step approximation is based on: 1. Direct solution of implicit equation for a small r approximation. 2. Improving the approximation using Newton's method for a fixed number of steps newton_steps.

              Returns:

              • The approximate value of r.
              source
              EpiAware.EpiInfModels.expected_RtMethod
              expected_Rt(
              +

              This function computes an approximation to the exponential growth rate r given the reproductive ratio R₀ and the discretized generation interval w with discretized interval width Δd. This is based on the implicit solution of

              \[G(r) - {1 \over R_0} = 0.\]

              where

              \[G(r) = \sum_{i=1}^n w_i e^{-r i}.\]

              is the negative moment generating function (MGF) of the generation interval distribution.

              The two step approximation is based on: 1. Direct solution of implicit equation for a small r approximation. 2. Improving the approximation using Newton's method for a fixed number of steps newton_steps.

              Returns:

              • The approximate value of r.
              source
              EpiAware.EpiInfModels.expected_RtMethod
              expected_Rt(
                   data::EpiData,
                   infections::Vector{<:Real}
               ) -> Any
              @@ -111,5 +111,5 @@
               
               data = EpiData([0.2, 0.3, 0.5], exp)
               infections = [100, 200, 300, 400, 500]
              -expected_Rt(data, infections)
              source
              EpiAware.EpiInfModels.r_to_RMethod
              r_to_R(r, w::AbstractVector) -> Any
              -
              r_to_R(r, w)

              Compute the reproductive ratio given exponential growth rate r and discretized generation interval w.

              Arguments

              • r: The exponential growth rate.
              • w: discretized generation interval.

              Returns

              • The reproductive ratio.
              source
              +expected_Rt(data, infections)source
              EpiAware.EpiInfModels.r_to_RMethod
              r_to_R(r, w::AbstractVector) -> Any
              +
              r_to_R(r, w)

              Compute the reproductive ratio given exponential growth rate r and discretized generation interval w.

              Arguments

              • r: The exponential growth rate.
              • w: discretized generation interval.

              Returns

              • The reproductive ratio.
              source
              diff --git a/previews/PR510/lib/EpiInference/index.html b/previews/PR510/lib/EpiInference/index.html index 6782fa660..18add11d1 100644 --- a/previews/PR510/lib/EpiInference/index.html +++ b/previews/PR510/lib/EpiInference/index.html @@ -1,2 +1,2 @@ -Overview · EpiAware.jl
              +Overview · EpiAware.jl
              diff --git a/previews/PR510/lib/EpiInference/internals/index.html b/previews/PR510/lib/EpiInference/internals/index.html index 00bfe8bac..ee88c4631 100644 --- a/previews/PR510/lib/EpiInference/internals/index.html +++ b/previews/PR510/lib/EpiInference/internals/index.html @@ -10,7 +10,7 @@ prev_result; kwargs... ) -> Any -

              Apply a ManyPathfinder method to a DynamicPPL.Model object.

              If prev_result is a vector of real numbers, then the ManyPathfinder method is applied with the initial values set to prev_result. Otherwise, the ManyPathfinder method is run with default initial values generated.

              source
              EpiAware.EpiAwareBase._apply_methodFunction
              _apply_method(
              +

              Apply a ManyPathfinder method to a DynamicPPL.Model object.

              If prev_result is a vector of real numbers, then the ManyPathfinder method is applied with the initial values set to prev_result. Otherwise, the ManyPathfinder method is run with default initial values generated.

              source
              EpiAware.EpiAwareBase._apply_methodFunction
              _apply_method(
                   model::DynamicPPL.Model,
                   method::NUTSampler;
                   ...
              @@ -21,20 +21,20 @@
                   prev_result;
                   kwargs...
               ) -> Any
              -

              Apply NUTS sampling to a DynamicPPL.Model object with prev_result representing any initial results to use for sampler initialisation.

              source
              EpiAware.EpiInference._apply_nutsMethod
              _apply_nuts(model, method, prev_result; kwargs...) -> Any
              -

              No initialisation NUTS.

              source
              EpiAware.EpiInference._apply_nutsMethod
              _apply_nuts(
              +

              Apply NUTS sampling to a DynamicPPL.Model object with prev_result representing any initial results to use for sampler initialisation.

              source
              EpiAware.EpiInference._apply_nutsMethod
              _apply_nuts(model, method, prev_result; kwargs...) -> Any
              +

              No initialisation NUTS.

              source
              EpiAware.EpiInference._apply_nutsMethod
              _apply_nuts(
                   model,
                   method,
                   prev_result::Pathfinder.PathfinderResult;
                   kwargs...
               ) -> Any
              -

              Initialise NUTS with initial parameters from a Pathfinder result.

              source
              EpiAware.EpiInference._continue_manypathfinder!Method
              _continue_manypathfinder!(
              +

              Initialise NUTS with initial parameters from a Pathfinder result.

              source
              EpiAware.EpiInference._continue_manypathfinder!Method
              _continue_manypathfinder!(
                   pfs,
                   mdl::DynamicPPL.Model;
                   max_tries,
                   nruns,
                   kwargs...
               )
              -

              Continue running the pathfinder algorithm until a pathfinder succeeds or the maximum number of tries is reached.

              Arguments

              • pfs: An array of pathfinder objects.
              • mdl::DynamicPPL.Model: The model to perform inference on.
              • max_tries: The maximum number of tries to run the pathfinder algorithm. Default is Inf.
              • nruns: The number of times to run the pathfinder function.
              • kwargs...: Additional keyword arguments passed to pathfinder.

              Returns

              • pfs: The updated array of pathfinder objects.
              source
              EpiAware.EpiInference._get_best_elbo_pathfinderMethod
              _get_best_elbo_pathfinder(pfs) -> Any
              -

              Selects the pathfinder with the highest ELBO estimate from a list of pathfinders.

              Arguments

              • pfs: A list of pathfinders results or Symbol values indicating failure.

              Returns

              The pathfinder with the highest ELBO estimate.

              source
              EpiAware.EpiInference._run_manypathfinderMethod
              _run_manypathfinder(mdl::DynamicPPL.Model; nruns, kwargs...)
              -

              Run pathfinder multiple times and store the results in an array. Fails safely.

              Arguments

              • mdl::DynamicPPL.Model: The Turing model to be used for inference.
              • nruns: The number of times to run the pathfinder function.
              • kwargs...: Additional keyword arguments passed to pathfinder.

              Returns

              An array of PathfinderResult objects or Symbol values indicating success or failure.

              source
              +

              Continue running the pathfinder algorithm until a pathfinder succeeds or the maximum number of tries is reached.

              Arguments

              Returns

              source
              EpiAware.EpiInference._get_best_elbo_pathfinderMethod
              _get_best_elbo_pathfinder(pfs) -> Any
              +

              Selects the pathfinder with the highest ELBO estimate from a list of pathfinders.

              Arguments

              • pfs: A list of pathfinders results or Symbol values indicating failure.

              Returns

              The pathfinder with the highest ELBO estimate.

              source
              EpiAware.EpiInference._run_manypathfinderMethod
              _run_manypathfinder(mdl::DynamicPPL.Model; nruns, kwargs...)
              +

              Run pathfinder multiple times and store the results in an array. Fails safely.

              Arguments

              • mdl::DynamicPPL.Model: The Turing model to be used for inference.
              • nruns: The number of times to run the pathfinder function.
              • kwargs...: Additional keyword arguments passed to pathfinder.

              Returns

              An array of PathfinderResult objects or Symbol values indicating success or failure.

              source
              diff --git a/previews/PR510/lib/EpiInference/public/index.html b/previews/PR510/lib/EpiInference/public/index.html index 3a7be0c6f..8a48fd046 100644 --- a/previews/PR510/lib/EpiInference/public/index.html +++ b/previews/PR510/lib/EpiInference/public/index.html @@ -1,5 +1,5 @@ -Public API · EpiAware.jl

              Public Documentation

              Documentation for EpiInference.jl's public interface.

              See the Internals section of the manual for internal package docs covering all submodules.

              Contents

              Index

              Public API

              EpiAware.EpiInference.ManyPathfinderType
              struct ManyPathfinder <: AbstractEpiOptMethod

              A variational inference method that runs manypathfinder.


              Fields

              • ndraws::Int64: Number of draws per pathfinder run.

              • nruns::Int64: Number of many pathfinder runs.

              • maxiters::Int64: Maximum number of optimization iterations for each run.

              • max_tries::Int64: Maximum number of tries if all runs fail.

              source
              EpiAware.EpiInference.NUTSamplerType
              struct NUTSampler{A<:ADTypes.AbstractADType, E<:AbstractMCMC.AbstractMCMCEnsemble, M} <: AbstractEpiSamplingMethod

              A NUTS method for sampling from a DynamicPPL.Model object.

              The NUTSampler struct represents using the No-U-Turn Sampler (NUTS) to sample from the distribution defined by a DynamicPPL.Model.


              Fields

              • target_acceptance::Float64: The target acceptance rate for the sampler.

              • adtype::ADTypes.AbstractADType: The automatic differentiation type used for computing gradients.

              • mcmc_parallel::AbstractMCMC.AbstractMCMCEnsemble: The parallelization strategy for the MCMC sampler.

              • nchains::Int64: The number of MCMC chains to run.

              • max_depth::Int64: Tree depth limit for the NUTS sampler.

              • Δ_max::Float64: Divergence threshold for the NUTS sampler.

              • init_ϵ::Float64: The initial step size for the NUTS sampler.

              • ndraws::Int64: The number of samples to draw from each chain.

              • metricT::Any: The metric type to use for the HMC sampler.

              • nadapts::Int64: number of adaptation steps

              source
              EpiAware.EpiInference.manypathfinderMethod
              manypathfinder(
              +Public API · EpiAware.jl

              Public Documentation

              Documentation for EpiInference.jl's public interface.

              See the Internals section of the manual for internal package docs covering all submodules.

              Contents

              Index

              Public API

              EpiAware.EpiInference.ManyPathfinderType
              struct ManyPathfinder <: AbstractEpiOptMethod

              A variational inference method that runs manypathfinder.


              Fields

              • ndraws::Int64: Number of draws per pathfinder run.

              • nruns::Int64: Number of many pathfinder runs.

              • maxiters::Int64: Maximum number of optimization iterations for each run.

              • max_tries::Int64: Maximum number of tries if all runs fail.

              source
              EpiAware.EpiInference.NUTSamplerType
              struct NUTSampler{A<:ADTypes.AbstractADType, E<:AbstractMCMC.AbstractMCMCEnsemble, M} <: AbstractEpiSamplingMethod

              A NUTS method for sampling from a DynamicPPL.Model object.

              The NUTSampler struct represents using the No-U-Turn Sampler (NUTS) to sample from the distribution defined by a DynamicPPL.Model.


              Fields

              • target_acceptance::Float64: The target acceptance rate for the sampler.

              • adtype::ADTypes.AbstractADType: The automatic differentiation type used for computing gradients.

              • mcmc_parallel::AbstractMCMC.AbstractMCMCEnsemble: The parallelization strategy for the MCMC sampler.

              • nchains::Int64: The number of MCMC chains to run.

              • max_depth::Int64: Tree depth limit for the NUTS sampler.

              • Δ_max::Float64: Divergence threshold for the NUTS sampler.

              • init_ϵ::Float64: The initial step size for the NUTS sampler.

              • ndraws::Int64: The number of samples to draw from each chain.

              • metricT::Any: The metric type to use for the HMC sampler.

              • nadapts::Int64: number of adaptation steps

              source
              EpiAware.EpiInference.manypathfinderMethod
              manypathfinder(
                   mdl::DynamicPPL.Model,
                   ndraws;
                   nruns,
              @@ -7,4 +7,4 @@
                   max_tries,
                   kwargs...
               ) -> Any
              -

              Run multiple instances of the pathfinder algorithm and returns the pathfinder run with the largest ELBO estimate.

              Arguments

              • mdl::DynamicPPL.Model: The model to perform inference on.
              • nruns::Int: The number of pathfinder runs to perform.
              • ndraws::Int: The number of draws per pathfinder run, readjusted to be at least as large as the number of chains.
              • nchains::Int: The number of chains that will be initialised by pathfinder draws.
              • maxiters::Int: The maximum number of optimizer iterations per pathfinder run.
              • max_tries::Int: The maximum number of extra tries to find a valid pathfinder result.
              • kwargs...: Additional keyword arguments passed to pathfinder.

              Returns

              • best_pfs::PathfinderResult: Best pathfinder result by estimated ELBO.
              source
              +

              Run multiple instances of the pathfinder algorithm and returns the pathfinder run with the largest ELBO estimate.

              Arguments

              • mdl::DynamicPPL.Model: The model to perform inference on.
              • nruns::Int: The number of pathfinder runs to perform.
              • ndraws::Int: The number of draws per pathfinder run, readjusted to be at least as large as the number of chains.
              • nchains::Int: The number of chains that will be initialised by pathfinder draws.
              • maxiters::Int: The maximum number of optimizer iterations per pathfinder run.
              • max_tries::Int: The maximum number of extra tries to find a valid pathfinder result.
              • kwargs...: Additional keyword arguments passed to pathfinder.

              Returns

              • best_pfs::PathfinderResult: Best pathfinder result by estimated ELBO.
              source
              diff --git a/previews/PR510/lib/EpiLatentModels/index.html b/previews/PR510/lib/EpiLatentModels/index.html index 0e957d751..59224e2d7 100644 --- a/previews/PR510/lib/EpiLatentModels/index.html +++ b/previews/PR510/lib/EpiLatentModels/index.html @@ -1,2 +1,2 @@ -Overview · EpiAware.jl
              +Overview · EpiAware.jl
              diff --git a/previews/PR510/lib/EpiLatentModels/internals/index.html b/previews/PR510/lib/EpiLatentModels/internals/index.html index 1708a2a84..70f993f0b 100644 --- a/previews/PR510/lib/EpiLatentModels/internals/index.html +++ b/previews/PR510/lib/EpiLatentModels/internals/index.html @@ -1,14 +1,14 @@ -Internal API · EpiAware.jl

              Internal Documentation

              Documentation for EpiLatentModels.jl's internal interface.

              Contents

              Index

              Internal API

              EpiAware.EpiLatentModels.ARStepType
              struct ARStep{D<:(AbstractVector{<:Real})} <: AbstractAccumulationStep

              The autoregressive (AR) step function struct


              Fields

              • damp_AR::AbstractVector{<:Real}
              source
              EpiAware.EpiAwareBase.broadcast_nMethod
              broadcast_n(_::RepeatBlock, n, period) -> Any
              -

              A function that returns the length of the latent periods to generate using the RepeatBlock rule which is equal n divided by the period and rounded up to the nearest integer.

              Arguments

              • rule::RepeatBlock: The broadcasting rule.
              • n: The number of samples to generate.
              • period: The period of the broadcast.
              source
              EpiAware.EpiAwareBase.broadcast_nMethod
              broadcast_n(_::RepeatEach, n, period) -> Any
              -

              A function that returns the length of the latent periods to generate using the RepeatEach rule which is equal to the period.

              Arguments

              • rule::RepeatEach: The broadcasting rule.
              • n: The number of samples to generate.
              • period: The period of the broadcast.

              Returns

              • m: The length of the latent periods to generate.
              source
              EpiAware.EpiAwareBase.generate_latentMethod
              generate_latent(latent_model::AR, n) -> Any
              -

              Generate a latent AR series.

              Arguments

              • latent_model::AR: The AR model.
              • n::Int: The length of the AR series.

              Returns

              • ar::Vector{Float64}: The generated AR series.

              Notes

              • The length of damp_prior and init_prior must be the same.
              • n must be longer than the order of the autoregressive process.
              source
              EpiAware.EpiAwareBase.generate_latentMethod
              generate_latent(model::BroadcastLatentModel, n) -> Any
              -

              Generates latent periods using the specified model and n number of samples.

              Arguments

              • model::BroadcastLatentModel: The broadcast latent model.
              • n::Any: The number of samples to generate.

              Returns

              • broadcasted_latent: The generated broadcasted latent periods.
              source
              EpiAware.EpiAwareBase.generate_latentMethod
              generate_latent(
              +Internal API · EpiAware.jl

              Internal Documentation

              Documentation for EpiLatentModels.jl's internal interface.

              Contents

              Index

              Internal API

              EpiAware.EpiLatentModels.ARStepType
              struct ARStep{D<:(AbstractVector{<:Real})} <: AbstractAccumulationStep

              The autoregressive (AR) step function struct


              Fields

              • damp_AR::AbstractVector{<:Real}
              source
              EpiAware.EpiAwareBase.broadcast_nMethod
              broadcast_n(_::RepeatBlock, n, period) -> Any
              +

              A function that returns the length of the latent periods to generate using the RepeatBlock rule which is equal n divided by the period and rounded up to the nearest integer.

              Arguments

              • rule::RepeatBlock: The broadcasting rule.
              • n: The number of samples to generate.
              • period: The period of the broadcast.
              source
              EpiAware.EpiAwareBase.broadcast_nMethod
              broadcast_n(_::RepeatEach, n, period) -> Any
              +

              A function that returns the length of the latent periods to generate using the RepeatEach rule which is equal to the period.

              Arguments

              • rule::RepeatEach: The broadcasting rule.
              • n: The number of samples to generate.
              • period: The period of the broadcast.

              Returns

              • m: The length of the latent periods to generate.
              source
              EpiAware.EpiAwareBase.generate_latentMethod
              generate_latent(latent_model::AR, n) -> Any
              +

              Generate a latent AR series.

              Arguments

              • latent_model::AR: The AR model.
              • n::Int: The length of the AR series.

              Returns

              • ar::Vector{Float64}: The generated AR series.

              Notes

              • The length of damp_prior and init_prior must be the same.
              • n must be longer than the order of the autoregressive process.
              source
              EpiAware.EpiAwareBase.generate_latentMethod
              generate_latent(model::BroadcastLatentModel, n) -> Any
              +

              Generates latent periods using the specified model and n number of samples.

              Arguments

              • model::BroadcastLatentModel: The broadcast latent model.
              • n::Any: The number of samples to generate.

              Returns

              • broadcasted_latent: The generated broadcasted latent periods.
              source
              EpiAware.EpiAwareBase.generate_latentMethod
              generate_latent(
                   latent_models::CombineLatentModels,
                   n
               ) -> Any
              -

              Generate latent variables using a combination of multiple latent models.

              Arguments

              • latent_models::CombineLatentModels: An instance of the CombineLatentModels type representing the collection of latent models.
              • n: The number of latent variables to generate.

              Returns

              • The combined latent variables generated from all the models.

              Example

              source
              EpiAware.EpiAwareBase.generate_latentMethod
              generate_latent(latent_models::ConcatLatentModels, n) -> Any
              -

              Generate latent variables by concatenating multiple latent models.

              Arguments

              • latent_models::ConcatLatentModels: An instance of the ConcatLatentModels type representing the collection of latent models.
              • n: The number of latent variables to generate.

              Returns

              • concatenated_latents: The combined latent variables generated from all the models.
              • latent_aux: A tuple containing the auxiliary latent variables generated from each individual model.
              source
              EpiAware.EpiAwareBase.generate_latentMethod
              generate_latent(latent_model::DiffLatentModel, n) -> Any
              +

              Generate latent variables using a combination of multiple latent models.

              Arguments

              • latent_models::CombineLatentModels: An instance of the CombineLatentModels type representing the collection of latent models.
              • n: The number of latent variables to generate.

              Returns

              • The combined latent variables generated from all the models.

              Example

              source
              EpiAware.EpiAwareBase.generate_latentMethod
              generate_latent(latent_models::ConcatLatentModels, n) -> Any
              +

              Generate latent variables by concatenating multiple latent models.

              Arguments

              • latent_models::ConcatLatentModels: An instance of the ConcatLatentModels type representing the collection of latent models.
              • n: The number of latent variables to generate.

              Returns

              • concatenated_latents: The combined latent variables generated from all the models.
              • latent_aux: A tuple containing the auxiliary latent variables generated from each individual model.
              source
              EpiAware.EpiAwareBase.generate_latentMethod
              generate_latent(latent_model::DiffLatentModel, n) -> Any
               

              Generate a Turing model for n-step latent process $Z_t$ using a differenced latent model defined by latent_model.

              Arguments

              • latent_model::DiffLatentModel: The differential latent model.
              • n: The length of the latent variables.

              Turing model specifications

              Sampled random variables

              • latent_init: The initial latent process variables.
              • Other random variables defined by model<:AbstractTuringLatentModel field of the undifferenced model.

              Generated quantities

              • A tuple containing the generated latent process as its first argument and a NamedTuple of sampled auxiliary variables as second argument.

              Example usage with DiffLatentModel model constructor

              generate_latent can be used to construct a Turing model for the differenced latent process. In this example, the underlying undifferenced process is a RandomWalk model.

              First, we construct a RandomWalk struct with an initial value prior and a step size standard deviation prior.

              using Distributions, EpiAware
               rw = RandomWalk(Normal(0.0, 1.0), truncated(Normal(0.0, 0.05), 0.0, Inf))

              Then, we can use DiffLatentModel to construct a DiffLatentModel for d-fold differenced process with rw as the undifferenced latent process.

              We have two constructor options for DiffLatentModel. The first option is to supply a common prior distribution for the initial terms and specify d as follows:

              diff_model = DiffLatentModel(rw, Normal(); d = 2)

              Or we can supply a vector of priors for the initial terms and d is inferred as follows:

              diff_model2 = DiffLatentModel(;undiffmodel = rw, init_priors = [Normal(), Normal()])

              Then, we can use generate_latent to construct a Turing model for the differenced latent process generating a length n process,

              # Construct a Turing model
               n = 100
              @@ -16,10 +16,10 @@
               θ = rand(difference_mdl)
               #Get a sampled latent process as a generated quantity from the model
               (Z_t, _) = generated_quantities(difference_mdl, θ)
              -Z_t
              source
              EpiAware.EpiAwareBase.generate_latentMethod
              generate_latent(latent_model::FixedIntercept, n) -> Any
              -

              Generate a latent intercept series with a fixed intercept value.

              Arguments

              • latent_model::FixedIntercept: The fixed intercept latent model.
              • n: The number of latent variables to generate.

              Returns

              • latent_vars: An array of length n filled with the fixed intercept value.
              source
              EpiAware.EpiAwareBase.generate_latentMethod
              generate_latent(obs_model::HierarchicalNormal, n) -> Any
              -
              function EpiAwareBase.generate_latent(obs_model::HierarchicalNormal, n)

              Generate latent variables from the hierarchical normal distribution.

              Arguments

              • obs_model::HierarchicalNormal: The hierarchical normal distribution model.
              • n: Number of latent variables to generate.

              Returns

              • η_t: Generated latent variables.
              source
              EpiAware.EpiAwareBase.generate_latentMethod
              generate_latent(latent_model::Intercept, n) -> Any
              -

              Generate a latent intercept series.

              Arguments

              • latent_model::Intercept: The intercept model.
              • n::Int: The length of the intercept series.

              Returns

              • intercept::Vector{Float64}: The generated intercept series.
              source
              EpiAware.EpiAwareBase.generate_latentMethod
              generate_latent(latent_model::FixedIntercept, n) -> Any
              +

              Generate a latent intercept series with a fixed intercept value.

              Arguments

              • latent_model::FixedIntercept: The fixed intercept latent model.
              • n: The number of latent variables to generate.

              Returns

              • latent_vars: An array of length n filled with the fixed intercept value.
              source
              EpiAware.EpiAwareBase.generate_latentMethod
              generate_latent(obs_model::HierarchicalNormal, n) -> Any
              +
              function EpiAwareBase.generate_latent(obs_model::HierarchicalNormal, n)

              Generate latent variables from the hierarchical normal distribution.

              Arguments

              • obs_model::HierarchicalNormal: The hierarchical normal distribution model.
              • n: Number of latent variables to generate.

              Returns

              • η_t: Generated latent variables.
              source
              EpiAware.EpiAwareBase.generate_latentMethod
              generate_latent(latent_model::Intercept, n) -> Any
              +

              Generate a latent intercept series.

              Arguments

              • latent_model::Intercept: The intercept model.
              • n::Int: The length of the intercept series.

              Returns

              • intercept::Vector{Float64}: The generated intercept series.
              source
              EpiAware.EpiAwareBase.generate_latentMethod
              generate_latent(latent_model::RandomWalk, n) -> Any
               

              Implement the generate_latent function for the RandomWalk model.

              Example usage of generate_latent with RandomWalk type of latent process model

              using Distributions, Turing, EpiAware
               
               # Create a RandomWalk model
              @@ -28,5 +28,5 @@
               rw_model = generate_latent(rw, 10)

              Now we can use the Turing PPL API to sample underlying parameters and generate the unobserved infections.

              #Sample random parameters from prior
               θ = rand(rw_model)
               #Get random walk sample path as a generated quantities from the model
              -Z_t, _ = generated_quantities(rw_model, θ)
              source
              EpiAware.EpiAwareBase.generate_latentMethod
              generate_latent(model::TransformLatentModel, n) -> Any
              -
              generate_latent(model::TransformLatentModel, n)

              Generate latent variables using the specified TransformLatentModel.

              Arguments

              • model::TransformLatentModel: The TransformLatentModel to generate latent variables from.
              • n: The number of latent variables to generate.

              Returns

              • The transformed latent variables.
              source
              +Z_t, _ = generated_quantities(rw_model, θ)
              source
              EpiAware.EpiAwareBase.generate_latentMethod
              generate_latent(model::TransformLatentModel, n) -> Any
              +
              generate_latent(model::TransformLatentModel, n)

              Generate latent variables using the specified TransformLatentModel.

              Arguments

              • model::TransformLatentModel: The TransformLatentModel to generate latent variables from.
              • n: The number of latent variables to generate.

              Returns

              • The transformed latent variables.
              source
              diff --git a/previews/PR510/lib/EpiLatentModels/public/index.html b/previews/PR510/lib/EpiLatentModels/public/index.html index 0392ffb44..f920b629b 100644 --- a/previews/PR510/lib/EpiLatentModels/public/index.html +++ b/previews/PR510/lib/EpiLatentModels/public/index.html @@ -1,39 +1,39 @@ -Public API · EpiAware.jl

              Public Documentation

              Documentation for EpiLatentModels.jl's public interface.

              See the Internals section of the manual for internal package docs covering all submodules.

              Contents

              Index

              Public API

              EpiAware.EpiLatentModels.ARType
              struct AR{D<:Distributions.Sampleable, S<:Distributions.Sampleable, I<:Distributions.Sampleable, P<:Int64} <: AbstractTuringLatentModel

              The autoregressive (AR) model struct.

              Constructors

              • AR(damp_prior::Distribution, std_prior::Distribution, init_prior::Distribution; p::Int = 1): Constructs an AR model with the specified prior distributions for damping coefficients, standard deviation, and initial conditions. The order of the AR model can also be specified.

              • AR(; damp_priors::Vector{D} = [truncated(Normal(0.0, 0.05))], std_prior::Distribution = truncated(Normal(0.0, 0.05), 0.0, Inf), init_priors::Vector{I} = [Normal()]) where {D <: Distribution, I <: Distribution}: Constructs an AR model with the specified prior distributions for damping coefficients, standard deviation, and initial conditions. The order of the AR model is determined by the length of the damp_priors vector.

              • AR(damp_prior::Distribution, std_prior::Distribution, init_prior::Distribution, p::Int): Constructs an AR model with the specified prior distributions for damping coefficients, standard deviation, and initial conditions. The order of the AR model is explicitly specified.

              Examples

              using Distributions
              +Public API · EpiAware.jl

              Public Documentation

              Documentation for EpiLatentModels.jl's public interface.

              See the Internals section of the manual for internal package docs covering all submodules.

              Contents

              Index

              Public API

              EpiAware.EpiLatentModels.ARType
              struct AR{D<:Distributions.Sampleable, S<:Distributions.Sampleable, I<:Distributions.Sampleable, P<:Int64} <: AbstractTuringLatentModel

              The autoregressive (AR) model struct.

              Constructors

              • AR(damp_prior::Distribution, std_prior::Distribution, init_prior::Distribution; p::Int = 1): Constructs an AR model with the specified prior distributions for damping coefficients, standard deviation, and initial conditions. The order of the AR model can also be specified.

              • AR(; damp_priors::Vector{D} = [truncated(Normal(0.0, 0.05))], std_prior::Distribution = truncated(Normal(0.0, 0.05), 0.0, Inf), init_priors::Vector{I} = [Normal()]) where {D <: Distribution, I <: Distribution}: Constructs an AR model with the specified prior distributions for damping coefficients, standard deviation, and initial conditions. The order of the AR model is determined by the length of the damp_priors vector.

              • AR(damp_prior::Distribution, std_prior::Distribution, init_prior::Distribution, p::Int): Constructs an AR model with the specified prior distributions for damping coefficients, standard deviation, and initial conditions. The order of the AR model is explicitly specified.

              Examples

              using Distributions
               using EpiAware
               ar = AR()
               ar_model = generate_latent(ar, 10)
              -rand(ar_model)

              Fields

              • damp_prior::Distributions.Sampleable: Prior distribution for the damping coefficients.

              • std_prior::Distributions.Sampleable: Prior distribution for the standard deviation.

              • init_prior::Distributions.Sampleable: Prior distribution for the initial conditions

              • p::Int64: Order of the AR model.

              source
              EpiAware.EpiLatentModels.BroadcastLatentModelType
              struct BroadcastLatentModel{M<:AbstractTuringLatentModel, P<:Integer, B<:AbstractBroadcastRule} <: AbstractTuringLatentModel

              The BroadcastLatentModel struct represents a latent model that supports broadcasting of latent periods.

              Constructors

              • BroadcastLatentModel(;model::M; period::Int, broadcast_rule::B): Constructs a BroadcastLatentModel with the given model, period, and broadcast_rule.
              • BroadcastLatentModel(model::M, period::Int, broadcast_rule::B): An alternative constructor that allows the model, period, and broadcast_rule to be specified without keyword arguments.

              Examples

              using EpiAware, Turing
              +rand(ar_model)

              Fields

              • damp_prior::Distributions.Sampleable: Prior distribution for the damping coefficients.

              • std_prior::Distributions.Sampleable: Prior distribution for the standard deviation.

              • init_prior::Distributions.Sampleable: Prior distribution for the initial conditions

              • p::Int64: Order of the AR model.

              source
              EpiAware.EpiLatentModels.BroadcastLatentModelType
              struct BroadcastLatentModel{M<:AbstractTuringLatentModel, P<:Integer, B<:AbstractBroadcastRule} <: AbstractTuringLatentModel

              The BroadcastLatentModel struct represents a latent model that supports broadcasting of latent periods.

              Constructors

              • BroadcastLatentModel(;model::M; period::Int, broadcast_rule::B): Constructs a BroadcastLatentModel with the given model, period, and broadcast_rule.
              • BroadcastLatentModel(model::M, period::Int, broadcast_rule::B): An alternative constructor that allows the model, period, and broadcast_rule to be specified without keyword arguments.

              Examples

              using EpiAware, Turing
               each_model = BroadcastLatentModel(RandomWalk(), 7, RepeatEach())
               gen_each_model = generate_latent(each_model, 10)
               rand(gen_each_model)
               
               block_model = BroadcastLatentModel(RandomWalk(), 3, RepeatBlock())
               gen_block_model = generate_latent(block_model, 10)
              -rand(gen_block_model)

              Fields

              • model::AbstractTuringLatentModel: The underlying latent model.

              • period::Integer: The period of the broadcast.

              • broadcast_rule::AbstractBroadcastRule: The broadcast rule to be applied.

              source
              EpiAware.EpiLatentModels.CombineLatentModelsType
              struct CombineLatentModels{M<:(AbstractVector{<:AbstractTuringLatentModel}), P<:(AbstractVector{<:String})} <: AbstractTuringLatentModel

              The CombineLatentModels struct.

              This struct is used to combine multiple latent models into a single latent model. If a prefix is supplied wraps each model with PrefixLatentModel.

              Constructors

              • CombineLatentModels(models::M, prefixes::P) where {M <: AbstractVector{<:AbstractTuringLatentModel}, P <: AbstractVector{<:String}}: Constructs a CombineLatentModels instance with specified models and prefixes, ensuring that there are at least two models and the number of models and prefixes are equal.
              • CombineLatentModels(models::M) where {M <: AbstractVector{<:AbstractTuringLatentModel}}: Constructs a CombineLatentModels instance with specified models, automatically generating prefixes for each model. The

              automatic prefixes are of the form Combine.1, Combine.2, etc.

              Examples

              using EpiAware, Distributions
              +rand(gen_block_model)

              Fields

              • model::AbstractTuringLatentModel: The underlying latent model.

              • period::Integer: The period of the broadcast.

              • broadcast_rule::AbstractBroadcastRule: The broadcast rule to be applied.

              source
              EpiAware.EpiLatentModels.CombineLatentModelsType
              struct CombineLatentModels{M<:(AbstractVector{<:AbstractTuringLatentModel}), P<:(AbstractVector{<:String})} <: AbstractTuringLatentModel

              The CombineLatentModels struct.

              This struct is used to combine multiple latent models into a single latent model. If a prefix is supplied wraps each model with PrefixLatentModel.

              Constructors

              • CombineLatentModels(models::M, prefixes::P) where {M <: AbstractVector{<:AbstractTuringLatentModel}, P <: AbstractVector{<:String}}: Constructs a CombineLatentModels instance with specified models and prefixes, ensuring that there are at least two models and the number of models and prefixes are equal.
              • CombineLatentModels(models::M) where {M <: AbstractVector{<:AbstractTuringLatentModel}}: Constructs a CombineLatentModels instance with specified models, automatically generating prefixes for each model. The

              automatic prefixes are of the form Combine.1, Combine.2, etc.

              Examples

              using EpiAware, Distributions
               combined_model = CombineLatentModels([Intercept(Normal(2, 0.2)), AR()])
               latent_model = generate_latent(combined_model, 10)
              -latent_model()

              Fields

              • models::AbstractVector{<:AbstractTuringLatentModel}: A vector of latent models

              • prefixes::AbstractVector{<:String}: A vector of prefixes for the latent models

              source
              EpiAware.EpiLatentModels.ConcatLatentModelsType
              struct ConcatLatentModels{M<:(AbstractVector{<:AbstractTuringLatentModel}), N<:Int64, F<:Function, P<:(AbstractVector{<:String})} <: AbstractTuringLatentModel

              The ConcatLatentModels struct.

              This struct is used to concatenate multiple latent models into a single latent model.

              Constructors

              • ConcatLatentModels(models::M, no_models::I, dimension_adaptor::F, prefixes::P) where {M <: AbstractVector{<:AbstractTuringLatentModel}, I <: Int, F <: Function, P <: AbstractVector{String}}: Constructs a ConcatLatentModels instance with specified models, number of models, dimension adaptor, and prefixes.
              • ConcatLatentModels(models::M, dimension_adaptor::F; prefixes::P = "Concat." * string.(1:length(models))) where {M <: AbstractVector{<:AbstractTuringLatentModel}, F <: Function}: Constructs a ConcatLatentModels instance with specified models and dimension adaptor. The number of models is automatically determined as are the prefixes (of the form Concat.1, Concat.2, etc.) by default.
              • ConcatLatentModels(models::M; dimension_adaptor::Function, prefixes::P) where {M <: AbstractVector{<:AbstractTuringLatentModel}, P <: AbstractVector{String}}: Constructs a ConcatLatentModels instance with specified models, dimension adaptor, prefixes, and automatically determines the number of models.The default dimension adaptor is equal_dimensions. The default prefixes are of the form Concat.1, Concat.2, etc.
              • ConcatLatentModels(; models::M, dimension_adaptor::Function, prefixes::P) where {M <: AbstractVector{<:AbstractTuringLatentModel}, P <: AbstractVector{String}}: Constructs a ConcatLatentModels instance with specified models, dimension adaptor, prefixes, and automatically determines the number of models. The default dimension adaptor is equal_dimensions. The default prefixes are of the form Concat.1, Concat.2, etc.

              Examples

              using EpiAware, Distributions
              +latent_model()

              Fields

              • models::AbstractVector{<:AbstractTuringLatentModel}: A vector of latent models

              • prefixes::AbstractVector{<:String}: A vector of prefixes for the latent models

              source
              EpiAware.EpiLatentModels.ConcatLatentModelsType
              struct ConcatLatentModels{M<:(AbstractVector{<:AbstractTuringLatentModel}), N<:Int64, F<:Function, P<:(AbstractVector{<:String})} <: AbstractTuringLatentModel

              The ConcatLatentModels struct.

              This struct is used to concatenate multiple latent models into a single latent model.

              Constructors

              • ConcatLatentModels(models::M, no_models::I, dimension_adaptor::F, prefixes::P) where {M <: AbstractVector{<:AbstractTuringLatentModel}, I <: Int, F <: Function, P <: AbstractVector{String}}: Constructs a ConcatLatentModels instance with specified models, number of models, dimension adaptor, and prefixes.
              • ConcatLatentModels(models::M, dimension_adaptor::F; prefixes::P = "Concat." * string.(1:length(models))) where {M <: AbstractVector{<:AbstractTuringLatentModel}, F <: Function}: Constructs a ConcatLatentModels instance with specified models and dimension adaptor. The number of models is automatically determined as are the prefixes (of the form Concat.1, Concat.2, etc.) by default.
              • ConcatLatentModels(models::M; dimension_adaptor::Function, prefixes::P) where {M <: AbstractVector{<:AbstractTuringLatentModel}, P <: AbstractVector{String}}: Constructs a ConcatLatentModels instance with specified models, dimension adaptor, prefixes, and automatically determines the number of models.The default dimension adaptor is equal_dimensions. The default prefixes are of the form Concat.1, Concat.2, etc.
              • ConcatLatentModels(; models::M, dimension_adaptor::Function, prefixes::P) where {M <: AbstractVector{<:AbstractTuringLatentModel}, P <: AbstractVector{String}}: Constructs a ConcatLatentModels instance with specified models, dimension adaptor, prefixes, and automatically determines the number of models. The default dimension adaptor is equal_dimensions. The default prefixes are of the form Concat.1, Concat.2, etc.

              Examples

              using EpiAware, Distributions
               combined_model = ConcatLatentModels([Intercept(Normal(2, 0.2)), AR()])
               latent_model = generate_latent(combined_model, 10)
              -latent_model()

              Fields

              • models::AbstractVector{<:AbstractTuringLatentModel}: A vector of latent models

              • no_models::Int64: The number of models in the collection

              • dimension_adaptor::Function: The dimension function for the latent variables. By default this divides the number of latent variables by the number of models and returns a vector of dimensions rounding up the first element and rounding down the rest.

              • prefixes::AbstractVector{<:String}: A vector of prefixes for the latent models

              source
              EpiAware.EpiLatentModels.DiffLatentModelType
              struct DiffLatentModel{M<:AbstractTuringLatentModel, P<:Distributions.Distribution} <: AbstractTuringLatentModel

              Model the latent process as a d-fold differenced version of another process.

              Mathematical specification

              Let $\Delta$ be the differencing operator. If $\tilde{Z}_t$ is a realisation of undifferenced latent model supplied to DiffLatentModel, then the differenced process is given by,

              \[\Delta^{(d)} Z_t = \tilde{Z}_t, \quad t = d+1, \ldots.\]

              We can recover $Z_t$ by applying the inverse differencing operator $\Delta^{-1}$, which corresponds to the cumulative sum operator cumsum in Julia, d-times. The d initial terms $Z_1, \ldots, Z_d$ are inferred.

              Constructors

              • DiffLatentModel(latent_model, init_prior_distribution::Distribution; d::Int) Constructs a DiffLatentModel for d-fold differencing with latent_model as the undifferenced latent process. All initial terms have common prior init_prior_distribution.
              • DiffLatentModel(;model, init_priors::Vector{D} where {D <: Distribution}) Constructs a DiffLatentModel for d-fold differencing with latent_model as the undifferenced latent process. The d initial terms have priors given by the vector init_priors, therefore length(init_priors) sets d.

              Example usage with generate_latent

              generate_latent can be used to construct a Turing model for the differenced latent process. In this example, the underlying undifferenced process is a RandomWalk model.

              First, we construct a RandomWalk struct with an initial value prior and a step size standard deviation prior.

              using Distributions, EpiAware
              +latent_model()

              Fields

              • models::AbstractVector{<:AbstractTuringLatentModel}: A vector of latent models

              • no_models::Int64: The number of models in the collection

              • dimension_adaptor::Function: The dimension function for the latent variables. By default this divides the number of latent variables by the number of models and returns a vector of dimensions rounding up the first element and rounding down the rest.

              • prefixes::AbstractVector{<:String}: A vector of prefixes for the latent models

              source
              EpiAware.EpiLatentModels.DiffLatentModelType
              struct DiffLatentModel{M<:AbstractTuringLatentModel, P<:Distributions.Distribution} <: AbstractTuringLatentModel

              Model the latent process as a d-fold differenced version of another process.

              Mathematical specification

              Let $\Delta$ be the differencing operator. If $\tilde{Z}_t$ is a realisation of undifferenced latent model supplied to DiffLatentModel, then the differenced process is given by,

              \[\Delta^{(d)} Z_t = \tilde{Z}_t, \quad t = d+1, \ldots.\]

              We can recover $Z_t$ by applying the inverse differencing operator $\Delta^{-1}$, which corresponds to the cumulative sum operator cumsum in Julia, d-times. The d initial terms $Z_1, \ldots, Z_d$ are inferred.

              Constructors

              • DiffLatentModel(latent_model, init_prior_distribution::Distribution; d::Int) Constructs a DiffLatentModel for d-fold differencing with latent_model as the undifferenced latent process. All initial terms have common prior init_prior_distribution.
              • DiffLatentModel(;model, init_priors::Vector{D} where {D <: Distribution}) Constructs a DiffLatentModel for d-fold differencing with latent_model as the undifferenced latent process. The d initial terms have priors given by the vector init_priors, therefore length(init_priors) sets d.

              Example usage with generate_latent

              generate_latent can be used to construct a Turing model for the differenced latent process. In this example, the underlying undifferenced process is a RandomWalk model.

              First, we construct a RandomWalk struct with an initial value prior and a step size standard deviation prior.

              using Distributions, EpiAware
               rw = RandomWalk(Normal(0.0, 1.0), truncated(Normal(0.0, 0.05), 0.0, Inf))

              Then, we can use DiffLatentModel to construct a DiffLatentModel for d-fold differenced process with rw as the undifferenced latent process.

              We have two constructor options for DiffLatentModel. The first option is to supply a common prior distribution for the initial terms and specify d as follows:

              diff_model = DiffLatentModel(rw, Normal(); d = 2)

              Or we can supply a vector of priors for the initial terms and d is inferred as follows:

              diff_model2 = DiffLatentModel(;undiffmodel = rw, init_priors = [Normal(), Normal()])

              Then, we can use generate_latent to construct a Turing model for the differenced latent process generating a length n process,

              # Construct a Turing model
               n = 100
               difference_mdl = generate_latent(diff_model, n)

              Now we can use the Turing PPL API to sample underlying parameters and generate the unobserved latent process.

              #Sample random parameters from prior
               θ = rand(difference_mdl)
               #Get a sampled latent process as a generated quantity from the model
               (Z_t, _) = generated_quantities(difference_mdl, θ)
              -Z_t

              Fields

              • model::AbstractTuringLatentModel: Underlying latent model for the differenced process

              • init_prior::Distributions.Distribution: The prior distribution for the initial latent variables.

              • d::Int64: Number of times differenced.

              source
              EpiAware.EpiLatentModels.FixedInterceptType
              struct FixedIntercept{F<:Real} <: AbstractTuringIntercept

              A variant of the Intercept struct that represents a fixed intercept value for a latent model.

              Constructors

              • FixedIntercept(intercept) : Constructs a FixedIntercept instance with the specified intercept value.
              • FixedIntercept(; intercept) : Constructs a FixedIntercept instance with the specified intercept value using named arguments.

              Examples

              using EpiAware
              +Z_t

              Fields

              • model::AbstractTuringLatentModel: Underlying latent model for the differenced process

              • init_prior::Distributions.Distribution: The prior distribution for the initial latent variables.

              • d::Int64: Number of times differenced.

              source
              EpiAware.EpiLatentModels.FixedInterceptType
              struct FixedIntercept{F<:Real} <: AbstractTuringIntercept

              A variant of the Intercept struct that represents a fixed intercept value for a latent model.

              Constructors

              • FixedIntercept(intercept) : Constructs a FixedIntercept instance with the specified intercept value.
              • FixedIntercept(; intercept) : Constructs a FixedIntercept instance with the specified intercept value using named arguments.

              Examples

              using EpiAware
               fi = FixedIntercept(2.0)
               fi_model = generate_latent(fi, 10)
              -fi_model()

              Fields

              • intercept::Real
              source
              EpiAware.EpiLatentModels.HierarchicalNormalType
              struct HierarchicalNormal{R<:Real, D<:Distributions.Sampleable} <: AbstractTuringLatentModel

              The HierarchicalNormal struct represents a non-centered hierarchical normal distribution.

              Constructors

              • HierarchicalNormal(mean, std_prior): Constructs a HierarchicalNormal instance with the specified mean and standard deviation prior.
              • HierarchicalNormal(; mean = 0.0, std_prior = truncated(Normal(0,1), 0, Inf)): Constructs a HierarchicalNormal instance with the specified mean and standard deviation prior using named arguments and with default values.

              Examples

              using Distributions, EpiAware
              +fi_model()

              Fields

              • intercept::Real
              source
              EpiAware.EpiLatentModels.HierarchicalNormalType
              struct HierarchicalNormal{R<:Real, D<:Distributions.Sampleable} <: AbstractTuringLatentModel

              The HierarchicalNormal struct represents a non-centered hierarchical normal distribution.

              Constructors

              • HierarchicalNormal(mean, std_prior): Constructs a HierarchicalNormal instance with the specified mean and standard deviation prior.
              • HierarchicalNormal(; mean = 0.0, std_prior = truncated(Normal(0,1), 0, Inf)): Constructs a HierarchicalNormal instance with the specified mean and standard deviation prior using named arguments and with default values.

              Examples

              using Distributions, EpiAware
               hnorm = HierarchicalNormal(0.0, truncated(Normal(0, 1), 0, Inf))
               hnorm_model = generate_latent(hnorm, 10)
              -hnorm_model()

              Fields

              • mean::Real

              • std_prior::Distributions.Sampleable

              source
              EpiAware.EpiLatentModels.InterceptType
              struct Intercept{D<:Distributions.Sampleable} <: AbstractTuringIntercept

              The Intercept struct is used to model the intercept of a latent process. It broadcasts a single intercept value to a length n latent process.

              Constructors

              • Intercept(intercept_prior)
              • Intercept(; intercept_prior)

              Examples

              using Distributions, Turing, EpiAware
              +hnorm_model()

              Fields

              • mean::Real

              • std_prior::Distributions.Sampleable

              source
              EpiAware.EpiLatentModels.InterceptType
              struct Intercept{D<:Distributions.Sampleable} <: AbstractTuringIntercept

              The Intercept struct is used to model the intercept of a latent process. It broadcasts a single intercept value to a length n latent process.

              Constructors

              • Intercept(intercept_prior)
              • Intercept(; intercept_prior)

              Examples

              using Distributions, Turing, EpiAware
               int = Intercept(Normal(0, 1))
               int_model = generate_latent(int, 10)
               rand(int_model)
              -int_model()

              Fields

              • intercept_prior::Distributions.Sampleable: Prior distribution for the intercept.
              source
              EpiAware.EpiLatentModels.PrefixLatentModelType
              struct PrefixLatentModel{M<:AbstractTuringLatentModel, P<:String} <: AbstractTuringLatentModel
              Generate a latent model with a prefix. A lightweight wrapper around `EpiAwareUtils.prefix_submodel`.
              +int_model()

              Fields

              • intercept_prior::Distributions.Sampleable: Prior distribution for the intercept.
              source
              EpiAware.EpiLatentModels.PrefixLatentModelType
              struct PrefixLatentModel{M<:AbstractTuringLatentModel, P<:String} <: AbstractTuringLatentModel
              Generate a latent model with a prefix. A lightweight wrapper around `EpiAwareUtils.prefix_submodel`.
               
               # Constructors
               - `PrefixLatentModel(model::M, prefix::P)`: Create a `PrefixLatentModel` with the latent model `model` and the prefix `prefix`.
              @@ -45,7 +45,7 @@
               latent_model = PrefixLatentModel(model = HierarchicalNormal(), prefix = "Test")
               mdl = generate_latent(latent_model, 10)
               rand(mdl)
              -```

              Fields

              • model::AbstractTuringLatentModel: The latent model

              • prefix::String: The prefix for the latent model

              source
              EpiAware.EpiLatentModels.RandomWalkType
              struct RandomWalk{D<:Distributions.Sampleable, S<:Distributions.Sampleable} <: AbstractTuringLatentModel

              Model latent process $Z_t$ as a random walk.

              Mathematical specification

              The random walk $Z_t$ is specified as a parameteric transformation of the white noise sequence $(\epsilon_t)_{t\geq 1}$,

              \[Z_t = Z_0 + \sigma \sum_{i = 1}^t \epsilon_t\]

              Constructing a random walk requires specifying:

              • An init_prior as a prior for $Z_0$. Default is Normal().
              • A std_prior for $\sigma$. The default is HalfNormal with a mean of 0.25.

              Constructors

              • RandomWalk(; init_prior, std_prior)

              Example usage with generate_latent

              generate_latent can be used to construct a Turing model for the random walk $Z_t$.

              First, we construct a RandomWalk struct with priors,

              using Distributions, Turing, EpiAware
              +```

              Fields

              • model::AbstractTuringLatentModel: The latent model

              • prefix::String: The prefix for the latent model

              source
              EpiAware.EpiLatentModels.RandomWalkType
              struct RandomWalk{D<:Distributions.Sampleable, S<:Distributions.Sampleable} <: AbstractTuringLatentModel

              Model latent process $Z_t$ as a random walk.

              Mathematical specification

              The random walk $Z_t$ is specified as a parameteric transformation of the white noise sequence $(\epsilon_t)_{t\geq 1}$,

              \[Z_t = Z_0 + \sigma \sum_{i = 1}^t \epsilon_t\]

              Constructing a random walk requires specifying:

              • An init_prior as a prior for $Z_0$. Default is Normal().
              • A std_prior for $\sigma$. The default is HalfNormal with a mean of 0.25.

              Constructors

              • RandomWalk(; init_prior, std_prior)

              Example usage with generate_latent

              generate_latent can be used to construct a Turing model for the random walk $Z_t$.

              First, we construct a RandomWalk struct with priors,

              using Distributions, Turing, EpiAware
               
               # Create a RandomWalk model
               rw = RandomWalk(init_prior = Normal(2., 1.),
              @@ -53,7 +53,7 @@
               rw_model = generate_latent(rw, 10)

              Now we can use the Turing PPL API to sample underlying parameters and generate the unobserved infections.

              #Sample random parameters from prior
               θ = rand(rw_model)
               #Get random walk sample path as a generated quantities from the model
              -Z_t, _ = generated_quantities(rw_model, θ)

              Fields

              • init_prior::Distributions.Sampleable

              • std_prior::Distributions.Sampleable

              source
              EpiAware.EpiLatentModels.RecordExpectedLatentType
              struct RecordExpectedLatent{M<:AbstractTuringLatentModel} <: AbstractTuringLatentModel

              Record a variable (using the Turing := syntax) in a latent model.

              # Fields
              +Z_t, _ = generated_quantities(rw_model, θ)

              Fields

              • init_prior::Distributions.Sampleable

              • std_prior::Distributions.Sampleable

              source
              EpiAware.EpiLatentModels.RecordExpectedLatentType
              struct RecordExpectedLatent{M<:AbstractTuringLatentModel} <: AbstractTuringLatentModel

              Record a variable (using the Turing := syntax) in a latent model.

              # Fields
               - `model::AbstractTuringLatentModel`: The latent model to dispatch to.
               
               # Constructors
              @@ -67,27 +67,27 @@
               mdl = RecordExpectedLatent(FixedIntercept(0.1))
               gen_latent = generate_latent(mdl, 1)
               sample(gen_latent, Prior(), 10)
              -```

              Fields

              • model::AbstractTuringLatentModel
              source
              EpiAware.EpiLatentModels.RepeatBlockType
              struct RepeatBlock <: AbstractBroadcastRule

              RepeatBlock is a struct that represents a broadcasting rule. It is a subtype of AbstractBroadcastRule.

              It repeats the latent process in blocks of size period. An example of this rule is to repeat the latent process in blocks of size 7 to model a weekly process (though for this we also provide the broadcast_weekly helper function).

              Examples

              using EpiAware
              +```

              Fields

              • model::AbstractTuringLatentModel
              source
              EpiAware.EpiLatentModels.RepeatBlockType
              struct RepeatBlock <: AbstractBroadcastRule

              RepeatBlock is a struct that represents a broadcasting rule. It is a subtype of AbstractBroadcastRule.

              It repeats the latent process in blocks of size period. An example of this rule is to repeat the latent process in blocks of size 7 to model a weekly process (though for this we also provide the broadcast_weekly helper function).

              Examples

              using EpiAware
               rule = RepeatBlock()
               latent = [1, 2, 3, 4, 5]
               n = 10
               period = 2
              -broadcast_rule(rule, latent, n, period)

              Fields

              source
              EpiAware.EpiLatentModels.RepeatEachType
              struct RepeatEach <: AbstractBroadcastRule

              RepeatEach is a struct that represents a broadcasting rule. It is a subtype of AbstractBroadcastRule.

              It repeats the latent process at each period. An example of this rule is to repeat the latent process at each day of the week (though for this we also provide the dayofweek helper function).

              Examples

              using EpiAware
              +broadcast_rule(rule, latent, n, period)

              Fields

              source
              EpiAware.EpiLatentModels.RepeatEachType
              struct RepeatEach <: AbstractBroadcastRule

              RepeatEach is a struct that represents a broadcasting rule. It is a subtype of AbstractBroadcastRule.

              It repeats the latent process at each period. An example of this rule is to repeat the latent process at each day of the week (though for this we also provide the dayofweek helper function).

              Examples

              using EpiAware
               rule = RepeatEach()
               latent = [1, 2]
               n = 10
               period = 2
              -broadcast_rule(rule, latent, n, period)

              Fields

              source
              EpiAware.EpiLatentModels.TransformLatentModelType
              struct TransformLatentModel{M<:AbstractTuringLatentModel, F<:Function} <: AbstractTuringLatentModel

              The TransformLatentModel struct represents a latent model that applies a transformation function to the latent variables generated by another latent model.

              Constructors

              • TransformLatentModel(model, trans_function): Constructs a TransformLatentModel instance with the specified latent model and transformation function.
              • TransformLatentModel(; model, trans_function): Constructs a TransformLatentModel instance with the specified latent model and transformation function using named arguments.

              Example

              using EpiAware, Distributions
              +broadcast_rule(rule, latent, n, period)

              Fields

              source
              EpiAware.EpiLatentModels.TransformLatentModelType
              struct TransformLatentModel{M<:AbstractTuringLatentModel, F<:Function} <: AbstractTuringLatentModel

              The TransformLatentModel struct represents a latent model that applies a transformation function to the latent variables generated by another latent model.

              Constructors

              • TransformLatentModel(model, trans_function): Constructs a TransformLatentModel instance with the specified latent model and transformation function.
              • TransformLatentModel(; model, trans_function): Constructs a TransformLatentModel instance with the specified latent model and transformation function using named arguments.

              Example

              using EpiAware, Distributions
               trans = TransformLatentModel(Intercept(Normal(2, 0.2)), x -> x .|> exp)
               trans_model = generate_latent(trans, 5)
              -trans_model()

              Fields

              • model::AbstractTuringLatentModel: The latent model to transform.

              • trans_function::Function: The transformation function.

              source
              EpiAware.EpiAwareBase.broadcast_ruleMethod
              broadcast_rule(_::RepeatBlock, latent, n, period) -> Any
              -

              broadcast_rule is a function that applies the RepeatBlock rule to the latent process latent to generate n samples.

              Arguments

              • rule::RepeatBlock: The broadcasting rule.
              • latent::Vector: The latent process.
              • n: The number of samples to generate.
              • period: The period of the broadcast.

              Returns

              • latent: The generated broadcasted latent periods.
              source
              EpiAware.EpiAwareBase.broadcast_ruleMethod
              broadcast_rule(_::RepeatEach, latent, n, period) -> Any
              -

              broadcast_rule is a function that applies the RepeatEach rule to the latent process latent to generate n samples.

              Arguments

              • rule::RepeatEach: The broadcasting rule.
              • latent::Vector: The latent process.
              • n: The number of samples to generate.
              • period: The period of the broadcast.

              Returns

              • latent: The generated broadcasted latent periods.
              source
              EpiAware.EpiAwareBase.broadcast_ruleMethod
              broadcast_rule(_::RepeatBlock, latent, n, period) -> Any
              +

              broadcast_rule is a function that applies the RepeatBlock rule to the latent process latent to generate n samples.

              Arguments

              • rule::RepeatBlock: The broadcasting rule.
              • latent::Vector: The latent process.
              • n: The number of samples to generate.
              • period: The period of the broadcast.

              Returns

              • latent: The generated broadcasted latent periods.
              source
              EpiAware.EpiAwareBase.broadcast_ruleMethod
              broadcast_rule(_::RepeatEach, latent, n, period) -> Any
              +

              broadcast_rule is a function that applies the RepeatEach rule to the latent process latent to generate n samples.

              Arguments

              • rule::RepeatEach: The broadcasting rule.
              • latent::Vector: The latent process.
              • n: The number of samples to generate.
              • period: The period of the broadcast.

              Returns

              • latent: The generated broadcasted latent periods.
              source
              EpiAware.EpiLatentModels.broadcast_dayofweekMethod
              broadcast_dayofweek(
                   model::AbstractTuringLatentModel;
                   link
               ) -> BroadcastLatentModel{TransformLatentModel{M, EpiAware.EpiLatentModels.var"#42#44"}, Int64, RepeatEach} where M<:AbstractTuringLatentModel
              -

              Constructs a BroadcastLatentModel appropriate for modelling the day of the week for a given AbstractTuringLatentModel.

              Arguments

              • model::AbstractTuringLatentModel: The latent model to be repeated.
              • link::Function: The link function to transform the latent model before broadcasting

              to periodic weekly. Default is x -> 7 * softmax(x) which implements constraint of the sum week effects to be 7.

              Returns

              • BroadcastLatentModel: The broadcast latent model.
              source
              EpiAware.EpiLatentModels.broadcast_weeklyMethod
              broadcast_weekly(
              +

              Constructs a BroadcastLatentModel appropriate for modelling the day of the week for a given AbstractTuringLatentModel.

              Arguments

              • model::AbstractTuringLatentModel: The latent model to be repeated.
              • link::Function: The link function to transform the latent model before broadcasting

              to periodic weekly. Default is x -> 7 * softmax(x) which implements constraint of the sum week effects to be 7.

              Returns

              • BroadcastLatentModel: The broadcast latent model.
              source
              EpiAware.EpiLatentModels.broadcast_weeklyMethod
              broadcast_weekly(
                   model::AbstractTuringLatentModel
               ) -> BroadcastLatentModel{<:AbstractTuringLatentModel, Int64, RepeatBlock}
              -

              Constructs a BroadcastLatentModel appropriate for modelling piecewise constant weekly processes for a given AbstractTuringLatentModel.

              Arguments

              • model::AbstractTuringLatentModel: The latent model to be repeated.

              Returns

              • BroadcastLatentModel: The broadcast latent model.
              source
              EpiAware.EpiLatentModels.equal_dimensionsMethod
              equal_dimensions(n::Int64, m::Int64) -> Vector{Int64}
              -

              Return a vector of dimensions that are equal or as close as possible, given the total number of elements n and the number of dimensions m. The default dimension adaptor for ConcatLatentModels.

              Arguments

              • n::Int: The total number of elements.
              • m::Int: The number of dimensions.

              Returns

              • dims::AbstractVector{Int}: A vector of dimensions, where the first element is the ceiling of n / m and the remaining elements are the floor of n / m.
              source
              +

              Constructs a BroadcastLatentModel appropriate for modelling piecewise constant weekly processes for a given AbstractTuringLatentModel.

              Arguments

              • model::AbstractTuringLatentModel: The latent model to be repeated.

              Returns

              • BroadcastLatentModel: The broadcast latent model.
              source
              EpiAware.EpiLatentModels.equal_dimensionsMethod
              equal_dimensions(n::Int64, m::Int64) -> Vector{Int64}
              +

              Return a vector of dimensions that are equal or as close as possible, given the total number of elements n and the number of dimensions m. The default dimension adaptor for ConcatLatentModels.

              Arguments

              • n::Int: The total number of elements.
              • m::Int: The number of dimensions.

              Returns

              • dims::AbstractVector{Int}: A vector of dimensions, where the first element is the ceiling of n / m and the remaining elements are the floor of n / m.
              source
              diff --git a/previews/PR510/lib/EpiObsModels/index.html b/previews/PR510/lib/EpiObsModels/index.html index e4a8fb086..8456d078f 100644 --- a/previews/PR510/lib/EpiObsModels/index.html +++ b/previews/PR510/lib/EpiObsModels/index.html @@ -1,2 +1,2 @@ -Overview · EpiAware.jl
              +Overview · EpiAware.jl
              diff --git a/previews/PR510/lib/EpiObsModels/internals/index.html b/previews/PR510/lib/EpiObsModels/internals/index.html index c4ba93729..da5aa6582 100644 --- a/previews/PR510/lib/EpiObsModels/internals/index.html +++ b/previews/PR510/lib/EpiObsModels/internals/index.html @@ -1,38 +1,43 @@ -Internal API · EpiAware.jl

              Internal Documentation

              Documentation for EpiObsModels.jl's internal interface.

              Contents

              Index

              Internal API

              EpiAware.EpiObsModels.LDStepType
              struct LDStep{D<:(AbstractVector{<:Real})} <: AbstractAccumulationStep

              The LatentDelay step function struct


              Fields

              • rev_pmf::AbstractVector{<:Real}
              source
              EpiAware.EpiAwareBase.generate_observationsMethod
              generate_observations(
              +Internal API · EpiAware.jl

              Internal Documentation

              Documentation for EpiObsModels.jl's internal interface.

              Contents

              Index

              Internal API

              EpiAware.EpiObsModels.LDStepType
              struct LDStep{D<:(AbstractVector{<:Real})} <: AbstractAccumulationStep

              The LatentDelay step function struct


              Fields

              • rev_pmf::AbstractVector{<:Real}
              source
              EpiAware.EpiAwareBase.generate_observationsMethod
              generate_observations(
                   obs_model::AbstractTuringObservationErrorModel,
                   y_t,
                   Y_t
               ) -> Any
              -

              Generates observations from an observation error model. It provides support for missing values in observations (y_t), and expected observations (Y_t) that are shorter than observations. When this is the case it assumes that the expected observations are the last length(Y_t) elements of y_t. It also pads the expected observations with a small value (1e-6) to mitigate potential numerical issues.

              It dispatches to the observation_error function to generate the observation error distribution which uses priors generated by generate_observation_error_priors submodel. For most observation error models specific implementations of observation_error and generate_observation_error_priors are required but a specific implementation of generate_observations is not required.

              source
              EpiAware.EpiAwareBase.generate_observationsMethod
              generate_observations(
              +

              Generates observations from an observation error model. It provides support for missing values in observations (y_t), and expected observations (Y_t) that are shorter than observations. When this is the case it assumes that the expected observations are the last length(Y_t) elements of y_t. It also pads the expected observations with a small value (1e-6) to mitigate potential numerical issues.

              It dispatches to the observation_error function to generate the observation error distribution which uses priors generated by generate_observation_error_priors submodel. For most observation error models specific implementations of observation_error and generate_observation_error_priors are required but a specific implementation of generate_observations is not required.

              source
              EpiAware.EpiAwareBase.generate_observationsMethod
              generate_observations(
                   obs_model::Ascertainment,
                   y_t,
                   Y_t
               ) -> Any
              -

              Generates observations based on the LatentDelay observation model.

              Arguments

              • obs_model::Ascertainment: The Ascertainment model.
              • y_t: The current state of the observations.
              • Y_t` : The expected observations.

              Returns

              • y_t: The updated observations.
              • expected_aux: Additional expected observation-related variables.
              • obs_aux: Additional observation-related variables.
              source
              EpiAware.EpiAwareBase.generate_observationsMethod
              generate_observations(
              +

              Generates observations based on the LatentDelay observation model.

              Arguments

              • obs_model::Ascertainment: The Ascertainment model.
              • y_t: The current state of the observations.
              • Y_t` : The expected observations.

              Returns

              • y_t: The updated observations.
              • expected_aux: Additional expected observation-related variables.
              • obs_aux: Additional observation-related variables.
              source
              EpiAware.EpiAwareBase.generate_observationsMethod
              generate_observations(
                   obs_model::LatentDelay,
                   y_t,
                   Y_t
               ) -> Any
              -

              Generates observations based on the LatentDelay observation model.

              Arguments

              • obs_model::LatentDelay: The LatentDelay observation model.
              • y_t: The current observations.
              • I_t: The current infection indicator.

              Returns

              • y_t: The updated observations.
              source
              EpiAware.EpiAwareBase.generate_observationsMethod
              generate_observations(
              +

              Generates observations based on the LatentDelay observation model.

              Arguments

              • obs_model::LatentDelay: The LatentDelay observation model.
              • y_t: The current observations.
              • I_t: The current infection indicator.

              Returns

              • y_t: The updated observations.
              source
              EpiAware.EpiAwareBase.generate_observationsMethod
              generate_observations(
                   obs_model::StackObservationModels,
                   y_t::NamedTuple,
                   Y_t::AbstractVector
               ) -> Any
              -

              Generate observations from a stack of observation models. Maps Y_t to a NamedTuple of the same length as y_t assuming a 1 to many mapping.

              Arguments

              • obs_model::StackObservationModels: The stack of observation models.
              • y_t::NamedTuple: The observed values.
              • Y_t::AbstractVector: The expected values.
              source
              EpiAware.EpiAwareBase.generate_observationsMethod
              generate_observations(
              +

              Generate observations from a stack of observation models. Maps Y_t to a NamedTuple of the same length as y_t assuming a 1 to many mapping.

              Arguments

              • obs_model::StackObservationModels: The stack of observation models.
              • y_t::NamedTuple: The observed values.
              • Y_t::AbstractVector: The expected values.
              source
              EpiAware.EpiAwareBase.generate_observationsMethod
              generate_observations(
                   obs_model::StackObservationModels,
                   y_t::NamedTuple,
                   Y_t::NamedTuple
               ) -> Any
              -

              Generate observations from a stack of observation models. Assumes a 1 to 1 mapping between y_t and Y_t.

              Arguments

              • obs_model::StackObservationModels: The stack of observation models.
              • y_t::NamedTuple: The observed values.
              • Y_t::NamedTuple: The expected values.
              source
              EpiAware.EpiAwareUtils.get_stateMethod
              get_state(
              +

              Generate observations from a stack of observation models. Assumes a 1 to 1 mapping between y_t and Y_t.

              Arguments

              • obs_model::StackObservationModels: The stack of observation models.
              • y_t::NamedTuple: The observed values.
              • Y_t::NamedTuple: The expected values.
              source
              EpiAware.EpiAwareBase.generate_observationsMethod
              generate_observations(
              +    obs::TransformObservationModel,
              +    y_t,
              +    Y_t
              +) -> Any
              +

              Generates observations or accumulates log-likelihood based on the TransformObservationModel.

              Arguments

              • obs::TransformObservationModel: The TransformObservationModel.
              • y_t: The current state of the observations.
              • Y_t: The expected observations.

              Returns

              • y_t: The updated observations.
              source
              EpiAware.EpiObsModels.NegativeBinomialMeanClustMethod
              NegativeBinomialMeanClust(μ, α) -> SafeNegativeBinomial
              -

              Compute the mean-cluster factor negative binomial distribution.

              Arguments

              • μ: The mean of the distribution.
              • α: The clustering factor parameter.

              Returns

              A NegativeBinomial distribution object.

              source
              EpiAware.EpiObsModels.NegativeBinomialMeanClustMethod
              NegativeBinomialMeanClust(μ, α) -> SafeNegativeBinomial
              +

              Compute the mean-cluster factor negative binomial distribution.

              Arguments

              • μ: The mean of the distribution.
              • α: The clustering factor parameter.

              Returns

              A NegativeBinomial distribution object.

              source
              EpiAware.EpiObsModels.generate_observation_kernelMethod
              generate_observation_kernel(
                   delay_int,
                   time_horizon;
                   partial
               ) -> Any
              -

              Generate an observation kernel matrix based on the given delay interval and time horizon.

              Arguments

              • delay_int::Vector{Float64}: The delay PMF vector.
              • time_horizon::Int: The number of time steps of the observation period.
              • partial::Bool: Whether to generate a partial observation kernel matrix.

              Returns

              • K::SparseMatrixCSC{Float64, Int}: The observation kernel matrix.
              source
              +

              Generate an observation kernel matrix based on the given delay interval and time horizon.

              Arguments

              • delay_int::Vector{Float64}: The delay PMF vector.
              • time_horizon::Int: The number of time steps of the observation period.
              • partial::Bool: Whether to generate a partial observation kernel matrix.

              Returns

              • K::SparseMatrixCSC{Float64, Int}: The observation kernel matrix.
              source
              diff --git a/previews/PR510/lib/EpiObsModels/public/index.html b/previews/PR510/lib/EpiObsModels/public/index.html index 0092c2bf7..51cb4f7fe 100644 --- a/previews/PR510/lib/EpiObsModels/public/index.html +++ b/previews/PR510/lib/EpiObsModels/public/index.html @@ -1,20 +1,20 @@ -Public API · EpiAware.jl

              Public Documentation

              Documentation for EpiObsModels.jl's public interface.

              See the Internals section of the manual for internal package docs covering all submodules.

              Contents

              Index

              Public API

              EpiAware.EpiObsModels.AggregateType
              struct Aggregate{M<:AbstractTuringObservationModel, I<:(AbstractVector{<:Int64}), J<:(AbstractVector{<:Bool})} <: AbstractTuringObservationModel

              Aggregates observations over a specified time period. For efficiency it also only passes the aggregated observations to the submodel. The aggregation vector is internally broadcasted to the length of the observations and the present vector is broadcasted to the length of the aggregation vector using broadcast_n.

              Fields

              • model::AbstractTuringObservationModel: The submodel to use for the aggregated observations.
              • aggregation::AbstractVector{<: Int}: The number of time periods to aggregate over.
              • present::AbstractVector{<: Bool}: A vector of booleans indicating whether the observation is present or not.

              Constructors

              • Aggregate(model, aggregation): Constructs an Aggregate object and automatically sets the present field.
              • Aggregate(; model, aggregation): Constructs an Aggregate object and automatically sets the present field using named keyword arguments

              Examples

              using EpiAware
              +Public API · EpiAware.jl

              Public Documentation

              Documentation for EpiObsModels.jl's public interface.

              See the Internals section of the manual for internal package docs covering all submodules.

              Contents

              Index

              Public API

              EpiAware.EpiObsModels.AggregateType
              struct Aggregate{M<:AbstractTuringObservationModel, I<:(AbstractVector{<:Int64}), J<:(AbstractVector{<:Bool})} <: AbstractTuringObservationModel

              Aggregates observations over a specified time period. For efficiency it also only passes the aggregated observations to the submodel. The aggregation vector is internally broadcasted to the length of the observations and the present vector is broadcasted to the length of the aggregation vector using broadcast_n.

              Fields

              • model::AbstractTuringObservationModel: The submodel to use for the aggregated observations.
              • aggregation::AbstractVector{<: Int}: The number of time periods to aggregate over.
              • present::AbstractVector{<: Bool}: A vector of booleans indicating whether the observation is present or not.

              Constructors

              • Aggregate(model, aggregation): Constructs an Aggregate object and automatically sets the present field.
              • Aggregate(; model, aggregation): Constructs an Aggregate object and automatically sets the present field using named keyword arguments

              Examples

              using EpiAware
               weekly_agg = Aggregate(PoissonError(), [0, 0, 0, 0, 7, 0, 0])
               gen_obs = generate_observations(weekly_agg, missing, fill(1, 28))
              -gen_obs()

              Fields

              • model::AbstractTuringObservationModel

              • aggregation::AbstractVector{<:Int64}

              • present::AbstractVector{<:Bool}

              source
              EpiAware.EpiObsModels.AscertainmentType
              struct Ascertainment{M<:AbstractTuringObservationModel, T<:AbstractTuringLatentModel, F<:Function, P<:String} <: AbstractTuringObservationModel

              The Ascertainment struct represents an observation model that incorporates a ascertainment model. If a latent_prefixis supplied the latent_model is wrapped in a call to PrefixLatentModel.

              Constructors

              • Ascertainment(model::M, latent_model::T, transform::F, latent_prefix::P) where {M <: AbstractTuringObservationModel, T <: AbstractTuringLatentModel, F <: Function, P <: String}: Constructs an Ascertainment instance with the specified observation model, latent model, transform function, and latent prefix.
              • Ascertainment(; model::M, latent_model::T, transform::F = (Y_t, x) -> xexpy.(Y_t, x), latent_prefix::P = "Ascertainment") where {M <: AbstractTuringObservationModel, T <: AbstractTuringLatentModel, F <: Function, P <: String}: Constructs an Ascertainment instance with the specified observation model, latent model, optional transform function (default: (Y_t, x) -> xexpy.(Y_t, x)), and optional latent prefix (default: "Ascertainment").

              Examples

              using EpiAware, Turing
              +gen_obs()

              Fields

              • model::AbstractTuringObservationModel

              • aggregation::AbstractVector{<:Int64}

              • present::AbstractVector{<:Bool}

              source
              EpiAware.EpiObsModels.AscertainmentType
              struct Ascertainment{M<:AbstractTuringObservationModel, T<:AbstractTuringLatentModel, F<:Function, P<:String} <: AbstractTuringObservationModel

              The Ascertainment struct represents an observation model that incorporates a ascertainment model. If a latent_prefixis supplied the latent_model is wrapped in a call to PrefixLatentModel.

              Constructors

              • Ascertainment(model::M, latent_model::T, transform::F, latent_prefix::P) where {M <: AbstractTuringObservationModel, T <: AbstractTuringLatentModel, F <: Function, P <: String}: Constructs an Ascertainment instance with the specified observation model, latent model, transform function, and latent prefix.
              • Ascertainment(; model::M, latent_model::T, transform::F = (Y_t, x) -> xexpy.(Y_t, x), latent_prefix::P = "Ascertainment") where {M <: AbstractTuringObservationModel, T <: AbstractTuringLatentModel, F <: Function, P <: String}: Constructs an Ascertainment instance with the specified observation model, latent model, optional transform function (default: (Y_t, x) -> xexpy.(Y_t, x)), and optional latent prefix (default: "Ascertainment").

              Examples

              using EpiAware, Turing
               obs = Ascertainment(model = NegativeBinomialError(), latent_model = FixedIntercept(0.1))
               gen_obs = generate_observations(obs, missing, fill(100, 10))
              -rand(gen_obs)

              Fields

              • model::AbstractTuringObservationModel: The underlying observation model.

              • latent_model::AbstractTuringLatentModel: The latent model.

              • transform::Function: The function used to transform Y_t and the latent model output.

              • latent_prefix::String

              source
              EpiAware.EpiObsModels.LatentDelayType
              struct LatentDelay{M<:AbstractTuringObservationModel, T<:(AbstractVector{<:Real})} <: AbstractTuringObservationModel

              The LatentDelay struct represents an observation model that introduces a latent delay in the observations. It is a subtype of AbstractTuringObservationModel.

              Note that the LatentDelay observation model shortens the expected observation vector by the length of the delay distribution and this is then passed to the underlying observation model. This is to prevent fitting to partially observed data.

              Fields

              • model::M: The underlying observation model.
              • rev_pmf::T: The probability mass function (PMF) representing the delay distribution reversed.

              Constructors

              • LatentDelay(model::M, distribution::C; D = nothing, Δd = 1.0) where {M <: AbstractTuringObservationModel, C <: ContinuousDistribution}: Constructs a LatentDelay object with the given underlying observation model and continuous distribution. The D parameter specifies the right truncation of the distribution, with default D = nothing indicates that the distribution should be truncated at its 99th percentile rounded to nearest multiple of Δd. The Δd parameter specifies the width of each delay interval.

              • LatentDelay(model::M, pmf::T) where {M <: AbstractTuringObservationModel, T <: AbstractVector{<:Real}}: Constructs a LatentDelay object with the given underlying observation model and delay PMF.

              Examples

              using Distributions, Turing, EpiAware
              +rand(gen_obs)

              Fields

              • model::AbstractTuringObservationModel: The underlying observation model.

              • latent_model::AbstractTuringLatentModel: The latent model.

              • transform::Function: The function used to transform Y_t and the latent model output.

              • latent_prefix::String

              source
              EpiAware.EpiObsModels.LatentDelayType
              struct LatentDelay{M<:AbstractTuringObservationModel, T<:(AbstractVector{<:Real})} <: AbstractTuringObservationModel

              The LatentDelay struct represents an observation model that introduces a latent delay in the observations. It is a subtype of AbstractTuringObservationModel.

              Note that the LatentDelay observation model shortens the expected observation vector by the length of the delay distribution and this is then passed to the underlying observation model. This is to prevent fitting to partially observed data.

              Fields

              • model::M: The underlying observation model.
              • rev_pmf::T: The probability mass function (PMF) representing the delay distribution reversed.

              Constructors

              • LatentDelay(model::M, distribution::C; D = nothing, Δd = 1.0) where {M <: AbstractTuringObservationModel, C <: ContinuousDistribution}: Constructs a LatentDelay object with the given underlying observation model and continuous distribution. The D parameter specifies the right truncation of the distribution, with default D = nothing indicates that the distribution should be truncated at its 99th percentile rounded to nearest multiple of Δd. The Δd parameter specifies the width of each delay interval.

              • LatentDelay(model::M, pmf::T) where {M <: AbstractTuringObservationModel, T <: AbstractVector{<:Real}}: Constructs a LatentDelay object with the given underlying observation model and delay PMF.

              Examples

              using Distributions, Turing, EpiAware
               obs = LatentDelay(NegativeBinomialError(), truncated(Normal(5.0, 2.0), 0.0, Inf))
               obs_model = generate_observations(obs, missing, fill(10, 30))
              -obs_model()

              Fields

              • model::AbstractTuringObservationModel

              • rev_pmf::AbstractVector{<:Real}

              source
              EpiAware.EpiObsModels.NegativeBinomialErrorType
              struct NegativeBinomialError{S<:Distributions.Sampleable} <: AbstractTuringObservationErrorModel

              The NegativeBinomialError struct represents an observation model for negative binomial errors. It is a subtype of AbstractTuringObservationModel.

              Constructors

              • NegativeBinomialError(; cluster_factor_prior::Distribution = HalfNormal(0.1)): Constructs a NegativeBinomialError object with default values for the cluster factor prior.
              • NegativeBinomialError(cluster_factor_prior::Distribution): Constructs a NegativeBinomialError object with a specified cluster factor prior.

              Examples

              using Distributions, Turing, EpiAware
              +obs_model()

              Fields

              • model::AbstractTuringObservationModel

              • rev_pmf::AbstractVector{<:Real}

              source
              EpiAware.EpiObsModels.NegativeBinomialErrorType
              struct NegativeBinomialError{S<:Distributions.Sampleable} <: AbstractTuringObservationErrorModel

              The NegativeBinomialError struct represents an observation model for negative binomial errors. It is a subtype of AbstractTuringObservationModel.

              Constructors

              • NegativeBinomialError(; cluster_factor_prior::Distribution = HalfNormal(0.1)): Constructs a NegativeBinomialError object with default values for the cluster factor prior.
              • NegativeBinomialError(cluster_factor_prior::Distribution): Constructs a NegativeBinomialError object with a specified cluster factor prior.

              Examples

              using Distributions, Turing, EpiAware
               nb = NegativeBinomialError()
               nb_model = generate_observations(nb, missing, fill(10, 10))
              -rand(nb_model)

              Fields

              • cluster_factor_prior::Distributions.Sampleable: The prior distribution for the cluster factor.
              source
              EpiAware.EpiObsModels.PoissonErrorType
              struct PoissonError <: AbstractTuringObservationErrorModel

              The PoissonError struct represents an observation model for Poisson errors. It is a subtype of AbstractTuringObservationErrorModel.

              Constructors

              • PoissonError(): Constructs a PoissonError object.

              Examples

              using Distributions, Turing, EpiAware
              +rand(nb_model)

              Fields

              • cluster_factor_prior::Distributions.Sampleable: The prior distribution for the cluster factor.
              source
              EpiAware.EpiObsModels.PoissonErrorType
              struct PoissonError <: AbstractTuringObservationErrorModel

              The PoissonError struct represents an observation model for Poisson errors. It is a subtype of AbstractTuringObservationErrorModel.

              Constructors

              • PoissonError(): Constructs a PoissonError object.

              Examples

              using Distributions, Turing, EpiAware
               poi = PoissonError()
               poi_model = generate_observations(poi, missing, fill(10, 10))
              -rand(poi_model)

              Fields

              source
              EpiAware.EpiObsModels.PrefixObservationModelType
              struct PrefixObservationModel{M<:AbstractTuringObservationModel, P<:String} <: AbstractTuringObservationModel
              Generate an observation model with a prefix. A lightweight wrapper around `EpiAwareUtils.prefix_submodel`.
              +rand(poi_model)

              Fields

              source
              EpiAware.EpiObsModels.PrefixObservationModelType
              struct PrefixObservationModel{M<:AbstractTuringObservationModel, P<:String} <: AbstractTuringObservationModel
              Generate an observation model with a prefix. A lightweight wrapper around `EpiAwareUtils.prefix_submodel`.
               
               # Constructors
               - `PrefixObservationModel(model::M, prefix::P)`: Create a `PrefixObservationModel` with the observation model `model` and the prefix `prefix`.
              @@ -26,7 +26,7 @@
               observation_model = PrefixObservationModel(Poisson(), "Test")
               obs = generate_observations(observation_model, 10)
               rand(obs)
              -```

              Fields

              • model::AbstractTuringObservationModel: The observation model

              • prefix::String: The prefix for the observation model

              source
              EpiAware.EpiObsModels.RecordExpectedObsType
              struct RecordExpectedObs{M<:AbstractTuringObservationModel} <: AbstractTuringObservationModel

              Record a variable (using the Turing := syntax) in the observation model.

              # Fields
              +```

              Fields

              • model::AbstractTuringObservationModel: The observation model

              • prefix::String: The prefix for the observation model

              source
              EpiAware.EpiObsModels.RecordExpectedObsType
              struct RecordExpectedObs{M<:AbstractTuringObservationModel} <: AbstractTuringObservationModel

              Record a variable (using the Turing := syntax) in the observation model.

              # Fields
               - `model::AbstractTuringObservationModel`: The observation model to dispatch to.
               
               # Constructors
              @@ -40,7 +40,7 @@
               mdl = RecordExpectedObs(NegativeBinomialError())
               gen_obs = generate_observations(mdl, missing, fill(100, 10))
               sample(gen_obs, Prior(), 10)
              -```

              Fields

              • model::AbstractTuringObservationModel
              source
              EpiAware.EpiObsModels.StackObservationModelsType
              struct StackObservationModels{M<:(AbstractVector{<:AbstractTuringObservationModel}), N<:(AbstractVector{<:AbstractString})} <: AbstractTuringObservationModel

              A stack of observation models that are looped over to generate observations for each model in the stack. Note that the model names are used to prefix the parameters in each model (so if I have a model named cases and a parameter y_t, the parameter in the model will be cases.y_t). Inside the constructor PrefixObservationModel is wrapped around each observation model.

              Constructors

              • StackObservationModels(models::Vector{<:AbstractTuringObservationModel}, model_names::Vector{<:AbstractString}): Construct a StackObservationModels object with a vector of observation models and a vector of model names.
                • `StackObservationModels(; models::Vector{<:AbstractTuringObservationModel},
                model_names::Vector{<:AbstractString}): Construct aStackObservationModels` object with a vector of observation models and a vector of model names.
              • StackObservationModels(models::NamedTuple{names, T}): Construct a StackObservationModels object with a named tuple of observation models. The model names are automatically generated from the keys of the named tuple.

              Example

              using EpiAware, Turing
              +```

              Fields

              • model::AbstractTuringObservationModel
              source
              EpiAware.EpiObsModels.StackObservationModelsType
              struct StackObservationModels{M<:(AbstractVector{<:AbstractTuringObservationModel}), N<:(AbstractVector{<:AbstractString})} <: AbstractTuringObservationModel

              A stack of observation models that are looped over to generate observations for each model in the stack. Note that the model names are used to prefix the parameters in each model (so if I have a model named cases and a parameter y_t, the parameter in the model will be cases.y_t). Inside the constructor PrefixObservationModel is wrapped around each observation model.

              Constructors

              • StackObservationModels(models::Vector{<:AbstractTuringObservationModel}, model_names::Vector{<:AbstractString}): Construct a StackObservationModels object with a vector of observation models and a vector of model names.
                • `StackObservationModels(; models::Vector{<:AbstractTuringObservationModel},
                model_names::Vector{<:AbstractString}): Construct aStackObservationModels` object with a vector of observation models and a vector of model names.
              • StackObservationModels(models::NamedTuple{names, T}): Construct a StackObservationModels object with a named tuple of observation models. The model names are automatically generated from the keys of the named tuple.

              Example

              using EpiAware, Turing
               
               obs = StackObservationModels(
                   (cases = PoissonError(), deaths = NegativeBinomialError())
              @@ -54,7 +54,11 @@
               cases_y_t
               
               deaths_y_t = group(samples, "deaths.y_t")
              -deaths_y_t

              Fields

              • models::AbstractVector{<:AbstractTuringObservationModel}: A vector of observation models.

              • model_names::AbstractVector{<:AbstractString}: A vector of observation model names

              source
              EpiAware.EpiObsModels.ascertainment_dayofweekMethod
              ascertainment_dayofweek(
              +deaths_y_t

              Fields

              • models::AbstractVector{<:AbstractTuringObservationModel}: A vector of observation models.

              • model_names::AbstractVector{<:AbstractString}: A vector of observation model names

              source
              EpiAware.EpiObsModels.TransformObservationModelType
              struct TransformObservationModel{M<:AbstractTuringObservationModel, F<:Function} <: AbstractTuringObservationModel

              The TransformObservationModel struct represents an observation model that applies a transformation function to the expected observations before passing them to the underlying observation model.

              Fields

              • model::M: The underlying observation model.
              • transform::F: The transformation function applied to the expected observations.

              Constructors

              • TransformObservationModel(model::M, transform::F = x -> log1pexp.(x)) where {M <: AbstractTuringObservationModel, F <: Function}: Constructs a TransformObservationModel instance with the specified observation model and a default transformation function.
              • TransformObservationModel(; model::M, transform::F = x -> log1pexp.(x)) where {M <: AbstractTuringObservationModel, F <: Function}: Constructs a TransformObservationModel instance using named arguments.
              • TransformObservationModel(model::M; transform::F = x -> log1pexp.(x)) where {M <: AbstractTuringObservationModel, F <: Function}: Constructs a TransformObservationModel instance with the specified observation model and a default transformation function.

              Example

              using EpiAware, Distributions, LogExpFunctions
              +
              +trans_obs = TransformObservationModel(NegativeBinomialError())
              +gen_obs = generate_observations(trans_obs, missing, fill(10.0, 30))
              +gen_obs()

              Fields

              • model::AbstractTuringObservationModel: The underlying observation model.

              • transform::Function: The transformation function. The default is log1pexp which is the softplus transformation

              source
              EpiAware.EpiObsModels.ascertainment_dayofweekMethod
              ascertainment_dayofweek(
                   model::AbstractTuringObservationModel;
                   latent_model,
                   transform,
              @@ -64,27 +68,27 @@
               obs = ascertainment_dayofweek(PoissonError())
               gen_obs = generate_observations(obs, missing, fill(100, 14))
               gen_obs()
              -rand(gen_obs)
              source
              EpiAware.EpiObsModels.generate_observation_error_priorsMethod
              generate_observation_error_priors(
                   obs_model::AbstractTuringObservationErrorModel,
                   y_t,
                   Y_t
               ) -> Any
              -

              Generates priors for the observation error model. This should return a named tuple containing the priors required for generating the observation error distribution.

              source
              EpiAware.EpiObsModels.generate_observation_error_priorsMethod
              generate_observation_error_priors(
                   obs_model::NegativeBinomialError,
                   Y_t,
                   y_t
               ) -> Any
              -

              Generates observation error priors based on the NegativeBinomialError observation model. This function generates the cluster factor prior for the negative binomial error model.

              source
              EpiAware.EpiObsModels.observation_errorMethod
              observation_error(
              +

              Generates observation error priors based on the NegativeBinomialError observation model. This function generates the cluster factor prior for the negative binomial error model.

              source
              EpiAware.EpiObsModels.observation_errorMethod
              observation_error(
                   obs_model::AbstractTuringObservationErrorModel,
                   Y_t
               ) -> SafePoisson
              -

              The observation error distribution for the observation error model. This function should return the distribution for the observation error given the expected observation value Y_t and the priors generated by generate_observation_error_priors.

              source
              EpiAware.EpiObsModels.observation_errorMethod
              observation_error(
              +

              The observation error distribution for the observation error model. This function should return the distribution for the observation error given the expected observation value Y_t and the priors generated by generate_observation_error_priors.

              source
              EpiAware.EpiObsModels.observation_errorMethod
              observation_error(
                   obs_model::NegativeBinomialError,
                   Y_t,
                   sq_cluster_factor
               ) -> SafeNegativeBinomial
              -

              This function generates the observation error model based on the negative binomial error model with a positive shift. It dispatches to the NegativeBinomialMeanClust distribution.

              source
              EpiAware.EpiObsModels.observation_errorMethod
              observation_error(
              +

              This function generates the observation error model based on the negative binomial error model with a positive shift. It dispatches to the NegativeBinomialMeanClust distribution.

              source
              EpiAware.EpiObsModels.observation_errorMethod
              observation_error(
                   obs_model::PoissonError,
                   Y_t
               ) -> SafePoisson
              -

              The observation error model for Poisson errors. This function generates the observation error model based on the Poisson error model.

              source
              +

              The observation error model for Poisson errors. This function generates the observation error model based on the Poisson error model.

              source
              diff --git a/previews/PR510/lib/index.html b/previews/PR510/lib/index.html index 3f96aae2c..ab4e0c8f5 100644 --- a/previews/PR510/lib/index.html +++ b/previews/PR510/lib/index.html @@ -1,2 +1,2 @@ -Overview · EpiAware.jl

              API reference

              Welcome to the EpiAware API reference! This section is designed to help you understand the API of the package which is split into submodules.

              The EpiAware package itself contains no functions or types. Instead, it re-exports the functions and types from its submodules. See the sidebar for the list of submodules.

              +Overview · EpiAware.jl

              API reference

              Welcome to the EpiAware API reference! This section is designed to help you understand the API of the package which is split into submodules.

              The EpiAware package itself contains no functions or types. Instead, it re-exports the functions and types from its submodules. See the sidebar for the list of submodules.

              diff --git a/previews/PR510/lib/internals/index.html b/previews/PR510/lib/internals/index.html index 974362131..24518c332 100644 --- a/previews/PR510/lib/internals/index.html +++ b/previews/PR510/lib/internals/index.html @@ -1,2 +1,2 @@ -Internal API · EpiAware.jl
              +Internal API · EpiAware.jl
              diff --git a/previews/PR510/lib/public/index.html b/previews/PR510/lib/public/index.html index b8bf79609..34b2e6c53 100644 --- a/previews/PR510/lib/public/index.html +++ b/previews/PR510/lib/public/index.html @@ -1,2 +1,2 @@ -Public API · EpiAware.jl
              +Public API · EpiAware.jl
              diff --git a/previews/PR510/objects.inv b/previews/PR510/objects.inv index 625865dd2..482238e72 100644 Binary files a/previews/PR510/objects.inv and b/previews/PR510/objects.inv differ diff --git a/previews/PR510/overview/index.html b/previews/PR510/overview/index.html index f76eeaeda..6b501de6b 100644 --- a/previews/PR510/overview/index.html +++ b/previews/PR510/overview/index.html @@ -1,2 +1,2 @@ -Overview · EpiAware.jl

              Overview of the EpiAware Software Ecosystem

              EpiAware is not a standard toolkit for infectious disease modelling.

              It seeks to be highly modular and composable for advanced users whilst still providing opinionated workflows for those who are new to the field. Developed by the authors behind other widely used infectious disease modelling packages such as EpiNow2, epinowcast, and epidist, alongside experts in infectious disease modelling in Julia,EpiAware is designed to go beyond the capabilities of these packages by providing a more flexible and extensible framework for modelling and inference of infectious disease dynamics.

              Package Features

              • Flexible: The package is designed to be flexible and extensible, and to provide a consistent interface for fitting and simulating models.
              • Modular: The package is designed to be modular, with a clear separation between the model and the data.
              • Extensible: The package is designed to be extensible, with a clear separation between the model and the data.
              • Consistent: The package is designed to provide a consistent interface for fitting and simulating models.
              • Efficient: The package is designed to be efficient, with a clear separation between the model and the data.

              Package structure

              EpiAware.jl is a wrapper around a series of submodules, each of which provides a different aspect of the package's functionality (much like the tidyverse in R). The package is designed to be modular, with a clear separation between modules and between modules and data. Currently included modules are:

              • EpiAwareBase: The core module, which provides the underlying abstract types and functions for the package.
              • EpiAwareUtils: A utility module, which provides a series of utility functions for working with the package.
              • EpiInference: An inference module, which provides a series of functions for fitting models to data. Builds on top of Turing.jl.
              • EpiInfModels: Provides tools for composing models of the disease transmission process. Builds on top of Turing.jl, in particular the DynamicPPL.jl interface.
              • EpiLatentModels: Provides tools for composing latent models such as random walks, autoregressive models, etc. Builds on top of DynamicPPL.jl. Used by all other modelling modules to define latent processes.
              • EpiObsModels: Provides tools for composing observation models, such as Poisson, Binomial, etc. Builds on top of DynamicPPL.jl.

              Using the package

              We support two primary workflows for using the package:

              • EpiProblem: A high-level interface for defining and fitting models to data. This is the recommended way to use the package.
              • Turing interface: A lower-level interface for defining and fitting models to data. This is the more flexible way to use the package and may also be more familiar to users of Turing.jl.

              See the getting started section for tutorials on each of these workflows.

              +Overview · EpiAware.jl

              Overview of the EpiAware Software Ecosystem

              EpiAware is not a standard toolkit for infectious disease modelling.

              It seeks to be highly modular and composable for advanced users whilst still providing opinionated workflows for those who are new to the field. Developed by the authors behind other widely used infectious disease modelling packages such as EpiNow2, epinowcast, and epidist, alongside experts in infectious disease modelling in Julia,EpiAware is designed to go beyond the capabilities of these packages by providing a more flexible and extensible framework for modelling and inference of infectious disease dynamics.

              Package Features

              • Flexible: The package is designed to be flexible and extensible, and to provide a consistent interface for fitting and simulating models.
              • Modular: The package is designed to be modular, with a clear separation between the model and the data.
              • Extensible: The package is designed to be extensible, with a clear separation between the model and the data.
              • Consistent: The package is designed to provide a consistent interface for fitting and simulating models.
              • Efficient: The package is designed to be efficient, with a clear separation between the model and the data.

              Package structure

              EpiAware.jl is a wrapper around a series of submodules, each of which provides a different aspect of the package's functionality (much like the tidyverse in R). The package is designed to be modular, with a clear separation between modules and between modules and data. Currently included modules are:

              • EpiAwareBase: The core module, which provides the underlying abstract types and functions for the package.
              • EpiAwareUtils: A utility module, which provides a series of utility functions for working with the package.
              • EpiInference: An inference module, which provides a series of functions for fitting models to data. Builds on top of Turing.jl.
              • EpiInfModels: Provides tools for composing models of the disease transmission process. Builds on top of Turing.jl, in particular the DynamicPPL.jl interface.
              • EpiLatentModels: Provides tools for composing latent models such as random walks, autoregressive models, etc. Builds on top of DynamicPPL.jl. Used by all other modelling modules to define latent processes.
              • EpiObsModels: Provides tools for composing observation models, such as Poisson, Binomial, etc. Builds on top of DynamicPPL.jl.

              Using the package

              We support two primary workflows for using the package:

              • EpiProblem: A high-level interface for defining and fitting models to data. This is the recommended way to use the package.
              • Turing interface: A lower-level interface for defining and fitting models to data. This is the more flexible way to use the package and may also be more familiar to users of Turing.jl.

              See the getting started section for tutorials on each of these workflows.

              diff --git a/previews/PR510/release-notes/index.html b/previews/PR510/release-notes/index.html index 84e7f7377..4bb6d2f1c 100644 --- a/previews/PR510/release-notes/index.html +++ b/previews/PR510/release-notes/index.html @@ -1,2 +1,2 @@ -Release notes · EpiAware.jl
              +Release notes · EpiAware.jl
              diff --git a/previews/PR510/search_index.js b/previews/PR510/search_index.js index ff09424f1..cf1710a88 100644 --- a/previews/PR510/search_index.js +++ b/previews/PR510/search_index.js @@ -1,3 +1,3 @@ var documenterSearchIndex = {"docs": -[{"location":"getting-started/installation/#Installation","page":"Installation","title":"Installation","text":"","category":"section"},{"location":"getting-started/installation/","page":"Installation","title":"Installation","text":"Eventually, EpiAware is likely to be added to the Julia registry. Until then, you can install it from the /EpiAware sub-directory of this repository by running the following command in the Julia REPL:","category":"page"},{"location":"getting-started/installation/","page":"Installation","title":"Installation","text":"using Pkg; Pkg.add(url=\"https://github.com/CDCgov/Rt-without-renewal\", subdir=\"EpiAware\")","category":"page"},{"location":"lib/EpiInfModels/#EpiInfModels.jl","page":"Overview","title":"EpiInfModels.jl","text":"","category":"section"},{"location":"lib/EpiInfModels/","page":"Overview","title":"Overview","text":"This package provides infectious disease transmission models for the EpiAware ecosystem.","category":"page"},{"location":"lib/EpiInfModels/#API","page":"Overview","title":"API","text":"","category":"section"},{"location":"lib/EpiInfModels/","page":"Overview","title":"Overview","text":"Pages = [\"lib/EpiInfModels/public.md\", \"lib/EpiInfModels/internals.md\"]","category":"page"},{"location":"getting-started/quickstart/#Quickstart","page":"Quickstart","title":"Quickstart","text":"","category":"section"},{"location":"getting-started/quickstart/","page":"Quickstart","title":"Quickstart","text":"Get up and running with EpiAware in just a few minutes using this quickstart guide.","category":"page"},{"location":"lib/EpiLatentModels/#EpiLatentModels.jl","page":"Overview","title":"EpiLatentModels.jl","text":"","category":"section"},{"location":"lib/EpiLatentModels/","page":"Overview","title":"Overview","text":"This package provides latent variable models for the EpiAware ecosystem.","category":"page"},{"location":"lib/EpiLatentModels/#API","page":"Overview","title":"API","text":"","category":"section"},{"location":"lib/EpiLatentModels/","page":"Overview","title":"Overview","text":"Pages = [\"lib/EpiLatentModels/public.md\", \"lib/EpiLatentModels/internals.md\"]","category":"page"},{"location":"lib/#api-reference","page":"Overview","title":"API reference","text":"","category":"section"},{"location":"lib/","page":"Overview","title":"Overview","text":"Welcome to the EpiAware API reference! This section is designed to help you understand the API of the package which is split into submodules.","category":"page"},{"location":"lib/","page":"Overview","title":"Overview","text":"The EpiAware package itself contains no functions or types. Instead, it re-exports the functions and types from its submodules. See the sidebar for the list of submodules.","category":"page"},{"location":"lib/EpiLatentModels/internals/#Internal-Documentation","page":"Internal API","title":"Internal Documentation","text":"","category":"section"},{"location":"lib/EpiLatentModels/internals/","page":"Internal API","title":"Internal API","text":"Documentation for EpiLatentModels.jl's internal interface.","category":"page"},{"location":"lib/EpiLatentModels/internals/#Contents","page":"Internal API","title":"Contents","text":"","category":"section"},{"location":"lib/EpiLatentModels/internals/","page":"Internal API","title":"Internal API","text":"Pages = [\"internals.md\"]\nDepth = 2:2","category":"page"},{"location":"lib/EpiLatentModels/internals/#Index","page":"Internal API","title":"Index","text":"","category":"section"},{"location":"lib/EpiLatentModels/internals/","page":"Internal API","title":"Internal API","text":"Pages = [\"internals.md\"]","category":"page"},{"location":"lib/EpiLatentModels/internals/#Internal-API","page":"Internal API","title":"Internal API","text":"","category":"section"},{"location":"lib/EpiLatentModels/internals/","page":"Internal API","title":"Internal API","text":"Modules = [EpiAware.EpiLatentModels]\nPublic = false","category":"page"},{"location":"lib/EpiLatentModels/internals/#EpiAware.EpiLatentModels.ARStep","page":"Internal API","title":"EpiAware.EpiLatentModels.ARStep","text":"struct ARStep{D<:(AbstractVector{<:Real})} <: AbstractAccumulationStep\n\nThe autoregressive (AR) step function struct\n\n\n\nFields\n\ndamp_AR::AbstractVector{<:Real}\n\n\n\n\n\n","category":"type"},{"location":"lib/EpiLatentModels/internals/#EpiAware.EpiLatentModels.ARStep-Tuple{Any, Any}","page":"Internal API","title":"EpiAware.EpiLatentModels.ARStep","text":"The autoregressive (AR) step function for use with accumulate_scan.\n\n\n\n\n\n","category":"method"},{"location":"lib/EpiLatentModels/internals/#EpiAware.EpiAwareBase.broadcast_n-Tuple{RepeatBlock, Any, Any}","page":"Internal API","title":"EpiAware.EpiAwareBase.broadcast_n","text":"broadcast_n(_::RepeatBlock, n, period) -> Any\n\n\nA function that returns the length of the latent periods to generate using the RepeatBlock rule which is equal n divided by the period and rounded up to the nearest integer.\n\nArguments\n\nrule::RepeatBlock: The broadcasting rule.\nn: The number of samples to generate.\nperiod: The period of the broadcast.\n\n\n\n\n\n","category":"method"},{"location":"lib/EpiLatentModels/internals/#EpiAware.EpiAwareBase.broadcast_n-Tuple{RepeatEach, Any, Any}","page":"Internal API","title":"EpiAware.EpiAwareBase.broadcast_n","text":"broadcast_n(_::RepeatEach, n, period) -> Any\n\n\nA function that returns the length of the latent periods to generate using the RepeatEach rule which is equal to the period.\n\nArguments\n\nrule::RepeatEach: The broadcasting rule.\nn: The number of samples to generate.\nperiod: The period of the broadcast.\n\nReturns\n\nm: The length of the latent periods to generate.\n\n\n\n\n\n","category":"method"},{"location":"lib/EpiLatentModels/internals/#EpiAware.EpiAwareBase.generate_latent-Tuple{AR, Any}","page":"Internal API","title":"EpiAware.EpiAwareBase.generate_latent","text":"generate_latent(latent_model::AR, n) -> Any\n\n\nGenerate a latent AR series.\n\nArguments\n\nlatent_model::AR: The AR model.\nn::Int: The length of the AR series.\n\nReturns\n\nar::Vector{Float64}: The generated AR series.\n\nNotes\n\nThe length of damp_prior and init_prior must be the same.\nn must be longer than the order of the autoregressive process.\n\n\n\n\n\n","category":"method"},{"location":"lib/EpiLatentModels/internals/#EpiAware.EpiAwareBase.generate_latent-Tuple{BroadcastLatentModel, Any}","page":"Internal API","title":"EpiAware.EpiAwareBase.generate_latent","text":"generate_latent(model::BroadcastLatentModel, n) -> Any\n\n\nGenerates latent periods using the specified model and n number of samples.\n\nArguments\n\nmodel::BroadcastLatentModel: The broadcast latent model.\nn::Any: The number of samples to generate.\n\nReturns\n\nbroadcasted_latent: The generated broadcasted latent periods.\n\n\n\n\n\n","category":"method"},{"location":"lib/EpiLatentModels/internals/#EpiAware.EpiAwareBase.generate_latent-Tuple{CombineLatentModels, Any}","page":"Internal API","title":"EpiAware.EpiAwareBase.generate_latent","text":"generate_latent(\n latent_models::CombineLatentModels,\n n\n) -> Any\n\n\nGenerate latent variables using a combination of multiple latent models.\n\nArguments\n\nlatent_models::CombineLatentModels: An instance of the CombineLatentModels type representing the collection of latent models.\nn: The number of latent variables to generate.\n\nReturns\n\nThe combined latent variables generated from all the models.\n\nExample\n\n\n\n\n\n","category":"method"},{"location":"lib/EpiLatentModels/internals/#EpiAware.EpiAwareBase.generate_latent-Tuple{ConcatLatentModels, Any}","page":"Internal API","title":"EpiAware.EpiAwareBase.generate_latent","text":"generate_latent(latent_models::ConcatLatentModels, n) -> Any\n\n\nGenerate latent variables by concatenating multiple latent models.\n\nArguments\n\nlatent_models::ConcatLatentModels: An instance of the ConcatLatentModels type representing the collection of latent models.\nn: The number of latent variables to generate.\n\nReturns\n\nconcatenated_latents: The combined latent variables generated from all the models.\nlatent_aux: A tuple containing the auxiliary latent variables generated from each individual model.\n\n\n\n\n\n","category":"method"},{"location":"lib/EpiLatentModels/internals/#EpiAware.EpiAwareBase.generate_latent-Tuple{DiffLatentModel, Any}","page":"Internal API","title":"EpiAware.EpiAwareBase.generate_latent","text":"generate_latent(latent_model::DiffLatentModel, n) -> Any\n\n\nGenerate a Turing model for n-step latent process Z_t using a differenced latent model defined by latent_model.\n\nArguments\n\nlatent_model::DiffLatentModel: The differential latent model.\nn: The length of the latent variables.\n\nTuring model specifications\n\nSampled random variables\n\nlatent_init: The initial latent process variables.\nOther random variables defined by model<:AbstractTuringLatentModel field of the undifferenced model.\n\nGenerated quantities\n\nA tuple containing the generated latent process as its first argument and a NamedTuple of sampled auxiliary variables as second argument.\n\nExample usage with DiffLatentModel model constructor\n\ngenerate_latent can be used to construct a Turing model for the differenced latent process. In this example, the underlying undifferenced process is a RandomWalk model.\n\nFirst, we construct a RandomWalk struct with an initial value prior and a step size standard deviation prior.\n\nusing Distributions, EpiAware\nrw = RandomWalk(Normal(0.0, 1.0), truncated(Normal(0.0, 0.05), 0.0, Inf))\n\nThen, we can use DiffLatentModel to construct a DiffLatentModel for d-fold differenced process with rw as the undifferenced latent process.\n\nWe have two constructor options for DiffLatentModel. The first option is to supply a common prior distribution for the initial terms and specify d as follows:\n\ndiff_model = DiffLatentModel(rw, Normal(); d = 2)\n\nOr we can supply a vector of priors for the initial terms and d is inferred as follows:\n\ndiff_model2 = DiffLatentModel(;undiffmodel = rw, init_priors = [Normal(), Normal()])\n\nThen, we can use generate_latent to construct a Turing model for the differenced latent process generating a length n process,\n\n# Construct a Turing model\nn = 100\ndifference_mdl = generate_latent(diff_model, n)\n\nNow we can use the Turing PPL API to sample underlying parameters and generate the unobserved latent process.\n\n#Sample random parameters from prior\nθ = rand(difference_mdl)\n#Get a sampled latent process as a generated quantity from the model\n(Z_t, _) = generated_quantities(difference_mdl, θ)\nZ_t\n\n\n\n\n\n","category":"method"},{"location":"lib/EpiLatentModels/internals/#EpiAware.EpiAwareBase.generate_latent-Tuple{FixedIntercept, Any}","page":"Internal API","title":"EpiAware.EpiAwareBase.generate_latent","text":"generate_latent(latent_model::FixedIntercept, n) -> Any\n\n\nGenerate a latent intercept series with a fixed intercept value.\n\nArguments\n\nlatent_model::FixedIntercept: The fixed intercept latent model.\nn: The number of latent variables to generate.\n\nReturns\n\nlatent_vars: An array of length n filled with the fixed intercept value.\n\n\n\n\n\n","category":"method"},{"location":"lib/EpiLatentModels/internals/#EpiAware.EpiAwareBase.generate_latent-Tuple{HierarchicalNormal, Any}","page":"Internal API","title":"EpiAware.EpiAwareBase.generate_latent","text":"generate_latent(obs_model::HierarchicalNormal, n) -> Any\n\n\nfunction EpiAwareBase.generate_latent(obs_model::HierarchicalNormal, n)\n\nGenerate latent variables from the hierarchical normal distribution.\n\nArguments\n\nobs_model::HierarchicalNormal: The hierarchical normal distribution model.\nn: Number of latent variables to generate.\n\nReturns\n\nη_t: Generated latent variables.\n\n\n\n\n\n","category":"method"},{"location":"lib/EpiLatentModels/internals/#EpiAware.EpiAwareBase.generate_latent-Tuple{Intercept, Any}","page":"Internal API","title":"EpiAware.EpiAwareBase.generate_latent","text":"generate_latent(latent_model::Intercept, n) -> Any\n\n\nGenerate a latent intercept series.\n\nArguments\n\nlatent_model::Intercept: The intercept model.\nn::Int: The length of the intercept series.\n\nReturns\n\nintercept::Vector{Float64}: The generated intercept series.\n\n\n\n\n\n","category":"method"},{"location":"lib/EpiLatentModels/internals/#EpiAware.EpiAwareBase.generate_latent-Tuple{RandomWalk, Any}","page":"Internal API","title":"EpiAware.EpiAwareBase.generate_latent","text":"generate_latent(latent_model::RandomWalk, n) -> Any\n\n\nImplement the generate_latent function for the RandomWalk model.\n\nExample usage of generate_latent with RandomWalk type of latent process model\n\nusing Distributions, Turing, EpiAware\n\n# Create a RandomWalk model\nrw = RandomWalk(init_prior = Normal(2., 1.),\n std_prior = HalfNormal(0.1))\n\nThen, we can use generate_latent to construct a Turing model for a 10 step random walk.\n\n# Construct a Turing model\nrw_model = generate_latent(rw, 10)\n\nNow we can use the Turing PPL API to sample underlying parameters and generate the unobserved infections.\n\n#Sample random parameters from prior\nθ = rand(rw_model)\n#Get random walk sample path as a generated quantities from the model\nZ_t, _ = generated_quantities(rw_model, θ)\n\n\n\n\n\n","category":"method"},{"location":"lib/EpiLatentModels/internals/#EpiAware.EpiAwareBase.generate_latent-Tuple{TransformLatentModel, Any}","page":"Internal API","title":"EpiAware.EpiAwareBase.generate_latent","text":"generate_latent(model::TransformLatentModel, n) -> Any\n\n\ngenerate_latent(model::TransformLatentModel, n)\n\nGenerate latent variables using the specified TransformLatentModel.\n\nArguments\n\nmodel::TransformLatentModel: The TransformLatentModel to generate latent variables from.\nn: The number of latent variables to generate.\n\nReturns\n\nThe transformed latent variables.\n\n\n\n\n\n","category":"method"},{"location":"lib/EpiObsModels/public/#Public-Documentation","page":"Public API","title":"Public Documentation","text":"","category":"section"},{"location":"lib/EpiObsModels/public/","page":"Public API","title":"Public API","text":"Documentation for EpiObsModels.jl's public interface.","category":"page"},{"location":"lib/EpiObsModels/public/","page":"Public API","title":"Public API","text":"See the Internals section of the manual for internal package docs covering all submodules.","category":"page"},{"location":"lib/EpiObsModels/public/#Contents","page":"Public API","title":"Contents","text":"","category":"section"},{"location":"lib/EpiObsModels/public/","page":"Public API","title":"Public API","text":"Pages = [\"public.md\"]\nDepth = 2:2","category":"page"},{"location":"lib/EpiObsModels/public/#Index","page":"Public API","title":"Index","text":"","category":"section"},{"location":"lib/EpiObsModels/public/","page":"Public API","title":"Public API","text":"Pages = [\"public.md\"]","category":"page"},{"location":"lib/EpiObsModels/public/#Public-API","page":"Public API","title":"Public API","text":"","category":"section"},{"location":"lib/EpiObsModels/public/","page":"Public API","title":"Public API","text":"Modules = [EpiAware.EpiObsModels]\nPrivate = false","category":"page"},{"location":"lib/EpiObsModels/public/#EpiAware.EpiObsModels","page":"Public API","title":"EpiAware.EpiObsModels","text":"Module for defining observation models.\n\n\n\n\n\n","category":"module"},{"location":"lib/EpiObsModels/public/#EpiAware.EpiObsModels.Aggregate","page":"Public API","title":"EpiAware.EpiObsModels.Aggregate","text":"struct Aggregate{M<:AbstractTuringObservationModel, I<:(AbstractVector{<:Int64}), J<:(AbstractVector{<:Bool})} <: AbstractTuringObservationModel\n\nAggregates observations over a specified time period. For efficiency it also only passes the aggregated observations to the submodel. The aggregation vector is internally broadcasted to the length of the observations and the present vector is broadcasted to the length of the aggregation vector using broadcast_n.\n\nFields\n\nmodel::AbstractTuringObservationModel: The submodel to use for the aggregated observations.\naggregation::AbstractVector{<: Int}: The number of time periods to aggregate over.\npresent::AbstractVector{<: Bool}: A vector of booleans indicating whether the observation is present or not.\n\nConstructors\n\nAggregate(model, aggregation): Constructs an Aggregate object and automatically sets the present field.\nAggregate(; model, aggregation): Constructs an Aggregate object and automatically sets the present field using named keyword arguments\n\nExamples\n\nusing EpiAware\nweekly_agg = Aggregate(PoissonError(), [0, 0, 0, 0, 7, 0, 0])\ngen_obs = generate_observations(weekly_agg, missing, fill(1, 28))\ngen_obs()\n\n\n\nFields\n\nmodel::AbstractTuringObservationModel\naggregation::AbstractVector{<:Int64}\npresent::AbstractVector{<:Bool}\n\n\n\n\n\n","category":"type"},{"location":"lib/EpiObsModels/public/#EpiAware.EpiObsModels.Ascertainment","page":"Public API","title":"EpiAware.EpiObsModels.Ascertainment","text":"struct Ascertainment{M<:AbstractTuringObservationModel, T<:AbstractTuringLatentModel, F<:Function, P<:String} <: AbstractTuringObservationModel\n\nThe Ascertainment struct represents an observation model that incorporates a ascertainment model. If a latent_prefixis supplied the latent_model is wrapped in a call to PrefixLatentModel.\n\nConstructors\n\nAscertainment(model::M, latent_model::T, transform::F, latent_prefix::P) where {M <: AbstractTuringObservationModel, T <: AbstractTuringLatentModel, F <: Function, P <: String}: Constructs an Ascertainment instance with the specified observation model, latent model, transform function, and latent prefix.\nAscertainment(; model::M, latent_model::T, transform::F = (Y_t, x) -> xexpy.(Y_t, x), latent_prefix::P = \"Ascertainment\") where {M <: AbstractTuringObservationModel, T <: AbstractTuringLatentModel, F <: Function, P <: String}: Constructs an Ascertainment instance with the specified observation model, latent model, optional transform function (default: (Y_t, x) -> xexpy.(Y_t, x)), and optional latent prefix (default: \"Ascertainment\").\n\nExamples\n\nusing EpiAware, Turing\nobs = Ascertainment(model = NegativeBinomialError(), latent_model = FixedIntercept(0.1))\ngen_obs = generate_observations(obs, missing, fill(100, 10))\nrand(gen_obs)\n\n\n\nFields\n\nmodel::AbstractTuringObservationModel: The underlying observation model.\nlatent_model::AbstractTuringLatentModel: The latent model.\ntransform::Function: The function used to transform Y_t and the latent model output.\nlatent_prefix::String\n\n\n\n\n\n","category":"type"},{"location":"lib/EpiObsModels/public/#EpiAware.EpiObsModels.LatentDelay","page":"Public API","title":"EpiAware.EpiObsModels.LatentDelay","text":"struct LatentDelay{M<:AbstractTuringObservationModel, T<:(AbstractVector{<:Real})} <: AbstractTuringObservationModel\n\nThe LatentDelay struct represents an observation model that introduces a latent delay in the observations. It is a subtype of AbstractTuringObservationModel.\n\nNote that the LatentDelay observation model shortens the expected observation vector by the length of the delay distribution and this is then passed to the underlying observation model. This is to prevent fitting to partially observed data.\n\nFields\n\nmodel::M: The underlying observation model.\nrev_pmf::T: The probability mass function (PMF) representing the delay distribution reversed.\n\nConstructors\n\nLatentDelay(model::M, distribution::C; D = nothing, Δd = 1.0) where {M <: AbstractTuringObservationModel, C <: ContinuousDistribution}: Constructs a LatentDelay object with the given underlying observation model and continuous distribution. The D parameter specifies the right truncation of the distribution, with default D = nothing indicates that the distribution should be truncated at its 99th percentile rounded to nearest multiple of Δd. The Δd parameter specifies the width of each delay interval.\nLatentDelay(model::M, pmf::T) where {M <: AbstractTuringObservationModel, T <: AbstractVector{<:Real}}: Constructs a LatentDelay object with the given underlying observation model and delay PMF.\n\nExamples\n\nusing Distributions, Turing, EpiAware\nobs = LatentDelay(NegativeBinomialError(), truncated(Normal(5.0, 2.0), 0.0, Inf))\nobs_model = generate_observations(obs, missing, fill(10, 30))\nobs_model()\n\n\n\nFields\n\nmodel::AbstractTuringObservationModel\nrev_pmf::AbstractVector{<:Real}\n\n\n\n\n\n","category":"type"},{"location":"lib/EpiObsModels/public/#EpiAware.EpiObsModels.NegativeBinomialError","page":"Public API","title":"EpiAware.EpiObsModels.NegativeBinomialError","text":"struct NegativeBinomialError{S<:Distributions.Sampleable} <: AbstractTuringObservationErrorModel\n\nThe NegativeBinomialError struct represents an observation model for negative binomial errors. It is a subtype of AbstractTuringObservationModel.\n\nConstructors\n\nNegativeBinomialError(; cluster_factor_prior::Distribution = HalfNormal(0.1)): Constructs a NegativeBinomialError object with default values for the cluster factor prior.\nNegativeBinomialError(cluster_factor_prior::Distribution): Constructs a NegativeBinomialError object with a specified cluster factor prior.\n\nExamples\n\nusing Distributions, Turing, EpiAware\nnb = NegativeBinomialError()\nnb_model = generate_observations(nb, missing, fill(10, 10))\nrand(nb_model)\n\n\n\nFields\n\ncluster_factor_prior::Distributions.Sampleable: The prior distribution for the cluster factor.\n\n\n\n\n\n","category":"type"},{"location":"lib/EpiObsModels/public/#EpiAware.EpiObsModels.PoissonError","page":"Public API","title":"EpiAware.EpiObsModels.PoissonError","text":"struct PoissonError <: AbstractTuringObservationErrorModel\n\nThe PoissonError struct represents an observation model for Poisson errors. It is a subtype of AbstractTuringObservationErrorModel.\n\nConstructors\n\nPoissonError(): Constructs a PoissonError object.\n\nExamples\n\nusing Distributions, Turing, EpiAware\npoi = PoissonError()\npoi_model = generate_observations(poi, missing, fill(10, 10))\nrand(poi_model)\n\n\n\nFields\n\n\n\n\n\n","category":"type"},{"location":"lib/EpiObsModels/public/#EpiAware.EpiObsModels.PrefixObservationModel","page":"Public API","title":"EpiAware.EpiObsModels.PrefixObservationModel","text":"struct PrefixObservationModel{M<:AbstractTuringObservationModel, P<:String} <: AbstractTuringObservationModel\n\nGenerate an observation model with a prefix. A lightweight wrapper around `EpiAwareUtils.prefix_submodel`.\n\n# Constructors\n- `PrefixObservationModel(model::M, prefix::P)`: Create a `PrefixObservationModel` with the observation model `model` and the prefix `prefix`.\n- `PrefixObservationModel(; model::M, prefix::P)`: Create a `PrefixObservationModel` with the observation model `model` and the prefix `prefix`.\n\n# Examples\n```julia\nusing EpiAware\nobservation_model = PrefixObservationModel(Poisson(), \"Test\")\nobs = generate_observations(observation_model, 10)\nrand(obs)\n```\n\n\n\nFields\n\nmodel::AbstractTuringObservationModel: The observation model\nprefix::String: The prefix for the observation model\n\n\n\n\n\n","category":"type"},{"location":"lib/EpiObsModels/public/#EpiAware.EpiObsModels.RecordExpectedObs","page":"Public API","title":"EpiAware.EpiObsModels.RecordExpectedObs","text":"struct RecordExpectedObs{M<:AbstractTuringObservationModel} <: AbstractTuringObservationModel\n\nRecord a variable (using the Turing := syntax) in the observation model.\n\n# Fields\n- `model::AbstractTuringObservationModel`: The observation model to dispatch to.\n\n# Constructors\n\n- `RecordExpectedObs(model::AbstractTuringObservationModel)`: Record the expected observation from the model as `exp_y_t`.\n\n# Examples\n\n```julia\nusing EpiAware, Turing\nmdl = RecordExpectedObs(NegativeBinomialError())\ngen_obs = generate_observations(mdl, missing, fill(100, 10))\nsample(gen_obs, Prior(), 10)\n```\n\n\n\nFields\n\nmodel::AbstractTuringObservationModel\n\n\n\n\n\n","category":"type"},{"location":"lib/EpiObsModels/public/#EpiAware.EpiObsModels.StackObservationModels","page":"Public API","title":"EpiAware.EpiObsModels.StackObservationModels","text":"struct StackObservationModels{M<:(AbstractVector{<:AbstractTuringObservationModel}), N<:(AbstractVector{<:AbstractString})} <: AbstractTuringObservationModel\n\nA stack of observation models that are looped over to generate observations for each model in the stack. Note that the model names are used to prefix the parameters in each model (so if I have a model named cases and a parameter y_t, the parameter in the model will be cases.y_t). Inside the constructor PrefixObservationModel is wrapped around each observation model.\n\nConstructors\n\nStackObservationModels(models::Vector{<:AbstractTuringObservationModel}, model_names::Vector{<:AbstractString}): Construct a StackObservationModels object with a vector of observation models and a vector of model names.\n`StackObservationModels(; models::Vector{<:AbstractTuringObservationModel},\nmodel_names::Vector{<:AbstractString}): Construct aStackObservationModels` object with a vector of observation models and a vector of model names.\nStackObservationModels(models::NamedTuple{names, T}): Construct a StackObservationModels object with a named tuple of observation models. The model names are automatically generated from the keys of the named tuple.\n\nExample\n\nusing EpiAware, Turing\n\nobs = StackObservationModels(\n (cases = PoissonError(), deaths = NegativeBinomialError())\n)\ny_t = (cases = missing, deaths = missing)\nobs_model = generate_observations(obs, y_t, fill(10, 10))\nrand(obs_model)\nsamples = sample(obs_model, Prior(), 100; progress = false)\n\ncases_y_t = group(samples, \"cases.y_t\")\ncases_y_t\n\ndeaths_y_t = group(samples, \"deaths.y_t\")\ndeaths_y_t\n\n\n\nFields\n\nmodels::AbstractVector{<:AbstractTuringObservationModel}: A vector of observation models.\nmodel_names::AbstractVector{<:AbstractString}: A vector of observation model names\n\n\n\n\n\n","category":"type"},{"location":"lib/EpiObsModels/public/#EpiAware.EpiObsModels.ascertainment_dayofweek-Tuple{AbstractTuringObservationModel}","page":"Public API","title":"EpiAware.EpiObsModels.ascertainment_dayofweek","text":"ascertainment_dayofweek(\n model::AbstractTuringObservationModel;\n latent_model,\n transform,\n latent_prefix\n) -> Ascertainment{M, AbstractTuringLatentModel, EpiAware.EpiObsModels.var\"#18#20\", String} where M<:AbstractTuringObservationModel\n\n\nCreate an Ascertainment object that models the ascertainment process based on the day of the week.\n\nArguments\n\nmodel::AbstractTuringObservationModel: The observation model to be used.\nlatent_model::AbstractTuringLatentModel: The latent model to be used. Default is HierarchicalNormal() which is a hierarchical normal distribution.\ntransform: The transform function to be used. Default is (x, y) -> x .* y.\n\nThis function is used to transform the latent model after broadcasting to periodic weekly has been applied.\n\nlatent_prefix: The prefix to be used for the latent model. Default is \"DayofWeek\".\n\nReturns\n\nAscertainment: The Ascertainment object that models the ascertainment process based on the day of the week.\n\nExamples\n\nusing EpiAware\nobs = ascertainment_dayofweek(PoissonError())\ngen_obs = generate_observations(obs, missing, fill(100, 14))\ngen_obs()\nrand(gen_obs)\n\n\n\n\n\n","category":"method"},{"location":"lib/EpiObsModels/public/#EpiAware.EpiObsModels.generate_observation_error_priors-Tuple{AbstractTuringObservationErrorModel, Any, Any}","page":"Public API","title":"EpiAware.EpiObsModels.generate_observation_error_priors","text":"generate_observation_error_priors(\n obs_model::AbstractTuringObservationErrorModel,\n y_t,\n Y_t\n) -> Any\n\n\nGenerates priors for the observation error model. This should return a named tuple containing the priors required for generating the observation error distribution.\n\n\n\n\n\n","category":"method"},{"location":"lib/EpiObsModels/public/#EpiAware.EpiObsModels.generate_observation_error_priors-Tuple{NegativeBinomialError, Any, Any}","page":"Public API","title":"EpiAware.EpiObsModels.generate_observation_error_priors","text":"generate_observation_error_priors(\n obs_model::NegativeBinomialError,\n Y_t,\n y_t\n) -> Any\n\n\nGenerates observation error priors based on the NegativeBinomialError observation model. This function generates the cluster factor prior for the negative binomial error model.\n\n\n\n\n\n","category":"method"},{"location":"lib/EpiObsModels/public/#EpiAware.EpiObsModels.observation_error-Tuple{AbstractTuringObservationErrorModel, Any}","page":"Public API","title":"EpiAware.EpiObsModels.observation_error","text":"observation_error(\n obs_model::AbstractTuringObservationErrorModel,\n Y_t\n) -> SafePoisson\n\n\nThe observation error distribution for the observation error model. This function should return the distribution for the observation error given the expected observation value Y_t and the priors generated by generate_observation_error_priors.\n\n\n\n\n\n","category":"method"},{"location":"lib/EpiObsModels/public/#EpiAware.EpiObsModels.observation_error-Tuple{NegativeBinomialError, Any, Any}","page":"Public API","title":"EpiAware.EpiObsModels.observation_error","text":"observation_error(\n obs_model::NegativeBinomialError,\n Y_t,\n sq_cluster_factor\n) -> SafeNegativeBinomial\n\n\nThis function generates the observation error model based on the negative binomial error model with a positive shift. It dispatches to the NegativeBinomialMeanClust distribution.\n\n\n\n\n\n","category":"method"},{"location":"lib/EpiObsModels/public/#EpiAware.EpiObsModels.observation_error-Tuple{PoissonError, Any}","page":"Public API","title":"EpiAware.EpiObsModels.observation_error","text":"observation_error(\n obs_model::PoissonError,\n Y_t\n) -> SafePoisson\n\n\nThe observation error model for Poisson errors. This function generates the observation error model based on the Poisson error model.\n\n\n\n\n\n","category":"method"},{"location":"getting-started/explainers/julia/#Julia-for-EpiAware","page":"Working with Julia","title":"Julia for EpiAware","text":"","category":"section"},{"location":"getting-started/explainers/julia/","page":"Working with Julia","title":"Working with Julia","text":"Julia is a programming language aimed at technical computing. This guide is aimed at helping you set up Julia on your system and pointing towards resources for learning more.","category":"page"},{"location":"getting-started/explainers/julia/","page":"Working with Julia","title":"Working with Julia","text":"[!NOTE] If you are familar with other languages with tooling for technical computing (e.g. R, MATLAB, Python) these noteworthy differences may be useful.","category":"page"},{"location":"getting-started/explainers/julia/","page":"Working with Julia","title":"Working with Julia","text":"Pages = [\"getting-started/tutorials/julia.md\"]\nDepth = 3","category":"page"},{"location":"getting-started/explainers/julia/#What-this-guide-is-and-isn't","page":"Working with Julia","title":"What this guide is and isn't","text":"","category":"section"},{"location":"getting-started/explainers/julia/","page":"Working with Julia","title":"Working with Julia","text":"This isn't a guide to learning the Julia programming language. Instead we providing an opinionated guide to setting up your system to use Julia effectively in project workflows aimed at people with familiarity with Julia but have maybe only developed projects in other languages (e.g. R, MATLAB, Python).","category":"page"},{"location":"getting-started/explainers/julia/","page":"Working with Julia","title":"Working with Julia","text":"If you want to learn more about the Julia programming language, we recommend the following resources:","category":"page"},{"location":"getting-started/explainers/julia/","page":"Working with Julia","title":"Working with Julia","text":"Julia Documentation - getting started.\nJulia Academy.\nJulia learning resources.\nJuliaHub.\nJulia Discourse.\nJulia Slack.","category":"page"},{"location":"getting-started/explainers/julia/#Julia-Installation-with-Juliaup","page":"Working with Julia","title":"Julia Installation with Juliaup","text":"","category":"section"},{"location":"getting-started/explainers/julia/","page":"Working with Julia","title":"Working with Julia","text":"Download Juliaup: This is a cross-platform installer/updater for the Julia programming language. It simplifies the process of installing and managing Julia versions. Go to the Juliaup GitHub repository or to the official Julia website for installation instructions.\nVerify Installation: Open a terminal (or Command Prompt on Windows) and type julia to start the Julia REPL (Read-Eval-Print Loop). You should see a Julia prompt julia>.","category":"page"},{"location":"getting-started/explainers/julia/#Basic-usage-of-Juliaup","page":"Working with Julia","title":"Basic usage of Juliaup","text":"","category":"section"},{"location":"getting-started/explainers/julia/","page":"Working with Julia","title":"Working with Julia","text":"Juliaup is a tool for managing Julia installations on your system. It allows you to install, update, and switch between different versions of Julia. Details are available at the Juliaup GitHub repository, but here are some examples of common commands:","category":"page"},{"location":"getting-started/explainers/julia/#Add-a-specific-version-of-Julia","page":"Working with Julia","title":"Add a specific version of Julia","text":"","category":"section"},{"location":"getting-started/explainers/julia/","page":"Working with Julia","title":"Working with Julia","text":"Juliaup default installs the latest release version of Julia. To install a specific version, use the add command followed by the version number. For example, to install Julia version 1.9.3, use the following command:","category":"page"},{"location":"getting-started/explainers/julia/","page":"Working with Julia","title":"Working with Julia","text":"% juliaup add 1.9.3","category":"page"},{"location":"getting-started/explainers/julia/#Use-a-specific-version-of-Julia","page":"Working with Julia","title":"Use a specific version of Julia","text":"","category":"section"},{"location":"getting-started/explainers/julia/","page":"Working with Julia","title":"Working with Julia","text":"To switch between different versions of Julia, use + julia-version after the julia command. For example, to use Julia version 1.9.3, use the following command:","category":"page"},{"location":"getting-started/explainers/julia/","page":"Working with Julia","title":"Working with Julia","text":"% julia +1.9.3","category":"page"},{"location":"getting-started/explainers/julia/","page":"Working with Julia","title":"Working with Julia","text":"This will use the specified version of Julia for the current REPL. In general, adding the + julia-version flag after the julia command will execute using the specified version of Julia.","category":"page"},{"location":"getting-started/explainers/julia/#Check-versions-of-Julia-installed","page":"Working with Julia","title":"Check versions of Julia installed","text":"","category":"section"},{"location":"getting-started/explainers/julia/","page":"Working with Julia","title":"Working with Julia","text":"To see a list of all the versions of Julia installed on your system, use the following command:","category":"page"},{"location":"getting-started/explainers/julia/","page":"Working with Julia","title":"Working with Julia","text":"% juliaup list","category":"page"},{"location":"getting-started/explainers/julia/#Update-Julia-(all-versions-installed)","page":"Working with Julia","title":"Update Julia (all versions installed)","text":"","category":"section"},{"location":"getting-started/explainers/julia/","page":"Working with Julia","title":"Working with Julia","text":"This will update all versions of Julia installed on your system to their latest release versions.","category":"page"},{"location":"getting-started/explainers/julia/","page":"Working with Julia","title":"Working with Julia","text":"% juliaup update","category":"page"},{"location":"getting-started/explainers/julia/#Usage-of-Julia-environments","page":"Working with Julia","title":"Usage of Julia environments","text":"","category":"section"},{"location":"getting-started/explainers/julia/","page":"Working with Julia","title":"Working with Julia","text":"The environment of a Julia project determines which packages, and their version, are available to the project. This is useful when you want to ensure that a project uses a specific version of a package, or when you want to isolate the project from other projects on your system. As per other languages, Julia environments are useful for managing dependencies and ensuring reproducibility.","category":"page"},{"location":"getting-started/explainers/julia/","page":"Working with Julia","title":"Working with Julia","text":"The most common usage of environments is to create a new explicit environment for a project in a directory. This creates a Project.toml file in the directory that specifies the dependencies for the project and a Manifest.toml file that specifies the exact versions of the dependencies, and their underlying dependencies. We'll discuss how to set up a new environment for a project in the REPL section.","category":"page"},{"location":"getting-started/explainers/julia/","page":"Working with Julia","title":"Working with Julia","text":"Julia environments can be stacked. This means that you can have a primary environment embedded in the stacked environment, along with secondary environment(s) that define common packages to be available to many projects.","category":"page"},{"location":"getting-started/explainers/julia/","page":"Working with Julia","title":"Working with Julia","text":"From a project development point of view, most commonly the project environment will be the primary environment, isolated from other project environments. And the environment of the Julia version installation (e.g. the @v1.10 env) will be a secondary environment because its in the default LOAD_PATH Julia environmental variable. You can add packages to the Julia version environment that you want to be available to all projects as we'll show in the REPL section. See section Recommended packages for the primary Julia environment for our recommendations.","category":"page"},{"location":"getting-started/explainers/julia/#Using-the-Julia-REPL-in-projects","page":"Working with Julia","title":"Using the Julia REPL in projects","text":"","category":"section"},{"location":"getting-started/explainers/julia/","page":"Working with Julia","title":"Working with Julia","text":"The Julia REPL (Read-Eval-Print Loop) is an interactive programming environment that takes single user inputs (i.e., single expressions), evaluates them, and returns the result to the user.","category":"page"},{"location":"getting-started/explainers/julia/#Package-management-programmatically-and-from-REPL","page":"Working with Julia","title":"Package management programmatically and from REPL","text":"","category":"section"},{"location":"getting-started/explainers/julia/","page":"Working with Julia","title":"Working with Julia","text":"Julia has a built-in package manager called Pkg, which is documented briefly here and in more detail here. The package manager is used to install, update, and manage Julia packages and environments.","category":"page"},{"location":"getting-started/explainers/julia/","page":"Working with Julia","title":"Working with Julia","text":"You can use Pkg programmatically as a normal Julia package, which is often done in scripts. For example, if we wanted to install the OrdinaryDiffEq package as part of executing a julia script, we would add the following lines to the script:","category":"page"},{"location":"getting-started/explainers/julia/","page":"Working with Julia","title":"Working with Julia","text":"using Pkg\nPkg.add(\"OrdinaryDiffEq\")","category":"page"},{"location":"getting-started/explainers/julia/","page":"Working with Julia","title":"Working with Julia","text":"However, you can also use the package manager interactively from the REPL. In our opinion, this is the more common usage of package management in Julia project development.","category":"page"},{"location":"getting-started/explainers/julia/","page":"Working with Julia","title":"Working with Julia","text":"For example, to install the OrdinaryDiffEq package from the REPL you can switch to package mode by typing ] and then type add OrdinaryDiffEq. To exit package mode, type backspace.","category":"page"},{"location":"getting-started/explainers/julia/","page":"Working with Julia","title":"Working with Julia","text":"julia> ]\n(@v1.10) pkg> add OrdinaryDiffEq","category":"page"},{"location":"getting-started/explainers/julia/","page":"Working with Julia","title":"Working with Julia","text":"This workflow is often more convenient than the programmatic interface, especially when setting packages you want to install to the environment for your julia installation, e.g the @v1.10 environment for julia 1.10.","category":"page"},{"location":"getting-started/explainers/julia/","page":"Working with Julia","title":"Working with Julia","text":"By default, the environment for a julia installation is stacked as a primary environment, so that the packages you install in the julia installation environment are available to all projects.","category":"page"},{"location":"getting-started/explainers/julia/#Using-the-Julia-REPL-to-set-up-active-project-environments","page":"Working with Julia","title":"Using the Julia REPL to set up active project environments","text":"","category":"section"},{"location":"getting-started/explainers/julia/","page":"Working with Julia","title":"Working with Julia","text":"To set a new active project environment, you can use the Pkg package manager from the REPL with the command activate with a local directory path. The project environment is named after the directory hosting the Project.toml file. After activating the project environment, you can manage packages to the project environment, as well as use packages from the primary stacked environment as described above.","category":"page"},{"location":"getting-started/explainers/julia/","page":"Working with Julia","title":"Working with Julia","text":"Here is an example of how you can create a new environment for a project when the REPL working directory is in some directory /myproject, and then add OrdinaryDiffEq to the project environment:","category":"page"},{"location":"getting-started/explainers/julia/","page":"Working with Julia","title":"Working with Julia","text":"julia> pwd() #Check your directory\n# \"path/to/myproject\"\njulia> ]\n(@v1.10) pkg> activate .\n(myproject) pkg> add OrdinaryDiffEq","category":"page"},{"location":"getting-started/explainers/julia/","page":"Working with Julia","title":"Working with Julia","text":"Note that if the project directory doesn't have a Project.toml file, one will be created when you add the first package to the project environment.","category":"page"},{"location":"getting-started/explainers/julia/#Experimenting-with-Julia-from-REPL-using-a-temporary-environment","page":"Working with Julia","title":"Experimenting with Julia from REPL using a temporary environment","text":"","category":"section"},{"location":"getting-started/explainers/julia/","page":"Working with Julia","title":"Working with Julia","text":"It is quite common to want to experiment with new Julia packages and code snippets. A convenient way to do this without setting up a new project environment or adding dependencies to the primary environment is to use a temporary environment. To do this:","category":"page"},{"location":"getting-started/explainers/julia/","page":"Working with Julia","title":"Working with Julia","text":"julia> ]\n(@v1.10) pkg> activate --temp\n(jl_FTIz6j) pkg> add InterestingPackage","category":"page"},{"location":"getting-started/explainers/julia/","page":"Working with Julia","title":"Working with Julia","text":"This will create a temporary environment, stacked with the primary environment, that is not saved to disk, and you can add packages to this environment without affecting the primary environment or any project environments. When you exit the REPL, the temporary environment will be deleted.","category":"page"},{"location":"getting-started/explainers/julia/#Recommended-packages-for-the-\"global\"-Julia-version-environment","page":"Working with Julia","title":"Recommended packages for the \"global\" Julia version environment","text":"","category":"section"},{"location":"getting-started/explainers/julia/","page":"Working with Julia","title":"Working with Julia","text":"In our view these packages are useful for your Julia version environment, e.g. v1.10 env, which will be available to other environments.","category":"page"},{"location":"getting-started/explainers/julia/","page":"Working with Julia","title":"Working with Julia","text":"Revise: For modifying package code and using the changes without restarting Julia session.\nTerm: For pretty and stylized REPL output (including error messages).\nJuliaFormatter: For code formatting.\nDocumenter: For local documentation generation.\nPluto: A native Julia notebook for interactive development.\nTestEnv: For easy use of test environments for package testing.\nUnicodePlots: For simple and quick plotting in the REPL without needing to install a fully featured plotting package.","category":"page"},{"location":"getting-started/explainers/julia/#startup.jl-recommendation","page":"Working with Julia","title":"startup.jl recommendation","text":"","category":"section"},{"location":"getting-started/explainers/julia/","page":"Working with Julia","title":"Working with Julia","text":"Revise and Term useful to have available in every Julia session. It is convenient to have these packages loaded automatically when you start a Julia session by adding a startup.jl file. This file should be located in the ~/.julia/config directory. Here is an example of a startup.jl file that loads the Revise and Term:","category":"page"},{"location":"getting-started/explainers/julia/","page":"Working with Julia","title":"Working with Julia","text":"atreplinit() do repl\n # Load Revise if it is installed\n try\n @eval using Revise\n catch e\n @warn \"error while importing Revise\" e\n end\n # Load Term if it is installed\n try\n @eval using Term\n @eval install_term_repr()\n @eval install_term_stacktrace()\n catch e\n @warn \"error while importing Term\" e\n end\nend\n","category":"page"},{"location":"getting-started/explainers/julia/#Developing-a-EpiAware-project-from-VS-Code","page":"Working with Julia","title":"Developing a EpiAware-project from VS-Code","text":"","category":"section"},{"location":"getting-started/explainers/julia/#Julia-extension-for-VS-Code","page":"Working with Julia","title":"Julia extension for VS-Code","text":"","category":"section"},{"location":"getting-started/explainers/julia/","page":"Working with Julia","title":"Working with Julia","text":"Visual Studio Code (VS-Code) is a popular code editor that supports Julia development. The Julia extension for VS-Code provides an interactive development environment that will be familiar to users of other scientific IDEs (e.g. developing R projects in RStudio or using the MATLAB application).","category":"page"},{"location":"getting-started/explainers/julia/#Features-of-the-Julia-extension-for-VS-Code","page":"Working with Julia","title":"Features of the Julia extension for VS-Code","text":"","category":"section"},{"location":"getting-started/explainers/julia/","page":"Working with Julia","title":"Working with Julia","text":"It is worth reading both the VS-Code documentation and the Julia extension documentation, however, here are some highlights:","category":"page"},{"location":"getting-started/explainers/julia/","page":"Working with Julia","title":"Working with Julia","text":"Julia REPL: The Julia extension provides an integrated REPL in the TERMINAL pane that allows you to interact with Julia code directly from the editor. For example, you can run code snippets from highlighting or code blocks defined by ## comments in the scripts.\nPlotting: By default, plots generated by featured plotting packages (e.g. Plots.jl) will be displayed in a Plot pane generated by the VS-Code editor.\nJulia Tab: The Julia extension provides a Julia tab with the following sub-tabs:\nWorkspace: This allows you to inspect the modules, functions and variables in your current REPL session. For variables that can be understood as a Table, you can view them in a tabular format from the workspace tab.\nDocumentation: This allows you to view the documentation for functions and types in the Julia standard library and any packages you have installed.\nPlot Navigator: This allows you to navigate the plots generated by the featured plotting packages.\nTesting: The Julia extension provides interaction between the Testing tab in VS-Code with Julia tests defined using the Julia package TestItems macro @testitem run with TestItemRunner.","category":"page"},{"location":"getting-started/explainers/julia/","page":"Working with Julia","title":"Working with Julia","text":"Other standard IDE features are Code completion, Code linting, Code formatting, Debugging, and Profiling.","category":"page"},{"location":"getting-started/explainers/julia/#Recommended-settings-for-the-Julia-extension-in-VS-Code","page":"Working with Julia","title":"Recommended settings for the Julia extension in VS-Code","text":"","category":"section"},{"location":"getting-started/explainers/julia/","page":"Working with Julia","title":"Working with Julia","text":"The settings of the Julia extension can be found by accessing Preferences: Open User Settings from the command palette in VS-Code and then searching for Julia.","category":"page"},{"location":"getting-started/explainers/julia/","page":"Working with Julia","title":"Working with Julia","text":"We recommend the following workplace settings saved in a file .vscode/settings.json relative to your working directory:","category":"page"},{"location":"getting-started/explainers/julia/","page":"Working with Julia","title":"Working with Julia","text":"{\n \"[julia]\": {\n \"editor.detectIndentation\": false,\n \"editor.insertSpaces\": true,\n \"editor.tabSize\": 4,\n \"files.insertFinalNewline\": true,\n \"files.trimFinalNewlines\": true,\n \"files.trimTrailingWhitespace\": true,\n \"editor.rulers\": [80],\n \"files.eol\": \"\\n\"\n },\n \"julia.liveTestFile\": \"path/to/runtests.jl\",\n \"julia.environmentPath\": \"path/to/project/directory\",\n}","category":"page"},{"location":"getting-started/explainers/julia/","page":"Working with Julia","title":"Working with Julia","text":"These settings set basic code formatting and whitespace settings for Julia files, as well as setting the path to the test file for the project and the path to the project directory for the environment.","category":"page"},{"location":"getting-started/explainers/julia/","page":"Working with Julia","title":"Working with Julia","text":"The VS-Code command Julia: Start REPL will start a REPL in TERMINAL tab in the editor with the environment set to the project directory and the Testing tab will detect the defined tests for the project.","category":"page"},{"location":"getting-started/explainers/julia/#Literate-programming-with-Julia-in-EpiAware","page":"Working with Julia","title":"Literate programming with Julia in EpiAware","text":"","category":"section"},{"location":"getting-started/explainers/julia/","page":"Working with Julia","title":"Working with Julia","text":"Its common to develop technical computing projects using a literate programming style, where code and documentation are interwoven. Julia supports this style of programming through a number of packages. In EpiAware we recommend the following:","category":"page"},{"location":"getting-started/explainers/julia/","page":"Working with Julia","title":"Working with Julia","text":"Pluto: A native Julia notebook for interactive development. Pluto notebooks are reactive, meaning that the output of all cells are updated as input changes. Installation instructions are available here. Pluto notebook files have the extension .jl and can be run as scripts.\nQuarto: A literate programming tool that allows you to write documents in markdown with embedded Julia code. Installation instructions are available here. Quarto files have the extension .qmd.","category":"page"},{"location":"getting-started/explainers/julia/","page":"Working with Julia","title":"Working with Julia","text":"We use Pluto for interactive development and Quarto for generating reports and academic articles. Both tools are useful for developing reproducible workflows.","category":"page"},{"location":"getting-started/explainers/inference/#Inference","page":"Inference","title":"Inference","text":"","category":"section"},{"location":"lib/EpiInference/public/#Public-Documentation","page":"Public API","title":"Public Documentation","text":"","category":"section"},{"location":"lib/EpiInference/public/","page":"Public API","title":"Public API","text":"Documentation for EpiInference.jl's public interface.","category":"page"},{"location":"lib/EpiInference/public/","page":"Public API","title":"Public API","text":"See the Internals section of the manual for internal package docs covering all submodules.","category":"page"},{"location":"lib/EpiInference/public/#Contents","page":"Public API","title":"Contents","text":"","category":"section"},{"location":"lib/EpiInference/public/","page":"Public API","title":"Public API","text":"Pages = [\"public.md\"]\nDepth = 2:2","category":"page"},{"location":"lib/EpiInference/public/#Index","page":"Public API","title":"Index","text":"","category":"section"},{"location":"lib/EpiInference/public/","page":"Public API","title":"Public API","text":"Pages = [\"public.md\"]","category":"page"},{"location":"lib/EpiInference/public/#Public-API","page":"Public API","title":"Public API","text":"","category":"section"},{"location":"lib/EpiInference/public/","page":"Public API","title":"Public API","text":"Modules = [EpiAware.EpiInference]\nPrivate = false","category":"page"},{"location":"lib/EpiInference/public/#EpiAware.EpiInference","page":"Public API","title":"EpiAware.EpiInference","text":"Module for defining inference methods.\n\n\n\n\n\n","category":"module"},{"location":"lib/EpiInference/public/#EpiAware.EpiInference.ManyPathfinder","page":"Public API","title":"EpiAware.EpiInference.ManyPathfinder","text":"struct ManyPathfinder <: AbstractEpiOptMethod\n\nA variational inference method that runs manypathfinder.\n\n\n\nFields\n\nndraws::Int64: Number of draws per pathfinder run.\nnruns::Int64: Number of many pathfinder runs.\nmaxiters::Int64: Maximum number of optimization iterations for each run.\nmax_tries::Int64: Maximum number of tries if all runs fail.\n\n\n\n\n\n","category":"type"},{"location":"lib/EpiInference/public/#EpiAware.EpiInference.NUTSampler","page":"Public API","title":"EpiAware.EpiInference.NUTSampler","text":"struct NUTSampler{A<:ADTypes.AbstractADType, E<:AbstractMCMC.AbstractMCMCEnsemble, M} <: AbstractEpiSamplingMethod\n\nA NUTS method for sampling from a DynamicPPL.Model object.\n\nThe NUTSampler struct represents using the No-U-Turn Sampler (NUTS) to sample from the distribution defined by a DynamicPPL.Model.\n\n\n\nFields\n\ntarget_acceptance::Float64: The target acceptance rate for the sampler.\nadtype::ADTypes.AbstractADType: The automatic differentiation type used for computing gradients.\nmcmc_parallel::AbstractMCMC.AbstractMCMCEnsemble: The parallelization strategy for the MCMC sampler.\nnchains::Int64: The number of MCMC chains to run.\nmax_depth::Int64: Tree depth limit for the NUTS sampler.\nΔ_max::Float64: Divergence threshold for the NUTS sampler.\ninit_ϵ::Float64: The initial step size for the NUTS sampler.\nndraws::Int64: The number of samples to draw from each chain.\nmetricT::Any: The metric type to use for the HMC sampler.\nnadapts::Int64: number of adaptation steps\n\n\n\n\n\n","category":"type"},{"location":"lib/EpiInference/public/#EpiAware.EpiInference.manypathfinder-Tuple{DynamicPPL.Model, Any}","page":"Public API","title":"EpiAware.EpiInference.manypathfinder","text":"manypathfinder(\n mdl::DynamicPPL.Model,\n ndraws;\n nruns,\n maxiters,\n max_tries,\n kwargs...\n) -> Any\n\n\nRun multiple instances of the pathfinder algorithm and returns the pathfinder run with the largest ELBO estimate.\n\nArguments\n\nmdl::DynamicPPL.Model: The model to perform inference on.\nnruns::Int: The number of pathfinder runs to perform.\nndraws::Int: The number of draws per pathfinder run, readjusted to be at least as large as the number of chains.\nnchains::Int: The number of chains that will be initialised by pathfinder draws.\nmaxiters::Int: The maximum number of optimizer iterations per pathfinder run.\nmax_tries::Int: The maximum number of extra tries to find a valid pathfinder result.\nkwargs...: Additional keyword arguments passed to pathfinder.\n\nReturns\n\nbest_pfs::PathfinderResult: Best pathfinder result by estimated ELBO.\n\n\n\n\n\n","category":"method"},{"location":"lib/EpiLatentModels/public/#Public-Documentation","page":"Public API","title":"Public Documentation","text":"","category":"section"},{"location":"lib/EpiLatentModels/public/","page":"Public API","title":"Public API","text":"Documentation for EpiLatentModels.jl's public interface.","category":"page"},{"location":"lib/EpiLatentModels/public/","page":"Public API","title":"Public API","text":"See the Internals section of the manual for internal package docs covering all submodules.","category":"page"},{"location":"lib/EpiLatentModels/public/#Contents","page":"Public API","title":"Contents","text":"","category":"section"},{"location":"lib/EpiLatentModels/public/","page":"Public API","title":"Public API","text":"Pages = [\"public.md\"]\nDepth = 2:2","category":"page"},{"location":"lib/EpiLatentModels/public/#Index","page":"Public API","title":"Index","text":"","category":"section"},{"location":"lib/EpiLatentModels/public/","page":"Public API","title":"Public API","text":"Pages = [\"public.md\"]","category":"page"},{"location":"lib/EpiLatentModels/public/#Public-API","page":"Public API","title":"Public API","text":"","category":"section"},{"location":"lib/EpiLatentModels/public/","page":"Public API","title":"Public API","text":"Modules = [EpiAware.EpiLatentModels]\nPrivate = false","category":"page"},{"location":"lib/EpiLatentModels/public/#EpiAware.EpiLatentModels","page":"Public API","title":"EpiAware.EpiLatentModels","text":"Module for defining latent models.\n\n\n\n\n\n","category":"module"},{"location":"lib/EpiLatentModels/public/#EpiAware.EpiLatentModels.AR","page":"Public API","title":"EpiAware.EpiLatentModels.AR","text":"struct AR{D<:Distributions.Sampleable, S<:Distributions.Sampleable, I<:Distributions.Sampleable, P<:Int64} <: AbstractTuringLatentModel\n\nThe autoregressive (AR) model struct.\n\nConstructors\n\nAR(damp_prior::Distribution, std_prior::Distribution, init_prior::Distribution; p::Int = 1): Constructs an AR model with the specified prior distributions for damping coefficients, standard deviation, and initial conditions. The order of the AR model can also be specified.\nAR(; damp_priors::Vector{D} = [truncated(Normal(0.0, 0.05))], std_prior::Distribution = truncated(Normal(0.0, 0.05), 0.0, Inf), init_priors::Vector{I} = [Normal()]) where {D <: Distribution, I <: Distribution}: Constructs an AR model with the specified prior distributions for damping coefficients, standard deviation, and initial conditions. The order of the AR model is determined by the length of the damp_priors vector.\nAR(damp_prior::Distribution, std_prior::Distribution, init_prior::Distribution, p::Int): Constructs an AR model with the specified prior distributions for damping coefficients, standard deviation, and initial conditions. The order of the AR model is explicitly specified.\n\nExamples\n\nusing Distributions\nusing EpiAware\nar = AR()\nar_model = generate_latent(ar, 10)\nrand(ar_model)\n\n\n\nFields\n\ndamp_prior::Distributions.Sampleable: Prior distribution for the damping coefficients.\nstd_prior::Distributions.Sampleable: Prior distribution for the standard deviation.\ninit_prior::Distributions.Sampleable: Prior distribution for the initial conditions\np::Int64: Order of the AR model.\n\n\n\n\n\n","category":"type"},{"location":"lib/EpiLatentModels/public/#EpiAware.EpiLatentModels.BroadcastLatentModel","page":"Public API","title":"EpiAware.EpiLatentModels.BroadcastLatentModel","text":"struct BroadcastLatentModel{M<:AbstractTuringLatentModel, P<:Integer, B<:AbstractBroadcastRule} <: AbstractTuringLatentModel\n\nThe BroadcastLatentModel struct represents a latent model that supports broadcasting of latent periods.\n\nConstructors\n\nBroadcastLatentModel(;model::M; period::Int, broadcast_rule::B): Constructs a BroadcastLatentModel with the given model, period, and broadcast_rule.\nBroadcastLatentModel(model::M, period::Int, broadcast_rule::B): An alternative constructor that allows the model, period, and broadcast_rule to be specified without keyword arguments.\n\nExamples\n\nusing EpiAware, Turing\neach_model = BroadcastLatentModel(RandomWalk(), 7, RepeatEach())\ngen_each_model = generate_latent(each_model, 10)\nrand(gen_each_model)\n\nblock_model = BroadcastLatentModel(RandomWalk(), 3, RepeatBlock())\ngen_block_model = generate_latent(block_model, 10)\nrand(gen_block_model)\n\n\n\nFields\n\nmodel::AbstractTuringLatentModel: The underlying latent model.\nperiod::Integer: The period of the broadcast.\nbroadcast_rule::AbstractBroadcastRule: The broadcast rule to be applied.\n\n\n\n\n\n","category":"type"},{"location":"lib/EpiLatentModels/public/#EpiAware.EpiLatentModels.CombineLatentModels","page":"Public API","title":"EpiAware.EpiLatentModels.CombineLatentModels","text":"struct CombineLatentModels{M<:(AbstractVector{<:AbstractTuringLatentModel}), P<:(AbstractVector{<:String})} <: AbstractTuringLatentModel\n\nThe CombineLatentModels struct.\n\nThis struct is used to combine multiple latent models into a single latent model. If a prefix is supplied wraps each model with PrefixLatentModel.\n\nConstructors\n\nCombineLatentModels(models::M, prefixes::P) where {M <: AbstractVector{<:AbstractTuringLatentModel}, P <: AbstractVector{<:String}}: Constructs a CombineLatentModels instance with specified models and prefixes, ensuring that there are at least two models and the number of models and prefixes are equal.\nCombineLatentModels(models::M) where {M <: AbstractVector{<:AbstractTuringLatentModel}}: Constructs a CombineLatentModels instance with specified models, automatically generating prefixes for each model. The\n\nautomatic prefixes are of the form Combine.1, Combine.2, etc.\n\nExamples\n\nusing EpiAware, Distributions\ncombined_model = CombineLatentModels([Intercept(Normal(2, 0.2)), AR()])\nlatent_model = generate_latent(combined_model, 10)\nlatent_model()\n\n\n\nFields\n\nmodels::AbstractVector{<:AbstractTuringLatentModel}: A vector of latent models\nprefixes::AbstractVector{<:String}: A vector of prefixes for the latent models\n\n\n\n\n\n","category":"type"},{"location":"lib/EpiLatentModels/public/#EpiAware.EpiLatentModels.ConcatLatentModels","page":"Public API","title":"EpiAware.EpiLatentModels.ConcatLatentModels","text":"struct ConcatLatentModels{M<:(AbstractVector{<:AbstractTuringLatentModel}), N<:Int64, F<:Function, P<:(AbstractVector{<:String})} <: AbstractTuringLatentModel\n\nThe ConcatLatentModels struct.\n\nThis struct is used to concatenate multiple latent models into a single latent model.\n\nConstructors\n\nConcatLatentModels(models::M, no_models::I, dimension_adaptor::F, prefixes::P) where {M <: AbstractVector{<:AbstractTuringLatentModel}, I <: Int, F <: Function, P <: AbstractVector{String}}: Constructs a ConcatLatentModels instance with specified models, number of models, dimension adaptor, and prefixes.\nConcatLatentModels(models::M, dimension_adaptor::F; prefixes::P = \"Concat.\" * string.(1:length(models))) where {M <: AbstractVector{<:AbstractTuringLatentModel}, F <: Function}: Constructs a ConcatLatentModels instance with specified models and dimension adaptor. The number of models is automatically determined as are the prefixes (of the form Concat.1, Concat.2, etc.) by default.\nConcatLatentModels(models::M; dimension_adaptor::Function, prefixes::P) where {M <: AbstractVector{<:AbstractTuringLatentModel}, P <: AbstractVector{String}}: Constructs a ConcatLatentModels instance with specified models, dimension adaptor, prefixes, and automatically determines the number of models.The default dimension adaptor is equal_dimensions. The default prefixes are of the form Concat.1, Concat.2, etc.\nConcatLatentModels(; models::M, dimension_adaptor::Function, prefixes::P) where {M <: AbstractVector{<:AbstractTuringLatentModel}, P <: AbstractVector{String}}: Constructs a ConcatLatentModels instance with specified models, dimension adaptor, prefixes, and automatically determines the number of models. The default dimension adaptor is equal_dimensions. The default prefixes are of the form Concat.1, Concat.2, etc.\n\nExamples\n\nusing EpiAware, Distributions\ncombined_model = ConcatLatentModels([Intercept(Normal(2, 0.2)), AR()])\nlatent_model = generate_latent(combined_model, 10)\nlatent_model()\n\n\n\nFields\n\nmodels::AbstractVector{<:AbstractTuringLatentModel}: A vector of latent models\nno_models::Int64: The number of models in the collection\ndimension_adaptor::Function: The dimension function for the latent variables. By default this divides the number of latent variables by the number of models and returns a vector of dimensions rounding up the first element and rounding down the rest.\nprefixes::AbstractVector{<:String}: A vector of prefixes for the latent models\n\n\n\n\n\n","category":"type"},{"location":"lib/EpiLatentModels/public/#EpiAware.EpiLatentModels.DiffLatentModel","page":"Public API","title":"EpiAware.EpiLatentModels.DiffLatentModel","text":"struct DiffLatentModel{M<:AbstractTuringLatentModel, P<:Distributions.Distribution} <: AbstractTuringLatentModel\n\nModel the latent process as a d-fold differenced version of another process.\n\nMathematical specification\n\nLet Delta be the differencing operator. If tildeZ_t is a realisation of undifferenced latent model supplied to DiffLatentModel, then the differenced process is given by,\n\nDelta^(d) Z_t = tildeZ_t quad t = d+1 ldots\n\nWe can recover Z_t by applying the inverse differencing operator Delta^-1, which corresponds to the cumulative sum operator cumsum in Julia, d-times. The d initial terms Z_1 ldots Z_d are inferred.\n\nConstructors\n\nDiffLatentModel(latent_model, init_prior_distribution::Distribution; d::Int) Constructs a DiffLatentModel for d-fold differencing with latent_model as the undifferenced latent process. All initial terms have common prior init_prior_distribution.\nDiffLatentModel(;model, init_priors::Vector{D} where {D <: Distribution}) Constructs a DiffLatentModel for d-fold differencing with latent_model as the undifferenced latent process. The d initial terms have priors given by the vector init_priors, therefore length(init_priors) sets d.\n\nExample usage with generate_latent\n\ngenerate_latent can be used to construct a Turing model for the differenced latent process. In this example, the underlying undifferenced process is a RandomWalk model.\n\nFirst, we construct a RandomWalk struct with an initial value prior and a step size standard deviation prior.\n\nusing Distributions, EpiAware\nrw = RandomWalk(Normal(0.0, 1.0), truncated(Normal(0.0, 0.05), 0.0, Inf))\n\nThen, we can use DiffLatentModel to construct a DiffLatentModel for d-fold differenced process with rw as the undifferenced latent process.\n\nWe have two constructor options for DiffLatentModel. The first option is to supply a common prior distribution for the initial terms and specify d as follows:\n\ndiff_model = DiffLatentModel(rw, Normal(); d = 2)\n\nOr we can supply a vector of priors for the initial terms and d is inferred as follows:\n\ndiff_model2 = DiffLatentModel(;undiffmodel = rw, init_priors = [Normal(), Normal()])\n\nThen, we can use generate_latent to construct a Turing model for the differenced latent process generating a length n process,\n\n# Construct a Turing model\nn = 100\ndifference_mdl = generate_latent(diff_model, n)\n\nNow we can use the Turing PPL API to sample underlying parameters and generate the unobserved latent process.\n\n#Sample random parameters from prior\nθ = rand(difference_mdl)\n#Get a sampled latent process as a generated quantity from the model\n(Z_t, _) = generated_quantities(difference_mdl, θ)\nZ_t\n\n\n\nFields\n\nmodel::AbstractTuringLatentModel: Underlying latent model for the differenced process\ninit_prior::Distributions.Distribution: The prior distribution for the initial latent variables.\nd::Int64: Number of times differenced.\n\n\n\n\n\n","category":"type"},{"location":"lib/EpiLatentModels/public/#EpiAware.EpiLatentModels.FixedIntercept","page":"Public API","title":"EpiAware.EpiLatentModels.FixedIntercept","text":"struct FixedIntercept{F<:Real} <: AbstractTuringIntercept\n\nA variant of the Intercept struct that represents a fixed intercept value for a latent model.\n\nConstructors\n\nFixedIntercept(intercept) : Constructs a FixedIntercept instance with the specified intercept value.\nFixedIntercept(; intercept) : Constructs a FixedIntercept instance with the specified intercept value using named arguments.\n\nExamples\n\nusing EpiAware\nfi = FixedIntercept(2.0)\nfi_model = generate_latent(fi, 10)\nfi_model()\n\n\n\nFields\n\nintercept::Real\n\n\n\n\n\n","category":"type"},{"location":"lib/EpiLatentModels/public/#EpiAware.EpiLatentModels.HierarchicalNormal","page":"Public API","title":"EpiAware.EpiLatentModels.HierarchicalNormal","text":"struct HierarchicalNormal{R<:Real, D<:Distributions.Sampleable} <: AbstractTuringLatentModel\n\nThe HierarchicalNormal struct represents a non-centered hierarchical normal distribution.\n\nConstructors\n\nHierarchicalNormal(mean, std_prior): Constructs a HierarchicalNormal instance with the specified mean and standard deviation prior.\nHierarchicalNormal(; mean = 0.0, std_prior = truncated(Normal(0,1), 0, Inf)): Constructs a HierarchicalNormal instance with the specified mean and standard deviation prior using named arguments and with default values.\n\nExamples\n\nusing Distributions, EpiAware\nhnorm = HierarchicalNormal(0.0, truncated(Normal(0, 1), 0, Inf))\nhnorm_model = generate_latent(hnorm, 10)\nhnorm_model()\n\n\n\nFields\n\nmean::Real\nstd_prior::Distributions.Sampleable\n\n\n\n\n\n","category":"type"},{"location":"lib/EpiLatentModels/public/#EpiAware.EpiLatentModels.Intercept","page":"Public API","title":"EpiAware.EpiLatentModels.Intercept","text":"struct Intercept{D<:Distributions.Sampleable} <: AbstractTuringIntercept\n\nThe Intercept struct is used to model the intercept of a latent process. It broadcasts a single intercept value to a length n latent process.\n\nConstructors\n\nIntercept(intercept_prior)\nIntercept(; intercept_prior)\n\nExamples\n\nusing Distributions, Turing, EpiAware\nint = Intercept(Normal(0, 1))\nint_model = generate_latent(int, 10)\nrand(int_model)\nint_model()\n\n\n\nFields\n\nintercept_prior::Distributions.Sampleable: Prior distribution for the intercept.\n\n\n\n\n\n","category":"type"},{"location":"lib/EpiLatentModels/public/#EpiAware.EpiLatentModels.PrefixLatentModel","page":"Public API","title":"EpiAware.EpiLatentModels.PrefixLatentModel","text":"struct PrefixLatentModel{M<:AbstractTuringLatentModel, P<:String} <: AbstractTuringLatentModel\n\nGenerate a latent model with a prefix. A lightweight wrapper around `EpiAwareUtils.prefix_submodel`.\n\n# Constructors\n- `PrefixLatentModel(model::M, prefix::P)`: Create a `PrefixLatentModel` with the latent model `model` and the prefix `prefix`.\n- `PrefixLatentModel(; model::M, prefix::P)`: Create a `PrefixLatentModel` with the latent model `model` and the prefix `prefix`.\n\n# Examples\n```julia\nusing EpiAware\nlatent_model = PrefixLatentModel(model = HierarchicalNormal(), prefix = \"Test\")\nmdl = generate_latent(latent_model, 10)\nrand(mdl)\n```\n\n\n\nFields\n\nmodel::AbstractTuringLatentModel: The latent model\nprefix::String: The prefix for the latent model\n\n\n\n\n\n","category":"type"},{"location":"lib/EpiLatentModels/public/#EpiAware.EpiLatentModels.RandomWalk","page":"Public API","title":"EpiAware.EpiLatentModels.RandomWalk","text":"struct RandomWalk{D<:Distributions.Sampleable, S<:Distributions.Sampleable} <: AbstractTuringLatentModel\n\nModel latent process Z_t as a random walk.\n\nMathematical specification\n\nThe random walk Z_t is specified as a parameteric transformation of the white noise sequence (epsilon_t)_tgeq 1,\n\nZ_t = Z_0 + sigma sum_i = 1^t epsilon_t\n\nConstructing a random walk requires specifying:\n\nAn init_prior as a prior for Z_0. Default is Normal().\nA std_prior for sigma. The default is HalfNormal with a mean of 0.25.\n\nConstructors\n\nRandomWalk(; init_prior, std_prior)\n\nExample usage with generate_latent\n\ngenerate_latent can be used to construct a Turing model for the random walk Z_t.\n\nFirst, we construct a RandomWalk struct with priors,\n\nusing Distributions, Turing, EpiAware\n\n# Create a RandomWalk model\nrw = RandomWalk(init_prior = Normal(2., 1.),\n std_prior = HalfNormal(0.1))\n\nThen, we can use generate_latent to construct a Turing model for a 10 step random walk.\n\n# Construct a Turing model\nrw_model = generate_latent(rw, 10)\n\nNow we can use the Turing PPL API to sample underlying parameters and generate the unobserved infections.\n\n#Sample random parameters from prior\nθ = rand(rw_model)\n#Get random walk sample path as a generated quantities from the model\nZ_t, _ = generated_quantities(rw_model, θ)\n\n\n\nFields\n\ninit_prior::Distributions.Sampleable\nstd_prior::Distributions.Sampleable\n\n\n\n\n\n","category":"type"},{"location":"lib/EpiLatentModels/public/#EpiAware.EpiLatentModels.RecordExpectedLatent","page":"Public API","title":"EpiAware.EpiLatentModels.RecordExpectedLatent","text":"struct RecordExpectedLatent{M<:AbstractTuringLatentModel} <: AbstractTuringLatentModel\n\nRecord a variable (using the Turing := syntax) in a latent model.\n\n# Fields\n- `model::AbstractTuringLatentModel`: The latent model to dispatch to.\n\n# Constructors\n\n- `RecordExpectedLatent(model::AbstractTuringLatentModel)`: Record the expected latent vector from the model as `exp_latent`.\n\n# Examples\n\n```julia\nusing EpiAware, Turing\nmdl = RecordExpectedLatent(FixedIntercept(0.1))\ngen_latent = generate_latent(mdl, 1)\nsample(gen_latent, Prior(), 10)\n```\n\n\n\nFields\n\nmodel::AbstractTuringLatentModel\n\n\n\n\n\n","category":"type"},{"location":"lib/EpiLatentModels/public/#EpiAware.EpiLatentModels.RepeatBlock","page":"Public API","title":"EpiAware.EpiLatentModels.RepeatBlock","text":"struct RepeatBlock <: AbstractBroadcastRule\n\nRepeatBlock is a struct that represents a broadcasting rule. It is a subtype of AbstractBroadcastRule.\n\nIt repeats the latent process in blocks of size period. An example of this rule is to repeat the latent process in blocks of size 7 to model a weekly process (though for this we also provide the broadcast_weekly helper function).\n\nExamples\n\nusing EpiAware\nrule = RepeatBlock()\nlatent = [1, 2, 3, 4, 5]\nn = 10\nperiod = 2\nbroadcast_rule(rule, latent, n, period)\n\n\n\nFields\n\n\n\n\n\n","category":"type"},{"location":"lib/EpiLatentModels/public/#EpiAware.EpiLatentModels.RepeatEach","page":"Public API","title":"EpiAware.EpiLatentModels.RepeatEach","text":"struct RepeatEach <: AbstractBroadcastRule\n\nRepeatEach is a struct that represents a broadcasting rule. It is a subtype of AbstractBroadcastRule.\n\nIt repeats the latent process at each period. An example of this rule is to repeat the latent process at each day of the week (though for this we also provide the dayofweek helper function).\n\nExamples\n\nusing EpiAware\nrule = RepeatEach()\nlatent = [1, 2]\nn = 10\nperiod = 2\nbroadcast_rule(rule, latent, n, period)\n\n\n\nFields\n\n\n\n\n\n","category":"type"},{"location":"lib/EpiLatentModels/public/#EpiAware.EpiLatentModels.TransformLatentModel","page":"Public API","title":"EpiAware.EpiLatentModels.TransformLatentModel","text":"struct TransformLatentModel{M<:AbstractTuringLatentModel, F<:Function} <: AbstractTuringLatentModel\n\nThe TransformLatentModel struct represents a latent model that applies a transformation function to the latent variables generated by another latent model.\n\nConstructors\n\nTransformLatentModel(model, trans_function): Constructs a TransformLatentModel instance with the specified latent model and transformation function.\nTransformLatentModel(; model, trans_function): Constructs a TransformLatentModel instance with the specified latent model and transformation function using named arguments.\n\nExample\n\nusing EpiAware, Distributions\ntrans = TransformLatentModel(Intercept(Normal(2, 0.2)), x -> x .|> exp)\ntrans_model = generate_latent(trans, 5)\ntrans_model()\n\n\n\nFields\n\nmodel::AbstractTuringLatentModel: The latent model to transform.\ntrans_function::Function: The transformation function.\n\n\n\n\n\n","category":"type"},{"location":"lib/EpiLatentModels/public/#EpiAware.EpiAwareBase.broadcast_rule-Tuple{RepeatBlock, Any, Any, Any}","page":"Public API","title":"EpiAware.EpiAwareBase.broadcast_rule","text":"broadcast_rule(_::RepeatBlock, latent, n, period) -> Any\n\n\nbroadcast_rule is a function that applies the RepeatBlock rule to the latent process latent to generate n samples.\n\nArguments\n\nrule::RepeatBlock: The broadcasting rule.\nlatent::Vector: The latent process.\nn: The number of samples to generate.\nperiod: The period of the broadcast.\n\nReturns\n\nlatent: The generated broadcasted latent periods.\n\n\n\n\n\n","category":"method"},{"location":"lib/EpiLatentModels/public/#EpiAware.EpiAwareBase.broadcast_rule-Tuple{RepeatEach, Any, Any, Any}","page":"Public API","title":"EpiAware.EpiAwareBase.broadcast_rule","text":"broadcast_rule(_::RepeatEach, latent, n, period) -> Any\n\n\nbroadcast_rule is a function that applies the RepeatEach rule to the latent process latent to generate n samples.\n\nArguments\n\nrule::RepeatEach: The broadcasting rule.\nlatent::Vector: The latent process.\nn: The number of samples to generate.\nperiod: The period of the broadcast.\n\nReturns\n\nlatent: The generated broadcasted latent periods.\n\n\n\n\n\n","category":"method"},{"location":"lib/EpiLatentModels/public/#EpiAware.EpiLatentModels.broadcast_dayofweek-Tuple{AbstractTuringLatentModel}","page":"Public API","title":"EpiAware.EpiLatentModels.broadcast_dayofweek","text":"broadcast_dayofweek(\n model::AbstractTuringLatentModel;\n link\n) -> BroadcastLatentModel{TransformLatentModel{M, EpiAware.EpiLatentModels.var\"#42#44\"}, Int64, RepeatEach} where M<:AbstractTuringLatentModel\n\n\nConstructs a BroadcastLatentModel appropriate for modelling the day of the week for a given AbstractTuringLatentModel.\n\nArguments\n\nmodel::AbstractTuringLatentModel: The latent model to be repeated.\nlink::Function: The link function to transform the latent model before broadcasting\n\nto periodic weekly. Default is x -> 7 * softmax(x) which implements constraint of the sum week effects to be 7.\n\nReturns\n\nBroadcastLatentModel: The broadcast latent model.\n\n\n\n\n\n","category":"method"},{"location":"lib/EpiLatentModels/public/#EpiAware.EpiLatentModels.broadcast_weekly-Tuple{AbstractTuringLatentModel}","page":"Public API","title":"EpiAware.EpiLatentModels.broadcast_weekly","text":"broadcast_weekly(\n model::AbstractTuringLatentModel\n) -> BroadcastLatentModel{<:AbstractTuringLatentModel, Int64, RepeatBlock}\n\n\nConstructs a BroadcastLatentModel appropriate for modelling piecewise constant weekly processes for a given AbstractTuringLatentModel.\n\nArguments\n\nmodel::AbstractTuringLatentModel: The latent model to be repeated.\n\nReturns\n\nBroadcastLatentModel: The broadcast latent model.\n\n\n\n\n\n","category":"method"},{"location":"lib/EpiLatentModels/public/#EpiAware.EpiLatentModels.equal_dimensions-Tuple{Int64, Int64}","page":"Public API","title":"EpiAware.EpiLatentModels.equal_dimensions","text":"equal_dimensions(n::Int64, m::Int64) -> Vector{Int64}\n\n\nReturn a vector of dimensions that are equal or as close as possible, given the total number of elements n and the number of dimensions m. The default dimension adaptor for ConcatLatentModels.\n\nArguments\n\nn::Int: The total number of elements.\nm::Int: The number of dimensions.\n\nReturns\n\ndims::AbstractVector{Int}: A vector of dimensions, where the first element is the ceiling of n / m and the remaining elements are the floor of n / m.\n\n\n\n\n\n","category":"method"},{"location":"getting-started/faq/#Frequently-asked-questions","page":"Frequently asked questions","title":"Frequently asked questions","text":"","category":"section"},{"location":"getting-started/faq/","page":"Frequently asked questions","title":"Frequently asked questions","text":"This page contains a list of frequently asked questions about the EpiAware package. If you have a question that is not answered here, please open a discussion on the GitHub repository.","category":"page"},{"location":"getting-started/faq/","page":"Frequently asked questions","title":"Frequently asked questions","text":"Pages = [\"lib/getting-started/faq.md\"]","category":"page"},{"location":"getting-started/faq/#Pluto-notebooks","page":"Frequently asked questions","title":"Pluto notebooks","text":"","category":"section"},{"location":"getting-started/faq/","page":"Frequently asked questions","title":"Frequently asked questions","text":"In some of the showcase examples in EpiAware/docs/src/showcase we use Pluto.jl notebooks for the underlying computation. As well as reading the code blocks and output of the notebooks in this documentation, you can also run these notebooks by cloning EpiAware and running the notebooks with Pluto.jl (for further details see developer notes).","category":"page"},{"location":"getting-started/faq/","page":"Frequently asked questions","title":"Frequently asked questions","text":"It should be noted that Pluto.jl notebooks are reactive, meaning that they re-run downstream code after changes with downstreaming determined by a tree of dependent code blocks. This is different from the standard Julia REPL, and some other notebook formats (e.g. .ipynb). In Pluto each code block is a single lines of code or encapsulated by let ... end and begin ... end. The difference between let ... end blocks and begin ... end blocks are that the let ... end type of code block only adds the final output/return value of the block to scope, like an anonymous function, whereas begin ... end executes each line and adds defined variables to scope.","category":"page"},{"location":"getting-started/faq/","page":"Frequently asked questions","title":"Frequently asked questions","text":"For installation instructions and more information and documentation on Pluto.jl see the Pluto.jl documentation.","category":"page"},{"location":"getting-started/faq/#Manipulating-EpiAware-model-specifications","page":"Frequently asked questions","title":"Manipulating EpiAware model specifications","text":"","category":"section"},{"location":"getting-started/faq/#Modular-model-construction","page":"Frequently asked questions","title":"Modular model construction","text":"","category":"section"},{"location":"getting-started/faq/","page":"Frequently asked questions","title":"Frequently asked questions","text":"One of the key features of EpiAware is the ability to specify models as components of a larger model. This is useful for specifying models that are shared across multiple EpiProblems or for specifying models that are used in multiple methods. You can see an examples of this approach in our showcases.","category":"page"},{"location":"getting-started/faq/#Remaking-models","page":"Frequently asked questions","title":"Remaking models","text":"","category":"section"},{"location":"getting-started/faq/","page":"Frequently asked questions","title":"Frequently asked questions","text":"An alternative to modular model construction is to remake models with different parameters. This can be useful for comparing models with different parameters or for comparing models with different priors. Whilst we don't have a built in function for this, we recommend the Accessors.jl package for this purpose. For examples of how to use this package see the documentation.","category":"page"},{"location":"getting-started/faq/#Working-with-Turing.jl-models","page":"Frequently asked questions","title":"Working with Turing.jl models","text":"","category":"section"},{"location":"getting-started/faq/#[DynamicPPL.jl](https://github.com/TuringLang/DynamicPPL.jl)","page":"Frequently asked questions","title":"DynamicPPL.jl","text":"","category":"section"},{"location":"getting-started/faq/","page":"Frequently asked questions","title":"Frequently asked questions","text":"Whilst Turing.jl is the front end of the Turing.jl ecosystem, it is not the only package that can be used to work with Turing.jl models. DynamicPPL.jl is the part of the ecosytem that deals with defining, running, and manipulating models.","category":"page"},{"location":"getting-started/faq/#Conditioning-and-deconditioning-models","page":"Frequently asked questions","title":"Conditioning and deconditioning models","text":"","category":"section"},{"location":"getting-started/faq/","page":"Frequently asked questions","title":"Frequently asked questions","text":"DynamicPPL supports the condition (alased with |) to fix values as known observations in the model (i.e fixing values on the left hand side of ~ definitions). This is useful for fixing parameters to known values or for conditioning the model on data. The decondition function can be used to remove these conditions. Internally this is what apply_method(::EpiProblem, ...) does to condition the user supplied EpiProblem to data. See more here.","category":"page"},{"location":"getting-started/faq/#Fixing-and-unfixing-models","page":"Frequently asked questions","title":"Fixing and unfixing models","text":"","category":"section"},{"location":"getting-started/faq/","page":"Frequently asked questions","title":"Frequently asked questions","text":"Similarly to conditioning and deconditioning models, DynamicPPL supports fixing and unfixing models via the fix and unfix functions. Fixing is essentially saying that variables are constants (i.e replacing the right hand side of ~ with a value and changing the ~ to a =). A common use of this would be to simplify a prespecified model, for example to make the variance of a random walk be known versus estimated from the data. We also use this functionality in apply_method(::EpiProblem, ...) to allow users to simplify EpiProblems on the fly. See more here.","category":"page"},{"location":"getting-started/faq/#Tools-for-working-with-MCMCChain-objects","page":"Frequently asked questions","title":"Tools for working with MCMCChain objects","text":"","category":"section"},{"location":"getting-started/faq/#[MCMCChain.jl](https://turinglang.org/MCMCChains.jl/stable/)","page":"Frequently asked questions","title":"MCMCChain.jl","text":"","category":"section"},{"location":"getting-started/faq/","page":"Frequently asked questions","title":"Frequently asked questions","text":"MCMCChain.jl is the package from which MCMCChains is imported. It provides a number of useful functions for working with MCMCChain objects. These include functions for summarising, plotting, and manipulating chains. Below is a list of some of the most useful functions.","category":"page"},{"location":"getting-started/faq/","page":"Frequently asked questions","title":"Frequently asked questions","text":"plot: Plots trace and density plots for each parameter in the chain object.\nhistogram: Plots histograms for each parameter in the chain object by chain.\nget: Accesses the values of a parameter/s in the chain object.\nDataFrames.DataFrame converts a chain into a wide format DataFrame.\ndescribe: Prints the summary statistics of the chain object.","category":"page"},{"location":"getting-started/faq/","page":"Frequently asked questions","title":"Frequently asked questions","text":"There are many more functions available in the MCMCChain.jl package. For a full list of functions, see the documentation.","category":"page"},{"location":"getting-started/faq/#[Arviz.jl](https://julia.arviz.org/ArviZ/stable/)","page":"Frequently asked questions","title":"Arviz.jl","text":"","category":"section"},{"location":"getting-started/faq/","page":"Frequently asked questions","title":"Frequently asked questions","text":"An alternative to MCMCChain.jl is the ArviZ.jl package. ArviZ.jl is a Julia meta-package for exploratory analysis of Bayesian models. It is part of the ArviZ project, which also includes a related Python package.","category":"page"},{"location":"getting-started/faq/","page":"Frequently asked questions","title":"Frequently asked questions","text":"ArviZ.jl uses a InferenceData object to store the results of a Bayesian analysis. This object can be created from a MCMCChain object using the from_mcmcchains function. The InferenceData object can then be used to create a range of plots and summaries of the model. This is particularly useful as it allows you to specify the indexes of your parameters (for example you could use dates for time parameters).","category":"page"},{"location":"getting-started/faq/","page":"Frequently asked questions","title":"Frequently asked questions","text":"In addition to this useful functionality from_mcmcchains can also be used to combine posterior predictions with prior predictions, prior information and the log likelihood of the model (see here for an example of this). This unlocks a range of useful diagnostics and plots that can be used to assess the model.","category":"page"},{"location":"getting-started/faq/","page":"Frequently asked questions","title":"Frequently asked questions","text":"There is a lot of functionality in ArviZ.jl and it is worth exploring the documentation to see what is available.","category":"page"},{"location":"showcase/replications/chatzilena-2019/","page":"Statistical inference for ODE-based infectious disease models","title":"Statistical inference for ODE-based infectious disease models","text":"\n\n\n\n\n\n\n\n

              Example: Statistical inference for ODE-based infectious disease models

              Introduction

              What are we going to do in this Vignette

              In this vignette, we'll demonstrate how to use EpiAware in conjunction with SciML ecosystem for Bayesian inference of infectious disease dynamics. The model and data is heavily based on Contemporary statistical inference for infectious disease models using Stan Chatzilena et al. 2019.

              We'll cover the following key points:

              1. Defining the deterministic ODE model from Chatzilena et al section 2.2.2 using SciML ODE functionality and an EpiAware observation model.

              2. Build on this to define the stochastic ODE model from Chatzilena et al section 2.2.3 using an EpiAware observation model.

              3. Fitting the deterministic ODE model to data from an Influenza outbreak in an English boarding school.

              4. Fitting the stochastic ODE model to data from an Influenza outbreak in an English boarding school.

              What might I need to know before starting

              This vignette builds on concepts from EpiAware observation models and a familarity with the SciML and Turing ecosystems would be useful but not essential.

              Packages used in this vignette

              Alongside the EpiAware package we will use the OrdinaryDiffEq and SciMLSensitivity packages for interfacing with SciML ecosystem; this is a lower dependency usage of DifferentialEquations.jl that, respectively, exposes ODE solvers and adjoint methods for ODE solvees; that is the method of propagating parameter derivatives through functions containing ODE solutions. Bayesian inference will be done with NUTS from the Turing ecosystem. We will also use the CairoMakie package for plotting and DataFramesMeta for data manipulation.

              \n\n
              using EpiAware
              \n\n\n
              using Turing
              \n\n\n
              using OrdinaryDiffEq, SciMLSensitivity #ODE solvers and adjoint methods
              \n\n\n
              using Distributions, Statistics, LogExpFunctions #Statistics and special func packages
              \n\n\n
              using CSV, DataFramesMeta #Data wrangling
              \n\n\n
              using CairoMakie, PairPlots
              \n\n\n
              using ReverseDiff #Automatic differentiation backend
              \n\n\n
              begin #Date utility and set Random seed\n    using Dates\n    using Random\n    Random.seed!(1234)\nend
              \n
              TaskLocalRNG()
              \n\n\n

              Single population SIR model

              As mentioned in Chatzilena et al disease spread is frequently modelled in terms of ODE-based models. The study population is divided into compartments representing a specific stage of the epidemic status. In this case, susceptible, infected, and recovered individuals.

              $$\\begin{aligned}\n{dS \\over dt} &= - \\beta \\frac{I(t)}{N} S(t) \\\\\n{dI \\over dt} &= \\beta \\frac{I(t)}{N} S(t) - \\gamma I(t) \\\\\n{dR \\over dt} &= \\gamma I(t). \\\\\n\\end{aligned}$$

              where S(t) represents the number of susceptible, I(t) the number of infected and R(t) the number of recovered individuals at time t. The total population size is denoted by N (with N = S(t) + I(t) + R(t)), β denotes the transmission rate and γ denotes the recovery rate.

              \n\n\n

              We can interface to the SciML ecosystem by writing a function with the signature:

              (du, u, p, t) -> nothing

              Where:

              We do this for the SIR model described above in a function called sir!:

              \n\n
              function sir!(du, u, p, t)\n    S, I, R = u\n    β, γ = p\n    du[1] = -β * I * S\n    du[2] = β * I * S - γ * I\n    du[3] = γ * I\n\n    return nothing\nend
              \n
              sir! (generic function with 1 method)
              \n\n\n

              We combine vector field function sir! with a initial condition u0 and the integration period tspan to make an ODEProblem. We do not define the parameters, these will be defined within an inference approach.

              \n\n
              sir_prob = ODEProblem(\n    sir!,\n    N .* [0.99, 0.01, 0.0],\n    (0.0, (Date(1978, 2, 4) - Date(1978, 1, 22)).value + 1)\n)
              \n
              ODEProblem with uType Vector{Float64} and tType Float64. In-place: true\ntimespan: (0.0, 14.0)\nu0: 3-element Vector{Float64}:\n 755.37\n   7.63\n   0.0
              \n\n\n

              Note that this is analogous to the EpiProblem approach we expose from EpiAware, as used in the Mishra et al replication. The difference is that here we are going to use ODE solvers from the SciML ecosystem to generate the dynamics of the underlying infections. In the linked example, we use latent process generation exposed by EpiAware as the underlying generative process for underlying dynamics.

              \n\n","category":"page"},{"location":"showcase/replications/chatzilena-2019/#Data-for-inference","page":"Statistical inference for ODE-based infectious disease models","title":"Data for inference","text":"","category":"section"},{"location":"showcase/replications/chatzilena-2019/","page":"Statistical inference for ODE-based infectious disease models","title":"Statistical inference for ODE-based infectious disease models","text":"
              \n

              There was a brief, but intense, outbreak of Influenza within the (semi-) closed community of a boarding school reported to the British medical journal in 1978. The outbreak lasted from 22nd January to 4th February and it is reported that one infected child started the epidemic and then it spread rapidly. Of the 763 children at the boarding scholl, 512 became ill.

              We downloaded the data of this outbreak using the R package outbreaks which is maintained as part of the R Epidemics Consortium(RECON).

              \n\n
              data = \"https://raw.githubusercontent.com/CDCgov/Rt-without-renewal/refs/heads/main/EpiAware/docs/src/showcase/replications/chatzilena-2019/influenza_england_1978_school.csv2\" |>\n       url -> CSV.read(download(url), DataFrame) |>\n              df -> @transform(df,\n    :ts=(:date .- minimum(:date)) .|> d -> d.value + 1.0,)
              \n
              Column1datein_bedconvalescentts
              111978-01-22301.0
              221978-01-23802.0
              331978-01-242603.0
              441978-01-257604.0
              551978-01-2622595.0
              661978-01-27298176.0
              771978-01-282581057.0
              881978-01-292331628.0
              991978-01-301891769.0
              10101978-01-3112816610.0
              11111978-02-016815011.0
              12121978-02-02298512.0
              13131978-02-03144713.0
              14141978-02-0442014.0
              \n\n
              N = 763;
              \n\n\n","category":"page"},{"location":"showcase/replications/chatzilena-2019/#Inference-for-the-deterministic-SIR-model","page":"Statistical inference for ODE-based infectious disease models","title":"Inference for the deterministic SIR model","text":"","category":"section"},{"location":"showcase/replications/chatzilena-2019/","page":"Statistical inference for ODE-based infectious disease models","title":"Statistical inference for ODE-based infectious disease models","text":"
              \n

              The boarding school data gives the number of children \"in bed\" and \"convalescent\" on each of 14 days from 22nd Jan to 4th Feb 1978. We follow Chatzilena et al and treat the number \"in bed\" as a proxy for the number of children in the infectious (I) compartment in the ODE model.

              The full observation model is:

              $$\\begin{aligned}\nY_t &\\sim \\text{Poisson}(\\lambda_t)\\\\\n\\lambda_t &= I(t)\\\\\n\\beta &\\sim \\text{LogNormal}(\\text{logmean}=0,\\text{logstd}=1) \\\\\n\\gamma & \\sim \\text{Gamma}(\\text{shape} = 0.004, \\text{scale} = 50)\\\\\nS(0) /N &\\sim \\text{Beta}(0.5, 0.5).\n\\end{aligned}$$

              NB: Chatzilena et al give \\(\\lambda_t = \\int_0^t \\beta \\frac{I(s)}{N} S(s) - \\gamma I(s)ds = I(t) - I(0).\\) However, this doesn't match their underlying stan code.

              \n\n\n

              From EpiAware, we have the PoissonError struct which defines the probabilistic structure of this observation error model.

              \n\n
              obs = PoissonError()
              \n
              PoissonError()
              \n\n\n

              Now we can write the probabilistic model using the Turing PPL. Note that instead of using \\(I(t)\\) directly we do the softplus transform on \\(I(t)\\) implemented by LogExpFunctions.log1pexp. The reason is that the solver can return small negative numbers, the soft plus transform smoothly maintains positivity which being very close to \\(I(t)\\) when \\(I(t) > 2\\).

              \n\n
              @model function deterministic_ode_mdl(y_t, ts, obs, prob, N;\n        solver = AutoTsit5(Rosenbrock23())\n)\n    ##Priors##\n    β ~ LogNormal(0.0, 1.0)\n    γ ~ Gamma(0.004, 1 / 0.002)\n    S₀ ~ Beta(0.5, 0.5)\n\n    ##remake ODE model##\n    _prob = remake(prob;\n        u0 = [S₀, 1 - S₀, 0.0],\n        p = [β, γ]\n    )\n\n    ##Solve remade ODE model##\n\n    sol = solve(_prob, solver;\n        saveat = ts,\n        verbose = false)\n\n    ##log-like accumulation using obs##\n    λt = log1pexp.(N * sol[2, :]) # #expected It\n    @submodel generated_y_t = generate_observations(obs, y_t, λt)\n\n    ##Generated quantities##\n    return (; sol, generated_y_t, R0 = β / γ)\nend
              \n
              deterministic_ode_mdl (generic function with 2 methods)
              \n\n\n

              We instantiate the model in two ways:

              1. deterministic_mdl: This conditions the generative model on the data observation. We can sample from this model to find the posterior distribution of the parameters.

              2. deterministic_uncond_mdl: This doesn't condition on the data. This is useful for prior and posterior predictive modelling.

              Here we construct the Turing model directly, in the Mishra et al replication we using the EpiProblem functionality to build a Turing model under the hood. Because in this note we are using a mix of functionality from SciML and EpiAware, we construct the model to sample from directly.

              \n\n
              deterministic_mdl = deterministic_ode_mdl(data.in_bed, data.ts, obs, sir_prob, N);
              \n\n\n
              deterministic_uncond_mdl = deterministic_ode_mdl(\n    fill(missing, length(data.in_bed)), data.ts, obs, sir_prob, N);
              \n\n\n\n

              We add a useful plotting utility.

              \n\n
              function plot_predYt(data, gens; title::String, ylabel::String)\n    fig = Figure()\n    ga = fig[1, 1:2] = GridLayout()\n\n    ax = Axis(ga[1, 1];\n        title = title,\n        xticks = (data.ts[1:3:end], data.date[1:3:end] .|> string),\n        ylabel = ylabel\n    )\n    pred_Yt = mapreduce(hcat, gens) do gen\n        gen.generated_y_t\n    end |> X -> mapreduce(vcat, eachrow(X)) do row\n        quantile(row, [0.5, 0.025, 0.975, 0.1, 0.9, 0.25, 0.75])'\n    end\n\n    lines!(ax, data.ts, pred_Yt[:, 1]; linewidth = 3, color = :green, label = \"Median\")\n    band!(\n        ax, data.ts, pred_Yt[:, 2], pred_Yt[:, 3], color = (:green, 0.2), label = \"95% CI\")\n    band!(\n        ax, data.ts, pred_Yt[:, 4], pred_Yt[:, 5], color = (:green, 0.4), label = \"80% CI\")\n    band!(\n        ax, data.ts, pred_Yt[:, 6], pred_Yt[:, 7], color = (:green, 0.6), label = \"50% CI\")\n    scatter!(ax, data.in_bed, label = \"data\")\n    leg = Legend(ga[1, 2], ax; framevisible = false)\n    hidespines!(ax)\n\n    fig\nend
              \n
              plot_predYt (generic function with 1 method)
              \n\n\n

              Prior predictive sampling

              \n\n
              let\n    prior_chn = sample(deterministic_uncond_mdl, Prior(), 2000)\n    gens = generated_quantities(deterministic_uncond_mdl, prior_chn)\n    plot_predYt(data, gens;\n        title = \"Prior predictive: deterministic model\",\n        ylabel = \"Number of Infected students\"\n    )\nend
              \n\n\n\n

              The prior predictive checking suggests that a priori our parameter beliefs are very far from the data. Approaching the inference naively can lead to poor fits.

              We do three things to mitigate this:

              1. We choose a switching ODE solver which switches between explicit (Tsit5) and implicit (Rosenbrock23) solvers. This helps avoid the ODE solver failing when the sampler tries extreme parameter values. This is the default solver = AutoTsit5(Rosenbrock23()) above.

              2. We locate the maximum likelihood point, that is we ignore the influence of the priors, as a useful starting point for NUTS.

              \n\n
              nmle_tries = 100
              \n
              100
              \n\n
              mle_fit = map(1:nmle_tries) do _\n    fit = try\n        maximum_likelihood(deterministic_mdl)\n    catch\n        (lp = -Inf,)\n    end\nend |>\n          fits -> (findmax(fit -> fit.lp, fits)[2], fits) |>\n                  max_and_fits -> max_and_fits[2][max_and_fits[1]]
              \n
              ModeResult with maximized lp of -67.36\n[1.8991528341217605, 0.4808836287362608, 0.9995360155493858]
              \n\n
              mle_fit.optim_result.retcode
              \n
              ReturnCode.Success = 1
              \n\n\n

              Note that we choose the best out of 100 tries for the MLE estimators.

              Now, we sample aiming at 1000 samples for each of 4 chains.

              \n\n
              chn = sample(\n    deterministic_mdl, NUTS(), MCMCThreads(), 1000, 4;\n    initial_params = fill(mle_fit.values.array, 4)\n)
              \n
              iterationchainβγS₀lpn_stepsis_accept...
              150111.924530.4987620.999601-80.69227.01.0
              250211.913930.5104770.999503-83.06197.01.0
              350311.820630.4548110.999413-83.09815.01.0
              450412.008220.5025520.999739-83.595631.01.0
              550512.025140.4618030.999763-83.66615.01.0
              650611.999270.4659280.999722-81.84247.01.0
              750711.793810.4888090.999205-81.262563.01.0
              850811.790290.4903840.999199-81.4663.01.0
              950911.794890.4711040.999239-80.879915.01.0
              1051011.897170.4745680.999502-79.58715.01.0
              ...
              \n\n
              describe(chn)
              \n
              2-element Vector{ChainDataFrame}:\n Summary Statistics (3 x 8)\n Quantiles (3 x 6)
              \n\n
              pairplot(chn)
              \n\n\n\n

              Posterior predictive plotting

              \n\n
              let\n    gens = generated_quantities(deterministic_uncond_mdl, chn)\n    plot_predYt(data, gens;\n        title = \"Fitted deterministic model\",\n        ylabel = \"Number of Infected students\"\n    )\nend
              \n\n\n","category":"page"},{"location":"showcase/replications/chatzilena-2019/#Inference-for-the-Stochastic-SIR-model","page":"Statistical inference for ODE-based infectious disease models","title":"Inference for the Stochastic SIR model","text":"","category":"section"},{"location":"showcase/replications/chatzilena-2019/","page":"Statistical inference for ODE-based infectious disease models","title":"Statistical inference for ODE-based infectious disease models","text":"
              \n

              In Chatzilena et al, they present an auto-regressive model for connecting the outcome of the ODE model to illness observations. The argument is that the stochastic component of the model can absorb the noise generated by a possible mis-specification of the model.

              In their approach they consider \\(\\kappa_t = \\log \\lambda_t\\) where \\(\\kappa_t\\) evolves according to an Ornstein-Uhlenbeck process:

              $$d\\kappa_t = \\phi(\\mu_t - \\kappa_t) dt + \\sigma dB_t.$$

              Which has transition density:

              $$\\kappa_{t+1} | \\kappa_t \\sim N\\Big(\\mu_t + \\left(\\kappa_t - \\mu_t\\right)e^{-\\phi}, {\\sigma^2 \\over 2 \\phi} \\left(1 - e^{-2\\phi} \\right)\\Big).$$

              Where \\(\\mu_t = \\log(I(t))\\).

              We modify this approach since it implies that the \\(\\mu_t\\) is treated as constant between observation times.

              Instead we redefine \\(\\kappa_t\\) as the log-residual:

              $$\\kappa_t = \\log(\\lambda_t / I(t)).$$

              With the transition density:

              $$\\kappa_{t+1} | \\kappa_t \\sim N\\Big(\\kappa_te^{-\\phi}, {\\sigma^2 \\over 2 \\phi} \\left(1 - e^{-2\\phi} \\right)\\Big).$$

              This is an AR(1) process.

              The stochastic model is completed:

              $$\\begin{aligned}\nY_t &\\sim \\text{Poisson}(\\lambda_t)\\\\\n\\lambda_t &= I(t)\\exp(\\kappa_t)\\\\\n\\beta &\\sim \\text{LogNormal}(\\text{logmean}=0,\\text{logstd}=1) \\\\\n\\gamma & \\sim \\text{Gamma}(\\text{shape} = 0.004, \\text{scale} = 50)\\\\\nS(0) /N &\\sim \\text{Beta}(0.5, 0.5)\\\\\n\\phi & \\sim \\text{HalfNormal}(0, 100) \\\\\n1 / \\sigma^2 & \\sim \\text{InvGamma}(0.1,0.1).\n\\end{aligned}$$

              \n\n\n

              We will using the AR struct from EpiAware to define the auto-regressive process in this model which has a direct parameterisation of the AR model.

              To convert from the formulation above we sample from the priors, and define HalfNormal priors based on the sampled prior means of \\(e^{-\\phi}\\) and \\({\\sigma^2 \\over 2 \\phi} \\left(1 - e^{-2\\phi} \\right)\\). We also add a strong prior that \\(\\kappa_1 \\approx 0\\).

              \n\n
              ϕs = rand(truncated(Normal(0, 100), lower = 0.0), 1000)
              \n
              1000-element Vector{Float64}:\n  84.27394515942191\n  13.516491690956862\n  51.07348186961277\n  37.941468070981934\n 128.41727813505105\n  43.06012859066134\n  62.31804897315879\n   ⋮\n  56.57116875489856\n 158.33706887743045\n  42.72304061442974\n   7.423694327684998\n 155.60429115685992\n  22.802727733585563
              \n\n
              σ²s = rand(InverseGamma(0.1, 0.1), 1000) .|> x -> 1 / x
              \n
              1000-element Vector{Float64}:\n 0.0016224742151858818\n 6.79221353591839e-9\n 6.207746413070522e-7\n 0.18882277475797452\n 0.0001662633660039789\n 0.1923483831345634\n 0.14764829136880042\n ⋮\n 0.06624877782984823\n 0.14836794638364514\n 0.00021895942825830565\n 2.209773387224151\n 0.06613574232694587\n 0.0026714312973339926
              \n\n
              sampled_AR_damps = ϕs .|> ϕ -> exp(-ϕ)
              \n
              1000-element Vector{Float64}:\n 2.5135680594819346e-37\n 1.3485350660539842e-6\n 6.592781044298219e-23\n 3.3283560716429985e-17\n 1.6946683748176592e-56\n 1.991699264693254e-19\n 8.622142732783223e-28\n ⋮\n 2.7005584094809084e-25\n 1.7182434846473966e-69\n 2.7900964146464195e-19\n 0.0005969397758191972\n 2.641891576222659e-68\n 1.249974556559806e-10
              \n\n
              sampled_AR_stds = map(ϕs, σ²s) do ϕ, σ²\n    (1 - exp(-2 * ϕ)) * σ² / (2 * ϕ)\nend
              \n
              1000-element Vector{Float64}:\n 9.626191179946722e-6\n 2.5125652762581625e-10\n 6.0772696376159436e-9\n 0.00248834302358464\n 6.473559026423481e-7\n 0.002233485935017376\n 0.001184635059999989\n ⋮\n 0.0005855348164793897\n 0.00046851930326718863\n 2.562545000417783e-6\n 0.14883240757631075\n 0.00021251259150776393\n 5.857701167477672e-5
              \n\n\n

              We define the AR(1) process by matching means of HalfNormal prior distributions for the damp parameters and std deviation parameter to the calculated the prior means from the Chatzilena et al definition.

              \n\n
              ar = AR(\n    damp_priors = [HalfNormal(mean(sampled_AR_damps))],\n    std_prior = HalfNormal(mean(sampled_AR_stds)),\n    init_priors = [Normal(0, 0.001)]\n)
              \n
              AR{Product{Continuous, HalfNormal{Float64}, FillArrays.Fill{HalfNormal{Float64}, 1, Tuple{Base.OneTo{Int64}}}}, HalfNormal{Float64}, DistributionsAD.TuringScalMvNormal{Vector{Float64}, Float64}, Int64}(Distributions.Product{Distributions.Continuous, HalfNormal{Float64}, FillArrays.Fill{HalfNormal{Float64}, 1, Tuple{Base.OneTo{Int64}}}}(v=Fill(HalfNormal{Float64}(μ=0.004725237126863895), 1)), HalfNormal{Float64}(μ=0.0184303247003225), DistributionsAD.TuringScalMvNormal{Vector{Float64}, Float64}(m=[0.0], σ=0.001), 1)
              \n\n\n

              We can sample directly from the behaviour specified by the ar struct to do prior predictive checking on the AR(1) process.

              \n\n
              let\n    nobs = size(data, 1)\n    ar_mdl = generate_latent(ar, nobs)\n    fig = Figure()\n    ax = Axis(fig[1, 1],\n        xticks = (data.ts[1:3:end], data.date[1:3:end] .|> string),\n        ylabel = \"exp(kt)\",\n        title = \"Prior predictive sampling for relative residual in mean pred.\"\n    )\n    for i in 1:500\n        lines!(ax, ar_mdl() .|> exp, color = (:grey, 0.15))\n    end\n    fig\nend
              \n\n\n\n

              We see that the choice of priors implies an a priori belief that the extra observation noise on the mean prediction of the ODE model is fairly small, approximately 10% relative to the mean prediction.

              \n\n\n

              We can now define the probabilistic model. The stochastic model assumes a (random) time-varying ascertainment, which we implement using the Ascertainment struct from EpiAware. Note that instead of implementing an ascertainment factor exp.(κₜ) directly, which can be unstable for large primal values, by default Ascertainment uses the LogExpFunctions.xexpy function which implements \\(x\\exp(y)\\) stabily for a wide range of values.

              \n\n\n

              To distinguish random variables sampled by various sub-processes EpiAware process types create prefixes. The default for Ascertainment is just the string \"Ascertainment\", but in this case we use the less verbose \"va\" for \"varying ascertainment\".

              \n\n
              mdl_prefix = \"va\"
              \n
              \"va\"
              \n\n\n

              Now we can construct our time varying ascertianment model. The main keyword arguments here are model and latent_model. model sets the connection between the expected observation and the actual observation. In this case, we reuse our PoissonError model from above. latent_model sets the modification model on the expected values. In this case, we use the AR process we defined above.

              \n\n
              varying_ascertainment = Ascertainment(\n    model = obs,\n    latent_model = ar,\n    latent_prefix = mdl_prefix\n)
              \n
              Ascertainment{PoissonError, AbstractTuringLatentModel, EpiAware.EpiObsModels.var\"#10#16\", String}(PoissonError(), PrefixLatentModel{AR{Product{Continuous, HalfNormal{Float64}, FillArrays.Fill{HalfNormal{Float64}, 1, Tuple{Base.OneTo{Int64}}}}, HalfNormal{Float64}, DistributionsAD.TuringScalMvNormal{Vector{Float64}, Float64}, Int64}, String}(AR{Product{Continuous, HalfNormal{Float64}, FillArrays.Fill{HalfNormal{Float64}, 1, Tuple{Base.OneTo{Int64}}}}, HalfNormal{Float64}, DistributionsAD.TuringScalMvNormal{Vector{Float64}, Float64}, Int64}(Distributions.Product{Distributions.Continuous, HalfNormal{Float64}, FillArrays.Fill{HalfNormal{Float64}, 1, Tuple{Base.OneTo{Int64}}}}(v=Fill(HalfNormal{Float64}(μ=0.004725237126863895), 1)), HalfNormal{Float64}(μ=0.0184303247003225), DistributionsAD.TuringScalMvNormal{Vector{Float64}, Float64}(m=[0.0], σ=0.001), 1), \"va\"), EpiAware.EpiObsModels.var\"#10#16\"(), \"va\")
              \n\n\n

              Now we can declare the full model in the Turing PPL.

              \n\n
              @model function stochastic_ode_mdl(y_t, ts, obs, prob, N;\n        solver = AutoTsit5(Rosenbrock23())\n)\n\n    ##Priors##\n    β ~ LogNormal(0.0, 1.0)\n    γ ~ Gamma(0.004, 1 / 0.002)\n    S₀ ~ Beta(0.5, 0.5)\n\n    ##Remake ODE model##\n    _prob = remake(prob;\n        u0 = [S₀, 1 - S₀, 0.0],\n        p = [β, γ]\n    )\n\n    ##Solve ODE model##\n    sol = solve(_prob, solver;\n        saveat = ts,\n        verbose = false\n    )\n    λt = log1pexp.(N * sol[2, :])\n\n    ##Observation##\n    @submodel generated_y_t = generate_observations(obs, y_t, λt)\n\n    ##Generated quantities##\n    return (; sol, generated_y_t, R0 = β / γ)\nend
              \n
              stochastic_ode_mdl (generic function with 2 methods)
              \n\n
              stochastic_mdl = stochastic_ode_mdl(\n    data.in_bed,\n    data.ts,\n    varying_ascertainment,\n    sir_prob,\n    N\n)
              \n
              DynamicPPL.Model{typeof(stochastic_ode_mdl), (:y_t, :ts, :obs, :prob, :N), (:solver,), (), Tuple{Vector{Int64}, Vector{Float64}, Ascertainment{PoissonError, AbstractTuringLatentModel, EpiAware.EpiObsModels.var\"#10#16\", String}, ODEProblem{Vector{Float64}, Tuple{Float64, Float64}, true, SciMLBase.NullParameters, ODEFunction{true, SciMLBase.AutoSpecialize, typeof(sir!), LinearAlgebra.UniformScaling{Bool}, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, Base.Pairs{Symbol, Union{}, Tuple{}, @NamedTuple{}}, SciMLBase.StandardODEProblem}, Int64}, Tuple{CompositeAlgorithm{0, Tuple{Tsit5{typeof(OrdinaryDiffEqCore.trivial_limiter!), typeof(OrdinaryDiffEqCore.trivial_limiter!), Static.False}, Rosenbrock23{0, true, Nothing, typeof(OrdinaryDiffEqCore.DEFAULT_PRECS), Val{:forward}, true, nothing, typeof(OrdinaryDiffEqCore.trivial_limiter!), typeof(OrdinaryDiffEqCore.trivial_limiter!)}}, AutoSwitch{Tsit5{typeof(OrdinaryDiffEqCore.trivial_limiter!), typeof(OrdinaryDiffEqCore.trivial_limiter!), Static.False}, Rosenbrock23{0, true, Nothing, typeof(OrdinaryDiffEqCore.DEFAULT_PRECS), Val{:forward}, true, nothing, typeof(OrdinaryDiffEqCore.trivial_limiter!), typeof(OrdinaryDiffEqCore.trivial_limiter!)}, Rational{Int64}, Int64}}}, DynamicPPL.DefaultContext}(Main.var\"workspace#17\".stochastic_ode_mdl, (y_t = [3, 8, 26, 76, 225, 298, 258, 233, 189, 128, 68, 29, 14, 4], ts = [1.0, 2.0, 3.0, 4.0, 5.0, 6.0, 7.0, 8.0, 9.0, 10.0, 11.0, 12.0, 13.0, 14.0], obs = Ascertainment{PoissonError, AbstractTuringLatentModel, EpiAware.EpiObsModels.var\"#10#16\", String}(PoissonError(), PrefixLatentModel{AR{Product{Continuous, HalfNormal{Float64}, FillArrays.Fill{HalfNormal{Float64}, 1, Tuple{Base.OneTo{Int64}}}}, HalfNormal{Float64}, DistributionsAD.TuringScalMvNormal{Vector{Float64}, Float64}, Int64}, String}(AR{Product{Continuous, HalfNormal{Float64}, FillArrays.Fill{HalfNormal{Float64}, 1, Tuple{Base.OneTo{Int64}}}}, HalfNormal{Float64}, DistributionsAD.TuringScalMvNormal{Vector{Float64}, Float64}, Int64}(Distributions.Product{Distributions.Continuous, HalfNormal{Float64}, FillArrays.Fill{HalfNormal{Float64}, 1, Tuple{Base.OneTo{Int64}}}}(v=Fill(HalfNormal{Float64}(μ=0.004725237126863895), 1)), HalfNormal{Float64}(μ=0.0184303247003225), DistributionsAD.TuringScalMvNormal{Vector{Float64}, Float64}(m=[0.0], σ=0.001), 1), \"va\"), EpiAware.EpiObsModels.var\"#10#16\"(), \"va\"), prob = ODEProblem{Vector{Float64}, Tuple{Float64, Float64}, true, SciMLBase.NullParameters, ODEFunction{true, SciMLBase.AutoSpecialize, typeof(sir!), LinearAlgebra.UniformScaling{Bool}, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, Base.Pairs{Symbol, Union{}, Tuple{}, @NamedTuple{}}, SciMLBase.StandardODEProblem}(ODEFunction{true, SciMLBase.AutoSpecialize, typeof(sir!), LinearAlgebra.UniformScaling{Bool}, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}(Main.var\"workspace#17\".sir!, LinearAlgebra.UniformScaling{Bool}(true), nothing, nothing, nothing, nothing, nothing, nothing, nothing, nothing, nothing, nothing, nothing, SciMLBase.DEFAULT_OBSERVED, nothing, nothing, nothing, nothing, nothing, nothing), [755.37, 7.63, 0.0], (0.0, 14.0), SciMLBase.NullParameters(), Base.Pairs{Symbol, Union{}, Tuple{}, @NamedTuple{}}(), SciMLBase.StandardODEProblem()), N = 763), (solver = CompositeAlgorithm{0, Tuple{Tsit5{typeof(OrdinaryDiffEqCore.trivial_limiter!), typeof(OrdinaryDiffEqCore.trivial_limiter!), Static.False}, Rosenbrock23{0, true, Nothing, typeof(OrdinaryDiffEqCore.DEFAULT_PRECS), Val{:forward}, true, nothing, typeof(OrdinaryDiffEqCore.trivial_limiter!), typeof(OrdinaryDiffEqCore.trivial_limiter!)}}, AutoSwitch{Tsit5{typeof(OrdinaryDiffEqCore.trivial_limiter!), typeof(OrdinaryDiffEqCore.trivial_limiter!), Static.False}, Rosenbrock23{0, true, Nothing, typeof(OrdinaryDiffEqCore.DEFAULT_PRECS), Val{:forward}, true, nothing, typeof(OrdinaryDiffEqCore.trivial_limiter!), typeof(OrdinaryDiffEqCore.trivial_limiter!)}, Rational{Int64}, Int64}}((Tsit5{typeof(OrdinaryDiffEqCore.trivial_limiter!), typeof(OrdinaryDiffEqCore.trivial_limiter!), Static.False}(OrdinaryDiffEqCore.trivial_limiter!, OrdinaryDiffEqCore.trivial_limiter!, static(false)), Rosenbrock23{0, true, Nothing, typeof(OrdinaryDiffEqCore.DEFAULT_PRECS), Val{:forward}, true, nothing, typeof(OrdinaryDiffEqCore.trivial_limiter!), typeof(OrdinaryDiffEqCore.trivial_limiter!)}(nothing, OrdinaryDiffEqCore.DEFAULT_PRECS, OrdinaryDiffEqCore.trivial_limiter!, OrdinaryDiffEqCore.trivial_limiter!)), AutoSwitch{Tsit5{typeof(OrdinaryDiffEqCore.trivial_limiter!), typeof(OrdinaryDiffEqCore.trivial_limiter!), Static.False}, Rosenbrock23{0, true, Nothing, typeof(OrdinaryDiffEqCore.DEFAULT_PRECS), Val{:forward}, true, nothing, typeof(OrdinaryDiffEqCore.trivial_limiter!), typeof(OrdinaryDiffEqCore.trivial_limiter!)}, Rational{Int64}, Int64}(Tsit5{typeof(OrdinaryDiffEqCore.trivial_limiter!), typeof(OrdinaryDiffEqCore.trivial_limiter!), Static.False}(OrdinaryDiffEqCore.trivial_limiter!, OrdinaryDiffEqCore.trivial_limiter!, static(false)), Rosenbrock23{0, true, Nothing, typeof(OrdinaryDiffEqCore.DEFAULT_PRECS), Val{:forward}, true, nothing, typeof(OrdinaryDiffEqCore.trivial_limiter!), typeof(OrdinaryDiffEqCore.trivial_limiter!)}(nothing, OrdinaryDiffEqCore.DEFAULT_PRECS, OrdinaryDiffEqCore.trivial_limiter!, OrdinaryDiffEqCore.trivial_limiter!), 10, 3, 9//10, 9//10, 2, false, 5)),), DynamicPPL.DefaultContext())
              \n\n
              stochastic_uncond_mdl = stochastic_ode_mdl(\n    fill(missing, length(data.in_bed)),\n    data.ts,\n    varying_ascertainment,\n    sir_prob,\n    N\n)
              \n
              DynamicPPL.Model{typeof(stochastic_ode_mdl), (:y_t, :ts, :obs, :prob, :N), (:solver,), (), Tuple{Vector{Missing}, Vector{Float64}, Ascertainment{PoissonError, AbstractTuringLatentModel, EpiAware.EpiObsModels.var\"#10#16\", String}, ODEProblem{Vector{Float64}, Tuple{Float64, Float64}, true, SciMLBase.NullParameters, ODEFunction{true, SciMLBase.AutoSpecialize, typeof(sir!), LinearAlgebra.UniformScaling{Bool}, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, Base.Pairs{Symbol, Union{}, Tuple{}, @NamedTuple{}}, SciMLBase.StandardODEProblem}, Int64}, Tuple{CompositeAlgorithm{0, Tuple{Tsit5{typeof(OrdinaryDiffEqCore.trivial_limiter!), typeof(OrdinaryDiffEqCore.trivial_limiter!), Static.False}, Rosenbrock23{0, true, Nothing, typeof(OrdinaryDiffEqCore.DEFAULT_PRECS), Val{:forward}, true, nothing, typeof(OrdinaryDiffEqCore.trivial_limiter!), typeof(OrdinaryDiffEqCore.trivial_limiter!)}}, AutoSwitch{Tsit5{typeof(OrdinaryDiffEqCore.trivial_limiter!), typeof(OrdinaryDiffEqCore.trivial_limiter!), Static.False}, Rosenbrock23{0, true, Nothing, typeof(OrdinaryDiffEqCore.DEFAULT_PRECS), Val{:forward}, true, nothing, typeof(OrdinaryDiffEqCore.trivial_limiter!), typeof(OrdinaryDiffEqCore.trivial_limiter!)}, Rational{Int64}, Int64}}}, DynamicPPL.DefaultContext}(Main.var\"workspace#17\".stochastic_ode_mdl, (y_t = [missing, missing, missing, missing, missing, missing, missing, missing, missing, missing, missing, missing, missing, missing], ts = [1.0, 2.0, 3.0, 4.0, 5.0, 6.0, 7.0, 8.0, 9.0, 10.0, 11.0, 12.0, 13.0, 14.0], obs = Ascertainment{PoissonError, AbstractTuringLatentModel, EpiAware.EpiObsModels.var\"#10#16\", String}(PoissonError(), PrefixLatentModel{AR{Product{Continuous, HalfNormal{Float64}, FillArrays.Fill{HalfNormal{Float64}, 1, Tuple{Base.OneTo{Int64}}}}, HalfNormal{Float64}, DistributionsAD.TuringScalMvNormal{Vector{Float64}, Float64}, Int64}, String}(AR{Product{Continuous, HalfNormal{Float64}, FillArrays.Fill{HalfNormal{Float64}, 1, Tuple{Base.OneTo{Int64}}}}, HalfNormal{Float64}, DistributionsAD.TuringScalMvNormal{Vector{Float64}, Float64}, Int64}(Distributions.Product{Distributions.Continuous, HalfNormal{Float64}, FillArrays.Fill{HalfNormal{Float64}, 1, Tuple{Base.OneTo{Int64}}}}(v=Fill(HalfNormal{Float64}(μ=0.004725237126863895), 1)), HalfNormal{Float64}(μ=0.0184303247003225), DistributionsAD.TuringScalMvNormal{Vector{Float64}, Float64}(m=[0.0], σ=0.001), 1), \"va\"), EpiAware.EpiObsModels.var\"#10#16\"(), \"va\"), prob = ODEProblem{Vector{Float64}, Tuple{Float64, Float64}, true, SciMLBase.NullParameters, ODEFunction{true, SciMLBase.AutoSpecialize, typeof(sir!), LinearAlgebra.UniformScaling{Bool}, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, Base.Pairs{Symbol, Union{}, Tuple{}, @NamedTuple{}}, SciMLBase.StandardODEProblem}(ODEFunction{true, SciMLBase.AutoSpecialize, typeof(sir!), LinearAlgebra.UniformScaling{Bool}, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}(Main.var\"workspace#17\".sir!, LinearAlgebra.UniformScaling{Bool}(true), nothing, nothing, nothing, nothing, nothing, nothing, nothing, nothing, nothing, nothing, nothing, SciMLBase.DEFAULT_OBSERVED, nothing, nothing, nothing, nothing, nothing, nothing), [755.37, 7.63, 0.0], (0.0, 14.0), SciMLBase.NullParameters(), Base.Pairs{Symbol, Union{}, Tuple{}, @NamedTuple{}}(), SciMLBase.StandardODEProblem()), N = 763), (solver = CompositeAlgorithm{0, Tuple{Tsit5{typeof(OrdinaryDiffEqCore.trivial_limiter!), typeof(OrdinaryDiffEqCore.trivial_limiter!), Static.False}, Rosenbrock23{0, true, Nothing, typeof(OrdinaryDiffEqCore.DEFAULT_PRECS), Val{:forward}, true, nothing, typeof(OrdinaryDiffEqCore.trivial_limiter!), typeof(OrdinaryDiffEqCore.trivial_limiter!)}}, AutoSwitch{Tsit5{typeof(OrdinaryDiffEqCore.trivial_limiter!), typeof(OrdinaryDiffEqCore.trivial_limiter!), Static.False}, Rosenbrock23{0, true, Nothing, typeof(OrdinaryDiffEqCore.DEFAULT_PRECS), Val{:forward}, true, nothing, typeof(OrdinaryDiffEqCore.trivial_limiter!), typeof(OrdinaryDiffEqCore.trivial_limiter!)}, Rational{Int64}, Int64}}((Tsit5{typeof(OrdinaryDiffEqCore.trivial_limiter!), typeof(OrdinaryDiffEqCore.trivial_limiter!), Static.False}(OrdinaryDiffEqCore.trivial_limiter!, OrdinaryDiffEqCore.trivial_limiter!, static(false)), Rosenbrock23{0, true, Nothing, typeof(OrdinaryDiffEqCore.DEFAULT_PRECS), Val{:forward}, true, nothing, typeof(OrdinaryDiffEqCore.trivial_limiter!), typeof(OrdinaryDiffEqCore.trivial_limiter!)}(nothing, OrdinaryDiffEqCore.DEFAULT_PRECS, OrdinaryDiffEqCore.trivial_limiter!, OrdinaryDiffEqCore.trivial_limiter!)), AutoSwitch{Tsit5{typeof(OrdinaryDiffEqCore.trivial_limiter!), typeof(OrdinaryDiffEqCore.trivial_limiter!), Static.False}, Rosenbrock23{0, true, Nothing, typeof(OrdinaryDiffEqCore.DEFAULT_PRECS), Val{:forward}, true, nothing, typeof(OrdinaryDiffEqCore.trivial_limiter!), typeof(OrdinaryDiffEqCore.trivial_limiter!)}, Rational{Int64}, Int64}(Tsit5{typeof(OrdinaryDiffEqCore.trivial_limiter!), typeof(OrdinaryDiffEqCore.trivial_limiter!), Static.False}(OrdinaryDiffEqCore.trivial_limiter!, OrdinaryDiffEqCore.trivial_limiter!, static(false)), Rosenbrock23{0, true, Nothing, typeof(OrdinaryDiffEqCore.DEFAULT_PRECS), Val{:forward}, true, nothing, typeof(OrdinaryDiffEqCore.trivial_limiter!), typeof(OrdinaryDiffEqCore.trivial_limiter!)}(nothing, OrdinaryDiffEqCore.DEFAULT_PRECS, OrdinaryDiffEqCore.trivial_limiter!, OrdinaryDiffEqCore.trivial_limiter!), 10, 3, 9//10, 9//10, 2, false, 5)),), DynamicPPL.DefaultContext())
              \n\n\n

              Prior predictive checking

              \n\n
              let\n    prior_chn = sample(stochastic_uncond_mdl, Prior(), 2000)\n    gens = generated_quantities(stochastic_uncond_mdl, prior_chn)\n    plot_predYt(data, gens;\n        title = \"Prior predictive: stochastic model\",\n        ylabel = \"Number of Infected students\"\n    )\nend
              \n\n\n\n

              The prior predictive checking again shows misaligned prior beliefs; for example a priori without data we would not expect the median prediction of number of ill children as about 600 out of 763 after 1 day.

              The latent process for the log-residuals \\(\\kappa_t\\) doesn't make much sense without priors, so we look for a reasonable MAP point to start NUTS from. We do this by first making an initial guess which is a mixture of:

              1. The posterior averages from the deterministic model.

              2. The prior averages of the structure parameters of the AR(1) process.

              3. Zero for the time-varying noise underlying the AR(1) process.

              \n\n
              rand(stochastic_mdl)
              \n
              (β = 1.4733099145592605, γ = 2.903750758256854e-123, S₀ = 0.29861836011258897, var\"va.σ_AR\" = 0.04830504386163741, var\"va.ar_init\" = [-0.00024863122657975786], var\"va.damp_AR\" = [0.0032571734979405884], var\"va.ϵ_t\" = [0.3072398792156006, -1.183649965567883, 2.771050948892893, -0.6366422192999562, 1.6191332959597484, 0.24589190588482895, 1.4615005554123257, 0.025353011915720307, 0.16407599045634794, 0.2628599221133207, -1.0048884450877293, 1.96700665270484, -0.7501415436101209])
              \n\n
              initial_guess = [[mean(chn[:β]),\n                     mean(chn[:γ]),\n                     mean(chn[:S₀]),\n                     mean(ar.std_prior),\n                     mean(ar.init_prior)[1],\n                     mean(ar.damp_prior)[1]]\n                 zeros(13)]
              \n
              19-element Vector{Float64}:\n 1.8942148283773665\n 0.48062141906187955\n 0.9995061985155343\n 0.0184303247003225\n 0.0\n 0.004725237126863895\n 0.0\n ⋮\n 0.0\n 0.0\n 0.0\n 0.0\n 0.0\n 0.0
              \n\n\n

              Starting from the initial guess, the MAP point is calculated rapidly in one pass.

              \n\n
              map_fit_stoch_mdl = maximum_a_posteriori(stochastic_mdl;\n    adtype = AutoReverseDiff(),\n    initial_params = initial_guess\n)
              \n
              ModeResult with maximized lp of -69.56\n[1.9168299382321734, 0.4897041462336449, 0.9995563465712941, 0.06675569386075603, 1.3740571689410578e-6, 0.0001575538604931212, 0.14269439047176274, 0.17055298256610424, -0.29859817140192074, 0.6377161540197321, -0.00838185466017144, -0.5911576835821275, 0.7987402297108667, 1.7391572409676643, 1.4382700211216297, 0.24515802269495504, -0.6799723098817362, -0.7437116100116361, -0.8064297391295364]
              \n\n\n

              Now we can run NUTS, sampling 1000 posterior draws per chain for 4 chains.

              \n\n
              chn2 = sample(\n    stochastic_mdl,\n    NUTS(; adtype = AutoReverseDiff(true)),\n    MCMCThreads(), 1000, 4;\n    initial_params = fill(map_fit_stoch_mdl.values.array, 4)\n)
              \n
              iterationchainβγS₀va.σ_ARva.ar_init[1]va.damp_AR[1]...
              150111.930590.4871270.9995620.0964630.0001407320.000875502
              250211.970720.482620.999640.0441607-0.001037260.00110656
              350311.950660.4925990.9996390.03942060.0006763310.00519116
              450411.934920.4767340.9995880.04966710.0006703620.00955519
              550512.025580.4800110.9997290.066430.0006710760.00216336
              650611.980630.4665650.9996840.04910650.0006571630.00561317
              750711.815110.4942140.9992770.0164223-0.0006551070.00677129
              850812.001250.4661080.9996710.05584160.00340180.00524714
              950911.857780.489590.9993950.01294030.001782020.0014243
              1051011.753330.4825710.9990380.01366320.001519150.00445519
              ...
              \n\n
              describe(chn2)
              \n
              2-element Vector{ChainDataFrame}:\n Summary Statistics (19 x 8)\n Quantiles (19 x 6)
              \n\n
              pairplot(chn2[[:β, :γ, :S₀, Symbol(mdl_prefix * \".σ_AR\"),\n    Symbol(mdl_prefix * \".ar_init[1]\"), Symbol(mdl_prefix * \".damp_AR[1]\")]])
              \n\n\n
              let\n    vars = mapreduce(vcat, 1:13) do i\n        Symbol(mdl_prefix * \".ϵ_t[$i]\")\n    end\n    pairplot(chn2[vars])\nend
              \n\n\n
              let\n    gens = generated_quantities(stochastic_uncond_mdl, chn2)\n    plot_predYt(data, gens;\n        title = \"Fitted stochastic model\",\n        ylabel = \"Number of Infected students\"\n    )\nend
              \n\n\n","category":"page"},{"location":"showcase/replications/chatzilena-2019/","page":"Statistical inference for ODE-based infectious disease models","title":"Statistical inference for ODE-based infectious disease models","text":"EditURL = \"https://github.com/CDCgov/Rt-without-renewal/blob/main/docs/src/showcase/replications/chatzilena-2019/index.jl\"","category":"page"},{"location":"getting-started/explainers/latent-models/#Latent-models","page":"Latent models","title":"Latent models","text":"","category":"section"},{"location":"lib/internals/#Internal-Documentation","page":"Internal API","title":"Internal Documentation","text":"","category":"section"},{"location":"lib/internals/","page":"Internal API","title":"Internal API","text":"Documentation for EpiAware.jl's internal interface.","category":"page"},{"location":"lib/internals/#Contents","page":"Internal API","title":"Contents","text":"","category":"section"},{"location":"lib/internals/","page":"Internal API","title":"Internal API","text":"Pages = [\"internals.md\"]\nDepth = 2:2","category":"page"},{"location":"lib/internals/#Index","page":"Internal API","title":"Index","text":"","category":"section"},{"location":"lib/internals/","page":"Internal API","title":"Internal API","text":"Pages = [\"internals.md\"]","category":"page"},{"location":"lib/internals/#Internal-API","page":"Internal API","title":"Internal API","text":"","category":"section"},{"location":"lib/internals/","page":"Internal API","title":"Internal API","text":"Modules = [EpiAware]\nPublic = false","category":"page"},{"location":"getting-started/explainers/intro/#Introduction","page":"Introduction to EpiAware","title":"Introduction","text":"","category":"section"},{"location":"getting-started/explainers/intro/","page":"Introduction to EpiAware","title":"Introduction to EpiAware","text":"The diagram below shows the relationship between the modules in the package for a typical workflow.","category":"page"},{"location":"getting-started/explainers/intro/","page":"Introduction to EpiAware","title":"Introduction to EpiAware","text":"flowchart LR\n\nA[\"Underlying GI\nBijector\"]\n\nEpiModel[\"AbstractTuringEpiModel\n----------------------\nChoice of target\nfor latent process:\n\nDirectInfections\n ExpGrowthRate\n Renewal\"]\n\nInitModel[\"Priors for\ninitial scale of incidence\"]\n\nDataW[Data wrangling and QC]\n\n\nObsData[\"Observational Data\n---------------------\nObs. cases y_t\"]\n\nLatentProcPriors[\"Latent process priors\"]\n\nLatentProc[\"AbstractTuringLatentModel\n---------------------\nRandomWalk\"]\n\nObsModelPriors[\"Observation model priors\nchoice of delayed obs. model\"]\n\nObsModel[\"AbstractObservationModel\n---------------------\nDelayObservations\"]\n\nE[\"Turing model constructor\n---------------------\ngenerate_epiaware\"]\n\nG[Posterior draws]\nH[Posterior checking]\nI[Post-processing]\n\n\n\nA --> EpiData\nEpiData --> EpiModel\nInitModel --> EpiModel\nEpiModel -->E\nObsData-->E\nDataW-.->ObsData\nLatentProcPriors-->LatentProc\nLatentProc-->E\nObsModelPriors-->ObsModel\nObsModel-->E\n\n\nE-->|sample...NUTS...| G\nG-->H\nH-->I","category":"page"},{"location":"lib/EpiAwareBase/internals/#Internal-Documentation","page":"Internal API","title":"Internal Documentation","text":"","category":"section"},{"location":"lib/EpiAwareBase/internals/","page":"Internal API","title":"Internal API","text":"Documentation for EpiAwareBase.jl's internal interface.","category":"page"},{"location":"lib/EpiAwareBase/internals/#Contents","page":"Internal API","title":"Contents","text":"","category":"section"},{"location":"lib/EpiAwareBase/internals/#Contents-2","page":"Internal API","title":"Contents","text":"","category":"section"},{"location":"lib/EpiAwareBase/internals/","page":"Internal API","title":"Internal API","text":"Pages = [\"internals.md\"]\nDepth = 2:2","category":"page"},{"location":"lib/EpiAwareBase/internals/#Index","page":"Internal API","title":"Index","text":"","category":"section"},{"location":"lib/EpiAwareBase/internals/","page":"Internal API","title":"Internal API","text":"Pages = [\"internals.md\"]","category":"page"},{"location":"lib/EpiAwareBase/internals/","page":"Internal API","title":"Internal API","text":"Modules = [EpiAware.EpiAwareBase]\nPublic = false","category":"page"},{"location":"lib/EpiInfModels/internals/#Internal-Documentation","page":"Internal API","title":"Internal Documentation","text":"","category":"section"},{"location":"lib/EpiInfModels/internals/","page":"Internal API","title":"Internal API","text":"Documentation for EpiInfModels.jl's internal interface.","category":"page"},{"location":"lib/EpiInfModels/internals/#Contents","page":"Internal API","title":"Contents","text":"","category":"section"},{"location":"lib/EpiInfModels/internals/","page":"Internal API","title":"Internal API","text":"Pages = [\"internals.md\"]\nDepth = 2:2","category":"page"},{"location":"lib/EpiInfModels/internals/#Index","page":"Internal API","title":"Index","text":"","category":"section"},{"location":"lib/EpiInfModels/internals/","page":"Internal API","title":"Internal API","text":"Pages = [\"internals.md\"]","category":"page"},{"location":"lib/EpiInfModels/internals/#Internal-API","page":"Internal API","title":"Internal API","text":"","category":"section"},{"location":"lib/EpiInfModels/internals/","page":"Internal API","title":"Internal API","text":"Modules = [EpiAware.EpiInfModels]\nPublic = false","category":"page"},{"location":"lib/EpiInfModels/internals/#EpiAware.EpiInfModels.AbstractConstantRenewalStep","page":"Internal API","title":"EpiAware.EpiInfModels.AbstractConstantRenewalStep","text":"abstract type AbstractConstantRenewalStep <: AbstractAccumulationStep\n\nAbstract type representing an accumulation iteration/step for a Renewal model with a constant generation interval.\n\n\n\nFields\n\n\n\n\n\n","category":"type"},{"location":"lib/EpiInfModels/internals/#EpiAware.EpiInfModels.ConstantRenewalStep","page":"Internal API","title":"EpiAware.EpiInfModels.ConstantRenewalStep","text":"struct ConstantRenewalStep{T} <: EpiAware.EpiInfModels.AbstractConstantRenewalStep\n\nThe renewal process iteration/step function struct with constant generation interval.\n\nNote that the generation interval is stored in reverse order.\n\n\n\nFields\n\nrev_gen_int::Vector\n\n\n\n\n\n","category":"type"},{"location":"lib/EpiInfModels/internals/#EpiAware.EpiInfModels.ConstantRenewalStep-Tuple{Any, Any}","page":"Internal API","title":"EpiAware.EpiInfModels.ConstantRenewalStep","text":"function (recurrent_step::ConstantRenewalStep)(recent_incidence, Rt)\n\nImplement the Renewal model iteration/step function, with constant generation interval.\n\nMathematical specification\n\nThe new incidence is given by\n\nI_t = R_t sum_i=1^n-1 I_t-i g_i\n\nwhere I_t is the new incidence, R_t is the reproduction number, I_{t-i} is the recent incidence and g_i is the generation interval.\n\nArguments\n\nrecent_incidence: Array of recent incidence values order least recent to most recent.\nRt: Reproduction number.\n\nReturns\n\nUpdated incidence array.\n\n\n\n\n\n","category":"method"},{"location":"lib/EpiInfModels/internals/#EpiAware.EpiInfModels.ConstantRenewalWithPopulationStep","page":"Internal API","title":"EpiAware.EpiInfModels.ConstantRenewalWithPopulationStep","text":"struct ConstantRenewalWithPopulationStep{T} <: EpiAware.EpiInfModels.AbstractConstantRenewalStep\n\nThe renewal process iteration/step function struct with constant generation interval and a fixed population size.\n\nNote that the generation interval is stored in reverse order.\n\n\n\nFields\n\nrev_gen_int::Vector\npop_size::Any\n\n\n\n\n\n","category":"type"},{"location":"lib/EpiInfModels/internals/#EpiAware.EpiInfModels.ConstantRenewalWithPopulationStep-Tuple{Any, Any}","page":"Internal API","title":"EpiAware.EpiInfModels.ConstantRenewalWithPopulationStep","text":"function (recurrent_step::ConstantRenewalWithPopulationStep)(recent_incidence_and_available_sus, Rt)\n\nCallable on a RenewalWithPopulation struct for compute new incidence based on recent incidence, Rt and depletion of susceptibles.\n\nMathematical specification\n\nThe new incidence is given by\n\nI_t = S_t-1 N R_t sum_i=1^n-1 I_t-i g_i\n\nwhere I_t is the new incidence, R_t is the reproduction number, I_{t-i} is the recent incidence and g_i is the generation interval.\n\nArguments\n\nrecent_incidence_and_available_sus: A tuple with an array of recent incidence\n\nvalues and the remaining susceptible/available individuals.\n\nRt: Reproduction number.\n\nReturns\n\nVector containing the updated incidence array and the new recent_incidence_and_available_sus\n\nvalue.\n\n\n\n\n\n","category":"method"},{"location":"lib/EpiInfModels/internals/#EpiAware.EpiAwareBase.generate_latent_infs-Tuple{AbstractTuringRenewal, Any}","page":"Internal API","title":"EpiAware.EpiAwareBase.generate_latent_infs","text":"generate_latent_infs(\n epi_model::AbstractTuringRenewal,\n _Rt\n) -> Any\n\n\nImplement the generate_latent_infs function for the Renewal model.\n\nExample usage with Renewal type of model for unobserved infection process\n\ngenerate_latent_infs can be used to construct a Turing model for the latent infections conditional on the sample path of a latent process. In this example, we generate a sample of a white noise latent process.\n\nFirst, we construct an Renewal struct with an EpiData object, an initialisation prior and a transformation function.\n\nusing Distributions, Turing, EpiAware\ngen_int = [0.2, 0.3, 0.5]\ng = exp\n\n# Create an EpiData object\ndata = EpiData(gen_int, g)\n\n# Create an Renewal model\nrenewal_model = Renewal(data; initialisation_prior = Normal())\n\nThen, we can use generate_latent_infs to construct a Turing model for the unobserved infection generation model set by the type of renewal_model.\n\n# Construct a Turing model\nZ_t = randn(100) * 0.05\nlatent_inf = generate_latent_infs(renewal_model, Z_t)\n\nNow we can use the Turing PPL API to sample underlying parameters and generate the unobserved infections.\n\n# Sample from the unobserved infections model\n\n#Sample random parameters from prior\nθ = rand(latent_inf)\n#Get unobserved infections as a generated quantities from the model\nI_t = generated_quantities(latent_inf, θ)\n\n\n\n\n\n","category":"method"},{"location":"lib/EpiInfModels/internals/#EpiAware.EpiAwareBase.generate_latent_infs-Tuple{DirectInfections, Any}","page":"Internal API","title":"EpiAware.EpiAwareBase.generate_latent_infs","text":"generate_latent_infs(\n epi_model::DirectInfections,\n Z_t\n) -> Any\n\n\nImplement the generate_latent_infs function for the DirectInfections model.\n\nExample usage with DirectInfections type of model for unobserved infection process\n\nFirst, we construct a DirectInfections struct with an EpiData object, an initialisation prior and a transformation function.\n\nusing Distributions, Turing, EpiAware\ngen_int = [0.2, 0.3, 0.5]\ng = exp\n\n# Create an EpiData object\ndata = EpiData(gen_int, g)\n\n# Create a DirectInfections model\ndirect_inf_model = DirectInfections(data = data, initialisation_prior = Normal())\n\nThen, we can use generate_latent_infs to construct a Turing model for the unobserved infection generation model set by the type of direct_inf_model.\n\n# Construct a Turing model\nZ_t = randn(100)\nlatent_inf = generate_latent_infs(direct_inf_model, Z_t)\n\nNow we can use the Turing PPL API to sample underlying parameters and generate the unobserved infections.\n\n# Sample from the unobserved infections model\n\n#Sample random parameters from prior\nθ = rand(latent_inf)\n#Get unobserved infections as a generated quantities from the model\nI_t = generated_quantities(latent_inf, θ)\n\n\n\n\n\n","category":"method"},{"location":"lib/EpiInfModels/internals/#EpiAware.EpiAwareBase.generate_latent_infs-Tuple{ExpGrowthRate, Any}","page":"Internal API","title":"EpiAware.EpiAwareBase.generate_latent_infs","text":"generate_latent_infs(epi_model::ExpGrowthRate, rt) -> Any\n\n\nImplement the generate_latent_infs function for the ExpGrowthRate model.\n\nExample usage with ExpGrowthRate type of model for unobserved infection process\n\ngenerate_latent_infs can be used to construct a Turing model for the latent infections conditional on the sample path of a latent process. In this example, we generate a sample of a white noise latent process.\n\nFirst, we construct an ExpGrowthRate struct with an EpiData object, an initialisation prior and a transformation function.\n\nusing Distributions, Turing, EpiAware\ngen_int = [0.2, 0.3, 0.5]\ng = exp\n\n# Create an EpiData object\ndata = EpiData(gen_int, g)\n\n# Create an ExpGrowthRate model\nexp_growth_model = ExpGrowthRate(data = data, initialisation_prior = Normal())\n\nThen, we can use generate_latent_infs to construct a Turing model for the unobserved infection generation model set by the type of direct_inf_model.\n\n# Construct a Turing model\nZ_t = randn(100) * 0.05\nlatent_inf = generate_latent_infs(exp_growth_model, Z_t)\n\nNow we can use the Turing PPL API to sample underlying parameters and generate the unobserved infections.\n\n# Sample from the unobserved infections model\n\n#Sample random parameters from prior\nθ = rand(latent_inf)\n#Get unobserved infections as a generated quantities from the model\nI_t = generated_quantities(latent_inf, θ)\n\n\n\n\n\n","category":"method"},{"location":"lib/EpiInfModels/internals/#EpiAware.EpiAwareBase.generate_latent_infs-Tuple{ODEProcess, ODEParams}","page":"Internal API","title":"EpiAware.EpiAwareBase.generate_latent_infs","text":"generate_latent_infs(\n epi_model::ODEProcess,\n params::ODEParams\n) -> Any\n\n\nImplement the generate_latent_infs function for the ODEProcess model.\n\nConstructs a Turing model to generate latent infections using the specified epidemiological model and parameters.\n\nArguments\n\nepi_model::ODEProcess: The ODEProcess model containing the problem definition, time steps, solver, and solution-to-infections transformation function.\nparams::ODEParams: The initial conditions (u0) and parameters (p) for the ODE problem.\n\nGenerated quantities\n\nI_t: The latent infections generated by solving the ODE problem with the specified parameters.\n\nDetails\n\nThis function remakes the ODE problem with the provided initial conditions and parameters, solves it using the specified solver, and then transforms the solution into latent infections using the sol2infs function.\n\nExample usage\n\nusing EpiAware, OrdinaryDiffEq\nr = log(2) / 7 # Growth rate corresponding to 7 day doubling time\nu0 = [1.0]\np = [r]\nparams = ODEParams(u0 = u0, p = p)\n\n# Define the ODE problem using SciML\n# We use a simple exponential growth model\n\nfunction expgrowth(du, u, p, t)\n du[1] = p[1] * u[1]\nend\nprob = ODEProblem(expgrowth, u0, (0.0, 10.0), p)\n\n# Define the ODEProcess\n\nexpgrowth_model = ODEProcess(prob::ODEProblem; ts = 0:1:10,\n solver = Tsit5(),\n sol2infs = sol -> sol[1, :])\n\n# Generate the latent infections\nI_t = generate_latent_infs(expgrowth_model, params)()\n\n\n\n\n\n","category":"method"},{"location":"lib/EpiInfModels/internals/#EpiAware.EpiAwareUtils.get_state-Tuple{EpiAware.EpiInfModels.ConstantRenewalStep, Any, Any}","page":"Internal API","title":"EpiAware.EpiAwareUtils.get_state","text":"get_state(\n acc_step::EpiAware.EpiInfModels.ConstantRenewalStep,\n initial_state,\n state\n) -> Any\n\n\nMethod to get the state of the accumulation for a ConstantRenewalStep object.\n\n\n\n\n\n","category":"method"},{"location":"lib/EpiInfModels/internals/#EpiAware.EpiAwareUtils.get_state-Tuple{EpiAware.EpiInfModels.ConstantRenewalWithPopulationStep, Any, Any}","page":"Internal API","title":"EpiAware.EpiAwareUtils.get_state","text":"get_state(\n acc_step::EpiAware.EpiInfModels.ConstantRenewalWithPopulationStep,\n initial_state,\n state\n) -> Any\n\n\nMethod to get the state of the accumulation for a ConstantRenewalWithPopulationStep object.\n\n\n\n\n\n","category":"method"},{"location":"lib/EpiInfModels/internals/#EpiAware.EpiInfModels.make_renewal_init-Tuple{Renewal, Any, Any}","page":"Internal API","title":"EpiAware.EpiInfModels.make_renewal_init","text":"make_renewal_init(epi_model::Renewal, I₀, Rt₀) -> Any\n\n\nCreate the initial state of the Renewal model.\n\nArguments\n\nepi_model::Renewal: The Renewal model.\nI₀: The initial number of infected individuals.\nRt₀: The initial time-varying reproduction number.\n\nReturns\n\nThe initial vector of infected individuals.\n\n\n\n\n\n","category":"method"},{"location":"lib/EpiInfModels/internals/#EpiAware.EpiInfModels.neg_MGF-Tuple{Any, AbstractVector}","page":"Internal API","title":"EpiAware.EpiInfModels.neg_MGF","text":"neg_MGF(r, w::AbstractVector) -> Any\n\n\nCompute the negative moment generating function (MGF) for a given rate r and weights w.\n\nArguments\n\nr: The rate parameter.\nw: An abstract vector of weights.\n\nReturns\n\nThe value of the negative MGF.\n\n\n\n\n\n","category":"method"},{"location":"lib/EpiInfModels/internals/#EpiAware.EpiInfModels.oneexpy-Tuple{T} where T","page":"Internal API","title":"EpiAware.EpiInfModels.oneexpy","text":"oneexpy(y) -> Any\n\n\nVersion of LogExpFunctions.xexpy that takes a single argument y and returns exp(y).\n\n\n\n\n\n","category":"method"},{"location":"lib/EpiInfModels/internals/#EpiAware.EpiInfModels.renewal_init_state-Tuple{EpiAware.EpiInfModels.ConstantRenewalStep, Any, Any, Any}","page":"Internal API","title":"EpiAware.EpiInfModels.renewal_init_state","text":"renewal_init_state(\n recurrent_step::EpiAware.EpiInfModels.ConstantRenewalStep,\n I₀,\n r_approx,\n len_gen_int\n) -> Any\n\n\nConstructs the initial conditions for a renewal model with ConstantRenewalStep type of step function.\n\n\n\n\n\n","category":"method"},{"location":"lib/EpiInfModels/internals/#EpiAware.EpiInfModels.renewal_init_state-Tuple{EpiAware.EpiInfModels.ConstantRenewalWithPopulationStep, Any, Any, Any}","page":"Internal API","title":"EpiAware.EpiInfModels.renewal_init_state","text":"renewal_init_state(\n recurrent_step::EpiAware.EpiInfModels.ConstantRenewalWithPopulationStep,\n I₀,\n r_approx,\n len_gen_int\n) -> Any\n\n\nConstructs the initial conditions for a renewal model with ConstantRenewalWithPopulationStep type of step function.\n\n\n\n\n\n","category":"method"},{"location":"getting-started/explainers/interfaces/#Interfaces","page":"Interfaces","title":"Interfaces","text":"","category":"section"},{"location":"getting-started/explainers/interfaces/","page":"Interfaces","title":"Interfaces","text":"We support two primary workflows for using the package:","category":"page"},{"location":"getting-started/explainers/interfaces/","page":"Interfaces","title":"Interfaces","text":"EpiProblem: A high-level interface for defining and fitting models to data. This is the recommended way to use the package.\nTuring interface: A lower-level interface for defining and fitting models to data. This is the more flexible way to use the package and may also be more familiar to users of Turing.jl.","category":"page"},{"location":"getting-started/explainers/interfaces/","page":"Interfaces","title":"Interfaces","text":"See the getting started section for tutorials on each of these workflows.","category":"page"},{"location":"getting-started/explainers/interfaces/#EpiProblem","page":"Interfaces","title":"EpiProblem","text":"","category":"section"},{"location":"getting-started/explainers/interfaces/","page":"Interfaces","title":"Interfaces","text":"Each module of the overall epidemiological model we are interested in is a Turing Model in its own right. In this section, we compose the individual models into the full epidemiological model using the EpiProblem struct.","category":"page"},{"location":"getting-started/explainers/interfaces/","page":"Interfaces","title":"Interfaces","text":"The constructor for an EpiProblem requires:","category":"page"},{"location":"getting-started/explainers/interfaces/","page":"Interfaces","title":"Interfaces","text":"An epi_model.\nA latent_model.\nAn observation_model.\nA tspan.","category":"page"},{"location":"getting-started/explainers/interfaces/","page":"Interfaces","title":"Interfaces","text":"The tspan set the range of the time index for the models.","category":"page"},{"location":"getting-started/explainers/interfaces/#Turing-interface","page":"Interfaces","title":"Turing interface","text":"","category":"section"},{"location":"getting-started/explainers/interfaces/","page":"Interfaces","title":"Interfaces","text":"The Turing interface is a lower-level interface for defining and fitting models to data. This is the more flexible way to use the package and may also be more familiar to users of Turing.jl.","category":"page"},{"location":"getting-started/tutorials/#Tutorials","page":"Overview","title":"Tutorials","text":"","category":"section"},{"location":"getting-started/tutorials/","page":"Overview","title":"Overview","text":"This section contains tutorials that will help you get started with EpiAware for specific tasks. See the sidebar for the list of tutorials.","category":"page"},{"location":"overview/#overview","page":"Overview","title":"Overview of the EpiAware Software Ecosystem","text":"","category":"section"},{"location":"overview/","page":"Overview","title":"Overview","text":"EpiAware is not a standard toolkit for infectious disease modelling.","category":"page"},{"location":"overview/","page":"Overview","title":"Overview","text":"It seeks to be highly modular and composable for advanced users whilst still providing opinionated workflows for those who are new to the field. Developed by the authors behind other widely used infectious disease modelling packages such as EpiNow2, epinowcast, and epidist, alongside experts in infectious disease modelling in Julia,EpiAware is designed to go beyond the capabilities of these packages by providing a more flexible and extensible framework for modelling and inference of infectious disease dynamics.","category":"page"},{"location":"overview/#Package-Features","page":"Overview","title":"Package Features","text":"","category":"section"},{"location":"overview/","page":"Overview","title":"Overview","text":"Flexible: The package is designed to be flexible and extensible, and to provide a consistent interface for fitting and simulating models.\nModular: The package is designed to be modular, with a clear separation between the model and the data.\nExtensible: The package is designed to be extensible, with a clear separation between the model and the data.\nConsistent: The package is designed to provide a consistent interface for fitting and simulating models.\nEfficient: The package is designed to be efficient, with a clear separation between the model and the data.","category":"page"},{"location":"overview/#Package-structure","page":"Overview","title":"Package structure","text":"","category":"section"},{"location":"overview/","page":"Overview","title":"Overview","text":"EpiAware.jl is a wrapper around a series of submodules, each of which provides a different aspect of the package's functionality (much like the tidyverse in R). The package is designed to be modular, with a clear separation between modules and between modules and data. Currently included modules are:","category":"page"},{"location":"overview/","page":"Overview","title":"Overview","text":"EpiAwareBase: The core module, which provides the underlying abstract types and functions for the package.\nEpiAwareUtils: A utility module, which provides a series of utility functions for working with the package.\nEpiInference: An inference module, which provides a series of functions for fitting models to data. Builds on top of Turing.jl.\nEpiInfModels: Provides tools for composing models of the disease transmission process. Builds on top of Turing.jl, in particular the DynamicPPL.jl interface.\nEpiLatentModels: Provides tools for composing latent models such as random walks, autoregressive models, etc. Builds on top of DynamicPPL.jl. Used by all other modelling modules to define latent processes.\nEpiObsModels: Provides tools for composing observation models, such as Poisson, Binomial, etc. Builds on top of DynamicPPL.jl.","category":"page"},{"location":"overview/#Using-the-package","page":"Overview","title":"Using the package","text":"","category":"section"},{"location":"overview/","page":"Overview","title":"Overview","text":"We support two primary workflows for using the package:","category":"page"},{"location":"overview/","page":"Overview","title":"Overview","text":"EpiProblem: A high-level interface for defining and fitting models to data. This is the recommended way to use the package.\nTuring interface: A lower-level interface for defining and fitting models to data. This is the more flexible way to use the package and may also be more familiar to users of Turing.jl.","category":"page"},{"location":"overview/","page":"Overview","title":"Overview","text":"See the getting started section for tutorials on each of these workflows.","category":"page"},{"location":"lib/EpiAwareUtils/internals/#Internal-Documentation","page":"Internal API","title":"Internal Documentation","text":"","category":"section"},{"location":"lib/EpiAwareUtils/internals/","page":"Internal API","title":"Internal API","text":"Documentation for EpiAwareUtils.jl's internal interface.","category":"page"},{"location":"lib/EpiAwareUtils/internals/#Contents","page":"Internal API","title":"Contents","text":"","category":"section"},{"location":"lib/EpiAwareUtils/internals/","page":"Internal API","title":"Internal API","text":"Pages = [\"internals.md\"]\nDepth = 2:2","category":"page"},{"location":"lib/EpiAwareUtils/internals/#Index","page":"Internal API","title":"Index","text":"","category":"section"},{"location":"lib/EpiAwareUtils/internals/","page":"Internal API","title":"Internal API","text":"Pages = [\"internals.md\"]","category":"page"},{"location":"lib/EpiAwareUtils/internals/#Internal-API","page":"Internal API","title":"Internal API","text":"","category":"section"},{"location":"lib/EpiAwareUtils/internals/","page":"Internal API","title":"Internal API","text":"Modules = [EpiAware.EpiAwareUtils]\nPublic = false","category":"page"},{"location":"lib/EpiAwareUtils/internals/#EpiAware.EpiAwareBase._apply_method","page":"Internal API","title":"EpiAware.EpiAwareBase._apply_method","text":"_apply_method(\n model::DynamicPPL.Model,\n method::DirectSample;\n ...\n) -> Any\n_apply_method(\n model::DynamicPPL.Model,\n method::DirectSample,\n prev_result;\n kwargs...\n) -> Any\n\n\nImplements direct sampling from a Turing model.\n\n\n\n\n\n","category":"function"},{"location":"lib/EpiAwareUtils/internals/#EpiAware.EpiAwareBase._apply_method-2","page":"Internal API","title":"EpiAware.EpiAwareBase._apply_method","text":"_apply_method(\n model::DynamicPPL.Model,\n method::AbstractEpiMethod;\n ...\n) -> Any\n_apply_method(\n model::DynamicPPL.Model,\n method::AbstractEpiMethod,\n prev_result;\n kwargs...\n) -> Any\n\n\nApply the inference/generative method method to the Model object mdl.\n\nArguments\n\nmodel::AbstractEpiModel: The model to apply the method to.\nmethod::AbstractEpiMethod: The epidemiological method to apply.\nprev_result: The previous result of the method.\nkwargs: Additional keyword arguments passed to the method.\n\nReturns\n\nnothing: If no concrete implementation is defined for the given method.\n\n\n\n\n\n","category":"function"},{"location":"lib/EpiAwareUtils/internals/#EpiAware.EpiAwareBase._apply_method-Tuple{DynamicPPL.Model, EpiMethod, Any}","page":"Internal API","title":"EpiAware.EpiAwareBase._apply_method","text":"_apply_method(\n model::DynamicPPL.Model,\n method::EpiMethod,\n prev_result;\n kwargs...\n) -> Any\n\n\nApply steps defined by an EpiMethod to a model object.\n\nThis function applies the steps defined by an EpiMethod object to a Model object. It iterates over the pre-sampler steps defined in the EpiMethod object and recursively applies them to the model. Finally, it applies the sampler step defined in the EpiMethod object to the model. The prev_result argument is used to pass the result obtained from applying the previous steps, if any.\n\nArguments\n\nmethod::EpiMethod: The EpiMethod object containing the steps to be applied.\nmodel::Model: The model object to which the steps will be applied.\nprev_result: The previous result obtained from applying the steps. Defaults to nothing.\nkwargs...: Additional keyword arguments that can be passed to the steps.\n\nReturns\n\nprev_result: The result obtained after applying the steps.\n\n\n\n\n\n","category":"method"},{"location":"lib/EpiAwareUtils/internals/#EpiAware.EpiAwareBase._apply_method-Tuple{DynamicPPL.Model, EpiMethod}","page":"Internal API","title":"EpiAware.EpiAwareBase._apply_method","text":"_apply_method(\n model::DynamicPPL.Model,\n method::EpiMethod;\n kwargs...\n) -> Any\n\n\nApply a method to a mode without previous results\n\nArguments\n\nmodel::Model: The model to apply the method to.\nmethod::EpiMethod: The method to apply.\nkwargs...: Additional keyword arguments.\n\nReturns\n\nThe result of applying the method to the model.\n\n\n\n\n\n","category":"method"},{"location":"lib/EpiAwareUtils/internals/#EpiAware.EpiAwareBase.condition_model-Tuple{DynamicPPL.Model, NamedTuple, NamedTuple}","page":"Internal API","title":"EpiAware.EpiAwareBase.condition_model","text":"condition_model(\n model::DynamicPPL.Model,\n fix_parameters::NamedTuple,\n condition_parameters::NamedTuple\n) -> Any\n\n\nApply the condition to the model by fixing the specified parameters and conditioning on the others.\n\nArguments\n\nmodel::Model: The model to be conditioned.\nfix_parameters::NamedTuple: The parameters to be fixed.\ncondition_parameters::NamedTuple: The parameters to be conditioned on.\n\nReturns\n\n_model: The conditioned model.\n\n\n\n\n\n","category":"method"},{"location":"lib/EpiAwareUtils/internals/#EpiAware.EpiAwareBase.generate_epiaware-Tuple{Any, Any, AbstractTuringEpiModel}","page":"Internal API","title":"EpiAware.EpiAwareBase.generate_epiaware","text":"generate_epiaware(\n y_t,\n time_steps,\n epi_model::AbstractTuringEpiModel;\n latent_model,\n observation_model\n)\n\n\nGenerate an epi-aware model given the observed data and model specifications.\n\nArguments\n\ny_t: Observed data.\ntime_steps: Number of time steps.\nepi_model: A Turing Epi model specification.\nlatent_model: A Turing Latent model specification.\nobservation_model: A Turing Observation model specification.\n\nReturns\n\nA DynamicPPPL.Model object.\n\n\n\n\n\n","category":"method"},{"location":"lib/EpiAwareUtils/internals/#EpiAware.EpiAwareBase.generated_observables-Tuple{DynamicPPL.Model, Any, Union{NamedTuple, MCMCChains.Chains}}","page":"Internal API","title":"EpiAware.EpiAwareBase.generated_observables","text":"generated_observables(\n model::DynamicPPL.Model,\n data,\n solution::Union{NamedTuple, MCMCChains.Chains}\n) -> EpiAwareObservables\n\n\nGenerate observables from a given model and solution including generated quantities.\n\n\n\n\n\n","category":"method"},{"location":"lib/EpiAwareUtils/internals/#EpiAware.EpiAwareUtils._apply_direct_sample-Tuple{Any, Any, Int64}","page":"Internal API","title":"EpiAware.EpiAwareUtils._apply_direct_sample","text":"_apply_direct_sample(\n model,\n method,\n n_samples::Int64;\n kwargs...\n) -> Any\n\n\nSample the model directly using Turing.Prior() and a NamedTuple of the sampled random variables along with generated quantities.\n\n\n\n\n\n","category":"method"},{"location":"lib/EpiAwareUtils/internals/#EpiAware.EpiAwareUtils._apply_direct_sample-Tuple{Any, Any, Nothing}","page":"Internal API","title":"EpiAware.EpiAwareUtils._apply_direct_sample","text":"_apply_direct_sample(\n model,\n method,\n n_samples::Nothing\n) -> Any\n\n\nSample the model directly using rand and return a single set of sampled random variables.\n\n\n\n\n\n","category":"method"},{"location":"lib/EpiAwareUtils/internals/#EpiAware.EpiAwareUtils._check_and_give_ts-Tuple{Distributions.Distribution, Any, Any, Any}","page":"Internal API","title":"EpiAware.EpiAwareUtils._check_and_give_ts","text":"_check_and_give_ts(\n dist::Distributions.Distribution,\n Δd,\n D,\n upper\n) -> Any\n\n\nInternal function to check censored_pmf arguments and return the time steps of the rightmost limits of the censor intervals.\n\n\n\n\n\n","category":"method"},{"location":"#EpiAware.jl","page":"EpiAware.jl: Real-time infectious disease monitoring","title":"EpiAware.jl","text":"","category":"section"},{"location":"","page":"EpiAware.jl: Real-time infectious disease monitoring","title":"EpiAware.jl: Real-time infectious disease monitoring","text":"Infectious disease situational awareness modelling toolkit for Julia.","category":"page"},{"location":"#Where-to-start","page":"EpiAware.jl: Real-time infectious disease monitoring","title":"Where to start","text":"","category":"section"},{"location":"","page":"EpiAware.jl: Real-time infectious disease monitoring","title":"EpiAware.jl: Real-time infectious disease monitoring","text":"Want to get started running code? Check out the Getting Started Tutorials.\nWhat is EpiAware? Check out our Overview.\nWant to see some end-to-end examples? Check out our EpiAware showcase.\nWant to understand the API? Check out our API Reference.\nWant to chat with someone about EpiAware? Post on our GitHub Discussions.\nWant to contribute to EpiAware? Check out our Developer documentation.\nWant to see our code? Check out our GitHub Repository.","category":"page"},{"location":"getting-started/tutorials/censored-obs/","page":"Fitting distributions with censored data","title":"Fitting distributions with censored data","text":"\n\n\n\n\n

              Fitting distributions using EpiAware and Turing PPL

              Introduction

              What are we going to do in this Vignette

              In this vignette, we'll demonstrate how to use the CDF function for censored delay distributions EpiAwareUtils.∫F, which underlies EpiAwareUtils.censored_pmf in conjunction with the Turing PPL for Bayesian inference of epidemiological delay distributions. We'll cover the following key points:

              1. Simulating censored delay distribution data

              2. Fitting a naive model using Turing

              3. Evaluating the naive model's performance

              4. Fitting an improved model using censored delay functionality from EpiAware.

              5. Comparing the censored delay model's performance to the naive model

              What might I need to know before starting

              This note builds on the concepts introduced in the R/stan package primarycensoreddist, especially the Fitting distributions using primarycensorseddist and cmdstan vignette and assumes familiarity with using Turing tools as covered in the Turing documentation.

              This note is generated using the EpiAware package locally via Pkg.develop, in the EpiAware/docs environment. It is also possible to install EpiAware using

              Pkg.add(url=\"https://github.com/CDCgov/Rt-without-renewal\", subdir=\"EpiAware\")

              Packages used in this vignette

              As well as EpiAware and Turing we will use Makie ecosystem packages for plotting and DataFramesMeta for data manipulation.

              \n\n
              let\n    docs_dir = dirname(dirname(dirname(@__DIR__)))\n    using Pkg: Pkg\n    Pkg.activate(docs_dir)\n    Pkg.instantiate()\nend
              \n\n\n\n

              The other dependencies are as follows:

              \n\n
              begin\n    using EpiAware.EpiAwareUtils: censored_pmf, censored_cdf, ∫F\n    using Random, Distributions, StatsBase #utilities for random events\n    using DataFramesMeta #Data wrangling\n    using CairoMakie, PairPlots #plotting\n    using Turing #PPL\nend
              \n\n\n","category":"page"},{"location":"getting-started/tutorials/censored-obs/#Simulating-censored-and-truncated-delay-distribution-data","page":"Fitting distributions with censored data","title":"Simulating censored and truncated delay distribution data","text":"","category":"section"},{"location":"getting-started/tutorials/censored-obs/","page":"Fitting distributions with censored data","title":"Fitting distributions with censored data","text":"
              \n

              We'll start by simulating some censored and truncated delay distribution data. We’ll define a rpcens function for generating data.

              \n\n
              Random.seed!(123) # For reproducibility
              \n
              TaskLocalRNG()
              \n\n\n

              Define the true distribution parameters

              \n\n
              n = 2000
              \n
              2000
              \n\n
              meanlog = 1.5
              \n
              1.5
              \n\n
              sdlog = 0.75
              \n
              0.75
              \n\n
              true_dist = LogNormal(meanlog, sdlog)
              \n
              Distributions.LogNormal{Float64}(μ=1.5, σ=0.75)
              \n\n\n

              Generate varying pwindow, swindow, and obs_time lengths

              \n\n
              pwindows = rand(1:2, n)
              \n
              2000-element Vector{Int64}:\n 2\n 2\n 2\n 1\n 2\n 1\n 1\n ⋮\n 1\n 2\n 1\n 2\n 2\n 2
              \n\n
              swindows = rand(1:2, n)
              \n
              2000-element Vector{Int64}:\n 1\n 2\n 2\n 1\n 2\n 1\n 1\n ⋮\n 2\n 2\n 2\n 1\n 1\n 2
              \n\n
              obs_times = rand(8:10, n)
              \n
              2000-element Vector{Int64}:\n 10\n  9\n  9\n 10\n  9\n  8\n  8\n  ⋮\n  8\n  9\n  9\n 10\n  8\n  8
              \n\n\n

              We recreate the primary censored sampling function from primarycensoreddist, c.f. documentation here.

              \n\n
              \"\"\"\n    function rpcens(dist; pwindow = 1, swindow = 1, D = Inf, max_tries = 1000)\n\nDoes a truncated censored sample from `dist` with a uniform primary time on `[0, pwindow]`.\n\"\"\"\nfunction rpcens(dist; pwindow = 1, swindow = 1, D = Inf, max_tries = 1000)\n    T = zero(eltype(dist))\n    invalid_sample = true\n    attempts = 1\n    while (invalid_sample && attempts <= max_tries)\n        X = rand(dist)\n        U = rand() * pwindow\n        T = X + U\n        attempts += 1\n        if X + U < D\n            invalid_sample = false\n        end\n    end\n\n    @assert !invalid_sample \"censored value not found in $max_tries attempts\"\n\n    return (T ÷ swindow) * swindow\nend
              \n\n\n
              #Sample secondary time relative to beginning of primary censor window respecting the right-truncation\nsamples = map(pwindows, swindows, obs_times) do pw, sw, ot\n    rpcens(true_dist; pwindow = pw, swindow = sw, D = ot)\nend
              \n
              2000-element Vector{Float64}:\n 4.0\n 2.0\n 2.0\n 2.0\n 4.0\n 3.0\n 6.0\n ⋮\n 4.0\n 6.0\n 2.0\n 6.0\n 4.0\n 4.0
              \n\n\n

              Aggregate to unique combinations and count occurrences

              \n\n
              delay_counts = mapreduce(vcat, pwindows, swindows, obs_times, samples) do pw, sw, ot, s\n    DataFrame(\n        pwindow = pw,\n        swindow = sw,\n        obs_time = ot,\n        observed_delay = s,\n        observed_delay_upper = s + sw\n    )\nend |>\n               df -> @groupby(df, :pwindow, :swindow, :obs_time, :observed_delay,\n    :observed_delay_upper) |>\n                     gd -> @combine(gd, :n=length(:pwindow))
              \n
              pwindowswindowobs_timeobserved_delayobserved_delay_uppern
              11180.01.01
              21181.02.013
              31182.03.032
              41183.04.029
              51184.05.034
              61185.06.026
              71186.07.019
              81187.08.014
              91190.01.02
              101191.02.05
              ...
              8022108.010.022
              \n\n\n

              Compare the samples with and without secondary censoring to the true distribution and calculate empirical CDF

              \n\n
              empirical_cdf = ecdf(samples)
              \n
              ECDF{Vector{Float64}, Weights{Float64, Float64, Vector{Float64}}}([0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0  …  9.0, 9.0, 9.0, 9.0, 9.0, 9.0, 9.0, 9.0, 9.0, 9.0], Float64[])
              \n\n
              empirical_cdf_obs = ecdf(delay_counts.observed_delay, weights = delay_counts.n)
              \n
              ECDF{Vector{Float64}, Weights{Int64, Int64, Vector{Int64}}}([0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0  …  8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 9.0, 9.0], [1, 2, 2, 13, 16, 21, 1, 13, 13, 9  …  9, 10, 9, 13, 17, 15, 12, 22, 8, 7])
              \n\n
              x_seq = range(minimum(samples), maximum(samples), 100)
              \n
              0.0:0.09090909090909091:9.0
              \n\n
              theoretical_cdf = x_seq |> x -> cdf(true_dist, x)
              \n
              100-element Vector{Float64}:\n 0.0\n 1.011597608751049e-7\n 9.643132895117507e-6\n 9.484054524759167e-5\n 0.0004058100212574347\n 0.0011393531997368723\n 0.0024911102275376566\n ⋮\n 0.8052522612515658\n 0.8091156793117527\n 0.8128920005554523\n 0.8165833494282897\n 0.8201917991805499\n 0.8237193727611859
              \n\n
              let\n    f = Figure()\n    ax = Axis(f[1, 1],\n        title = \"Comparison of Observed vs Theoretical CDF\",\n        ylabel = \"Cumulative Probability\",\n        xlabel = \"Delay\"\n    )\n    lines!(\n        ax, x_seq, empirical_cdf_obs, label = \"Empirical CDF\", color = :blue, linewidth = 2)\n    lines!(ax, x_seq, theoretical_cdf, label = \"Theoretical CDF\",\n        color = :black, linewidth = 2)\n    vlines!(ax, [mean(samples)], color = :blue, linestyle = :dash,\n        label = \"Empirical mean\", linewidth = 2)\n    vlines!(ax, [mean(true_dist)], linestyle = :dash,\n        label = \"Theoretical mean\", color = :black, linewidth = 2)\n    axislegend(position = :rb)\n\n    f\nend
              \n\n\n\n

              We've aggregated the data to unique combinations of pwindow, swindow, and obs_time and counted the number of occurrences of each observed_delay for each combination. This is the data we will use to fit our model.

              \n\n","category":"page"},{"location":"getting-started/tutorials/censored-obs/#Fitting-a-naive-model-using-Turing","page":"Fitting distributions with censored data","title":"Fitting a naive model using Turing","text":"","category":"section"},{"location":"getting-started/tutorials/censored-obs/","page":"Fitting distributions with censored data","title":"Fitting distributions with censored data","text":"
              \n

              We'll start by fitting a naive model using NUTS from Turing. We define the model in the Turing PPL.

              \n\n
              @model function naive_model(N, y, n)\n    mu ~ Normal(1.0, 1.0)\n    sigma ~ truncated(Normal(0.5, 1.0); lower = 0.0)\n    d = LogNormal(mu, sigma)\n\n    for i in eachindex(y)\n        Turing.@addlogprob! n[i] * logpdf(d, y[i])\n    end\nend
              \n
              naive_model (generic function with 2 methods)
              \n\n\n

              Now lets instantiate this model with data

              \n\n
              naive_mdl = naive_model(\n    size(delay_counts, 1),\n    delay_counts.observed_delay .+ 1e-6, # Add a small constant to avoid log(0)\n    delay_counts.n)
              \n
              DynamicPPL.Model{typeof(naive_model), (:N, :y, :n), (), (), Tuple{Int64, Vector{Float64}, Vector{Int64}}, Tuple{}, DynamicPPL.DefaultContext}(Main.var\"workspace#5\".naive_model, (N = 80, y = [1.0e-6, 1.000001, 2.000001, 3.000001, 4.000001, 5.000001, 6.000001, 7.000001, 1.0e-6, 1.000001  …  1.0e-6, 2.000001, 4.000001, 6.000001, 8.000001, 1.0e-6, 2.000001, 4.000001, 6.000001, 8.000001], n = [1, 13, 32, 29, 34, 26, 19, 14, 2, 5  …  13, 69, 59, 30, 12, 9, 69, 48, 29, 22]), NamedTuple(), DynamicPPL.DefaultContext())
              \n\n\n

              and now let's fit the compiled model.

              \n\n
              naive_fit = sample(naive_mdl, NUTS(), MCMCThreads(), 500, 4)
              \n
              iterationchainmusigmalpn_stepsis_acceptacceptance_rate...
              125110.5698283.16687-6326.423.01.00.889234
              225210.515543.21602-6327.173.01.00.748919
              325310.6991243.15787-6327.727.01.00.895135
              425410.5230353.16482-6326.83.01.00.728358
              525510.4975833.17839-6327.163.01.00.840027
              625610.5489563.15474-6326.613.01.00.792753
              725710.5693353.16453-6326.433.01.01.0
              825810.6062343.2018-6326.533.01.00.977859
              925910.5306583.17006-6326.693.01.00.900204
              1026010.6271173.11173-6327.417.01.00.761508
              ...
              \n\n
              summarize(naive_fit)
              \n
              parametersmeanstdmcseess_bulkess_tailrhatess_per_sec
              1:mu0.5841290.07040670.001514492153.021487.541.00003311.987
              2:sigma3.177660.04957750.001135421905.041306.941.00127276.053
              \n\n
              let\n    f = pairplot(naive_fit)\n    vlines!(f[1, 1], [meanlog], linewidth = 4)\n    vlines!(f[2, 2], [sdlog], linewidth = 4)\n    f\nend
              \n\n\n\n

              We see that the model has converged and the diagnostics look good. However, just from the model posterior summary we see that we might not be very happy with the fit. mu is smaller than the target 1.5 and sigma is larger than the target 0.75.

              \n\n","category":"page"},{"location":"getting-started/tutorials/censored-obs/#Fitting-an-improved-model-using-censoring-utilities","page":"Fitting distributions with censored data","title":"Fitting an improved model using censoring utilities","text":"","category":"section"},{"location":"getting-started/tutorials/censored-obs/","page":"Fitting distributions with censored data","title":"Fitting distributions with censored data","text":"
              \n

              We'll now fit an improved model using the ∫F function from EpiAware.EpiAwareUtils for calculating the CDF of the total delay from the beginning of the primary window to the secondary event time. This includes both the delay distribution we are making inference on and the time between the start of the primary censor window and the primary event. The ∫F function underlies censored_pmf function from the EpiAware.EpiAwareUtils submodule.

              Using the ∫F function we can write a log-pmf function primary_censored_dist_lpmf that accounts for:

              This is the analog function to the function of the same name in primarycensoreddist: it calculates the log-probability of the secondary event occurring in the secondary censoring window conditional on the primary event occurring in the primary censoring window by calculating the increase in the CDF over the secondary window and rescaling by the probability of the secondary event occuring within the maximum observation time D.

              \n\n
              function primary_censored_dist_lpmf(dist, y, pwindow, y_upper, D)\n    if y == 0.0\n        return log(∫F(dist, y_upper, pwindow)) - log(∫F(dist, D, pwindow))\n    else\n        return log(∫F(dist, y_upper, pwindow) - ∫F(dist, y, pwindow)) -\n               log(∫F(dist, D, pwindow))\n    end\nend
              \n
              primary_censored_dist_lpmf (generic function with 1 method)
              \n\n\n

              We make a new Turing model that now uses primary_censored_dist_lpmf rather than the naive uncensored and untruncated logpdf.

              \n\n
              @model function primarycensoreddist_model(y, y_upper, n, pws, Ds)\n    mu ~ Normal(1.0, 1.0)\n    sigma ~ truncated(Normal(0.5, 0.5); lower = 0.0)\n    dist = LogNormal(mu, sigma)\n\n    for i in eachindex(y)\n        Turing.@addlogprob! n[i] * primary_censored_dist_lpmf(\n            dist, y[i], pws[i], y_upper[i], Ds[i])\n    end\nend
              \n
              primarycensoreddist_model (generic function with 2 methods)
              \n\n\n

              Lets instantiate this model with data

              \n\n
              primarycensoreddist_mdl = primarycensoreddist_model(\n    delay_counts.observed_delay,\n    delay_counts.observed_delay_upper,\n    delay_counts.n,\n    delay_counts.pwindow,\n    delay_counts.obs_time\n)
              \n
              DynamicPPL.Model{typeof(primarycensoreddist_model), (:y, :y_upper, :n, :pws, :Ds), (), (), Tuple{Vector{Float64}, Vector{Float64}, Vector{Int64}, Vector{Int64}, Vector{Int64}}, Tuple{}, DynamicPPL.DefaultContext}(Main.var\"workspace#5\".primarycensoreddist_model, (y = [0.0, 1.0, 2.0, 3.0, 4.0, 5.0, 6.0, 7.0, 0.0, 1.0  …  0.0, 2.0, 4.0, 6.0, 8.0, 0.0, 2.0, 4.0, 6.0, 8.0], y_upper = [1.0, 2.0, 3.0, 4.0, 5.0, 6.0, 7.0, 8.0, 1.0, 2.0  …  2.0, 4.0, 6.0, 8.0, 10.0, 2.0, 4.0, 6.0, 8.0, 10.0], n = [1, 13, 32, 29, 34, 26, 19, 14, 2, 5  …  13, 69, 59, 30, 12, 9, 69, 48, 29, 22], pws = [1, 1, 1, 1, 1, 1, 1, 1, 1, 1  …  2, 2, 2, 2, 2, 2, 2, 2, 2, 2], Ds = [8, 8, 8, 8, 8, 8, 8, 8, 9, 9  …  9, 9, 9, 9, 9, 10, 10, 10, 10, 10]), NamedTuple(), DynamicPPL.DefaultContext())
              \n\n\n

              Now let’s fit the compiled model.

              \n\n
              primarycensoreddist_fit = sample(\n    primarycensoreddist_mdl, NUTS(), MCMCThreads(), 1000, 4)
              \n
              iterationchainmusigmalpn_stepsis_acceptacceptance_rate...
              150111.468190.771804-3376.413.01.01.0
              250211.468770.738944-3375.13.01.00.999202
              350311.497350.740651-3376.393.01.00.741842
              450411.476180.762895-3375.613.01.00.983406
              550511.481320.74067-3375.473.01.00.852127
              650611.407460.711968-3375.637.01.00.914995
              750711.443290.747661-3375.557.01.00.89498
              850811.435680.734698-3375.173.01.00.977821
              950911.424560.696408-3375.793.01.00.941795
              1051011.469660.758485-3375.465.01.01.0
              ...
              \n\n
              summarize(primarycensoreddist_fit)
              \n
              parametersmeanstdmcseess_bulkess_tailrhatess_per_sec
              1:mu1.451680.03555140.001086361087.881420.311.0016960.9937
              2:sigma0.7330080.02749250.000827291110.271643.861.0017362.2486
              \n\n
              let\n    f = pairplot(primarycensoreddist_fit)\n    CairoMakie.vlines!(f[1, 1], [meanlog], linewidth = 3)\n    CairoMakie.vlines!(f[2, 2], [sdlog], linewidth = 3)\n    f\nend
              \n\n\n\n

              We see that the model has converged and the diagnostics look good. We also see that the posterior means are very near the true parameters and the 90% credible intervals include the true parameters.

              \n\n","category":"page"},{"location":"getting-started/tutorials/censored-obs/","page":"Fitting distributions with censored data","title":"Fitting distributions with censored data","text":"EditURL = \"https://github.com/CDCgov/Rt-without-renewal/blob/main/docs/src/getting-started/tutorials/censored-obs.jl\"","category":"page"},{"location":"lib/EpiInfModels/public/#Public-Documentation","page":"Public API","title":"Public Documentation","text":"","category":"section"},{"location":"lib/EpiInfModels/public/","page":"Public API","title":"Public API","text":"Documentation for EpiInfModels.jl's public interface.","category":"page"},{"location":"lib/EpiInfModels/public/","page":"Public API","title":"Public API","text":"See the Internals section of the manual for internal package docs covering all submodules.","category":"page"},{"location":"lib/EpiInfModels/public/#Contents","page":"Public API","title":"Contents","text":"","category":"section"},{"location":"lib/EpiInfModels/public/","page":"Public API","title":"Public API","text":"Pages = [\"public.md\"]\nDepth = 2:2","category":"page"},{"location":"lib/EpiInfModels/public/#Index","page":"Public API","title":"Index","text":"","category":"section"},{"location":"lib/EpiInfModels/public/","page":"Public API","title":"Public API","text":"Pages = [\"public.md\"]","category":"page"},{"location":"lib/EpiInfModels/public/#Public-API","page":"Public API","title":"Public API","text":"","category":"section"},{"location":"lib/EpiInfModels/public/","page":"Public API","title":"Public API","text":"Modules = [EpiAware.EpiInfModels]\nPrivate = false","category":"page"},{"location":"lib/EpiInfModels/public/#EpiAware.EpiInfModels","page":"Public API","title":"EpiAware.EpiInfModels","text":"Module for defining epidemiological models.\n\n\n\n\n\n","category":"module"},{"location":"lib/EpiInfModels/public/#EpiAware.EpiInfModels.DirectInfections","page":"Public API","title":"EpiAware.EpiInfModels.DirectInfections","text":"struct DirectInfections{S<:Distributions.Sampleable} <: AbstractTuringEpiModel\n\nModel unobserved/latent infections as a transformation on a sampled latent process.\n\nMathematical specification\n\nIf Z_t is a realisation of the latent model, then the unobserved/latent infections are given by\n\nI_t = g(hatI_0 + Z_t)\n\nwhere g is a transformation function and the unconstrained initial infections hatI_0 are sampled from a prior distribution.\n\nDirectInfections are constructed by passing an EpiData object data and an initialisation_prior for the prior distribution of hatI_0. The default initialisation_prior is Normal().\n\nConstructors\n\nDirectInfections(; data, initialisation_prior)\n\nExample usage with generate_latent_infs\n\ngenerate_latent_infs can be used to construct a Turing model for the latent infections conditional on the sample path of a latent process. In this example, we generate a sample of a white noise latent process.\n\nFirst, we construct a DirectInfections struct with an EpiData object, an initialisation prior and a transformation function.\n\nusing Distributions, Turing, EpiAware\ngen_int = [0.2, 0.3, 0.5]\ng = exp\n\n# Create an EpiData object\ndata = EpiData(gen_int, g)\n\n# Create a DirectInfections model\ndirect_inf_model = DirectInfections(data = data, initialisation_prior = Normal())\n\nThen, we can use generate_latent_infs to construct a Turing model for the unobserved infection generation model set by the type of direct_inf_model.\n\n# Construct a Turing model\nZ_t = randn(100)\nlatent_inf = generate_latent_infs(direct_inf_model, Z_t)\n\nNow we can use the Turing PPL API to sample underlying parameters and generate the unobserved infections.\n\n# Sample from the unobserved infections model\n\n#Sample random parameters from prior\nθ = rand(latent_inf)\n#Get unobserved infections as a generated quantities from the model\nI_t = generated_quantities(latent_inf, θ)\n\n\n\nFields\n\ndata::EpiData: Epidata object.\ninitialisation_prior::Distributions.Sampleable: Prior distribution for the initialisation of the infections. Default is Normal().\n\n\n\n\n\n","category":"type"},{"location":"lib/EpiInfModels/public/#EpiAware.EpiInfModels.EpiData","page":"Public API","title":"EpiAware.EpiInfModels.EpiData","text":"struct EpiData{T<:Real, F<:Function}\n\nThe EpiData struct represents epidemiological data used in infectious disease modeling.\n\nConstructors\n\nEpiData(gen_int, transformation::Function). Constructs an EpiData object with discrete\n\ngeneration interval gen_int and transformation function transformation.\n\nEpiData(;gen_distribution::ContinuousDistribution, D_gen, Δd = 1.0, transformation::Function = exp).\n\nConstructs an EpiData object with double interval censoring discretisation of the continuous next generation interval distribution gen_distribution with additional right truncation at D_gen. Δd sets the interval width (default = 1.0). transformation sets the transformation function\n\nExamples\n\nConstruction direct from discrete generation interval and transformation function:\n\nusing EpiAware\ngen_int = [0.2, 0.3, 0.5]\ng = exp\ndata = EpiData(gen_int, g)\n\nConstruction from continuous distribution for generation interval.\n\nusing Distributions\n\ngen_distribution = Uniform(0.0, 10.0)\n\ndata = EpiData(;gen_distribution\n D_gen = 10.0)\n\n\n\nFields\n\ngen_int::Vector{T} where T<:Real: Discrete generation interval.\nlen_gen_int::Integer: Length of the discrete generation interval.\ntransformation::Function: Transformation function defining constrained and unconstrained domain bijections.\n\n\n\n\n\n","category":"type"},{"location":"lib/EpiInfModels/public/#EpiAware.EpiInfModels.ExpGrowthRate","page":"Public API","title":"EpiAware.EpiInfModels.ExpGrowthRate","text":"struct ExpGrowthRate{S<:Distributions.Sampleable} <: AbstractTuringEpiModel\n\nModel unobserved/latent infections as due to time-varying exponential growth rate r_t which is generated by a latent process.\n\nMathematical specification\n\nIf Z_t is a realisation of the latent model, then the unobserved/latent infections are given by\n\nI_t = g(hatI_0) exp(Z_t)\n\nwhere g is a transformation function and the unconstrained initial infections hatI_0 are sampled from a prior distribution.\n\nExpGrowthRate are constructed by passing an EpiData object data and an initialisation_prior for the prior distribution of hatI_0. The default initialisation_prior is Normal().\n\nConstructor\n\nExpGrowthRate(; data, initialisation_prior).\n\nExample usage with generate_latent_infs\n\ngenerate_latent_infs can be used to construct a Turing model for the latent infections conditional on the sample path of a latent process. In this example, we generate a sample of a white noise latent process.\n\nFirst, we construct an ExpGrowthRate struct with an EpiData object, an initialisation prior and a transformation function.\n\nusing Distributions, Turing, EpiAware\ngen_int = [0.2, 0.3, 0.5]\ng = exp\n\n# Create an EpiData object\ndata = EpiData(gen_int, g)\n\n# Create an ExpGrowthRate model\nexp_growth_model = ExpGrowthRate(data = data, initialisation_prior = Normal())\n\nThen, we can use generate_latent_infs to construct a Turing model for the unobserved infection generation model set by the type of direct_inf_model.\n\n# Construct a Turing model\nZ_t = randn(100) * 0.05\nlatent_inf = generate_latent_infs(exp_growth_model, Z_t)\n\nNow we can use the Turing PPL API to sample underlying parameters and generate the unobserved infections.\n\n# Sample from the unobserved infections model\n\n#Sample random parameters from prior\nθ = rand(latent_inf)\n#Get unobserved infections as a generated quantities from the model\nI_t = generated_quantities(latent_inf, θ)\n\n\n\nFields\n\ndata::EpiData\ninitialisation_prior::Distributions.Sampleable\n\n\n\n\n\n","category":"type"},{"location":"lib/EpiInfModels/public/#EpiAware.EpiInfModels.ODEParams","page":"Public API","title":"EpiAware.EpiInfModels.ODEParams","text":"struct ODEParams{T}\n\nA structure to hold the initial condition and parameters for an ODE (Ordinary Differential Equation) process. params::ODEParams is used in the method generate_latent_infs(epi_model::ODEProcess, params::ODEParams)\n\nConstructors\n\nODEParams(; u0::VecOrMat, p::VecOrMat): Create an ODEParams object with the initial condition(s) u0 and parameters p.\n\nExample\n\nusing EpiAware\nparams = ODEParams(; u0 = ones(10), p = [2, 3])\n\n# output\n\nODEParams{Float64}([1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0], [2.0, 3.0])\n\n\n\nFields\n\nu0::VecOrMat: The initial condition(s) for the ODE, which can be a vector or matrix of type T.\np::VecOrMat: The parameters for the ODE, which can be a vector or matrix of type T.\n\n\n\n\n\n","category":"type"},{"location":"lib/EpiInfModels/public/#EpiAware.EpiInfModels.ODEProcess","page":"Public API","title":"EpiAware.EpiInfModels.ODEProcess","text":"struct ODEProcess{P<:SciMLBase.ODEProblem, T, S, F<:Function} <: AbstractTuringEpiModel\n\nA structure representing an infection process modeled by an Ordinary Differential Equation (ODE).\n\nBackground\n\nThe purpose of this structure is to define the behaviour of modelling an infection process using an ODE. We use the SciML ecosystem to define and solve the ODE. For ODEProcess structs we focus on defining from a restricted set of ODE problems:\n\nThe initial condition u0 must be a vector or matrix.\nThe parameters p must be a vector or matrix.\nThe output of the ODE should be interpreted as the infection incidence at each time point in\n\nts via the function sol2infs which maps the solution object sol of the ODE solve to infection counts.\n\nConstructors\n\nODEProcess(prob::ODEProblem; ts, solver, sol2infs): Create an ODEProcess\n\nobject with the ODE problem prob, time points ts, solver solver, and function sol2infs.\n\nExample\n\nusing EpiAware, OrdinaryDiffEq\nr = log(2) / 7 # Growth rate corresponding to 7 day doubling time\nu0 = [1.0]\np = [r]\nparams = ODEParams(u0 = u0, p = p)\n\n# Define the ODE problem using SciML\n# We use a simple exponential growth model\n\nfunction expgrowth(du, u, p, t)\n du[1] = p[1] * u[1]\nend\nprob = ODEProblem(expgrowth, u0, (0.0, 10.0), p)\n\n# Define the ODEProcess\n\nexpgrowth_model = ODEProcess(prob::ODEProblem; ts = 0:1:10,\n solver = Tsit5(),\n sol2infs = sol -> sol[1, :])\n\n# Generate the latent infections\nI_t = generate_latent_infs(expgrowth_model, params)()\n\n# output\n\n11-element Vector{Float64}:\n 1.0\n 1.1040895124087677\n 1.2190137467993492\n 1.3459001375697022\n 1.4859941865014936\n 1.640671113705054\n 1.8114471151863056\n 1.9999990356297939\n 2.2081789476865237\n 2.438027196361022\n 2.6918002758361723\n\n\n\nFields\n\nprob::SciMLBase.ODEProblem: The ODE problem instance, where P is a subtype of ODEProblem.\nts::Vector: A vector of time points, where T is the type of the time points.\nsolver::Any: The solver used for the ODE problem.\nsol2infs::Function: A function that maps the solution object of the ODE to infection counts.\n\n\n\n\n\n","category":"type"},{"location":"lib/EpiInfModels/public/#EpiAware.EpiInfModels.Renewal","page":"Public API","title":"EpiAware.EpiInfModels.Renewal","text":"struct Renewal{E, S<:Distributions.Sampleable, A} <: AbstractTuringRenewal\n\nModel unobserved/latent infections as due to time-varying Renewal model with reproduction number mathcalR_t which is generated by a latent process.\n\nMathematical specification\n\nIf Z_t is a realisation of the latent model, then the unobserved/latent infections are given by\n\nbeginalign\nmathcalR_t = g(Z_t)\nI_t = mathcalR_t sum_i=1^n-1 I_t-i g_i qquad t geq 1 \nI_t = g(hatI_0) exp(r(mathcalR_1) t) qquad t leq 0\nendalign\n\nwhere g is a transformation function and the unconstrained initial infections hatI_0 are sampled from a prior distribution. The discrete generation interval is given by g_i.\n\nr(mathcalR_1) is the exponential growth rate implied by mathcalR_1) using the implicit relationship between the exponential growth rate and the reproduction number.\n\nmathcalR sum_j geq 1 g_j exp(- r j)= 1\n\nRenewal are constructed by passing an EpiData object data and an initialisation_prior for the prior distribution of hatI_0. The default initialisation_prior is Normal().\n\nConstructors\n\nRenewal(; data, initialisation_prior). Construct a Renewal model with default update steps.\nRenewal(data; initialisation_prior). Construct a Renewal model with default update steps.\nRenewal(data, initialisation_prior, recurrent_step) Construct a Renewal model with recurrent_step update step function.\n\nExample usage with generate_latent_infs\n\ngenerate_latent_infs can be used to construct a Turing model for the latent infections conditional on the sample path of a latent process. In this example, we generate a sample of a white noise latent process.\n\nFirst, we construct an Renewal struct with an EpiData object, an initialisation prior and a transformation function.\n\nusing Distributions, Turing, EpiAware\ngen_int = [0.2, 0.3, 0.5]\ng = exp\n\n# Create an EpiData object\ndata = EpiData(gen_int, g)\n\n# Create an Renewal model\nrenewal_model = Renewal(data; initialisation_prior = Normal())\n\nThen, we can use generate_latent_infs to construct a Turing model for the unobserved infection generation model set by the type of direct_inf_model.\n\n# Construct a Turing model\nZ_t = randn(100) * 0.05\nlatent_inf = generate_latent_infs(renewal_model, Z_t)\n\nNow we can use the Turing PPL API to sample underlying parameters and generate the unobserved infections.\n\n# Sample from the unobserved infections model\n\n#Sample random parameters from prior\nθ = rand(latent_inf)\n#Get unobserved infections as a generated quantities from the model\nI_t = generated_quantities(latent_inf, θ)\n\n\n\nFields\n\ndata::Any\ninitialisation_prior::Distributions.Sampleable\nrecurrent_step::Any\n\n\n\n\n\n","category":"type"},{"location":"lib/EpiInfModels/public/#EpiAware.EpiInfModels.R_to_r-Union{Tuple{T}, Tuple{Any, Vector{T}}} where T<:AbstractFloat","page":"Public API","title":"EpiAware.EpiInfModels.R_to_r","text":"R_to_r(\n R₀,\n w::Array{T<:AbstractFloat, 1};\n newton_steps,\n Δd\n) -> Any\n\n\nThis function computes an approximation to the exponential growth rate r given the reproductive ratio R₀ and the discretized generation interval w with discretized interval width Δd. This is based on the implicit solution of\n\nG(r) - 1 over R_0 = 0\n\nwhere\n\nG(r) = sum_i=1^n w_i e^-r i\n\nis the negative moment generating function (MGF) of the generation interval distribution.\n\nThe two step approximation is based on: 1. Direct solution of implicit equation for a small r approximation. 2. Improving the approximation using Newton's method for a fixed number of steps newton_steps.\n\nReturns:\n\nThe approximate value of r.\n\n\n\n\n\n","category":"method"},{"location":"lib/EpiInfModels/public/#EpiAware.EpiInfModels.expected_Rt-Tuple{EpiData, Vector{<:Real}}","page":"Public API","title":"EpiAware.EpiInfModels.expected_Rt","text":"expected_Rt(\n data::EpiData,\n infections::Vector{<:Real}\n) -> Any\n\n\nCalculate the expected Rt values based on the given EpiData object and infections.\n\nR_t = fracI_tsum_i=1^n I_t-i g_i\n\nArguments\n\ndata::EpiData: An instance of the EpiData type containing generation interval data.\ninfections::Vector{<:Real}: A vector of infection data.\n\nReturns\n\nexp_Rt::Vector{Float64}: A vector of expected Rt values.\n\nExamples\n\nusing EpiAware\n\ndata = EpiData([0.2, 0.3, 0.5], exp)\ninfections = [100, 200, 300, 400, 500]\nexpected_Rt(data, infections)\n\n\n\n\n\n","category":"method"},{"location":"lib/EpiInfModels/public/#EpiAware.EpiInfModels.r_to_R-Tuple{Any, AbstractVector}","page":"Public API","title":"EpiAware.EpiInfModels.r_to_R","text":"r_to_R(r, w::AbstractVector) -> Any\n\n\nr_to_R(r, w)\n\nCompute the reproductive ratio given exponential growth rate r and discretized generation interval w.\n\nArguments\n\nr: The exponential growth rate.\nw: discretized generation interval.\n\nReturns\n\nThe reproductive ratio.\n\n\n\n\n\n","category":"method"},{"location":"lib/EpiAwareUtils/#EpiAwareUtils.jl","page":"Overview","title":"EpiAwareUtils.jl","text":"","category":"section"},{"location":"lib/EpiAwareUtils/","page":"Overview","title":"Overview","text":"This package provides utility functions for the EpiAware ecosystem.","category":"page"},{"location":"lib/EpiAwareUtils/#API","page":"Overview","title":"API","text":"","category":"section"},{"location":"lib/EpiAwareUtils/","page":"Overview","title":"Overview","text":"Pages = [\"lib/EpiAwareUtils/public.md\", \"lib/EpiAwareUtils/internals.md\"]","category":"page"},{"location":"lib/EpiInference/#EpiInference.jl","page":"Overview","title":"EpiInference.jl","text":"","category":"section"},{"location":"lib/EpiInference/","page":"Overview","title":"Overview","text":"This package provides inference algorithms for the EpiAware ecosystem.","category":"page"},{"location":"lib/EpiInference/#API","page":"Overview","title":"API","text":"","category":"section"},{"location":"lib/EpiInference/","page":"Overview","title":"Overview","text":"Pages = [\"lib/EpiInference/public.md\", \"lib/EpiInference/internals.md\"]","category":"page"},{"location":"lib/public/#Public-Documentation","page":"Public API","title":"Public Documentation","text":"","category":"section"},{"location":"lib/public/","page":"Public API","title":"Public API","text":"Documentation for EpiAware.jl's public interface.","category":"page"},{"location":"lib/public/","page":"Public API","title":"Public API","text":"See the Internals section of the manual for internal package docs covering all submodules.","category":"page"},{"location":"lib/public/#Contents","page":"Public API","title":"Contents","text":"","category":"section"},{"location":"lib/public/","page":"Public API","title":"Public API","text":"Pages = [\"public.md\"]\nDepth = 2:2","category":"page"},{"location":"lib/public/#Index","page":"Public API","title":"Index","text":"","category":"section"},{"location":"lib/public/","page":"Public API","title":"Public API","text":"Pages = [\"public.md\"]","category":"page"},{"location":"lib/public/#Public-API","page":"Public API","title":"Public API","text":"","category":"section"},{"location":"lib/public/","page":"Public API","title":"Public API","text":"Modules = [EpiAware]\nPrivate = false","category":"page"},{"location":"showcase/replications/mishra-2020/","page":"On the derivation of the renewal equation from an age-dependent branching process: an epidemic modelling perspective","title":"On the derivation of the renewal equation from an age-dependent branching process: an epidemic modelling perspective","text":"\n\n\n\n\n

              Example: Early COVID-19 case data in South Korea

              In this example we use EpiAware functionality to largely recreate an epidemiological model presented in On the derivation of the renewal equation from an age-dependent branching process: an epidemic modelling perspective, Mishra et al (2020). Mishra et al consider test-confirmed cases of COVID-19 in South Korea between January to July 2020. The components of the epidemilogical model they consider are:

              $$I_t = R_t \\sum_{s\\geq 1} I_{t-s} g_s.$$

              $$G \\sim \\text{Gamma}(6.5,0.62).$$

              $$C_t \\sim \\text{NegBin}(\\text{mean} = I_t,~ \\text{overdispersion} = \\phi).$$

              In the examples below we are going to largely recreate the Mishra et al model, whilst emphasing that each component of the overall epidemiological model is, itself, a stand alone model that can be sampled from.

              \n\n\n\n\n","category":"page"},{"location":"showcase/replications/mishra-2020/#Dependencies-for-this-notebook","page":"On the derivation of the renewal equation from an age-dependent branching process: an epidemic modelling perspective","title":"Dependencies for this notebook","text":"","category":"section"},{"location":"showcase/replications/mishra-2020/","page":"On the derivation of the renewal equation from an age-dependent branching process: an epidemic modelling perspective","title":"On the derivation of the renewal equation from an age-dependent branching process: an epidemic modelling perspective","text":"
              \n

              Now we want to import these dependencies into scope. If evaluating these code lines/blocks in REPL, then the REPL will offer to install any missing dependencies. Alternatively, you can add them to your active environment using Pkg.add.

              \n\n
              using EpiAware
              \n\n\n
              using Turing, DynamicPPL #Underlying Turing ecosystem packages to interact with models
              \n\n\n
              using Distributions, Statistics #Statistics packages
              \n\n\n
              using CSV, DataFramesMeta #Data wrangling
              \n\n\n
              using CairoMakie, PairPlots, TimeSeries #Plotting backend
              \n\n\n
              using ReverseDiff #Automatic differentiation backend
              \n\n\n
              begin #Date utility and set Random seed\n    using Dates\n    using Random\n    Random.seed!(1)\nend
              \n
              TaskLocalRNG()
              \n\n","category":"page"},{"location":"showcase/replications/mishra-2020/#Load-early-SARS-2-case-data-for-South-Korea","page":"On the derivation of the renewal equation from an age-dependent branching process: an epidemic modelling perspective","title":"Load early SARS-2 case data for South Korea","text":"","category":"section"},{"location":"showcase/replications/mishra-2020/","page":"On the derivation of the renewal equation from an age-dependent branching process: an epidemic modelling perspective","title":"On the derivation of the renewal equation from an age-dependent branching process: an epidemic modelling perspective","text":"
              \n

              First, we make sure that we have the data we want to analysis in scope by downloading it for where we have saved a copy in the EpiAware repository.

              NB: The case data is curated by the covidregionaldata package. We accessed the South Korean case data using a short R script. It is possible to interface directly from a Julia session using the RCall.jl package, but we do not do this in this notebook to reduce the number of underlying dependencies required to run this notebook.

              \n\n
              url = \"https://raw.githubusercontent.com/CDCgov/Rt-without-renewal/main/EpiAware/docs/src/showcase/replications/mishra-2020/south_korea_data.csv2\"
              \n
              \"https://raw.githubusercontent.com/CDCgov/Rt-without-renewal/main/EpiAware/docs/src/showcase/replications/mishra-2020/south_korea_data.csv2\"
              \n\n
              data = CSV.read(download(url), DataFrame)
              \n
              Column1datecases_newdeaths_new
              112019-12-3100
              222020-01-0100
              332020-01-0200
              442020-01-0300
              552020-01-0400
              662020-01-0500
              772020-01-0600
              882020-01-0700
              992020-01-0800
              10102020-01-0900
              ...
              2142142020-07-31361
              \n\n\n

              Time-varying reproduction number as an AbstractLatentModel type

              EpiAware exposes a AbstractLatentModel abstract type; the purpose of which is to group stochastic processes which can be interpreted as generating time-varying parameters/quantities of interest which we call latent process models.

              In the Mishra et al model the log-time varying reproductive number \\(Z_t\\) is assumed to evolve as an auto-regressive process, AR(2):

              $$\\begin{align}\nR_t &= \\exp Z_t, \\\\\nZ_t &= \\rho_1 Z_{t-1} + \\rho_2 Z_{t-2} + \\epsilon_t, \\\\\n\\epsilon_t &\\sim \\text{Normal}(0, \\sigma^*).\n\\end{align}$$

              Where \\(\\rho_1,\\rho_2\\), which are the parameters of AR process, and \\(\\epsilon_t\\) is a white noise process with standard deviation \\(\\sigma^*\\).

              \n\n\n

              In EpiAware we determine the behaviour of a latent process by choosing a concrete subtype (i.e. a struct) of AbstractLatentModel which has fields that set the priors of the various parameters required for the latent process.

              The AR process has the struct AR <: AbstractLatentModel. The user can supply the priors for \\(\\rho_1,\\rho_2\\) in the field damp_priors, for \\(\\sigma^*\\) in the field std_prior, and the initial values \\(Z_1, Z_2\\) in the field init_priors.

              \n\n\n

              We choose priors based on Mishra et al using the Distributions.jl interface to probability distributions. Note that we condition the AR parameters onto \\([0,1]\\), as in Mishra et al, using the truncated function.

              In Mishra et al the standard deviation of the stationary distribution of \\(Z_t\\) which has a standard normal distribution conditioned to be positive \\(\\sigma \\sim \\mathcal{N}^+(0,1)\\). The value \\(σ^*\\) was determined from a nonlinear function of sampled \\(\\sigma, ~\\rho_1, ~\\rho_2\\) values. Since, Mishra et al give sharply informative priors for \\(\\rho_1,~\\rho_2\\) (see below) we simplify by calculating \\(\\sigma^*\\) at the prior mode of \\(\\rho_1,~\\rho_2\\). This results in a \\(\\sigma^* \\sim \\mathcal{N}^+(0, 0.5)\\) prior.

              \n\n
              ar = AR(\n    damp_priors = reverse([truncated(Normal(0.8, 0.05), 0, 1),\n        truncated(Normal(0.1, 0.05), 0, 1)]),\n    std_prior = HalfNormal(0.5),\n    init_priors = [Normal(-1.0, 0.1), Normal(-1.0, 0.5)]\n)
              \n
              AR{Product{Continuous, Truncated{Normal{Float64}, Continuous, Float64, Float64, Float64}, Vector{Truncated{Normal{Float64}, Continuous, Float64, Float64, Float64}}}, HalfNormal{Float64}, Product{Continuous, Normal{Float64}, Vector{Normal{Float64}}}, Int64}(Distributions.Product{Distributions.Continuous, Distributions.Truncated{Distributions.Normal{Float64}, Distributions.Continuous, Float64, Float64, Float64}, Vector{Distributions.Truncated{Distributions.Normal{Float64}, Distributions.Continuous, Float64, Float64, Float64}}}(v=Truncated{Normal{Float64}, Continuous, Float64, Float64, Float64}[Truncated(Distributions.Normal{Float64}(μ=0.1, σ=0.05); lower=0.0, upper=1.0), Truncated(Distributions.Normal{Float64}(μ=0.8, σ=0.05); lower=0.0, upper=1.0)]), HalfNormal{Float64}(μ=0.5), Distributions.Product{Distributions.Continuous, Distributions.Normal{Float64}, Vector{Distributions.Normal{Float64}}}(v=Normal{Float64}[Distributions.Normal{Float64}(μ=-1.0, σ=0.1), Distributions.Normal{Float64}(μ=-1.0, σ=0.5)]), 2)
              \n\n\n

              Turing model interface to the AR process

              As mentioned above, we can use this instance of the AR latent model to construct a Turing model object which implements the probabilistic behaviour determined by ar. We do this with the constructor function exposed by EpiAware: generate_latent which combines an AbstractLatentModel substype struct with the number of time steps for which we want to generate the latent process.

              As a refresher, we remind that the Turing.Model object has the following properties:

              As a concrete example we create a model object for the AR(2) process we specified above for 50 time steps:

              \n\n
              ar_mdl = generate_latent(ar, 50)
              \n
              Model{typeof(generate_latent), (:latent_model, :n), (), (), Tuple{AR{Product{Continuous, Truncated{Normal{Float64}, Continuous, Float64, Float64, Float64}, Vector{Truncated{Normal{Float64}, Continuous, Float64, Float64, Float64}}}, HalfNormal{Float64}, Product{Continuous, Normal{Float64}, Vector{Normal{Float64}}}, Int64}, Int64}, Tuple{}, DefaultContext}(EpiAware.EpiAwareBase.generate_latent, (latent_model = AR{Product{Continuous, Truncated{Normal{Float64}, Continuous, Float64, Float64, Float64}, Vector{Truncated{Normal{Float64}, Continuous, Float64, Float64, Float64}}}, HalfNormal{Float64}, Product{Continuous, Normal{Float64}, Vector{Normal{Float64}}}, Int64}(Distributions.Product{Distributions.Continuous, Distributions.Truncated{Distributions.Normal{Float64}, Distributions.Continuous, Float64, Float64, Float64}, Vector{Distributions.Truncated{Distributions.Normal{Float64}, Distributions.Continuous, Float64, Float64, Float64}}}(v=Truncated{Normal{Float64}, Continuous, Float64, Float64, Float64}[Truncated(Distributions.Normal{Float64}(μ=0.1, σ=0.05); lower=0.0, upper=1.0), Truncated(Distributions.Normal{Float64}(μ=0.8, σ=0.05); lower=0.0, upper=1.0)]), HalfNormal{Float64}(μ=0.5), Distributions.Product{Distributions.Continuous, Distributions.Normal{Float64}, Vector{Distributions.Normal{Float64}}}(v=Normal{Float64}[Distributions.Normal{Float64}(μ=-1.0, σ=0.1), Distributions.Normal{Float64}(μ=-1.0, σ=0.5)]), 2), n = 50), NamedTuple(), DefaultContext())
              \n\n\n

              Ultimately, this will only be one component of the full epidemiological model. However, it is useful to visualise its probabilistic behaviour for model diagnostic and prior predictive checking.

              We can spaghetti plot generative samples from the AR(2) process with the priors specified above.

              \n\n
              plt_ar_sample = let\n    n_samples = 100\n    ar_mdl_samples = mapreduce(hcat, 1:n_samples) do _\n        ar_mdl() .|> exp #Sample Z_t trajectories for the model\n    end\n\n    fig = Figure()\n    ax = Axis(fig[1, 1];\n        yscale = log10,\n        ylabel = \"Time varying Rₜ\",\n        title = \"$(n_samples) draws from the prior Rₜ model\"\n    )\n    for col in eachcol(ar_mdl_samples)\n        lines!(ax, col, color = (:grey, 0.1))\n    end\n    fig\nend
              \n\n\n\n

              This suggests that a priori we believe that there is a few percent chance of achieving very high \\(R_t\\) values, i.e. \\(R_t \\sim 10-1000\\) is not excluded by our priors.

              \n\n\n

              The Renewal model as an AbstractEpiModel type

              The abstract type for models that generate infections exposed by EpiAware is called AbstractEpiModel. As with latent models different concrete subtypes of AbstractEpiModel define different classes of infection generating process. In this case we want to implement a renewal model.

              The Renewal <: AbstractEpiModel type of struct needs two fields:

              In Mishra et al they use an estimate of the serial interval of SARS-CoV-2 as an estimate of the generation interval.

              \n\n
              truth_GI = Gamma(6.5, 0.62)
              \n
              Distributions.Gamma{Float64}(α=6.5, θ=0.62)
              \n\n\n

              This is a representation of the generation interval distribution as continuous whereas the infection process will be formulated in discrete daily time steps. By default, EpiAware performs double interval censoring to convert our continuous estimate of the generation interval into a discretized version \\(g_t\\), whilst also applying left truncation such that \\(g_0 = 0\\) and normalising \\(\\sum_t g_t = 1.\\)

              The constructor for converting a continuous estimate of the generation interval distribution into a usable discrete time estimate is EpiData.

              \n\n
              model_data = EpiData(gen_distribution = truth_GI)
              \n
              EpiData{Float64, typeof(exp)}([0.026663134095601056, 0.14059778064943784, 0.2502660305615846, 0.24789569560506844, 0.1731751163417783, 0.09635404000022223, 0.04573437575216367, 0.019313826994143808], 8, exp)
              \n\n\n

              We can compare the discretized generation interval with the continuous estimate, which in this example is the serial interval estimate.

              \n\n
              let\n    fig = Figure()\n    ax = Axis(fig[1, 1];\n        xticks = 0:14,\n        xlabel = \"Days\",\n        title = \"Continuous and discrete generation intervals\"\n    )\n    barplot!(ax, model_data.gen_int;\n        label = \"Discretized next gen pmf\"\n    )\n    lines!(truth_GI;\n        label = \"Continuous serial interval\",\n        color = :green\n    )\n    axislegend(ax)\n    fig\nend
              \n\n\n\n

              The user also needs to specify a prior for the log incidence at time zero, \\(\\log I_0\\). The initial history of latent infections \\(I_{-1}, I_{-2},\\dots\\) is constructed as

              $$I_t = e^{rt} I_0,\\qquad t = 0, -1, -2,...$$

              Where the exponential growth rate \\(r\\) is determined by the initial reproductive number \\(R_1\\) via the solution to the implicit equation,

              $$R_1 = 1 \\Big{/} \\sum_{t\\geq 1} e^{-rt} g_t$$

              \n\n
              log_I0_prior = Normal(log(1.0), 1.0)
              \n
              Distributions.Normal{Float64}(μ=0.0, σ=1.0)
              \n\n
              epi = Renewal(model_data; initialisation_prior = log_I0_prior)
              \n
              Renewal{EpiData{Float64, typeof(exp)}, Normal{Float64}, EpiAware.EpiInfModels.ConstantRenewalStep{Float64}}(EpiData{Float64, typeof(exp)}([0.026663134095601056, 0.14059778064943784, 0.2502660305615846, 0.24789569560506844, 0.1731751163417783, 0.09635404000022223, 0.04573437575216367, 0.019313826994143808], 8, exp), Distributions.Normal{Float64}(μ=0.0, σ=1.0), EpiAware.EpiInfModels.ConstantRenewalStep{Float64}([0.019313826994143808, 0.04573437575216367, 0.09635404000022223, 0.1731751163417783, 0.24789569560506844, 0.2502660305615846, 0.14059778064943784, 0.026663134095601056]))
              \n\n\n

              NB: We don't implement a background infection rate in this model.

              \n\n\n

              Turing model interface to Renewal process

              As mentioned above, we can use this instance of the Renewal latent infection model to construct a TuringModel which implements the probabilistic behaviour determined by epi using the constructor function generate_latent_infs which combines epi with a provided \\(\\log R_t\\) time series.

              Here we choose an example where \\(R_t\\) decreases from \\(R_t = 3\\) to \\(R_t = 0.5\\) over the course of 50 days.

              \n\n
              R_t_fixed = [0.5 + 2.5 / (1 + exp(t - 15)) for t in 1:50]
              \n
              50-element Vector{Float64}:\n 2.9999979211799306\n 2.9999943491892553\n 2.9999846395634946\n 2.99995824644538\n 2.9998865053282437\n 2.9996915135600344\n 2.999161624673834\n ⋮\n 0.5000000000002339\n 0.500000000000086\n 0.5000000000000316\n 0.5000000000000117\n 0.5000000000000043\n 0.5000000000000016
              \n\n
              latent_inf_mdl = generate_latent_infs(epi, log.(R_t_fixed))
              \n
              Model{typeof(generate_latent_infs), (:epi_model, :_Rt), (), (), Tuple{Renewal{EpiData{Float64, typeof(exp)}, Normal{Float64}, EpiAware.EpiInfModels.ConstantRenewalStep{Float64}}, Vector{Float64}}, Tuple{}, DefaultContext}(EpiAware.EpiAwareBase.generate_latent_infs, (epi_model = Renewal{EpiData{Float64, typeof(exp)}, Normal{Float64}, EpiAware.EpiInfModels.ConstantRenewalStep{Float64}}(EpiData{Float64, typeof(exp)}([0.026663134095601056, 0.14059778064943784, 0.2502660305615846, 0.24789569560506844, 0.1731751163417783, 0.09635404000022223, 0.04573437575216367, 0.019313826994143808], 8, exp), Distributions.Normal{Float64}(μ=0.0, σ=1.0), EpiAware.EpiInfModels.ConstantRenewalStep{Float64}([0.019313826994143808, 0.04573437575216367, 0.09635404000022223, 0.1731751163417783, 0.24789569560506844, 0.2502660305615846, 0.14059778064943784, 0.026663134095601056])), _Rt = [1.0986115957278464, 1.098610405062754, 1.0986071685094998, 1.0985983707197156, 1.0985744563952262, 1.098509454567543, 1.0983327911702674, 1.097852790994088, 1.09654964358037, 1.0930193012626002  …  -0.6931471805343999, -0.6931471805505477, -0.693147180556488, -0.6931471805586734, -0.6931471805594774, -0.6931471805597732, -0.693147180559882, -0.693147180559922, -0.6931471805599366, -0.6931471805599422]), NamedTuple(), DefaultContext())
              \n\n
              plt_epi = let\n    n_samples = 100\n    #Sample unconditionally the underlying parameters of the model\n    epi_mdl_samples = mapreduce(hcat, 1:n_samples) do _\n        latent_inf_mdl()\n    end\n    fig = Figure()\n    ax1 = Axis(fig[1, 1];\n        title = \"$(n_samples) draws from renewal model with chosen Rt\",\n        ylabel = \"Latent infections\"\n    )\n    ax2 = Axis(fig[2, 1];\n        ylabel = \"Rt\"\n    )\n    for col in eachcol(epi_mdl_samples)\n        lines!(ax1, col;\n            color = (:grey, 0.1)\n        )\n    end\n    lines!(ax2, R_t_fixed;\n        linewidth = 2\n    )\n    fig\nend
              \n\n\n\n

              Negative Binomial Observations as an ObservationModel type

              In Mishra et al latent infections were assumed to occur on their observation day with negative binomial errors, this motivates using the serial interval (the time between onset of symptoms of a primary and secondary case) rather than generation interval distribution (the time between infection time of a primary and secondary case).

              Observation models are set in EpiAware as concrete subtypes of an ObservationModel. The Negative binomial error model without observation delays is set with a NegativeBinomialError struct. In Mishra et al the overdispersion parameter \\(\\phi\\) sets the relationship between the mean and variance of the negative binomial errors,

              $$\\text{var} = \\text{mean} + {\\text{mean}^2 \\over \\phi}.$$

              In EpiAware, we default to a prior on \\(\\sqrt{1/\\phi}\\) because this quantity is approximately the coefficient of variation of the observation noise and, therefore, is easier to reason on a priori beliefs. We call this quantity the cluster factor.

              A prior for \\(\\phi\\) was not specified in Mishra et al, we select one below but we will condition a value in analysis below.

              \n\n
              obs = NegativeBinomialError(cluster_factor_prior = HalfNormal(0.1))
              \n
              NegativeBinomialError{HalfNormal{Float64}}(HalfNormal{Float64}(μ=0.1))
              \n\n\n

              Turing model interface to the NegativeBinomialError model

              We can construct a NegativeBinomialError model implementation as a TuringModel using the EpiAwaregenerate_observations functions.

              Turing uses missing arguments to indicate variables that are to be sampled. We use this to observe a forward model that samples observations, conditional on an underlying expected observation time series.

              \n\n\n

              First, we set an artificial expected cases curve.

              \n\n
              expected_cases = [1000 * exp(-(t - 15)^2 / (2 * 4)) for t in 1:30]
              \n
              30-element Vector{Float64}:\n 2.289734845645553e-8\n 6.691586091292782e-7\n 1.5229979744712628e-5\n 0.0002699578503363014\n 0.003726653172078671\n 0.04006529739295107\n 0.33546262790251186\n ⋮\n 0.003726653172078671\n 0.0002699578503363014\n 1.5229979744712628e-5\n 6.691586091292782e-7\n 2.289734845645553e-8\n 6.101936677605324e-10
              \n\n
              obs_mdl = generate_observations(obs, missing, expected_cases)
              \n
              Model{typeof(generate_observations), (:obs_model, :y_t, :Y_t), (), (:y_t,), Tuple{NegativeBinomialError{HalfNormal{Float64}}, Missing, Vector{Float64}}, Tuple{}, DefaultContext}(EpiAware.EpiAwareBase.generate_observations, (obs_model = NegativeBinomialError{HalfNormal{Float64}}(HalfNormal{Float64}(μ=0.1)), y_t = missing, Y_t = [2.289734845645553e-8, 6.691586091292782e-7, 1.5229979744712628e-5, 0.0002699578503363014, 0.003726653172078671, 0.04006529739295107, 0.33546262790251186, 2.187491118182885, 11.108996538242305, 43.93693362340742  …  11.108996538242305, 2.187491118182885, 0.33546262790251186, 0.04006529739295107, 0.003726653172078671, 0.0002699578503363014, 1.5229979744712628e-5, 6.691586091292782e-7, 2.289734845645553e-8, 6.101936677605324e-10]), NamedTuple(), DefaultContext())
              \n\n
              plt_obs = let\n    n_samples = 100\n    obs_mdl_samples = mapreduce(hcat, 1:n_samples) do _\n        θ = obs_mdl() #Sample unconditionally the underlying parameters of the model\n    end\n    fig = Figure()\n    ax = Axis(fig[1, 1];\n        title = \"$(n_samples) draws from neg. bin. obs model\",\n        ylabel = \"Observed cases\"\n    )\n    for col in eachcol(obs_mdl_samples)\n        scatter!(ax, col;\n            color = (:grey, 0.2)\n        )\n    end\n    lines!(ax, expected_cases;\n        color = :red,\n        linewidth = 3,\n        label = \"Expected cases\"\n    )\n    axislegend(ax)\n    fig\nend
              \n\n\n\n

              Composing models into an EpiProblem

              Mishra et al follows a common pattern of having an infection generation process driven by a latent process with an observation model that links the infection process to a discrete valued time series of incidence data.

              In EpiAware we provide an EpiProblem constructor for this common epidemiological model pattern.

              The constructor for an EpiProblem requires:

              The tspan set the range of the time index for the models.

              \n\n
              epi_prob = EpiProblem(epi_model = epi,\n    latent_model = ar,\n    observation_model = obs,\n    tspan = (45, 80))
              \n
              EpiProblem{Renewal{EpiData{Float64, typeof(exp)}, Normal{Float64}, EpiAware.EpiInfModels.ConstantRenewalStep{Float64}}, AR{Product{Continuous, Truncated{Normal{Float64}, Continuous, Float64, Float64, Float64}, Vector{Truncated{Normal{Float64}, Continuous, Float64, Float64, Float64}}}, HalfNormal{Float64}, Product{Continuous, Normal{Float64}, Vector{Normal{Float64}}}, Int64}, NegativeBinomialError{HalfNormal{Float64}}}(Renewal{EpiData{Float64, typeof(exp)}, Normal{Float64}, EpiAware.EpiInfModels.ConstantRenewalStep{Float64}}(EpiData{Float64, typeof(exp)}([0.026663134095601056, 0.14059778064943784, 0.2502660305615846, 0.24789569560506844, 0.1731751163417783, 0.09635404000022223, 0.04573437575216367, 0.019313826994143808], 8, exp), Distributions.Normal{Float64}(μ=0.0, σ=1.0), EpiAware.EpiInfModels.ConstantRenewalStep{Float64}([0.019313826994143808, 0.04573437575216367, 0.09635404000022223, 0.1731751163417783, 0.24789569560506844, 0.2502660305615846, 0.14059778064943784, 0.026663134095601056])), AR{Product{Continuous, Truncated{Normal{Float64}, Continuous, Float64, Float64, Float64}, Vector{Truncated{Normal{Float64}, Continuous, Float64, Float64, Float64}}}, HalfNormal{Float64}, Product{Continuous, Normal{Float64}, Vector{Normal{Float64}}}, Int64}(Distributions.Product{Distributions.Continuous, Distributions.Truncated{Distributions.Normal{Float64}, Distributions.Continuous, Float64, Float64, Float64}, Vector{Distributions.Truncated{Distributions.Normal{Float64}, Distributions.Continuous, Float64, Float64, Float64}}}(v=Truncated{Normal{Float64}, Continuous, Float64, Float64, Float64}[Truncated(Distributions.Normal{Float64}(μ=0.1, σ=0.05); lower=0.0, upper=1.0), Truncated(Distributions.Normal{Float64}(μ=0.8, σ=0.05); lower=0.0, upper=1.0)]), HalfNormal{Float64}(μ=0.5), Distributions.Product{Distributions.Continuous, Distributions.Normal{Float64}, Vector{Distributions.Normal{Float64}}}(v=Normal{Float64}[Distributions.Normal{Float64}(μ=-1.0, σ=0.1), Distributions.Normal{Float64}(μ=-1.0, σ=0.5)]), 2), NegativeBinomialError{HalfNormal{Float64}}(HalfNormal{Float64}(μ=0.1)), (45, 80))
              \n\n","category":"page"},{"location":"showcase/replications/mishra-2020/#Inference-Methods","page":"On the derivation of the renewal equation from an age-dependent branching process: an epidemic modelling perspective","title":"Inference Methods","text":"","category":"section"},{"location":"showcase/replications/mishra-2020/","page":"On the derivation of the renewal equation from an age-dependent branching process: an epidemic modelling perspective","title":"On the derivation of the renewal equation from an age-dependent branching process: an epidemic modelling perspective","text":"
              \n

              We make inferences on the unobserved quantities, such as \\(R_t\\) by sampling from the model conditioned on the observed data. We generate the posterior samples using the No U-Turns (NUTS) sampler.

              To make NUTS more robust we provide manypathfinder, which is built on pathfinder variational inference from Pathfinder.jl. manypathfinder runs nruns pathfinder processes on the inference problem and returns the pathfinder run with maximum estimated ELBO.

              The composition of doing variational inference as a pre-sampler step which gets passed to NUTS initialisation is defined using the EpiMethod struct, where a sequence of pre-sampler steps can be be defined.

              EpiMethod also allows the specification of NUTS parameters, such as type of automatic differentiation, type of parallelism and number of parallel chains to sample.

              \n\n
              num_threads = min(10, Threads.nthreads())
              \n
              1
              \n\n
              inference_method = EpiMethod(\n    pre_sampler_steps = [ManyPathfinder(nruns = 4, maxiters = 100)],\n    sampler = NUTSampler(\n        adtype = AutoReverseDiff(compile = true),\n        ndraws = 2000,\n        nchains = num_threads,\n        mcmc_parallel = MCMCThreads())\n)
              \n
              EpiMethod{ManyPathfinder, NUTSampler{AutoReverseDiff{true}, MCMCThreads, UnionAll}}(ManyPathfinder[ManyPathfinder(10, 4, 100, 100)], NUTSampler{AutoReverseDiff{true}, MCMCThreads, UnionAll}(0.8, AutoReverseDiff(compile=true), MCMCThreads(), 1, 10, 1000.0, 0.0, 2000, AdvancedHMC.DiagEuclideanMetric, -1))
              \n\n","category":"page"},{"location":"showcase/replications/mishra-2020/#Inference-and-analysis","page":"On the derivation of the renewal equation from an age-dependent branching process: an epidemic modelling perspective","title":"Inference and analysis","text":"","category":"section"},{"location":"showcase/replications/mishra-2020/","page":"On the derivation of the renewal equation from an age-dependent branching process: an epidemic modelling perspective","title":"On the derivation of the renewal equation from an age-dependent branching process: an epidemic modelling perspective","text":"
              \n

              We supply the data as a NamedTuple with the y_t field containing the observed data, shortened to fit the chosen tspan of epi_prob.

              \n\n
              south_korea_data = (y_t = data.cases_new[epi_prob.tspan[1]:epi_prob.tspan[2]],\n    dates = data.date[epi_prob.tspan[1]:epi_prob.tspan[2]])
              \n
              (y_t = [0, 0, 0, 1, 1, 1, 15, 34, 75, 190  …  131, 242, 114, 110, 107, 76, 74, 84, 93, 152], dates = [Date(\"2020-02-13\"), Date(\"2020-02-14\"), Date(\"2020-02-15\"), Date(\"2020-02-16\"), Date(\"2020-02-17\"), Date(\"2020-02-18\"), Date(\"2020-02-19\"), Date(\"2020-02-20\"), Date(\"2020-02-21\"), Date(\"2020-02-22\")  …  Date(\"2020-03-10\"), Date(\"2020-03-11\"), Date(\"2020-03-12\"), Date(\"2020-03-13\"), Date(\"2020-03-14\"), Date(\"2020-03-15\"), Date(\"2020-03-16\"), Date(\"2020-03-17\"), Date(\"2020-03-18\"), Date(\"2020-03-19\")])
              \n\n\n

              In the epidemiological model it is hard to identify between the AR parameters such as the standard deviation of the AR process and the cluster factor of the negative binomial observation model. The reason for this identifiability problem is that the model assumes no delay between infection and observation. Therefore, on any day the data could be explained by \\(R_t\\) changing or observation noise and its not easy to disentangle greater volatility in \\(R_t\\) from higher noise in the observations.

              In models with latent delays, changes in \\(R_t\\) impact the observed cases over several days which means that it easier to disentangle trend effects from observation-to-observation fluctuations.

              To counter act this problem we condition the model on a fixed cluster factor value.

              \n\n
              fixed_cluster_factor = 0.25
              \n
              0.25
              \n\n\n

              EpiAware has the generate_epiaware function which joins an EpiProblem object with the data to produce as Turing model. This Turing model composes the three unit Turing models defined above: the Renewal infection generating process, the AR latent process for \\(\\log R_t\\), and the negative binomial observation model. Therefore, we can condition on variables as with any other Turing model.

              \n\n
              mdl = generate_epiaware(epi_prob, south_korea_data) |\n      (var\"obs.cluster_factor\" = fixed_cluster_factor,)
              \n
              Model{typeof(generate_epiaware), (:y_t, :time_steps, :epi_model), (:latent_model, :observation_model), (), Tuple{Vector{Int64}, Int64, Renewal{EpiData{Float64, typeof(exp)}, Normal{Float64}, EpiAware.EpiInfModels.ConstantRenewalStep{Float64}}}, Tuple{AR{Product{Continuous, Truncated{Normal{Float64}, Continuous, Float64, Float64, Float64}, Vector{Truncated{Normal{Float64}, Continuous, Float64, Float64, Float64}}}, HalfNormal{Float64}, Product{Continuous, Normal{Float64}, Vector{Normal{Float64}}}, Int64}, NegativeBinomialError{HalfNormal{Float64}}}, ConditionContext{@NamedTuple{obs.cluster_factor::Float64}, DefaultContext}}(EpiAware.EpiAwareBase.generate_epiaware, (y_t = [0, 0, 0, 1, 1, 1, 15, 34, 75, 190  …  131, 242, 114, 110, 107, 76, 74, 84, 93, 152], time_steps = 36, epi_model = Renewal{EpiData{Float64, typeof(exp)}, Normal{Float64}, EpiAware.EpiInfModels.ConstantRenewalStep{Float64}}(EpiData{Float64, typeof(exp)}([0.026663134095601056, 0.14059778064943784, 0.2502660305615846, 0.24789569560506844, 0.1731751163417783, 0.09635404000022223, 0.04573437575216367, 0.019313826994143808], 8, exp), Distributions.Normal{Float64}(μ=0.0, σ=1.0), EpiAware.EpiInfModels.ConstantRenewalStep{Float64}([0.019313826994143808, 0.04573437575216367, 0.09635404000022223, 0.1731751163417783, 0.24789569560506844, 0.2502660305615846, 0.14059778064943784, 0.026663134095601056]))), (latent_model = AR{Product{Continuous, Truncated{Normal{Float64}, Continuous, Float64, Float64, Float64}, Vector{Truncated{Normal{Float64}, Continuous, Float64, Float64, Float64}}}, HalfNormal{Float64}, Product{Continuous, Normal{Float64}, Vector{Normal{Float64}}}, Int64}(Distributions.Product{Distributions.Continuous, Distributions.Truncated{Distributions.Normal{Float64}, Distributions.Continuous, Float64, Float64, Float64}, Vector{Distributions.Truncated{Distributions.Normal{Float64}, Distributions.Continuous, Float64, Float64, Float64}}}(v=Truncated{Normal{Float64}, Continuous, Float64, Float64, Float64}[Truncated(Distributions.Normal{Float64}(μ=0.1, σ=0.05); lower=0.0, upper=1.0), Truncated(Distributions.Normal{Float64}(μ=0.8, σ=0.05); lower=0.0, upper=1.0)]), HalfNormal{Float64}(μ=0.5), Distributions.Product{Distributions.Continuous, Distributions.Normal{Float64}, Vector{Distributions.Normal{Float64}}}(v=Normal{Float64}[Distributions.Normal{Float64}(μ=-1.0, σ=0.1), Distributions.Normal{Float64}(μ=-1.0, σ=0.5)]), 2), observation_model = NegativeBinomialError{HalfNormal{Float64}}(HalfNormal{Float64}(μ=0.1))), ConditionContext((var\"obs.cluster_factor\" = 0.25,), DynamicPPL.DefaultContext()))
              \n\n\n

              Sampling with apply_method

              The apply_method function combines the elements above:

              And returns a collection of results:

              \n\n
              inference_results = apply_method(mdl,\n    inference_method,\n    south_korea_data\n)
              \n
              EpiAwareObservables(Model{typeof(generate_epiaware), (:y_t, :time_steps, :epi_model), (:latent_model, :observation_model), (), Tuple{Vector{Int64}, Int64, Renewal{EpiData{Float64, typeof(exp)}, Normal{Float64}, EpiAware.EpiInfModels.ConstantRenewalStep{Float64}}}, Tuple{AR{Product{Continuous, Truncated{Normal{Float64}, Continuous, Float64, Float64, Float64}, Vector{Truncated{Normal{Float64}, Continuous, Float64, Float64, Float64}}}, HalfNormal{Float64}, Product{Continuous, Normal{Float64}, Vector{Normal{Float64}}}, Int64}, NegativeBinomialError{HalfNormal{Float64}}}, ConditionContext{@NamedTuple{obs.cluster_factor::Float64}, DefaultContext}}(EpiAware.EpiAwareBase.generate_epiaware, (y_t = [0, 0, 0, 1, 1, 1, 15, 34, 75, 190  …  131, 242, 114, 110, 107, 76, 74, 84, 93, 152], time_steps = 36, epi_model = Renewal{EpiData{Float64, typeof(exp)}, Normal{Float64}, EpiAware.EpiInfModels.ConstantRenewalStep{Float64}}(EpiData{Float64, typeof(exp)}([0.026663134095601056, 0.14059778064943784, 0.2502660305615846, 0.24789569560506844, 0.1731751163417783, 0.09635404000022223, 0.04573437575216367, 0.019313826994143808], 8, exp), Distributions.Normal{Float64}(μ=0.0, σ=1.0), EpiAware.EpiInfModels.ConstantRenewalStep{Float64}([0.019313826994143808, 0.04573437575216367, 0.09635404000022223, 0.1731751163417783, 0.24789569560506844, 0.2502660305615846, 0.14059778064943784, 0.026663134095601056]))), (latent_model = AR{Product{Continuous, Truncated{Normal{Float64}, Continuous, Float64, Float64, Float64}, Vector{Truncated{Normal{Float64}, Continuous, Float64, Float64, Float64}}}, HalfNormal{Float64}, Product{Continuous, Normal{Float64}, Vector{Normal{Float64}}}, Int64}(Distributions.Product{Distributions.Continuous, Distributions.Truncated{Distributions.Normal{Float64}, Distributions.Continuous, Float64, Float64, Float64}, Vector{Distributions.Truncated{Distributions.Normal{Float64}, Distributions.Continuous, Float64, Float64, Float64}}}(v=Truncated{Normal{Float64}, Continuous, Float64, Float64, Float64}[Truncated(Distributions.Normal{Float64}(μ=0.1, σ=0.05); lower=0.0, upper=1.0), Truncated(Distributions.Normal{Float64}(μ=0.8, σ=0.05); lower=0.0, upper=1.0)]), HalfNormal{Float64}(μ=0.5), Distributions.Product{Distributions.Continuous, Distributions.Normal{Float64}, Vector{Distributions.Normal{Float64}}}(v=Normal{Float64}[Distributions.Normal{Float64}(μ=-1.0, σ=0.1), Distributions.Normal{Float64}(μ=-1.0, σ=0.5)]), 2), observation_model = NegativeBinomialError{HalfNormal{Float64}}(HalfNormal{Float64}(μ=0.1))), ConditionContext((var\"obs.cluster_factor\" = 0.25,), DynamicPPL.DefaultContext())), (y_t = [0, 0, 0, 1, 1, 1, 15, 34, 75, 190  …  131, 242, 114, 110, 107, 76, 74, 84, 93, 152], dates = [Date(\"2020-02-13\"), Date(\"2020-02-14\"), Date(\"2020-02-15\"), Date(\"2020-02-16\"), Date(\"2020-02-17\"), Date(\"2020-02-18\"), Date(\"2020-02-19\"), Date(\"2020-02-20\"), Date(\"2020-02-21\"), Date(\"2020-02-22\")  …  Date(\"2020-03-10\"), Date(\"2020-03-11\"), Date(\"2020-03-12\"), Date(\"2020-03-13\"), Date(\"2020-03-14\"), Date(\"2020-03-15\"), Date(\"2020-03-16\"), Date(\"2020-03-17\"), Date(\"2020-03-18\"), Date(\"2020-03-19\")]), MCMC chain (2000×52×1 Array{Float64, 3}), @NamedTuple{generated_y_t::Vector{Int64}, I_t::Vector{Float64}, Z_t::Vector{Float64}}[(generated_y_t = [0, 0, 0, 1, 1, 1, 15, 34, 75, 190  …  131, 242, 114, 110, 107, 76, 74, 84, 93, 152], I_t = [0.31320720807054464, 0.29683750985248575, 0.4089938601616076, 1.3022343628622814, 1.0225626150832448, 2.8527817296411477, 10.369697939206759, 30.004843225718137, 69.27683270045847, 197.8262544135264  …  165.82971938826418, 261.08028416983564, 128.7797349208945, 98.10236805176936, 150.39291882409293, 144.96465663477696, 85.45247839555344, 76.49701858640442, 93.82203638614712, 150.84255039241933], Z_t = [-0.9622516557378923, -0.779767809194889, -0.22894416095706988, 1.1292735506394296, 0.9415632945314946, 1.7200973931688441, 2.6194952823620414, 3.0811689176981933, 2.9825028812997374, 3.015766126931429  …  -1.1445943026275176, -0.527355948876789, -1.0005686058194918, -1.0426429725004263, -0.4220655088301653, -0.2592548294034239, -0.6147274678250605, -0.6236497063004567, -0.342583682700895, 0.25399548116615]); (generated_y_t = [0, 0, 0, 1, 1, 1, 15, 34, 75, 190  …  131, 242, 114, 110, 107, 76, 74, 84, 93, 152], I_t = [0.45814991051360804, 0.5157565397914593, 0.7485784521593867, 1.9258424522354982, 1.881579386894302, 3.0862955995411405, 23.768073533134828, 32.469112917123354, 79.08124126301547, 133.2426021603086  …  218.50970158839397, 189.3209377495298, 162.14089882072605, 74.82939396174662, 84.68530229352427, 115.38772530731804, 74.09774056573012, 78.26487650435531, 118.86475219175041, 125.40968754401513], Z_t = [-1.0088100650997336, -0.639544659082267, -0.026428595731269766, 1.106648226738017, 1.0986425367420274, 1.3272100844568537, 2.9971592948443506, 2.679686874488113, 2.5981760009741737, 2.257218180409069  …  -0.771390714439445, -0.8621512462984372, -0.8677931302838009, -1.4402257812121877, -1.093298024516366, -0.5310548172927552, -0.6966495722710407, -0.4237917262795986, 0.13517581524187788, 0.28543487337970164]); … ; (generated_y_t = [0, 0, 0, 1, 1, 1, 15, 34, 75, 190  …  131, 242, 114, 110, 107, 76, 74, 84, 93, 152], I_t = [0.5158893098958413, 0.4643438820242316, 0.573058040409358, 1.0591979428548484, 1.0216766089139648, 4.977924446799509, 10.379829593366294, 38.06015718537239, 52.785133791434795, 110.17269294072234  …  133.32317529108064, 246.07992816063788, 178.29360470154046, 124.97217683109014, 124.06804457042375, 71.61177951656336, 71.22461510270242, 101.9991809371847, 93.03147773366017, 120.5822766958333], Z_t = [-1.0210041815742075, -0.8713671807503602, -0.41324905463364925, 0.421555199727424, 0.5159404613230617, 2.051190024137349, 2.4937798436595635, 3.123698200942809, 2.5209349500859215, 2.3128754298302074  …  -1.4075716776839973, -0.7130153359849505, -0.8603864344110592, -0.9819602779371421, -0.7664180669401581, -1.1026213666869615, -0.9002959563439888, -0.3274085775203858, -0.20380639049023946, 0.1989039251891774]); (generated_y_t = [0, 0, 0, 1, 1, 1, 15, 34, 75, 190  …  131, 242, 114, 110, 107, 76, 74, 84, 93, 152], I_t = [0.504769766867223, 0.310306090042842, 0.5131669904549973, 1.5030702265370084, 2.802351779067001, 1.4698799404359122, 12.981457766592483, 20.113164564630615, 74.3333268644793, 216.22515597743123  …  253.24676521364242, 221.20271873483486, 102.18122690472096, 103.19829495194227, 116.67237754308549, 83.73332916634705, 77.75012482367323, 66.34548811778136, 91.87960122252447, 163.27415680770434], Z_t = [-0.9524412048170314, -1.2057488759480226, -0.4696039243518976, 0.839141397204691, 1.6035897556442378, 0.7535006022615907, 2.484868832014502, 2.4245438158415697, 2.9847584797143485, 3.1699589180089607  …  -0.6586714025253477, -0.6448978991097477, -1.227673886693326, -1.03657558668536, -0.7138395469441117, -0.8091533570285688, -0.6492157835649145, -0.6210108342818847, -0.13636614713928563, 0.5836160289527288]);;])
              \n\n\n

              Results and Predictive plotting

              To assess the quality of the inference visually we can plot predictive quantiles for generated case data from the version of the model which hasn't conditioned on case data using posterior parameters inferred from the version conditioned on observed data. For this purpose, we add a generated_quantiles utility function. This kind of visualisation is known as posterior predictive checking, and is a useful diagnostic tool for Bayesian inference (see here).

              We also plot the inferred \\(R_t\\) estimates from the model. We find that the EpiAware model recovers the main finding in Mishra et al; that the \\(R_t\\) in South Korea peaked at a very high value (\\(R_t \\sim 10\\) at peak) before rapidly dropping below 1 in early March 2020.

              Note that, in reality, the peak \\(R_t\\) found here and in Mishra et al is unrealistically high, this might be due to a combination of:

              In a future note, we'll demonstrate having a time-varying ascertainment rate.

              \n\n
              function generated_quantiles(gens, quantity, qs; transformation = x -> x)\n    mapreduce(hcat, gens) do gen #loop over sampled generated quantities\n        getfield(gen, quantity) |> transformation\n    end |> mat -> mapreduce(hcat, qs) do q #Loop over matrix row to condense into qs\n        map(eachrow(mat)) do row\n            if any(ismissing, row)\n                return missing\n            else\n                quantile(row, q)\n            end\n        end\n    end\nend
              \n
              generated_quantiles (generic function with 1 method)
              \n\n
              let\n    C = south_korea_data.y_t\n    D = south_korea_data.dates\n\n    #Case unconditional model for posterior predictive sampling\n    mdl_unconditional = generate_epiaware(epi_prob,\n        (y_t = fill(missing, length(C)),)\n    ) | (var\"obs.cluster_factor\" = fixed_cluster_factor,)\n    posterior_gens = generated_quantities(mdl_unconditional, inference_results.samples)\n\n    #plotting quantiles\n    qs = [0.025, 0.25, 0.5, 0.75, 0.975]\n\n    #Prediction quantiles\n    predicted_y_t = generated_quantiles(posterior_gens, :generated_y_t, qs)\n    predicted_R_t = generated_quantiles(\n        posterior_gens, :Z_t, qs; transformation = x -> exp.(x))\n\n    ts = D .|> d -> d - minimum(D) .|> d -> d.value + 1\n    t_ticks = string.(D)\n    fig = Figure()\n    ax1 = Axis(fig[1, 1];\n        ylabel = \"Daily cases\",\n        xticks = (ts[1:14:end], t_ticks[1:14:end]),\n        title = \"Posterior predictive: Cases\"\n    )\n    ax2 = Axis(fig[2, 1];\n        yscale = log10,\n        title = \"Prediction: Reproduction number\",\n        xticks = (ts[1:14:end], t_ticks[1:14:end])\n    )\n    linkxaxes!(ax1, ax2)\n\n    lines!(ax1, ts, predicted_y_t[:, 3];\n        color = :purple,\n        linewidth = 2,\n        label = \"Post. median\"\n    )\n    band!(ax1, 1:size(predicted_y_t, 1), predicted_y_t[:, 2], predicted_y_t[:, 4];\n        color = (:purple, 0.4),\n        label = \"50%\"\n    )\n    band!(ax1, 1:size(predicted_y_t, 1), predicted_y_t[:, 1], predicted_y_t[:, 5];\n        color = (:purple, 0.2),\n        label = \"95%\"\n    )\n    scatter!(ax1, C;\n        color = :black,\n        label = \"Actual cases\")\n    axislegend(ax1)\n\n    lines!(ax2, ts, predicted_R_t[:, 3];\n        color = :green,\n        linewidth = 2,\n        label = \"Post. median\"\n    )\n    band!(ax2, 1:size(predicted_R_t, 1), predicted_R_t[:, 2], predicted_R_t[:, 4];\n        color = (:green, 0.4),\n        label = \"50%\"\n    )\n    band!(ax2, 1:size(predicted_R_t, 1), predicted_R_t[:, 1], predicted_R_t[:, 5];\n        color = (:green, 0.2),\n        label = \"95%\"\n    )\n    axislegend(ax2)\n\n    fig\nend
              \n\n\n\n

              Parameter inference

              We can interrogate the sampled chains directly from the samples field of the inference_results object.

              \n\n
              let\n    sub_chn = inference_results.samples[inference_results.samples.name_map.parameters[[1:5;\n                                                                                       end]]]\n    fig = pairplot(sub_chn)\n    lines!(fig[1, 1], ar.std_prior, label = \"Prior\")\n    lines!(fig[2, 2], ar.init_prior.v[1], label = \"Prior\")\n    lines!(fig[3, 3], ar.init_prior.v[2], label = \"Prior\")\n    lines!(fig[4, 4], ar.damp_prior.v[1], label = \"Prior\")\n    lines!(fig[5, 5], ar.damp_prior.v[2], label = \"Prior\")\n    lines!(fig[6, 6], epi.initialisation_prior, label = \"Prior\")\n\n    fig\nend
              \n\n\n","category":"page"},{"location":"showcase/replications/mishra-2020/","page":"On the derivation of the renewal equation from an age-dependent branching process: an epidemic modelling perspective","title":"On the derivation of the renewal equation from an age-dependent branching process: an epidemic modelling perspective","text":"EditURL = \"https://github.com/CDCgov/Rt-without-renewal/blob/main/docs/src/showcase/replications/mishra-2020/index.jl\"","category":"page"},{"location":"lib/EpiAwareBase/public/#Public-Documentation","page":"Public API","title":"Public Documentation","text":"","category":"section"},{"location":"lib/EpiAwareBase/public/","page":"Public API","title":"Public API","text":"Documentation for EpiAwareBae.jl's public interface.","category":"page"},{"location":"lib/EpiAwareBase/public/","page":"Public API","title":"Public API","text":"See the Internals section of the manual for internal package docs covering all submodules.","category":"page"},{"location":"lib/EpiAwareBase/public/#Contents","page":"Public API","title":"Contents","text":"","category":"section"},{"location":"lib/EpiAwareBase/public/","page":"Public API","title":"Public API","text":"Pages = [\"public.md\"]\nDepth = 2:2","category":"page"},{"location":"lib/EpiAwareBase/public/#Index","page":"Public API","title":"Index","text":"","category":"section"},{"location":"lib/EpiAwareBase/public/","page":"Public API","title":"Public API","text":"Pages = [\"public.md\"]","category":"page"},{"location":"lib/EpiAwareBase/public/#Public-API","page":"Public API","title":"Public API","text":"","category":"section"},{"location":"lib/EpiAwareBase/public/","page":"Public API","title":"Public API","text":"Modules = [EpiAware.EpiAwareBase]\nPrivate = false","category":"page"},{"location":"lib/EpiAwareBase/public/#EpiAware.EpiAwareBase","page":"Public API","title":"EpiAware.EpiAwareBase","text":"Module for defining abstract epidemiological types.\n\n\n\n\n\n","category":"module"},{"location":"lib/EpiAwareBase/public/#EpiAware.EpiAwareBase.AbstractAccumulationStep","page":"Public API","title":"EpiAware.EpiAwareBase.AbstractAccumulationStep","text":"abstract type AbstractAccumulationStep\n\nAbstract type for all accumulation steps\n\n\n\nFields\n\n\n\n\n\n","category":"type"},{"location":"lib/EpiAwareBase/public/#EpiAware.EpiAwareBase.AbstractBroadcastRule","page":"Public API","title":"EpiAware.EpiAwareBase.AbstractBroadcastRule","text":"abstract type AbstractBroadcastRule\n\nAn abstract type representing a broadcast rule.\n\n\n\nFields\n\n\n\n\n\n","category":"type"},{"location":"lib/EpiAwareBase/public/#EpiAware.EpiAwareBase.AbstractEpiMethod","page":"Public API","title":"EpiAware.EpiAwareBase.AbstractEpiMethod","text":"abstract type AbstractEpiMethod\n\nAbstract supertype for all EpiAware inference/generative modelling methods.\n\n\n\nFields\n\n\n\n\n\n","category":"type"},{"location":"lib/EpiAwareBase/public/#EpiAware.EpiAwareBase.AbstractEpiModel","page":"Public API","title":"EpiAware.EpiAwareBase.AbstractEpiModel","text":"abstract type AbstractEpiModel <: AbstractModel\n\nThe abstract supertype for all structs that define a model for generating unobserved/latent infections.\n\n\n\nFields\n\n\n\n\n\n","category":"type"},{"location":"lib/EpiAwareBase/public/#EpiAware.EpiAwareBase.AbstractEpiOptMethod","page":"Public API","title":"EpiAware.EpiAwareBase.AbstractEpiOptMethod","text":"abstract type AbstractEpiOptMethod <: AbstractEpiMethod\n\nAbstract supertype for infence/generative methods that are based on optimization, e.g. MAP estimation or variational inference.\n\n\n\nFields\n\n\n\n\n\n","category":"type"},{"location":"lib/EpiAwareBase/public/#EpiAware.EpiAwareBase.AbstractEpiProblem","page":"Public API","title":"EpiAware.EpiAwareBase.AbstractEpiProblem","text":"abstract type AbstractEpiProblem\n\nAbstract supertype for all EpiAware problems.\n\n\n\nFields\n\n\n\n\n\n","category":"type"},{"location":"lib/EpiAwareBase/public/#EpiAware.EpiAwareBase.AbstractEpiSamplingMethod","page":"Public API","title":"EpiAware.EpiAwareBase.AbstractEpiSamplingMethod","text":"abstract type AbstractEpiSamplingMethod <: AbstractEpiMethod\n\nAbstract supertype for infence/generative methods that are based on sampling from the posterior distribution, e.g. NUTS.\n\n\n\nFields\n\n\n\n\n\n","category":"type"},{"location":"lib/EpiAwareBase/public/#EpiAware.EpiAwareBase.AbstractLatentModel","page":"Public API","title":"EpiAware.EpiAwareBase.AbstractLatentModel","text":"abstract type AbstractLatentModel <: AbstractModel\n\nThe abstract supertype for all structs that define a model for generating a latent process used in EpiAware models.\n\n\n\nFields\n\n\n\n\n\n","category":"type"},{"location":"lib/EpiAwareBase/public/#EpiAware.EpiAwareBase.AbstractObservationModel","page":"Public API","title":"EpiAware.EpiAwareBase.AbstractObservationModel","text":"abstract type AbstractObservationModel <: AbstractModel\n\nA type representing an abstract observation model that is a subtype of AbstractModel.\n\n\n\nFields\n\n\n\n\n\n","category":"type"},{"location":"lib/EpiAwareBase/public/#EpiAware.EpiAwareBase.AbstractTuringEpiModel","page":"Public API","title":"EpiAware.EpiAwareBase.AbstractTuringEpiModel","text":"abstract type AbstractTuringEpiModel <: AbstractEpiModel\n\nA abstract type representing a Turing-based epidemiological model.\n\n\n\nFields\n\n\n\n\n\n","category":"type"},{"location":"lib/EpiAwareBase/public/#EpiAware.EpiAwareBase.AbstractTuringIntercept","page":"Public API","title":"EpiAware.EpiAwareBase.AbstractTuringIntercept","text":"abstract type AbstractTuringIntercept <: AbstractTuringLatentModel\n\nA abstract type used to define the common interface for intercept models.\n\n\n\nFields\n\n\n\n\n\n","category":"type"},{"location":"lib/EpiAwareBase/public/#EpiAware.EpiAwareBase.AbstractTuringLatentModel","page":"Public API","title":"EpiAware.EpiAwareBase.AbstractTuringLatentModel","text":"abstract type AbstractTuringLatentModel <: AbstractLatentModel\n\nA abstract type representing a Turing-based Latent model.\n\n\n\nFields\n\n\n\n\n\n","category":"type"},{"location":"lib/EpiAwareBase/public/#EpiAware.EpiAwareBase.AbstractTuringObservationErrorModel","page":"Public API","title":"EpiAware.EpiAwareBase.AbstractTuringObservationErrorModel","text":"abstract type AbstractTuringObservationErrorModel <: AbstractTuringObservationModel\n\nThe abstract supertype for all structs that defines a Turing-based model for generating observation errors.\n\n\n\nFields\n\n\n\n\n\n","category":"type"},{"location":"lib/EpiAwareBase/public/#EpiAware.EpiAwareBase.AbstractTuringObservationModel","page":"Public API","title":"EpiAware.EpiAwareBase.AbstractTuringObservationModel","text":"abstract type AbstractTuringObservationModel <: AbstractObservationModel\n\nA abstract type representing a Turing-based observation model.\n\n\n\nFields\n\n\n\n\n\n","category":"type"},{"location":"lib/EpiAwareBase/public/#EpiAware.EpiAwareBase.AbstractTuringRenewal","page":"Public API","title":"EpiAware.EpiAwareBase.AbstractTuringRenewal","text":"abstract type AbstractTuringRenewal <: AbstractTuringEpiModel\n\nAbstract type for all Turing-based Renewal infection generating models.\n\n\n\nFields\n\n\n\n\n\n","category":"type"},{"location":"lib/EpiAwareBase/public/#EpiAware.EpiAwareBase.EpiAwareObservables","page":"Public API","title":"EpiAware.EpiAwareBase.EpiAwareObservables","text":"struct EpiAwareObservables\n\nThe EpiAwareObservables struct represents the observables used in the EpiAware model.\n\nFields\n\nmodel: The model used for the observables.\ndata: The data used for the observables.\nsamples: Samples from the posterior distribution.\ngenerated: The generated observables.\n\n\n\nFields\n\nmodel::Any\ndata::Any\nsamples::Any\ngenerated::Any\n\n\n\n\n\n","category":"type"},{"location":"lib/EpiAwareBase/public/#EpiAware.EpiAwareBase.EpiMethod","page":"Public API","title":"EpiAware.EpiAwareBase.EpiMethod","text":"struct EpiMethod{O<:AbstractEpiOptMethod, S<:AbstractEpiSamplingMethod} <: AbstractEpiMethod\n\nEpiMethod represents a method for performing EpiAware inference and/or generative modelling, which combines a sequence of optimization steps to pass initialisation information to a sampler method.\n\n\n\nFields\n\npre_sampler_steps::Vector{O} where O<:AbstractEpiOptMethod: Pre-sampler optimization steps.\nsampler::AbstractEpiSamplingMethod: Sampler method.\n\n\n\n\n\n","category":"type"},{"location":"lib/EpiAwareBase/public/#EpiAware.EpiAwareBase.EpiProblem","page":"Public API","title":"EpiAware.EpiAwareBase.EpiProblem","text":"struct EpiProblem{E<:AbstractEpiModel, L<:AbstractLatentModel, O<:AbstractObservationModel} <: AbstractEpiProblem\n\nDefines an inference/generative modelling problem for case data.\n\nEpiProblem wraps the underlying components of an epidemiological model:\n\nepi_model: An epidemiological model for unobserved infections.\nlatent_model: A latent model for underlying latent process.\nobservation_model: An observation model for observed cases.\n\nAlong with a tspan tuple for the time span of the case data.\n\n\n\nFields\n\nepi_model::AbstractEpiModel: Epidemiological model for unobserved infections.\nlatent_model::AbstractLatentModel: Latent model for underlying latent process.\nobservation_model::AbstractObservationModel: Observation model for observed cases.\ntspan::Tuple{Int64, Int64}: Time span for either inference or generative modelling of case time series.\n\n\n\n\n\n","category":"type"},{"location":"lib/EpiAwareBase/public/#EpiAware.EpiAwareBase._apply_method","page":"Public API","title":"EpiAware.EpiAwareBase._apply_method","text":"_apply_method(\n model::AbstractEpiModel,\n method::AbstractEpiMethod;\n ...\n)\n_apply_method(\n model::AbstractEpiModel,\n method::AbstractEpiMethod,\n prev_result;\n kwargs...\n)\n\n\nApply the inference/generative method method to the AbstractEpiModel object mdl.\n\nArguments\n\nmodel::AbstractEpiModel: The model to apply the method to.\nmethod::AbstractEpiMethod: The epidemiological method to apply.\nprev_result: The previous result of the method.\nkwargs: Additional keyword arguments passed to the method.\n\nReturns\n\nnothing: If no concrete implementation is defined for the given method.\n\n\n\n\n\n","category":"function"},{"location":"lib/EpiAwareBase/public/#EpiAware.EpiAwareBase.apply_method-Tuple{Any, Any, Any}","page":"Public API","title":"EpiAware.EpiAwareBase.apply_method","text":"apply_method(\n model,\n method,\n data;\n kwargs...\n) -> EpiAwareObservables\n\n\nWrap the _apply_method function by calling it with the given model, method, data, and optional keyword arguments (kwargs). The resulting solution is then passed to the generated_observables function, along with the model and input data, to compute the generated observables.\n\nArguments\n\nmodel: The model to apply the method to.\nmethod: The method to apply to the model.\ndata: The data to pass to the apply_method function.\nkwargs: Optional keyword arguments to pass to the apply_method function.\n\nReturns\n\nThe generated observables computed from the solution.\n\n\n\n\n\n","category":"method"},{"location":"lib/EpiAwareBase/public/#EpiAware.EpiAwareBase.apply_method-Tuple{Any, Any}","page":"Public API","title":"EpiAware.EpiAwareBase.apply_method","text":"apply_method(\n model,\n method;\n kwargs...\n) -> EpiAwareObservables\n\n\nCalls wrap_apply_method setting the data argument to nothing.\n\n\n\n\n\n","category":"method"},{"location":"lib/EpiAwareBase/public/#EpiAware.EpiAwareBase.apply_method-Tuple{EpiProblem, AbstractEpiMethod, Any}","page":"Public API","title":"EpiAware.EpiAwareBase.apply_method","text":"apply_method(\n epiproblem::EpiProblem,\n method::AbstractEpiMethod,\n data;\n fix_parameters,\n condition_parameters,\n kwargs...\n) -> EpiAwareObservables\n\n\nRun the EpiAware algorithm to estimate the parameters of an epidemiological model.\n\nArguments\n\nepiproblem::EpiProblem: An EpiProblem object specifying the epidemiological problem.\nmethod::EpiMethod: An EpiMethod object specifying the inference method.\ndata: The observed data used for inference.\n\nKeyword Arguments\n\nfix_parameters::NamedTuple: A NamedTuple of fixed parameters for the model.\ncondition_parameters::NamedTuple: A NamedTuple of conditioned parameters for the model.\nkwargs...: Additional keyword arguments passed to the inference methods.\n\nReturns\n\nA NamedTuple with a samples field which is the output of applying methods and a model field with the model used. Optionally, a gens field with the generated quantities from the model if that makes sense with the inference method.\n\n\n\n\n\n","category":"method"},{"location":"lib/EpiAwareBase/public/#EpiAware.EpiAwareBase.broadcast_n-Tuple{AbstractBroadcastRule, Any, Any, Any}","page":"Public API","title":"EpiAware.EpiAwareBase.broadcast_n","text":"broadcast_n(\n broadcast_rule::AbstractBroadcastRule,\n latent,\n n,\n period\n)\n\n\nThis function is used to define the behavior of broadcasting for a specific type of AbstractBroadcastRule.\n\nThe broadcast_n function returns the length of the latent periods to generate using the given broadcast_rule. Which model of broadcasting to be implemented is set by the type of broadcast_rule. If no implemention is defined for the given broadcast_rule, then EpiAware will return a warning and return nothing.\n\n\n\n\n\n","category":"method"},{"location":"lib/EpiAwareBase/public/#EpiAware.EpiAwareBase.broadcast_rule-Tuple{AbstractBroadcastRule, Any, Any}","page":"Public API","title":"EpiAware.EpiAwareBase.broadcast_rule","text":"broadcast_rule(\n broadcast_rule::AbstractBroadcastRule,\n n,\n period\n)\n\n\nThis function is used to define the behavior of broadcasting for a specific type of AbstractBroadcastRule.\n\nThe broadcast_rule function implements a model of broadcasting a latent process. Which model of broadcasting to be implemented is set by the type of broadcast_rule. If no implemention is defined for the given broadcast_rule, then EpiAware will return a warning and return nothing.\n\n\n\n\n\n","category":"method"},{"location":"lib/EpiAwareBase/public/#EpiAware.EpiAwareBase.condition_model-Tuple{Any, Any, Any}","page":"Public API","title":"EpiAware.EpiAwareBase.condition_model","text":"condition_model(\n model,\n fix_parameters,\n condition_parameters\n) -> Any\n\n\nCondition a model on fixed (i.e to a value) and conditioned (i.e to data) parameters.\n\nReturns\n\nmodel: The conditioned model.\n\n\n\n\n\n","category":"method"},{"location":"lib/EpiAwareBase/public/#EpiAware.EpiAwareBase.generate_epiaware-Tuple{Any, Any, AbstractEpiModel, AbstractLatentModel, AbstractObservationModel}","page":"Public API","title":"EpiAware.EpiAwareBase.generate_epiaware","text":"generate_epiaware(\n y_t,\n time_step,\n epi_model::AbstractEpiModel,\n latent_model::AbstractLatentModel,\n observation_model::AbstractObservationModel\n)\n\n\nCreate an epi-aware model using the specified epimodel, latentmodel, and observation_model.\n\nArguments\n\ny_t: The observed data.\ntime_steps: The time steps.\nepi_model: An abstract epi model.\nlatent_model: An abstract latent model.\nobservation_model: An abstract observation model.\n\nReturns\n\nnothing\n\n\n\n\n\n","category":"method"},{"location":"lib/EpiAwareBase/public/#EpiAware.EpiAwareBase.generate_epiaware-Tuple{EpiProblem, Any}","page":"Public API","title":"EpiAware.EpiAwareBase.generate_epiaware","text":"generate_epiaware(epiproblem::EpiProblem, data) -> Any\n\n\nGenerate an epi-aware model given an EpiProblem and data.\n\nArguments\n\nepiproblem: Epi problem specification.\ndata: Observed data.\n\nReturns\n\nA tuple containing the generated quantities of the epi-aware model.\n\n\n\n\n\n","category":"method"},{"location":"lib/EpiAwareBase/public/#EpiAware.EpiAwareBase.generate_latent-Tuple{AbstractLatentModel, Any}","page":"Public API","title":"EpiAware.EpiAwareBase.generate_latent","text":"generate_latent(latent_model::AbstractLatentModel, n) -> Any\n\n\nConstructor function for a latent process path Z_t of length n.\n\nThe generate_latent function implements a model of generating a latent process. Which model for generating the latent process infections is implemented is set by the type of latent_model. If no implemention is defined for the type of latent_model, then EpiAware will pass a warning and return nothing.\n\nInterface to Turing.jl probablilistic programming language (PPL)\n\nApart from the no implementation fallback method, the generate_latent implementation function should return a constructor function for a DynamicPPL.Model object. Sample paths of Z_t are generated quantities of the constructed model. Priors for model parameters are fields of epi_model.\n\n\n\n\n\n","category":"method"},{"location":"lib/EpiAwareBase/public/#EpiAware.EpiAwareBase.generate_latent_infs-Tuple{AbstractEpiModel, Any}","page":"Public API","title":"EpiAware.EpiAwareBase.generate_latent_infs","text":"generate_latent_infs(\n epi_model::AbstractEpiModel,\n Z_t\n) -> Any\n\n\nConstructor function for unobserved/latent infections based on the type of epi_model <: AbstractEpimodel and a latent process path Z_t.\n\nThe generate_latent_infs function implements a model of generating unobserved/latent infections conditional on a latent process. Which model of generating unobserved/latent infections to be implemented is set by the type of epi_model. If no implemention is defined for the given epi_model, then EpiAware will return a warning and return nothing.\n\nInterface to Turing.jl probablilistic programming language (PPL)\n\nApart from the no implementation fallback method, the generate_latent_infs implementation function returns a constructor function for a DynamicPPL.Model object where the unobserved/latent infections are a generated quantity. Priors for model parameters are fields of epi_model.\n\n\n\n\n\n","category":"method"},{"location":"lib/EpiAwareBase/public/#EpiAware.EpiAwareBase.generate_observations-Tuple{AbstractObservationModel, Any, Any}","page":"Public API","title":"EpiAware.EpiAwareBase.generate_observations","text":"generate_observations(\n obs_model::AbstractObservationModel,\n y_t,\n Y_t\n) -> Any\n\n\nConstructor function for generating observations based on the given observation model.\n\nThe generate_observations function implements a model of generating observations based on the given observation model. Which model of generating observations to be implemented is set by the type of obs_model. If no implemention is defined for the given obs_model, then EpiAware will return a warning and return nothing.\n\n\n\n\n\n","category":"method"},{"location":"lib/EpiAwareBase/public/#EpiAware.EpiAwareBase.generated_observables-Tuple{Any, Any, Any}","page":"Public API","title":"EpiAware.EpiAwareBase.generated_observables","text":"generated_observables(\n model,\n data,\n solution\n) -> EpiAwareObservables\n\n\nGenerate observables from a given model and solution and return them as a EpiAwareObservables struct.\n\nArguments\n\nmodel: The model used for generating observables.\ndata: The data used for generating observables.\nsolution: The solution used for generating observables.\n\nReturns\n\nAn instance of EpiAwareObservables struct with the provided model, data, solution, and the generated observables if specified\n\n\n\n\n\n","category":"method"},{"location":"getting-started/explainers/#Explainers","page":"Overview","title":"Explainers","text":"","category":"section"},{"location":"getting-started/explainers/","page":"Overview","title":"Overview","text":"This section contains a series of explainers that provide a detailed overview of the EpiAware platform and its features. These explainers are designed to help you understand the platform and its capabilities, and to provide you with the information you need to get started using EpiAware. See the sidebar for the list of explainers.","category":"page"},{"location":"lib/EpiObsModels/internals/#Internal-Documentation","page":"Internal API","title":"Internal Documentation","text":"","category":"section"},{"location":"lib/EpiObsModels/internals/","page":"Internal API","title":"Internal API","text":"Documentation for EpiObsModels.jl's internal interface.","category":"page"},{"location":"lib/EpiObsModels/internals/#Contents","page":"Internal API","title":"Contents","text":"","category":"section"},{"location":"lib/EpiObsModels/internals/","page":"Internal API","title":"Internal API","text":"Pages = [\"internals.md\"]\nDepth = 2:2","category":"page"},{"location":"lib/EpiObsModels/internals/#Index","page":"Internal API","title":"Index","text":"","category":"section"},{"location":"lib/EpiObsModels/internals/","page":"Internal API","title":"Internal API","text":"Pages = [\"internals.md\"]","category":"page"},{"location":"lib/EpiObsModels/internals/#Internal-API","page":"Internal API","title":"Internal API","text":"","category":"section"},{"location":"lib/EpiObsModels/internals/","page":"Internal API","title":"Internal API","text":"Modules = [EpiAware.EpiObsModels]\nPublic = false","category":"page"},{"location":"lib/EpiObsModels/internals/#EpiAware.EpiObsModels.LDStep","page":"Internal API","title":"EpiAware.EpiObsModels.LDStep","text":"struct LDStep{D<:(AbstractVector{<:Real})} <: AbstractAccumulationStep\n\nThe LatentDelay step function struct\n\n\n\nFields\n\nrev_pmf::AbstractVector{<:Real}\n\n\n\n\n\n","category":"type"},{"location":"lib/EpiObsModels/internals/#EpiAware.EpiObsModels.LDStep-Tuple{Any, Any}","page":"Internal API","title":"EpiAware.EpiObsModels.LDStep","text":"The LatentDelay step function method for accumulate_scan.\n\n\n\n\n\n","category":"method"},{"location":"lib/EpiObsModels/internals/#EpiAware.EpiAwareBase.generate_observations-Tuple{AbstractTuringObservationErrorModel, Any, Any}","page":"Internal API","title":"EpiAware.EpiAwareBase.generate_observations","text":"generate_observations(\n obs_model::AbstractTuringObservationErrorModel,\n y_t,\n Y_t\n) -> Any\n\n\nGenerates observations from an observation error model. It provides support for missing values in observations (y_t), and expected observations (Y_t) that are shorter than observations. When this is the case it assumes that the expected observations are the last length(Y_t) elements of y_t. It also pads the expected observations with a small value (1e-6) to mitigate potential numerical issues.\n\nIt dispatches to the observation_error function to generate the observation error distribution which uses priors generated by generate_observation_error_priors submodel. For most observation error models specific implementations of observation_error and generate_observation_error_priors are required but a specific implementation of generate_observations is not required.\n\n\n\n\n\n","category":"method"},{"location":"lib/EpiObsModels/internals/#EpiAware.EpiAwareBase.generate_observations-Tuple{Ascertainment, Any, Any}","page":"Internal API","title":"EpiAware.EpiAwareBase.generate_observations","text":"generate_observations(\n obs_model::Ascertainment,\n y_t,\n Y_t\n) -> Any\n\n\nGenerates observations based on the LatentDelay observation model.\n\nArguments\n\nobs_model::Ascertainment: The Ascertainment model.\ny_t: The current state of the observations.\nY_t` : The expected observations.\n\nReturns\n\ny_t: The updated observations.\nexpected_aux: Additional expected observation-related variables.\nobs_aux: Additional observation-related variables.\n\n\n\n\n\n","category":"method"},{"location":"lib/EpiObsModels/internals/#EpiAware.EpiAwareBase.generate_observations-Tuple{LatentDelay, Any, Any}","page":"Internal API","title":"EpiAware.EpiAwareBase.generate_observations","text":"generate_observations(\n obs_model::LatentDelay,\n y_t,\n Y_t\n) -> Any\n\n\nGenerates observations based on the LatentDelay observation model.\n\nArguments\n\nobs_model::LatentDelay: The LatentDelay observation model.\ny_t: The current observations.\nI_t: The current infection indicator.\n\nReturns\n\ny_t: The updated observations.\n\n\n\n\n\n","category":"method"},{"location":"lib/EpiObsModels/internals/#EpiAware.EpiAwareBase.generate_observations-Tuple{StackObservationModels, NamedTuple, AbstractVector}","page":"Internal API","title":"EpiAware.EpiAwareBase.generate_observations","text":"generate_observations(\n obs_model::StackObservationModels,\n y_t::NamedTuple,\n Y_t::AbstractVector\n) -> Any\n\n\nGenerate observations from a stack of observation models. Maps Y_t to a NamedTuple of the same length as y_t assuming a 1 to many mapping.\n\nArguments\n\nobs_model::StackObservationModels: The stack of observation models.\ny_t::NamedTuple: The observed values.\nY_t::AbstractVector: The expected values.\n\n\n\n\n\n","category":"method"},{"location":"lib/EpiObsModels/internals/#EpiAware.EpiAwareBase.generate_observations-Tuple{StackObservationModels, NamedTuple, NamedTuple}","page":"Internal API","title":"EpiAware.EpiAwareBase.generate_observations","text":"generate_observations(\n obs_model::StackObservationModels,\n y_t::NamedTuple,\n Y_t::NamedTuple\n) -> Any\n\n\nGenerate observations from a stack of observation models. Assumes a 1 to 1 mapping between y_t and Y_t.\n\nArguments\n\nobs_model::StackObservationModels: The stack of observation models.\ny_t::NamedTuple: The observed values.\nY_t::NamedTuple: The expected values.\n\n\n\n\n\n","category":"method"},{"location":"lib/EpiObsModels/internals/#EpiAware.EpiAwareUtils.get_state-Tuple{EpiAware.EpiObsModels.LDStep, Any, Any}","page":"Internal API","title":"EpiAware.EpiAwareUtils.get_state","text":"get_state(\n acc_step::EpiAware.EpiObsModels.LDStep,\n initial_state,\n state\n) -> Any\n\n\nThe LatentDelay step function method for get_state.\n\n\n\n\n\n","category":"method"},{"location":"lib/EpiObsModels/internals/#EpiAware.EpiObsModels.NegativeBinomialMeanClust-Tuple{Any, Any}","page":"Internal API","title":"EpiAware.EpiObsModels.NegativeBinomialMeanClust","text":"NegativeBinomialMeanClust(μ, α) -> SafeNegativeBinomial\n\n\nCompute the mean-cluster factor negative binomial distribution.\n\nArguments\n\nμ: The mean of the distribution.\nα: The clustering factor parameter.\n\nReturns\n\nA NegativeBinomial distribution object.\n\n\n\n\n\n","category":"method"},{"location":"lib/EpiObsModels/internals/#EpiAware.EpiObsModels.generate_observation_kernel-Tuple{Any, Any}","page":"Internal API","title":"EpiAware.EpiObsModels.generate_observation_kernel","text":"generate_observation_kernel(\n delay_int,\n time_horizon;\n partial\n) -> Any\n\n\nGenerate an observation kernel matrix based on the given delay interval and time horizon.\n\nArguments\n\ndelay_int::Vector{Float64}: The delay PMF vector.\ntime_horizon::Int: The number of time steps of the observation period.\npartial::Bool: Whether to generate a partial observation kernel matrix.\n\nReturns\n\nK::SparseMatrixCSC{Float64, Int}: The observation kernel matrix.\n\n\n\n\n\n","category":"method"},{"location":"lib/EpiInference/internals/#Internal-Documentation","page":"Internal API","title":"Internal Documentation","text":"","category":"section"},{"location":"lib/EpiInference/internals/","page":"Internal API","title":"Internal API","text":"Documentation for EpInference.jl's internal interface.","category":"page"},{"location":"lib/EpiInference/internals/#Contents","page":"Internal API","title":"Contents","text":"","category":"section"},{"location":"lib/EpiInference/internals/","page":"Internal API","title":"Internal API","text":"Pages = [\"internals.md\"]\nDepth = 2:2","category":"page"},{"location":"lib/EpiInference/internals/#Index","page":"Internal API","title":"Index","text":"","category":"section"},{"location":"lib/EpiInference/internals/","page":"Internal API","title":"Internal API","text":"Pages = [\"internals.md\"]","category":"page"},{"location":"lib/EpiInference/internals/#Internal-API","page":"Internal API","title":"Internal API","text":"","category":"section"},{"location":"lib/EpiInference/internals/","page":"Internal API","title":"Internal API","text":"Modules = [EpiAware.EpiInference]\nPublic = false","category":"page"},{"location":"lib/EpiInference/internals/#EpiAware.EpiAwareBase._apply_method","page":"Internal API","title":"EpiAware.EpiAwareBase._apply_method","text":"_apply_method(\n model::DynamicPPL.Model,\n method::ManyPathfinder;\n ...\n) -> Any\n_apply_method(\n model::DynamicPPL.Model,\n method::ManyPathfinder,\n prev_result;\n kwargs...\n) -> Any\n\n\nApply a ManyPathfinder method to a DynamicPPL.Model object.\n\nIf prev_result is a vector of real numbers, then the ManyPathfinder method is applied with the initial values set to prev_result. Otherwise, the ManyPathfinder method is run with default initial values generated.\n\n\n\n\n\n","category":"function"},{"location":"lib/EpiInference/internals/#EpiAware.EpiAwareBase._apply_method-2","page":"Internal API","title":"EpiAware.EpiAwareBase._apply_method","text":"_apply_method(\n model::DynamicPPL.Model,\n method::NUTSampler;\n ...\n) -> Any\n_apply_method(\n model::DynamicPPL.Model,\n method::NUTSampler,\n prev_result;\n kwargs...\n) -> Any\n\n\nApply NUTS sampling to a DynamicPPL.Model object with prev_result representing any initial results to use for sampler initialisation.\n\n\n\n\n\n","category":"function"},{"location":"lib/EpiInference/internals/#EpiAware.EpiInference._apply_nuts-Tuple{Any, Any, Any}","page":"Internal API","title":"EpiAware.EpiInference._apply_nuts","text":"_apply_nuts(model, method, prev_result; kwargs...) -> Any\n\n\nNo initialisation NUTS.\n\n\n\n\n\n","category":"method"},{"location":"lib/EpiInference/internals/#EpiAware.EpiInference._apply_nuts-Tuple{Any, Any, Pathfinder.PathfinderResult}","page":"Internal API","title":"EpiAware.EpiInference._apply_nuts","text":"_apply_nuts(\n model,\n method,\n prev_result::Pathfinder.PathfinderResult;\n kwargs...\n) -> Any\n\n\nInitialise NUTS with initial parameters from a Pathfinder result.\n\n\n\n\n\n","category":"method"},{"location":"lib/EpiInference/internals/#EpiAware.EpiInference._continue_manypathfinder!-Tuple{Any, DynamicPPL.Model}","page":"Internal API","title":"EpiAware.EpiInference._continue_manypathfinder!","text":"_continue_manypathfinder!(\n pfs,\n mdl::DynamicPPL.Model;\n max_tries,\n nruns,\n kwargs...\n)\n\n\nContinue running the pathfinder algorithm until a pathfinder succeeds or the maximum number of tries is reached.\n\nArguments\n\npfs: An array of pathfinder objects.\nmdl::DynamicPPL.Model: The model to perform inference on.\nmax_tries: The maximum number of tries to run the pathfinder algorithm. Default is Inf.\nnruns: The number of times to run the pathfinder function.\nkwargs...: Additional keyword arguments passed to pathfinder.\n\nReturns\n\npfs: The updated array of pathfinder objects.\n\n\n\n\n\n","category":"method"},{"location":"lib/EpiInference/internals/#EpiAware.EpiInference._get_best_elbo_pathfinder-Tuple{Any}","page":"Internal API","title":"EpiAware.EpiInference._get_best_elbo_pathfinder","text":"_get_best_elbo_pathfinder(pfs) -> Any\n\n\nSelects the pathfinder with the highest ELBO estimate from a list of pathfinders.\n\nArguments\n\npfs: A list of pathfinders results or Symbol values indicating failure.\n\nReturns\n\nThe pathfinder with the highest ELBO estimate.\n\n\n\n\n\n","category":"method"},{"location":"lib/EpiInference/internals/#EpiAware.EpiInference._run_manypathfinder-Tuple{DynamicPPL.Model}","page":"Internal API","title":"EpiAware.EpiInference._run_manypathfinder","text":"_run_manypathfinder(mdl::DynamicPPL.Model; nruns, kwargs...)\n\n\nRun pathfinder multiple times and store the results in an array. Fails safely.\n\nArguments\n\nmdl::DynamicPPL.Model: The Turing model to be used for inference.\nnruns: The number of times to run the pathfinder function.\nkwargs...: Additional keyword arguments passed to pathfinder.\n\nReturns\n\nAn array of PathfinderResult objects or Symbol values indicating success or failure.\n\n\n\n\n\n","category":"method"},{"location":"getting-started/explainers/modelling-infections/#Modelling-infections","page":"Modelling infections","title":"Modelling infections","text":"","category":"section"},{"location":"developer/contributing/#Contributing","page":"Contributing","title":"Contributing","text":"","category":"section"},{"location":"developer/contributing/","page":"Contributing","title":"Contributing","text":"This page details the some of the guidelines that should be followed when contributing to this package. It is adapted from Documenter.jl.","category":"page"},{"location":"developer/contributing/#Branches","page":"Contributing","title":"Branches","text":"","category":"section"},{"location":"developer/contributing/","page":"Contributing","title":"Contributing","text":"release-* branches are used for tagged minor versions of this package. This follows the same approach used in the main Julia repository, albeit on a much more modest scale.","category":"page"},{"location":"developer/contributing/","page":"Contributing","title":"Contributing","text":"Please open pull requests against the master branch rather than any of the release-* branches whenever possible.","category":"page"},{"location":"developer/contributing/#Backports","page":"Contributing","title":"Backports","text":"","category":"section"},{"location":"developer/contributing/","page":"Contributing","title":"Contributing","text":"Bug fixes are backported to the release-* branches using git cherry-pick -x by a EpiAware member and will become available in point releases of that particular minor version of the package.","category":"page"},{"location":"developer/contributing/","page":"Contributing","title":"Contributing","text":"Feel free to nominate commits that should be backported by opening an issue. Requests for new point releases to be tagged in METADATA.jl can also be made in the same way.","category":"page"},{"location":"developer/contributing/#release-*-branches","page":"Contributing","title":"release-* branches","text":"","category":"section"},{"location":"developer/contributing/","page":"Contributing","title":"Contributing","text":"Each new minor version x.y.0 gets a branch called release-x.y (a protected branch).\nNew versions are usually tagged only from the release-x.y branches.\nFor patch releases, changes get backported to the release-x.y branch via a single PR with the standard name \"Backports for x.y.z\" and label \"Type: Backport\". The PR message links to all the PRs that are providing commits to the backport. The PR gets merged as a merge commit (i.e. not squashed).\nThe old release-* branches may be removed once they have outlived their usefulness.\nPatch version milestones are used to keep track of which PRs get backported etc.","category":"page"},{"location":"developer/contributing/#Style-Guide","page":"Contributing","title":"Style Guide","text":"","category":"section"},{"location":"developer/contributing/","page":"Contributing","title":"Contributing","text":"Follow the style of the surrounding text when making changes. When adding new features please try to stick to the following points whenever applicable. This project follows the SciML style guide.","category":"page"},{"location":"developer/contributing/#Tests","page":"Contributing","title":"Tests","text":"","category":"section"},{"location":"developer/contributing/#Unit-tests","page":"Contributing","title":"Unit tests","text":"","category":"section"},{"location":"developer/contributing/","page":"Contributing","title":"Contributing","text":"As is conventional for Julia packages, unit tests are located at test/*.jl with the entrypoint test/runtests.jl.","category":"page"},{"location":"developer/contributing/#End-to-end-testing","page":"Contributing","title":"End to end testing","text":"","category":"section"},{"location":"developer/contributing/","page":"Contributing","title":"Contributing","text":"Tests that build example package docs from source and inspect the results (end to end tests) are located in /test/examples. The main entry points are test/examples/make.jl for building and test/examples/test.jl for doing some basic checks on the generated outputs.","category":"page"},{"location":"developer/contributing/#Pluto-usage-in-showcase-documentation","page":"Contributing","title":"Pluto usage in showcase documentation","text":"","category":"section"},{"location":"developer/contributing/","page":"Contributing","title":"Contributing","text":"Some of the showcase examples in EpiAware/docs/src/showcase use Pluto.jl notebooks for the underlying computation. The output of the notebooks is rendered into HTML for inclusion in the documentation in two steps:","category":"page"},{"location":"developer/contributing/","page":"Contributing","title":"Contributing","text":"PlutoStaticHTML.jl converts the notebook with output into a machine-readable .md format.\nDocumenter.jl renders the .md file into HTML for inclusion in the documentation during the build process.","category":"page"},{"location":"developer/contributing/","page":"Contributing","title":"Contributing","text":"For other examples of using Pluto to generate documentation see the examples shown here.","category":"page"},{"location":"developer/contributing/#Running-Pluto-notebooks-from-EpiAware-locally","page":"Contributing","title":"Running Pluto notebooks from EpiAware locally","text":"","category":"section"},{"location":"developer/contributing/","page":"Contributing","title":"Contributing","text":"To run the Pluto.jl scripts in the EpiAware documentation directly from the source code you can do these steps:","category":"page"},{"location":"developer/contributing/","page":"Contributing","title":"Contributing","text":"Install Pluto.jl locally. We recommend using the version of Pluto that is pinned in the Project.toml file defining the documentation environment.\nClone the EpiAware repository.\nStart Pluto.jl either from REPL (see the Pluto.jl documentation) or from the command line with the shell script EpiAware/docs/pluto-scripts.sh.\nFrom the Pluto.jl interface, navigate to the Pluto.jl script you want to run.","category":"page"},{"location":"developer/contributing/#Contributing-to-Pluto-notebooks-in-EpiAware-documentation","page":"Contributing","title":"Contributing to Pluto notebooks in EpiAware documentation","text":"","category":"section"},{"location":"developer/contributing/#Modifying-an-existing-Pluto-notebook","page":"Contributing","title":"Modifying an existing Pluto notebook","text":"","category":"section"},{"location":"developer/contributing/","page":"Contributing","title":"Contributing","text":"Committing changes to the Pluto.jl notebooks in the EpiAware documentation is the same as committing changes to any other part of the repository. However, please note that we expect the following features for the environment management of the notebooks:","category":"page"},{"location":"developer/contributing/","page":"Contributing","title":"Contributing","text":"Use the environment determined by the Project.toml file in the EpiAware/docs directory. If you want extra packages, add them to this environment.\nUse the version of EpiAware that is used in these notebooks to be the version of EpiAware on the branch being pull requested into main. To do this use the Pkg.develop function.","category":"page"},{"location":"developer/contributing/","page":"Contributing","title":"Contributing","text":"To do this you can use the following code snippet in the Pluto notebook:","category":"page"},{"location":"developer/contributing/","page":"Contributing","title":"Contributing","text":"# Determine the relative path to the `EpiAware/docs` directory\ndocs_dir = dirname(dirname(dirname(dirname(@__DIR__))))\n# Determine the relative path to the `EpiAware` package directory\npkg_dir = dirname(docs_dir)\n\nusing Pkg: Pkg\nPkg.activate(docs_dir)\nPkg.develop(; path = pkg_dir)\nPkg.instantiate()","category":"page"},{"location":"developer/contributing/#Adding-a-new-Pluto-notebook","page":"Contributing","title":"Adding a new Pluto notebook","text":"","category":"section"},{"location":"developer/contributing/","page":"Contributing","title":"Contributing","text":"Adding a new Pluto.jl notebook to the EpiAware documentation is the same as adding any other file to the repository. However, in addition to following the guidelines for modifying an existing notebook, please note that the new notebook is added to the set of notebook builds using build in the EpiAware/docs/make.jl file. This will generate an .md of the same name as the notebook which can be rendered when makedocs is run. For this document to be added to the overall documentation the path to the .md file must be added to the Pages array defined in EpiAware/docs/pages.jl.","category":"page"},{"location":"developer/checklist/#Checklists","page":"Release checklist","title":"Checklists","text":"","category":"section"},{"location":"developer/checklist/","page":"Release checklist","title":"Release checklist","text":"The purpose of this page is to collate a series of checklists for commonly performed changes to the source code of EpiAware. It has been adapted from Documenter.jl.","category":"page"},{"location":"developer/checklist/","page":"Release checklist","title":"Release checklist","text":"In each case, copy the checklist into the description of the pull request.","category":"page"},{"location":"developer/checklist/#Making-a-release","page":"Release checklist","title":"Making a release","text":"","category":"section"},{"location":"developer/checklist/","page":"Release checklist","title":"Release checklist","text":"In preparation for a release, use the following checklist. These steps should be performed on a branch with an open pull request, either for a topic branch, or for a new branch release-1.y.z (\"Release version 1.y.z\") if multiple changes have accumulated on the master branch since the last release.","category":"page"},{"location":"developer/checklist/","page":"Release checklist","title":"Release checklist","text":"## Pre-release\n\n - [ ] Change the version number in `Project.toml`\n * If the release is breaking, increment MAJOR\n * If the release adds a new user-visible feature, increment MINOR\n * Otherwise (bug-fixes, documentation improvements), increment PATCH\n - [ ] Update `CHANGELOG.md`, following the existing style (in particular, make sure that the change log for this version has the correct version number and date).\n - [ ] Run `make changelog`, to make sure that all the issue references in `CHANGELOG.md` are up to date.\n - [ ] Check that the commit messages in this PR do not contain `[ci skip]`\n - [ ] Run https://github.com/JuliaDocs/Documenter.jl/actions/workflows/regression-tests.yml\n using a `workflow_dispatch` trigger to check for any changes that broke extensions.\n\n## The release\n\n - [ ] After merging the pull request, tag the release. There are two options for this:\n\n 1. [Comment `[at]JuliaRegistrator register` on the GitHub commit.](https://github.com/JuliaRegistries/Registrator.jl#via-the-github-app)\n 2. Use [JuliaHub's package registration feature](https://help.juliahub.com/juliahub/stable/contribute/#registrator) to trigger the registration.\n\n Either of those should automatically publish a new version to the Julia registry.\n - Once registered, the `TagBot.yml` workflow should create a tag, and rebuild the documentation for this tag.\n - These steps can take quite a bit of time (1 hour or more), so don't be surprised if the new documentation takes a while to appear.","category":"page"},{"location":"lib/EpiObsModels/#EpiObsModels.jl","page":"Overview","title":"EpiObsModels.jl","text":"","category":"section"},{"location":"lib/EpiObsModels/","page":"Overview","title":"Overview","text":"This package provides observation models for the EpiAware ecosystem.","category":"page"},{"location":"lib/EpiObsModels/#API","page":"Overview","title":"API","text":"","category":"section"},{"location":"lib/EpiObsModels/","page":"Overview","title":"Overview","text":"Pages = [\"lib/EpiObsModels/public.md\", \"lib/EpiObsModels/internals.md\"]","category":"page"},{"location":"lib/EpiAwareBase/#EpiAwareBase.jl","page":"Overview","title":"EpiAwareBase.jl","text":"","category":"section"},{"location":"lib/EpiAwareBase/","page":"Overview","title":"Overview","text":"This package provides the core functionality for the EpiAware ecosystem. It is a dependency of all other EpiAware packages.","category":"page"},{"location":"lib/EpiAwareBase/#API","page":"Overview","title":"API","text":"","category":"section"},{"location":"lib/EpiAwareBase/","page":"Overview","title":"Overview","text":"Pages = [\"lib/EpiAwareBase/public.md\", \"lib/EpiAwareBase/internals.md\"]","category":"page"},{"location":"developer/#developer","page":"Overview","title":"Developer documentation","text":"","category":"section"},{"location":"developer/","page":"Overview","title":"Overview","text":"Welcome to the EpiAware developer documentation! This section is designed to help you get started with developing the package.","category":"page"},{"location":"lib/EpiAwareUtils/public/#Public-Documentation","page":"Public API","title":"Public Documentation","text":"","category":"section"},{"location":"lib/EpiAwareUtils/public/","page":"Public API","title":"Public API","text":"Documentation for EpiAwareBae.jl's public interface.","category":"page"},{"location":"lib/EpiAwareUtils/public/","page":"Public API","title":"Public API","text":"See the Internals section of the manual for internal package docs covering all submodules.","category":"page"},{"location":"lib/EpiAwareUtils/public/#Contents","page":"Public API","title":"Contents","text":"","category":"section"},{"location":"lib/EpiAwareUtils/public/","page":"Public API","title":"Public API","text":"Pages = [\"public.md\"]\nDepth = 2:2","category":"page"},{"location":"lib/EpiAwareUtils/public/#Index","page":"Public API","title":"Index","text":"","category":"section"},{"location":"lib/EpiAwareUtils/public/","page":"Public API","title":"Public API","text":"Pages = [\"public.md\"]","category":"page"},{"location":"lib/EpiAwareUtils/public/#Public-API","page":"Public API","title":"Public API","text":"","category":"section"},{"location":"lib/EpiAwareUtils/public/","page":"Public API","title":"Public API","text":"Modules = [EpiAware.EpiAwareUtils]\nPrivate = false","category":"page"},{"location":"lib/EpiAwareUtils/public/#EpiAware.EpiAwareUtils","page":"Public API","title":"EpiAware.EpiAwareUtils","text":"Module for defining utility functions.\n\n\n\n\n\n","category":"module"},{"location":"lib/EpiAwareUtils/public/#EpiAware.EpiAwareUtils.DirectSample","page":"Public API","title":"EpiAware.EpiAwareUtils.DirectSample","text":"struct DirectSample <: AbstractEpiSamplingMethod\n\nSample directly from a Turing model.\n\n\n\nFields\n\nn_samples::Union{Nothing, Int64}: Number of samples from a model. If an integer is provided, the model is sampled n_samples times using Turing.Prior() returning an MCMChains. Chain object. If nothing, the model is sampled once returning a NamedTuple object of the sampled random variables along with generated quantities\n\n\n\n\n\n","category":"type"},{"location":"lib/EpiAwareUtils/public/#EpiAware.EpiAwareUtils.HalfNormal","page":"Public API","title":"EpiAware.EpiAwareUtils.HalfNormal","text":"struct HalfNormal{T<:Real} <: Distributions.Distribution{Distributions.Univariate, Distributions.Continuous}\n\nCreate a half-normal prior distribution with the specified mean.\n\nArguments:\n\nμ: The mean of the half-normal distribution.\n\nReturns:\n\nA HalfNormal distribution with the specified mean.\n\nExamples:\n\nusing EpiAware, Distributions\n\nhn = HalfNormal(1.0)\n# output\nEpiAware.EpiAwareUtils.HalfNormal{Float64}(μ=1.0)\n\nfilter out all the values that are less than 0\n\nrand(hn)\n# output\n0.4508533245229199\n\ncdf(hn, 2)\n# output\n0.8894596502772643\n\nquantile(hn, 0.5)\n# output\n0.8453475393951495\n\nlogpdf(hn, 2)\n# output\n-3.1111166111445083\n\nmean(hn)\n# output\n1.0\n\nvar(hn)\n# output\n0.5707963267948966\n\n\n\nFields\n\nμ::Real\n\n\n\n\n\n","category":"type"},{"location":"lib/EpiAwareUtils/public/#EpiAware.EpiAwareUtils.SafeNegativeBinomial","page":"Public API","title":"EpiAware.EpiAwareUtils.SafeNegativeBinomial","text":"struct SafeNegativeBinomial{T<:Real} <: Distributions.Distribution{Distributions.Univariate, Distributions.Discrete}\n\nCreate a Negative binomial distribution with the specified mean that avoids InExactError when the mean is too large.\n\nParameterisation:\n\nWe are using a mean and cluster factorization of the negative binomial distribution such that the variance to mean relationship is:\n\nsigma^2 = mu + alpha^2 mu^2\n\nThe reason for this parameterisation is that at sufficiently large mean values (i.e. r > 1 / p) p is approximately equal to the standard fluctuation of the distribution, e.g. if p = 0.05 we expect typical fluctuations of samples from the negative binomial to be about 5% of the mean when the mean is notably larger than 20. Otherwise, we expect approximately Poisson noise. In our opinion, this parameterisation is useful for specifying the distribution in a way that is easier to reason on priors for p.\n\nArguments:\n\nr: The number of successes, although this can be extended to a continous number.\np: Success rate.\n\nReturns:\n\nA SafeNegativeBinomial distribution with the specified mean.\n\nExamples:\n\nusing EpiAware, Distributions\n\nbigμ = exp(48.0) #Large value of μ\nσ² = bigμ + 0.05 * bigμ^2 #Large variance\n\n# We can calculate the success rate from the mean to variance relationship\np = bigμ / σ²\nr = bigμ * p / (1 - p)\nd = SafeNegativeBinomial(r, p)\n# output\nEpiAware.EpiAwareUtils.SafeNegativeBinomial{Float64}(r=20.0, p=2.85032816548187e-20)\n\ncdf(d, 100)\n# output\n0.0\n\nlogpdf(d, 100)\n# output\n-850.1397180331871\n\nmean(d)\n# output\n7.016735912097631e20\n\nvar(d)\n# output\n2.4617291430060293e40\n\n\n\nFields\n\nr::Real\np::Real\n\n\n\n\n\n","category":"type"},{"location":"lib/EpiAwareUtils/public/#EpiAware.EpiAwareUtils.SafePoisson","page":"Public API","title":"EpiAware.EpiAwareUtils.SafePoisson","text":"struct SafePoisson{T<:Real} <: Distributions.Distribution{Distributions.Univariate, Distributions.Discrete}\n\nCreate a Poisson distribution with the specified mean that avoids InExactError when the mean is too large.\n\nArguments:\n\nλ: The mean of the Poisson distribution.\n\nReturns:\n\nA SafePoisson distribution with the specified mean.\n\nExamples:\n\nusing EpiAware, Distributions\n\nbigλ = exp(48.0) #Large value of λ\nd = SafePoisson(bigλ)\n# output\nEpiAware.EpiAwareUtils.SafePoisson{Float64}(λ=7.016735912097631e20)\n\ncdf(d, 2)\n# output\n0.0\n\nlogpdf(d, 100)\n# output\n-7.016735912097631e20\n\nmean(d)\n# output\n7.016735912097631e20\n\nvar(d)\n# output\n7.016735912097631e20\n\n\n\nFields\n\nλ::Real\n\n\n\n\n\n","category":"type"},{"location":"lib/EpiAwareUtils/public/#EpiAware.EpiAwareUtils.accumulate_scan-Tuple{AbstractAccumulationStep, Any, Any}","page":"Public API","title":"EpiAware.EpiAwareUtils.accumulate_scan","text":"accumulate_scan(\n acc_step::AbstractAccumulationStep,\n initial_state,\n ϵ_t\n) -> Any\n\n\nApply the `accumulate` function to the `AbstractAccumulationStep` object.\nThis is effectively a optimised version of a for loop that applies the\n`AbstractAccumulationStep` object to the input data in a single pass.\n\n# Arguments\n- `acc_step::AbstractAccumulationStep: The accumulation step function.\n- `initial_state`: The initial state of the accumulation.\n- `ϵ_t::AbstractVector{<:Real}`: The input data.\n\n# Returns\n- `state::AbstractVector{<:Real}`: The accumulated state as returned by the\n`get_state` function from the output of the `accumulate` function.\n\n# Examples\n```julia\nusing EpiAware\nstruct TestStep <: AbstractAccumulationStep\n a::Float64\nend\n\nfunction (step::TestStep)(state, ϵ)\n new_state = step.a * ϵ\n return new_state\nend\n\nacc_step = TestStep(0.5)\ninitial_state = zeros(3)\n\naccumulate_scan(acc_step, initial_state, [1.0, 2.0, 3.0])\n\nfunction get_state(acc_step::TestStep, initial_state, state)\n return state\nend\n\naccumulate_scan(acc_step, initial_state, [1.0, 2.0, 3.0])\n```\n\n\n\n\n\n","category":"method"},{"location":"lib/EpiAwareUtils/public/#EpiAware.EpiAwareUtils.censored_cdf-Tuple{Distributions.Distribution}","page":"Public API","title":"EpiAware.EpiAwareUtils.censored_cdf","text":"censored_cdf(\n dist::Distributions.Distribution;\n Δd,\n D,\n upper\n) -> Any\n\n\nCreate a discrete probability cumulative distribution function (CDF) from a given distribution, assuming a uniform distribution over primary event times with censoring intervals of width Δd for both primary and secondary events.\n\nNB: censored_cdf returns the non-truncated CDF, i.e. the CDF without conditioning on the secondary event occuring either before or after some time.\n\nArguments\n\ndist: The distribution from which to create the PMF.\nΔd: The step size for discretizing the domain. Default is 1.0.\nD: The upper bound of the domain. Must be greater than Δd. Default D = nothing\n\nindicates that the distribution should be truncated at its upperth percentile rounded to nearest multiple of Δd.\n\nReturns\n\nA vector representing the CDF with 0.0 appended at the beginning.\n\nRaises\n\nAssertionError if the minimum value of dist is negative.\nAssertionError if Δd is not positive.\nAssertionError if D is shorter than Δd.\nAssertionError if D is not a multiple of Δd.\n\nExamples\n\nusing Distributions\nusing EpiAware.EpiAwareUtils\n\ndist = Exponential(1.0)\n\ncensored_cdf(dist; D = 10) |>\n p -> round.(p, digits=3)\n\n# output\n11-element Vector{Float64}:\n 0.0\n 0.368\n 0.767\n 0.914\n 0.969\n 0.988\n 0.996\n 0.998\n 0.999\n 1.0\n 1.0\n\n\n\n\n\n","category":"method"},{"location":"lib/EpiAwareUtils/public/#EpiAware.EpiAwareUtils.censored_pmf-Tuple{Distributions.Distribution, Val{:single_censored}}","page":"Public API","title":"EpiAware.EpiAwareUtils.censored_pmf","text":"censored_pmf(\n dist::Distributions.Distribution,\n ::Val{:single_censored};\n primary_approximation_point,\n Δd,\n D\n)\n\n\nCreate a discrete probability mass function (PMF) from a given distribution, assuming that the primary event happens at primary_approximation_point * Δd within an intial censoring interval. Common single-censoring approximations are primary_approximation_point = 0 (left-hand approximation), primary_approximation_point = 1 (right-hand) and primary_approximation_point = 0.5 (midpoint).\n\nArguments\n\ndist: The distribution from which to create the PMF.\n::Val{:single_censored}: A dummy argument to dispatch to this method. The purpose of the Val\n\ntype argument is that to use single-censored approximation is an active decision.\n\nprimary_approximation_point: A approximation point for the primary time in its censoring interval.\n\nDefault is 0.5 for midpoint approximation.\n\nΔd: The step size for discretizing the domain. Default is 1.0.\nD: The upper bound of the domain. Must be greater than Δd.\n\nReturns\n\nA vector representing the PMF.\n\nRaises:\n\nAssertionError if the minimum value of dist is negative.\nAssertionError if Δd is not positive.\nAssertionError if D is not greater than Δd.\n\nExamples\n\nusing Distributions\nusing EpiAware.EpiAwareUtils\n\ndist = Exponential(1.0)\n\ncensored_pmf(dist, Val(:single_censored); D = 10) |>\n p -> round.(p, digits=3)\n\n# output\n10-element Vector{Float64}:\n 0.393\n 0.383\n 0.141\n 0.052\n 0.019\n 0.007\n 0.003\n 0.001\n 0.0\n 0.0\n\n\n\n\n\n","category":"method"},{"location":"lib/EpiAwareUtils/public/#EpiAware.EpiAwareUtils.censored_pmf-Tuple{Distributions.Distribution}","page":"Public API","title":"EpiAware.EpiAwareUtils.censored_pmf","text":"censored_pmf(\n dist::Distributions.Distribution;\n Δd,\n D,\n upper\n) -> Any\n\n\nCreate a discrete probability mass function (PMF) from a given distribution, assuming a uniform distribution over primary event times with censoring intervals of width Δd for both primary and secondary events. The CDF for the time from the left edge of the interval containing the primary event to the secondary event is created by direct numerical integration (quadrature) of the convolution of the CDF of dist with the uniform density on [0,Δd), using the censored_cdf function. The discrete PMF for double censored delays is then found using simple differencing on the CDF.\n\nNB: censored_pmf returns a right-truncated PMF, i.e. the PMF conditioned on the secondary event occurring before or on the final secondary censoring window.\n\nArguments\n\ndist: The distribution from which to create the PMF.\nΔd: The step size for discretizing the domain. Default is 1.0.\nD: The upper bound of the domain. Must be greater than Δd. Default D = nothing\n\nindicates that the distribution should be truncated at its upperth percentile rounded to nearest multiple of Δd.\n\nReturns\n\nA vector representing the PMF.\n\nRaises\n\nAssertionError if the minimum value of dist is negative.\nAssertionError if Δd is not positive.\nAssertionError if D is shorter than Δd.\nAssertionError if D is not a multiple of Δd.\n\nExamples\n\nusing Distributions\nusing EpiAware.EpiAwareUtils\n\ndist = Exponential(1.0)\n\ncensored_pmf(dist; D = 10) |>\n p -> round.(p, digits=3)\n\n# output\n10-element Vector{Float64}:\n 0.368\n 0.4\n 0.147\n 0.054\n 0.02\n 0.007\n 0.003\n 0.001\n 0.0\n 0.0\n\n\n\n\n\n","category":"method"},{"location":"lib/EpiAwareUtils/public/#EpiAware.EpiAwareUtils.get_param_array-Tuple{MCMCChains.Chains}","page":"Public API","title":"EpiAware.EpiAwareUtils.get_param_array","text":"get_param_array(chn::MCMCChains.Chains) -> Any\n\n\nExtract a parameter array from a Chains object chn that matches the shape of number of sample and chain pairs in chn.\n\nArguments\n\nchn::Chains: The Chains object containing the MCMC samples.\n\nReturns\n\nparam_array: An array of parameter samples, where each element corresponds to a single\n\nMCMC sample as a NamedTuple.\n\nExample\n\nSampling from a simple model which has both scalar and vector quantity random variables across 4 chains.\n\nusing Turing, MCMCChains, EpiAware\n\n@model function testmodel()\n y ~ Normal()\nend\nmdl = testmodel()\nchn = sample(mdl, Prior(), MCMCSerial(), 2, 1, progress=false)\n\nA = get_param_array(chn)\n\n\n\n\n\n","category":"method"},{"location":"lib/EpiAwareUtils/public/#EpiAware.EpiAwareUtils.get_state-Tuple{AbstractAccumulationStep, Any, Any}","page":"Public API","title":"EpiAware.EpiAwareUtils.get_state","text":"get_state(\n acc_step::AbstractAccumulationStep,\n initial_state,\n state\n) -> Any\n\n\nProcesses the output of the `accumulate` function to return the final state.\n\n# Arguments\n- `acc_step::AbstractAccumulationStep`: The accumulation step function.\n- `initial_state`: The initial state of the accumulation.\n- `state`: The output of the `accumulate` function.\n\n# Returns\n- `state`: The combination of the initial state and the last element of\n each accumulated state.\n\n\n\n\n\n","category":"method"},{"location":"lib/EpiAwareUtils/public/#EpiAware.EpiAwareUtils.prefix_submodel-Tuple{AbstractModel, Function, String, Vararg{Any}}","page":"Public API","title":"EpiAware.EpiAwareUtils.prefix_submodel","text":"prefix_submodel(\n model::AbstractModel,\n fn::Function,\n prefix::String,\n kwargs...\n) -> Any\n\n\nGenerate a submodel with an optional prefix. A lightweight wrapper around the @submodel macro from DynamicPPL.jl.\n\nArguments\n\nmodel::AbstractModel: The model to be used.\nfn::Function: The Turing @model function to be applied to the model.\nprefix::String: The prefix to be used. If the prefix is an empty string, the submodel is created without a prefix.\n\nReturns\n\nsubmodel: The returns from the submodel are passed through.\n\nExamples\n\nusing EpiAware, DynamicPPL\n\nsubmodel = prefix_submodel(FixedIntercept(0.1), generate_latent, string(1), 2)\n\nWe can now draw a sample from the submodel.\n\nrand(submodel)\n\n\n\n\n\n","category":"method"},{"location":"lib/EpiAwareUtils/public/#EpiAware.EpiAwareUtils.scan-Union{Tuple{F}, Tuple{F, Any, Any}} where F<:AbstractModel","page":"Public API","title":"EpiAware.EpiAwareUtils.scan","text":"scan(f::AbstractModel, init, xs) -> Tuple{Any, Any}\n\n\nApply f to each element of xs and accumulate the results.\n\nf must be a callable on a sub-type of AbstractModel.\n\nDesign note\n\nscan is being restricted to AbstractModel sub-types to ensure: 1. That compiler specialization is activated 2. Also avoids potential compiler overhead from specialisation on f<: Function.\n\nArguments\n\nf: A callable/functor that takes two arguments, carry and x, and returns a new carry and a result y.\ninit: The initial value for the carry variable.\nxs: An iterable collection of elements.\n\nReturns\n\nys: An array containing the results of applying f to each element of xs.\ncarry: The final value of the carry variable after processing all elements of xs.\n\nExamples\n\n```jldoctest using EpiAware\n\nstruct Adder <: EpiAwareBase.AbstractModel end function (a::Adder)(carry, x) carry + x, carry + x end\n\nscan(Adder(), 0, 1:5) #output ([1, 3, 6, 10, 15], 15)\n\n\n\n\n\n","category":"method"},{"location":"lib/EpiAwareUtils/public/#EpiAware.EpiAwareUtils.spread_draws-Tuple{MCMCChains.Chains}","page":"Public API","title":"EpiAware.EpiAwareUtils.spread_draws","text":"spread_draws(chn::MCMCChains.Chains) -> DataFrames.DataFrame\n\n\nspread_draws(chn::Chains)\n\nConverts a Chains object into a DataFrame in tidybayes format.\n\nArguments\n\nchn::Chains: The Chains object to be converted.\n\nReturns\n\ndf::DataFrame: The converted DataFrame.\n\n\n\n\n\n","category":"method"},{"location":"lib/EpiAwareUtils/public/#EpiAware.EpiAwareUtils.∫F-Tuple{Any, Any, Any}","page":"Public API","title":"EpiAware.EpiAwareUtils.∫F","text":"∫F(dist, t, Δd) -> Any\n\n\nCalculate the CDF of the random variable X + U where X has cumulative distriubtion function F and U is a uniform random variable on [0, Δd).\n\nThis is used in solving for censored CDFs and PMFs using numerical quadrature.\n\n\n\n\n\n","category":"method"},{"location":"release-notes/","page":"Release notes","title":"Release notes","text":"EditURL = \"https://github.com/JuliaDocs/Documenter.jl/blob/master/CHANGELOG.md\"","category":"page"},{"location":"release-notes/#Release-notes","page":"Release notes","title":"Release notes","text":"","category":"section"},{"location":"release-notes/","page":"Release notes","title":"Release notes","text":"The format is based on Keep a Changelog, and this project adheres to Semantic Versioning.","category":"page"},{"location":"release-notes/#Unreleased","page":"Release notes","title":"Unreleased","text":"","category":"section"},{"location":"release-notes/#Added","page":"Release notes","title":"Added","text":"","category":"section"},{"location":"release-notes/#Changed","page":"Release notes","title":"Changed","text":"","category":"section"},{"location":"release-notes/#Fixed","page":"Release notes","title":"Fixed","text":"","category":"section"},{"location":"getting-started/#getting-started","page":"Overview","title":"Getting started","text":"","category":"section"},{"location":"getting-started/","page":"Overview","title":"Overview","text":"Note that this section of the documentation is still under construction. Please see replications for the most up-to-date information. Please feel free to contribute to the documentation by submitting a pull request.","category":"page"},{"location":"getting-started/","page":"Overview","title":"Overview","text":"Welcome to the EpiAware documentation! This section is designed to help you get started with the package. It includes a frequently asked questions (FAQ) section, a series of explainers that provide a detailed overview of the platform and its features, and tutorials that will help you get started with EpiAware for specific tasks. See the sidebar for the list of topics.","category":"page"},{"location":"getting-started/explainers/observation-models/#Observation-models","page":"Observation models","title":"Observation models","text":"","category":"section"},{"location":"showcase/#showcase","page":"Overview","title":"EpiAware Showcase","text":"","category":"section"},{"location":"showcase/","page":"Overview","title":"Overview","text":"Here we showcase the capabilities of EpiAware in action. If you have a showcase you would like to add, please submit a pull request.","category":"page"}] +[{"location":"getting-started/installation/#Installation","page":"Installation","title":"Installation","text":"","category":"section"},{"location":"getting-started/installation/","page":"Installation","title":"Installation","text":"Eventually, EpiAware is likely to be added to the Julia registry. Until then, you can install it from the /EpiAware sub-directory of this repository by running the following command in the Julia REPL:","category":"page"},{"location":"getting-started/installation/","page":"Installation","title":"Installation","text":"using Pkg; Pkg.add(url=\"https://github.com/CDCgov/Rt-without-renewal\", subdir=\"EpiAware\")","category":"page"},{"location":"lib/EpiInfModels/#EpiInfModels.jl","page":"Overview","title":"EpiInfModels.jl","text":"","category":"section"},{"location":"lib/EpiInfModels/","page":"Overview","title":"Overview","text":"This package provides infectious disease transmission models for the EpiAware ecosystem.","category":"page"},{"location":"lib/EpiInfModels/#API","page":"Overview","title":"API","text":"","category":"section"},{"location":"lib/EpiInfModels/","page":"Overview","title":"Overview","text":"Pages = [\"lib/EpiInfModels/public.md\", \"lib/EpiInfModels/internals.md\"]","category":"page"},{"location":"getting-started/quickstart/#Quickstart","page":"Quickstart","title":"Quickstart","text":"","category":"section"},{"location":"getting-started/quickstart/","page":"Quickstart","title":"Quickstart","text":"Get up and running with EpiAware in just a few minutes using this quickstart guide.","category":"page"},{"location":"lib/EpiLatentModels/#EpiLatentModels.jl","page":"Overview","title":"EpiLatentModels.jl","text":"","category":"section"},{"location":"lib/EpiLatentModels/","page":"Overview","title":"Overview","text":"This package provides latent variable models for the EpiAware ecosystem.","category":"page"},{"location":"lib/EpiLatentModels/#API","page":"Overview","title":"API","text":"","category":"section"},{"location":"lib/EpiLatentModels/","page":"Overview","title":"Overview","text":"Pages = [\"lib/EpiLatentModels/public.md\", \"lib/EpiLatentModels/internals.md\"]","category":"page"},{"location":"lib/#api-reference","page":"Overview","title":"API reference","text":"","category":"section"},{"location":"lib/","page":"Overview","title":"Overview","text":"Welcome to the EpiAware API reference! This section is designed to help you understand the API of the package which is split into submodules.","category":"page"},{"location":"lib/","page":"Overview","title":"Overview","text":"The EpiAware package itself contains no functions or types. Instead, it re-exports the functions and types from its submodules. See the sidebar for the list of submodules.","category":"page"},{"location":"lib/EpiLatentModels/internals/#Internal-Documentation","page":"Internal API","title":"Internal Documentation","text":"","category":"section"},{"location":"lib/EpiLatentModels/internals/","page":"Internal API","title":"Internal API","text":"Documentation for EpiLatentModels.jl's internal interface.","category":"page"},{"location":"lib/EpiLatentModels/internals/#Contents","page":"Internal API","title":"Contents","text":"","category":"section"},{"location":"lib/EpiLatentModels/internals/","page":"Internal API","title":"Internal API","text":"Pages = [\"internals.md\"]\nDepth = 2:2","category":"page"},{"location":"lib/EpiLatentModels/internals/#Index","page":"Internal API","title":"Index","text":"","category":"section"},{"location":"lib/EpiLatentModels/internals/","page":"Internal API","title":"Internal API","text":"Pages = [\"internals.md\"]","category":"page"},{"location":"lib/EpiLatentModels/internals/#Internal-API","page":"Internal API","title":"Internal API","text":"","category":"section"},{"location":"lib/EpiLatentModels/internals/","page":"Internal API","title":"Internal API","text":"Modules = [EpiAware.EpiLatentModels]\nPublic = false","category":"page"},{"location":"lib/EpiLatentModels/internals/#EpiAware.EpiLatentModels.ARStep","page":"Internal API","title":"EpiAware.EpiLatentModels.ARStep","text":"struct ARStep{D<:(AbstractVector{<:Real})} <: AbstractAccumulationStep\n\nThe autoregressive (AR) step function struct\n\n\n\nFields\n\ndamp_AR::AbstractVector{<:Real}\n\n\n\n\n\n","category":"type"},{"location":"lib/EpiLatentModels/internals/#EpiAware.EpiLatentModels.ARStep-Tuple{Any, Any}","page":"Internal API","title":"EpiAware.EpiLatentModels.ARStep","text":"The autoregressive (AR) step function for use with accumulate_scan.\n\n\n\n\n\n","category":"method"},{"location":"lib/EpiLatentModels/internals/#EpiAware.EpiAwareBase.broadcast_n-Tuple{RepeatBlock, Any, Any}","page":"Internal API","title":"EpiAware.EpiAwareBase.broadcast_n","text":"broadcast_n(_::RepeatBlock, n, period) -> Any\n\n\nA function that returns the length of the latent periods to generate using the RepeatBlock rule which is equal n divided by the period and rounded up to the nearest integer.\n\nArguments\n\nrule::RepeatBlock: The broadcasting rule.\nn: The number of samples to generate.\nperiod: The period of the broadcast.\n\n\n\n\n\n","category":"method"},{"location":"lib/EpiLatentModels/internals/#EpiAware.EpiAwareBase.broadcast_n-Tuple{RepeatEach, Any, Any}","page":"Internal API","title":"EpiAware.EpiAwareBase.broadcast_n","text":"broadcast_n(_::RepeatEach, n, period) -> Any\n\n\nA function that returns the length of the latent periods to generate using the RepeatEach rule which is equal to the period.\n\nArguments\n\nrule::RepeatEach: The broadcasting rule.\nn: The number of samples to generate.\nperiod: The period of the broadcast.\n\nReturns\n\nm: The length of the latent periods to generate.\n\n\n\n\n\n","category":"method"},{"location":"lib/EpiLatentModels/internals/#EpiAware.EpiAwareBase.generate_latent-Tuple{AR, Any}","page":"Internal API","title":"EpiAware.EpiAwareBase.generate_latent","text":"generate_latent(latent_model::AR, n) -> Any\n\n\nGenerate a latent AR series.\n\nArguments\n\nlatent_model::AR: The AR model.\nn::Int: The length of the AR series.\n\nReturns\n\nar::Vector{Float64}: The generated AR series.\n\nNotes\n\nThe length of damp_prior and init_prior must be the same.\nn must be longer than the order of the autoregressive process.\n\n\n\n\n\n","category":"method"},{"location":"lib/EpiLatentModels/internals/#EpiAware.EpiAwareBase.generate_latent-Tuple{BroadcastLatentModel, Any}","page":"Internal API","title":"EpiAware.EpiAwareBase.generate_latent","text":"generate_latent(model::BroadcastLatentModel, n) -> Any\n\n\nGenerates latent periods using the specified model and n number of samples.\n\nArguments\n\nmodel::BroadcastLatentModel: The broadcast latent model.\nn::Any: The number of samples to generate.\n\nReturns\n\nbroadcasted_latent: The generated broadcasted latent periods.\n\n\n\n\n\n","category":"method"},{"location":"lib/EpiLatentModels/internals/#EpiAware.EpiAwareBase.generate_latent-Tuple{CombineLatentModels, Any}","page":"Internal API","title":"EpiAware.EpiAwareBase.generate_latent","text":"generate_latent(\n latent_models::CombineLatentModels,\n n\n) -> Any\n\n\nGenerate latent variables using a combination of multiple latent models.\n\nArguments\n\nlatent_models::CombineLatentModels: An instance of the CombineLatentModels type representing the collection of latent models.\nn: The number of latent variables to generate.\n\nReturns\n\nThe combined latent variables generated from all the models.\n\nExample\n\n\n\n\n\n","category":"method"},{"location":"lib/EpiLatentModels/internals/#EpiAware.EpiAwareBase.generate_latent-Tuple{ConcatLatentModels, Any}","page":"Internal API","title":"EpiAware.EpiAwareBase.generate_latent","text":"generate_latent(latent_models::ConcatLatentModels, n) -> Any\n\n\nGenerate latent variables by concatenating multiple latent models.\n\nArguments\n\nlatent_models::ConcatLatentModels: An instance of the ConcatLatentModels type representing the collection of latent models.\nn: The number of latent variables to generate.\n\nReturns\n\nconcatenated_latents: The combined latent variables generated from all the models.\nlatent_aux: A tuple containing the auxiliary latent variables generated from each individual model.\n\n\n\n\n\n","category":"method"},{"location":"lib/EpiLatentModels/internals/#EpiAware.EpiAwareBase.generate_latent-Tuple{DiffLatentModel, Any}","page":"Internal API","title":"EpiAware.EpiAwareBase.generate_latent","text":"generate_latent(latent_model::DiffLatentModel, n) -> Any\n\n\nGenerate a Turing model for n-step latent process Z_t using a differenced latent model defined by latent_model.\n\nArguments\n\nlatent_model::DiffLatentModel: The differential latent model.\nn: The length of the latent variables.\n\nTuring model specifications\n\nSampled random variables\n\nlatent_init: The initial latent process variables.\nOther random variables defined by model<:AbstractTuringLatentModel field of the undifferenced model.\n\nGenerated quantities\n\nA tuple containing the generated latent process as its first argument and a NamedTuple of sampled auxiliary variables as second argument.\n\nExample usage with DiffLatentModel model constructor\n\ngenerate_latent can be used to construct a Turing model for the differenced latent process. In this example, the underlying undifferenced process is a RandomWalk model.\n\nFirst, we construct a RandomWalk struct with an initial value prior and a step size standard deviation prior.\n\nusing Distributions, EpiAware\nrw = RandomWalk(Normal(0.0, 1.0), truncated(Normal(0.0, 0.05), 0.0, Inf))\n\nThen, we can use DiffLatentModel to construct a DiffLatentModel for d-fold differenced process with rw as the undifferenced latent process.\n\nWe have two constructor options for DiffLatentModel. The first option is to supply a common prior distribution for the initial terms and specify d as follows:\n\ndiff_model = DiffLatentModel(rw, Normal(); d = 2)\n\nOr we can supply a vector of priors for the initial terms and d is inferred as follows:\n\ndiff_model2 = DiffLatentModel(;undiffmodel = rw, init_priors = [Normal(), Normal()])\n\nThen, we can use generate_latent to construct a Turing model for the differenced latent process generating a length n process,\n\n# Construct a Turing model\nn = 100\ndifference_mdl = generate_latent(diff_model, n)\n\nNow we can use the Turing PPL API to sample underlying parameters and generate the unobserved latent process.\n\n#Sample random parameters from prior\nθ = rand(difference_mdl)\n#Get a sampled latent process as a generated quantity from the model\n(Z_t, _) = generated_quantities(difference_mdl, θ)\nZ_t\n\n\n\n\n\n","category":"method"},{"location":"lib/EpiLatentModels/internals/#EpiAware.EpiAwareBase.generate_latent-Tuple{FixedIntercept, Any}","page":"Internal API","title":"EpiAware.EpiAwareBase.generate_latent","text":"generate_latent(latent_model::FixedIntercept, n) -> Any\n\n\nGenerate a latent intercept series with a fixed intercept value.\n\nArguments\n\nlatent_model::FixedIntercept: The fixed intercept latent model.\nn: The number of latent variables to generate.\n\nReturns\n\nlatent_vars: An array of length n filled with the fixed intercept value.\n\n\n\n\n\n","category":"method"},{"location":"lib/EpiLatentModels/internals/#EpiAware.EpiAwareBase.generate_latent-Tuple{HierarchicalNormal, Any}","page":"Internal API","title":"EpiAware.EpiAwareBase.generate_latent","text":"generate_latent(obs_model::HierarchicalNormal, n) -> Any\n\n\nfunction EpiAwareBase.generate_latent(obs_model::HierarchicalNormal, n)\n\nGenerate latent variables from the hierarchical normal distribution.\n\nArguments\n\nobs_model::HierarchicalNormal: The hierarchical normal distribution model.\nn: Number of latent variables to generate.\n\nReturns\n\nη_t: Generated latent variables.\n\n\n\n\n\n","category":"method"},{"location":"lib/EpiLatentModels/internals/#EpiAware.EpiAwareBase.generate_latent-Tuple{Intercept, Any}","page":"Internal API","title":"EpiAware.EpiAwareBase.generate_latent","text":"generate_latent(latent_model::Intercept, n) -> Any\n\n\nGenerate a latent intercept series.\n\nArguments\n\nlatent_model::Intercept: The intercept model.\nn::Int: The length of the intercept series.\n\nReturns\n\nintercept::Vector{Float64}: The generated intercept series.\n\n\n\n\n\n","category":"method"},{"location":"lib/EpiLatentModels/internals/#EpiAware.EpiAwareBase.generate_latent-Tuple{RandomWalk, Any}","page":"Internal API","title":"EpiAware.EpiAwareBase.generate_latent","text":"generate_latent(latent_model::RandomWalk, n) -> Any\n\n\nImplement the generate_latent function for the RandomWalk model.\n\nExample usage of generate_latent with RandomWalk type of latent process model\n\nusing Distributions, Turing, EpiAware\n\n# Create a RandomWalk model\nrw = RandomWalk(init_prior = Normal(2., 1.),\n std_prior = HalfNormal(0.1))\n\nThen, we can use generate_latent to construct a Turing model for a 10 step random walk.\n\n# Construct a Turing model\nrw_model = generate_latent(rw, 10)\n\nNow we can use the Turing PPL API to sample underlying parameters and generate the unobserved infections.\n\n#Sample random parameters from prior\nθ = rand(rw_model)\n#Get random walk sample path as a generated quantities from the model\nZ_t, _ = generated_quantities(rw_model, θ)\n\n\n\n\n\n","category":"method"},{"location":"lib/EpiLatentModels/internals/#EpiAware.EpiAwareBase.generate_latent-Tuple{TransformLatentModel, Any}","page":"Internal API","title":"EpiAware.EpiAwareBase.generate_latent","text":"generate_latent(model::TransformLatentModel, n) -> Any\n\n\ngenerate_latent(model::TransformLatentModel, n)\n\nGenerate latent variables using the specified TransformLatentModel.\n\nArguments\n\nmodel::TransformLatentModel: The TransformLatentModel to generate latent variables from.\nn: The number of latent variables to generate.\n\nReturns\n\nThe transformed latent variables.\n\n\n\n\n\n","category":"method"},{"location":"lib/EpiObsModels/public/#Public-Documentation","page":"Public API","title":"Public Documentation","text":"","category":"section"},{"location":"lib/EpiObsModels/public/","page":"Public API","title":"Public API","text":"Documentation for EpiObsModels.jl's public interface.","category":"page"},{"location":"lib/EpiObsModels/public/","page":"Public API","title":"Public API","text":"See the Internals section of the manual for internal package docs covering all submodules.","category":"page"},{"location":"lib/EpiObsModels/public/#Contents","page":"Public API","title":"Contents","text":"","category":"section"},{"location":"lib/EpiObsModels/public/","page":"Public API","title":"Public API","text":"Pages = [\"public.md\"]\nDepth = 2:2","category":"page"},{"location":"lib/EpiObsModels/public/#Index","page":"Public API","title":"Index","text":"","category":"section"},{"location":"lib/EpiObsModels/public/","page":"Public API","title":"Public API","text":"Pages = [\"public.md\"]","category":"page"},{"location":"lib/EpiObsModels/public/#Public-API","page":"Public API","title":"Public API","text":"","category":"section"},{"location":"lib/EpiObsModels/public/","page":"Public API","title":"Public API","text":"Modules = [EpiAware.EpiObsModels]\nPrivate = false","category":"page"},{"location":"lib/EpiObsModels/public/#EpiAware.EpiObsModels","page":"Public API","title":"EpiAware.EpiObsModels","text":"Module for defining observation models.\n\n\n\n\n\n","category":"module"},{"location":"lib/EpiObsModels/public/#EpiAware.EpiObsModels.Aggregate","page":"Public API","title":"EpiAware.EpiObsModels.Aggregate","text":"struct Aggregate{M<:AbstractTuringObservationModel, I<:(AbstractVector{<:Int64}), J<:(AbstractVector{<:Bool})} <: AbstractTuringObservationModel\n\nAggregates observations over a specified time period. For efficiency it also only passes the aggregated observations to the submodel. The aggregation vector is internally broadcasted to the length of the observations and the present vector is broadcasted to the length of the aggregation vector using broadcast_n.\n\nFields\n\nmodel::AbstractTuringObservationModel: The submodel to use for the aggregated observations.\naggregation::AbstractVector{<: Int}: The number of time periods to aggregate over.\npresent::AbstractVector{<: Bool}: A vector of booleans indicating whether the observation is present or not.\n\nConstructors\n\nAggregate(model, aggregation): Constructs an Aggregate object and automatically sets the present field.\nAggregate(; model, aggregation): Constructs an Aggregate object and automatically sets the present field using named keyword arguments\n\nExamples\n\nusing EpiAware\nweekly_agg = Aggregate(PoissonError(), [0, 0, 0, 0, 7, 0, 0])\ngen_obs = generate_observations(weekly_agg, missing, fill(1, 28))\ngen_obs()\n\n\n\nFields\n\nmodel::AbstractTuringObservationModel\naggregation::AbstractVector{<:Int64}\npresent::AbstractVector{<:Bool}\n\n\n\n\n\n","category":"type"},{"location":"lib/EpiObsModels/public/#EpiAware.EpiObsModels.Ascertainment","page":"Public API","title":"EpiAware.EpiObsModels.Ascertainment","text":"struct Ascertainment{M<:AbstractTuringObservationModel, T<:AbstractTuringLatentModel, F<:Function, P<:String} <: AbstractTuringObservationModel\n\nThe Ascertainment struct represents an observation model that incorporates a ascertainment model. If a latent_prefixis supplied the latent_model is wrapped in a call to PrefixLatentModel.\n\nConstructors\n\nAscertainment(model::M, latent_model::T, transform::F, latent_prefix::P) where {M <: AbstractTuringObservationModel, T <: AbstractTuringLatentModel, F <: Function, P <: String}: Constructs an Ascertainment instance with the specified observation model, latent model, transform function, and latent prefix.\nAscertainment(; model::M, latent_model::T, transform::F = (Y_t, x) -> xexpy.(Y_t, x), latent_prefix::P = \"Ascertainment\") where {M <: AbstractTuringObservationModel, T <: AbstractTuringLatentModel, F <: Function, P <: String}: Constructs an Ascertainment instance with the specified observation model, latent model, optional transform function (default: (Y_t, x) -> xexpy.(Y_t, x)), and optional latent prefix (default: \"Ascertainment\").\n\nExamples\n\nusing EpiAware, Turing\nobs = Ascertainment(model = NegativeBinomialError(), latent_model = FixedIntercept(0.1))\ngen_obs = generate_observations(obs, missing, fill(100, 10))\nrand(gen_obs)\n\n\n\nFields\n\nmodel::AbstractTuringObservationModel: The underlying observation model.\nlatent_model::AbstractTuringLatentModel: The latent model.\ntransform::Function: The function used to transform Y_t and the latent model output.\nlatent_prefix::String\n\n\n\n\n\n","category":"type"},{"location":"lib/EpiObsModels/public/#EpiAware.EpiObsModels.LatentDelay","page":"Public API","title":"EpiAware.EpiObsModels.LatentDelay","text":"struct LatentDelay{M<:AbstractTuringObservationModel, T<:(AbstractVector{<:Real})} <: AbstractTuringObservationModel\n\nThe LatentDelay struct represents an observation model that introduces a latent delay in the observations. It is a subtype of AbstractTuringObservationModel.\n\nNote that the LatentDelay observation model shortens the expected observation vector by the length of the delay distribution and this is then passed to the underlying observation model. This is to prevent fitting to partially observed data.\n\nFields\n\nmodel::M: The underlying observation model.\nrev_pmf::T: The probability mass function (PMF) representing the delay distribution reversed.\n\nConstructors\n\nLatentDelay(model::M, distribution::C; D = nothing, Δd = 1.0) where {M <: AbstractTuringObservationModel, C <: ContinuousDistribution}: Constructs a LatentDelay object with the given underlying observation model and continuous distribution. The D parameter specifies the right truncation of the distribution, with default D = nothing indicates that the distribution should be truncated at its 99th percentile rounded to nearest multiple of Δd. The Δd parameter specifies the width of each delay interval.\nLatentDelay(model::M, pmf::T) where {M <: AbstractTuringObservationModel, T <: AbstractVector{<:Real}}: Constructs a LatentDelay object with the given underlying observation model and delay PMF.\n\nExamples\n\nusing Distributions, Turing, EpiAware\nobs = LatentDelay(NegativeBinomialError(), truncated(Normal(5.0, 2.0), 0.0, Inf))\nobs_model = generate_observations(obs, missing, fill(10, 30))\nobs_model()\n\n\n\nFields\n\nmodel::AbstractTuringObservationModel\nrev_pmf::AbstractVector{<:Real}\n\n\n\n\n\n","category":"type"},{"location":"lib/EpiObsModels/public/#EpiAware.EpiObsModels.NegativeBinomialError","page":"Public API","title":"EpiAware.EpiObsModels.NegativeBinomialError","text":"struct NegativeBinomialError{S<:Distributions.Sampleable} <: AbstractTuringObservationErrorModel\n\nThe NegativeBinomialError struct represents an observation model for negative binomial errors. It is a subtype of AbstractTuringObservationModel.\n\nConstructors\n\nNegativeBinomialError(; cluster_factor_prior::Distribution = HalfNormal(0.1)): Constructs a NegativeBinomialError object with default values for the cluster factor prior.\nNegativeBinomialError(cluster_factor_prior::Distribution): Constructs a NegativeBinomialError object with a specified cluster factor prior.\n\nExamples\n\nusing Distributions, Turing, EpiAware\nnb = NegativeBinomialError()\nnb_model = generate_observations(nb, missing, fill(10, 10))\nrand(nb_model)\n\n\n\nFields\n\ncluster_factor_prior::Distributions.Sampleable: The prior distribution for the cluster factor.\n\n\n\n\n\n","category":"type"},{"location":"lib/EpiObsModels/public/#EpiAware.EpiObsModels.PoissonError","page":"Public API","title":"EpiAware.EpiObsModels.PoissonError","text":"struct PoissonError <: AbstractTuringObservationErrorModel\n\nThe PoissonError struct represents an observation model for Poisson errors. It is a subtype of AbstractTuringObservationErrorModel.\n\nConstructors\n\nPoissonError(): Constructs a PoissonError object.\n\nExamples\n\nusing Distributions, Turing, EpiAware\npoi = PoissonError()\npoi_model = generate_observations(poi, missing, fill(10, 10))\nrand(poi_model)\n\n\n\nFields\n\n\n\n\n\n","category":"type"},{"location":"lib/EpiObsModels/public/#EpiAware.EpiObsModels.PrefixObservationModel","page":"Public API","title":"EpiAware.EpiObsModels.PrefixObservationModel","text":"struct PrefixObservationModel{M<:AbstractTuringObservationModel, P<:String} <: AbstractTuringObservationModel\n\nGenerate an observation model with a prefix. A lightweight wrapper around `EpiAwareUtils.prefix_submodel`.\n\n# Constructors\n- `PrefixObservationModel(model::M, prefix::P)`: Create a `PrefixObservationModel` with the observation model `model` and the prefix `prefix`.\n- `PrefixObservationModel(; model::M, prefix::P)`: Create a `PrefixObservationModel` with the observation model `model` and the prefix `prefix`.\n\n# Examples\n```julia\nusing EpiAware\nobservation_model = PrefixObservationModel(Poisson(), \"Test\")\nobs = generate_observations(observation_model, 10)\nrand(obs)\n```\n\n\n\nFields\n\nmodel::AbstractTuringObservationModel: The observation model\nprefix::String: The prefix for the observation model\n\n\n\n\n\n","category":"type"},{"location":"lib/EpiObsModels/public/#EpiAware.EpiObsModels.RecordExpectedObs","page":"Public API","title":"EpiAware.EpiObsModels.RecordExpectedObs","text":"struct RecordExpectedObs{M<:AbstractTuringObservationModel} <: AbstractTuringObservationModel\n\nRecord a variable (using the Turing := syntax) in the observation model.\n\n# Fields\n- `model::AbstractTuringObservationModel`: The observation model to dispatch to.\n\n# Constructors\n\n- `RecordExpectedObs(model::AbstractTuringObservationModel)`: Record the expected observation from the model as `exp_y_t`.\n\n# Examples\n\n```julia\nusing EpiAware, Turing\nmdl = RecordExpectedObs(NegativeBinomialError())\ngen_obs = generate_observations(mdl, missing, fill(100, 10))\nsample(gen_obs, Prior(), 10)\n```\n\n\n\nFields\n\nmodel::AbstractTuringObservationModel\n\n\n\n\n\n","category":"type"},{"location":"lib/EpiObsModels/public/#EpiAware.EpiObsModels.StackObservationModels","page":"Public API","title":"EpiAware.EpiObsModels.StackObservationModels","text":"struct StackObservationModels{M<:(AbstractVector{<:AbstractTuringObservationModel}), N<:(AbstractVector{<:AbstractString})} <: AbstractTuringObservationModel\n\nA stack of observation models that are looped over to generate observations for each model in the stack. Note that the model names are used to prefix the parameters in each model (so if I have a model named cases and a parameter y_t, the parameter in the model will be cases.y_t). Inside the constructor PrefixObservationModel is wrapped around each observation model.\n\nConstructors\n\nStackObservationModels(models::Vector{<:AbstractTuringObservationModel}, model_names::Vector{<:AbstractString}): Construct a StackObservationModels object with a vector of observation models and a vector of model names.\n`StackObservationModels(; models::Vector{<:AbstractTuringObservationModel},\nmodel_names::Vector{<:AbstractString}): Construct aStackObservationModels` object with a vector of observation models and a vector of model names.\nStackObservationModels(models::NamedTuple{names, T}): Construct a StackObservationModels object with a named tuple of observation models. The model names are automatically generated from the keys of the named tuple.\n\nExample\n\nusing EpiAware, Turing\n\nobs = StackObservationModels(\n (cases = PoissonError(), deaths = NegativeBinomialError())\n)\ny_t = (cases = missing, deaths = missing)\nobs_model = generate_observations(obs, y_t, fill(10, 10))\nrand(obs_model)\nsamples = sample(obs_model, Prior(), 100; progress = false)\n\ncases_y_t = group(samples, \"cases.y_t\")\ncases_y_t\n\ndeaths_y_t = group(samples, \"deaths.y_t\")\ndeaths_y_t\n\n\n\nFields\n\nmodels::AbstractVector{<:AbstractTuringObservationModel}: A vector of observation models.\nmodel_names::AbstractVector{<:AbstractString}: A vector of observation model names\n\n\n\n\n\n","category":"type"},{"location":"lib/EpiObsModels/public/#EpiAware.EpiObsModels.TransformObservationModel","page":"Public API","title":"EpiAware.EpiObsModels.TransformObservationModel","text":"struct TransformObservationModel{M<:AbstractTuringObservationModel, F<:Function} <: AbstractTuringObservationModel\n\nThe TransformObservationModel struct represents an observation model that applies a transformation function to the expected observations before passing them to the underlying observation model.\n\nFields\n\nmodel::M: The underlying observation model.\ntransform::F: The transformation function applied to the expected observations.\n\nConstructors\n\nTransformObservationModel(model::M, transform::F = x -> log1pexp.(x)) where {M <: AbstractTuringObservationModel, F <: Function}: Constructs a TransformObservationModel instance with the specified observation model and a default transformation function.\nTransformObservationModel(; model::M, transform::F = x -> log1pexp.(x)) where {M <: AbstractTuringObservationModel, F <: Function}: Constructs a TransformObservationModel instance using named arguments.\nTransformObservationModel(model::M; transform::F = x -> log1pexp.(x)) where {M <: AbstractTuringObservationModel, F <: Function}: Constructs a TransformObservationModel instance with the specified observation model and a default transformation function.\n\nExample\n\nusing EpiAware, Distributions, LogExpFunctions\n\ntrans_obs = TransformObservationModel(NegativeBinomialError())\ngen_obs = generate_observations(trans_obs, missing, fill(10.0, 30))\ngen_obs()\n\n\n\nFields\n\nmodel::AbstractTuringObservationModel: The underlying observation model.\ntransform::Function: The transformation function. The default is log1pexp which is the softplus transformation\n\n\n\n\n\n","category":"type"},{"location":"lib/EpiObsModels/public/#EpiAware.EpiObsModels.ascertainment_dayofweek-Tuple{AbstractTuringObservationModel}","page":"Public API","title":"EpiAware.EpiObsModels.ascertainment_dayofweek","text":"ascertainment_dayofweek(\n model::AbstractTuringObservationModel;\n latent_model,\n transform,\n latent_prefix\n) -> Ascertainment{M, AbstractTuringLatentModel, EpiAware.EpiObsModels.var\"#18#20\", String} where M<:AbstractTuringObservationModel\n\n\nCreate an Ascertainment object that models the ascertainment process based on the day of the week.\n\nArguments\n\nmodel::AbstractTuringObservationModel: The observation model to be used.\nlatent_model::AbstractTuringLatentModel: The latent model to be used. Default is HierarchicalNormal() which is a hierarchical normal distribution.\ntransform: The transform function to be used. Default is (x, y) -> x .* y.\n\nThis function is used to transform the latent model after broadcasting to periodic weekly has been applied.\n\nlatent_prefix: The prefix to be used for the latent model. Default is \"DayofWeek\".\n\nReturns\n\nAscertainment: The Ascertainment object that models the ascertainment process based on the day of the week.\n\nExamples\n\nusing EpiAware\nobs = ascertainment_dayofweek(PoissonError())\ngen_obs = generate_observations(obs, missing, fill(100, 14))\ngen_obs()\nrand(gen_obs)\n\n\n\n\n\n","category":"method"},{"location":"lib/EpiObsModels/public/#EpiAware.EpiObsModels.generate_observation_error_priors-Tuple{AbstractTuringObservationErrorModel, Any, Any}","page":"Public API","title":"EpiAware.EpiObsModels.generate_observation_error_priors","text":"generate_observation_error_priors(\n obs_model::AbstractTuringObservationErrorModel,\n y_t,\n Y_t\n) -> Any\n\n\nGenerates priors for the observation error model. This should return a named tuple containing the priors required for generating the observation error distribution.\n\n\n\n\n\n","category":"method"},{"location":"lib/EpiObsModels/public/#EpiAware.EpiObsModels.generate_observation_error_priors-Tuple{NegativeBinomialError, Any, Any}","page":"Public API","title":"EpiAware.EpiObsModels.generate_observation_error_priors","text":"generate_observation_error_priors(\n obs_model::NegativeBinomialError,\n Y_t,\n y_t\n) -> Any\n\n\nGenerates observation error priors based on the NegativeBinomialError observation model. This function generates the cluster factor prior for the negative binomial error model.\n\n\n\n\n\n","category":"method"},{"location":"lib/EpiObsModels/public/#EpiAware.EpiObsModels.observation_error-Tuple{AbstractTuringObservationErrorModel, Any}","page":"Public API","title":"EpiAware.EpiObsModels.observation_error","text":"observation_error(\n obs_model::AbstractTuringObservationErrorModel,\n Y_t\n) -> SafePoisson\n\n\nThe observation error distribution for the observation error model. This function should return the distribution for the observation error given the expected observation value Y_t and the priors generated by generate_observation_error_priors.\n\n\n\n\n\n","category":"method"},{"location":"lib/EpiObsModels/public/#EpiAware.EpiObsModels.observation_error-Tuple{NegativeBinomialError, Any, Any}","page":"Public API","title":"EpiAware.EpiObsModels.observation_error","text":"observation_error(\n obs_model::NegativeBinomialError,\n Y_t,\n sq_cluster_factor\n) -> SafeNegativeBinomial\n\n\nThis function generates the observation error model based on the negative binomial error model with a positive shift. It dispatches to the NegativeBinomialMeanClust distribution.\n\n\n\n\n\n","category":"method"},{"location":"lib/EpiObsModels/public/#EpiAware.EpiObsModels.observation_error-Tuple{PoissonError, Any}","page":"Public API","title":"EpiAware.EpiObsModels.observation_error","text":"observation_error(\n obs_model::PoissonError,\n Y_t\n) -> SafePoisson\n\n\nThe observation error model for Poisson errors. This function generates the observation error model based on the Poisson error model.\n\n\n\n\n\n","category":"method"},{"location":"getting-started/explainers/julia/#Julia-for-EpiAware","page":"Working with Julia","title":"Julia for EpiAware","text":"","category":"section"},{"location":"getting-started/explainers/julia/","page":"Working with Julia","title":"Working with Julia","text":"Julia is a programming language aimed at technical computing. This guide is aimed at helping you set up Julia on your system and pointing towards resources for learning more.","category":"page"},{"location":"getting-started/explainers/julia/","page":"Working with Julia","title":"Working with Julia","text":"[!NOTE] If you are familar with other languages with tooling for technical computing (e.g. R, MATLAB, Python) these noteworthy differences may be useful.","category":"page"},{"location":"getting-started/explainers/julia/","page":"Working with Julia","title":"Working with Julia","text":"Pages = [\"getting-started/tutorials/julia.md\"]\nDepth = 3","category":"page"},{"location":"getting-started/explainers/julia/#What-this-guide-is-and-isn't","page":"Working with Julia","title":"What this guide is and isn't","text":"","category":"section"},{"location":"getting-started/explainers/julia/","page":"Working with Julia","title":"Working with Julia","text":"This isn't a guide to learning the Julia programming language. Instead we providing an opinionated guide to setting up your system to use Julia effectively in project workflows aimed at people with familiarity with Julia but have maybe only developed projects in other languages (e.g. R, MATLAB, Python).","category":"page"},{"location":"getting-started/explainers/julia/","page":"Working with Julia","title":"Working with Julia","text":"If you want to learn more about the Julia programming language, we recommend the following resources:","category":"page"},{"location":"getting-started/explainers/julia/","page":"Working with Julia","title":"Working with Julia","text":"Julia Documentation - getting started.\nJulia Academy.\nJulia learning resources.\nJuliaHub.\nJulia Discourse.\nJulia Slack.","category":"page"},{"location":"getting-started/explainers/julia/#Julia-Installation-with-Juliaup","page":"Working with Julia","title":"Julia Installation with Juliaup","text":"","category":"section"},{"location":"getting-started/explainers/julia/","page":"Working with Julia","title":"Working with Julia","text":"Download Juliaup: This is a cross-platform installer/updater for the Julia programming language. It simplifies the process of installing and managing Julia versions. Go to the Juliaup GitHub repository or to the official Julia website for installation instructions.\nVerify Installation: Open a terminal (or Command Prompt on Windows) and type julia to start the Julia REPL (Read-Eval-Print Loop). You should see a Julia prompt julia>.","category":"page"},{"location":"getting-started/explainers/julia/#Basic-usage-of-Juliaup","page":"Working with Julia","title":"Basic usage of Juliaup","text":"","category":"section"},{"location":"getting-started/explainers/julia/","page":"Working with Julia","title":"Working with Julia","text":"Juliaup is a tool for managing Julia installations on your system. It allows you to install, update, and switch between different versions of Julia. Details are available at the Juliaup GitHub repository, but here are some examples of common commands:","category":"page"},{"location":"getting-started/explainers/julia/#Add-a-specific-version-of-Julia","page":"Working with Julia","title":"Add a specific version of Julia","text":"","category":"section"},{"location":"getting-started/explainers/julia/","page":"Working with Julia","title":"Working with Julia","text":"Juliaup default installs the latest release version of Julia. To install a specific version, use the add command followed by the version number. For example, to install Julia version 1.9.3, use the following command:","category":"page"},{"location":"getting-started/explainers/julia/","page":"Working with Julia","title":"Working with Julia","text":"% juliaup add 1.9.3","category":"page"},{"location":"getting-started/explainers/julia/#Use-a-specific-version-of-Julia","page":"Working with Julia","title":"Use a specific version of Julia","text":"","category":"section"},{"location":"getting-started/explainers/julia/","page":"Working with Julia","title":"Working with Julia","text":"To switch between different versions of Julia, use + julia-version after the julia command. For example, to use Julia version 1.9.3, use the following command:","category":"page"},{"location":"getting-started/explainers/julia/","page":"Working with Julia","title":"Working with Julia","text":"% julia +1.9.3","category":"page"},{"location":"getting-started/explainers/julia/","page":"Working with Julia","title":"Working with Julia","text":"This will use the specified version of Julia for the current REPL. In general, adding the + julia-version flag after the julia command will execute using the specified version of Julia.","category":"page"},{"location":"getting-started/explainers/julia/#Check-versions-of-Julia-installed","page":"Working with Julia","title":"Check versions of Julia installed","text":"","category":"section"},{"location":"getting-started/explainers/julia/","page":"Working with Julia","title":"Working with Julia","text":"To see a list of all the versions of Julia installed on your system, use the following command:","category":"page"},{"location":"getting-started/explainers/julia/","page":"Working with Julia","title":"Working with Julia","text":"% juliaup list","category":"page"},{"location":"getting-started/explainers/julia/#Update-Julia-(all-versions-installed)","page":"Working with Julia","title":"Update Julia (all versions installed)","text":"","category":"section"},{"location":"getting-started/explainers/julia/","page":"Working with Julia","title":"Working with Julia","text":"This will update all versions of Julia installed on your system to their latest release versions.","category":"page"},{"location":"getting-started/explainers/julia/","page":"Working with Julia","title":"Working with Julia","text":"% juliaup update","category":"page"},{"location":"getting-started/explainers/julia/#Usage-of-Julia-environments","page":"Working with Julia","title":"Usage of Julia environments","text":"","category":"section"},{"location":"getting-started/explainers/julia/","page":"Working with Julia","title":"Working with Julia","text":"The environment of a Julia project determines which packages, and their version, are available to the project. This is useful when you want to ensure that a project uses a specific version of a package, or when you want to isolate the project from other projects on your system. As per other languages, Julia environments are useful for managing dependencies and ensuring reproducibility.","category":"page"},{"location":"getting-started/explainers/julia/","page":"Working with Julia","title":"Working with Julia","text":"The most common usage of environments is to create a new explicit environment for a project in a directory. This creates a Project.toml file in the directory that specifies the dependencies for the project and a Manifest.toml file that specifies the exact versions of the dependencies, and their underlying dependencies. We'll discuss how to set up a new environment for a project in the REPL section.","category":"page"},{"location":"getting-started/explainers/julia/","page":"Working with Julia","title":"Working with Julia","text":"Julia environments can be stacked. This means that you can have a primary environment embedded in the stacked environment, along with secondary environment(s) that define common packages to be available to many projects.","category":"page"},{"location":"getting-started/explainers/julia/","page":"Working with Julia","title":"Working with Julia","text":"From a project development point of view, most commonly the project environment will be the primary environment, isolated from other project environments. And the environment of the Julia version installation (e.g. the @v1.10 env) will be a secondary environment because its in the default LOAD_PATH Julia environmental variable. You can add packages to the Julia version environment that you want to be available to all projects as we'll show in the REPL section. See section Recommended packages for the primary Julia environment for our recommendations.","category":"page"},{"location":"getting-started/explainers/julia/#Using-the-Julia-REPL-in-projects","page":"Working with Julia","title":"Using the Julia REPL in projects","text":"","category":"section"},{"location":"getting-started/explainers/julia/","page":"Working with Julia","title":"Working with Julia","text":"The Julia REPL (Read-Eval-Print Loop) is an interactive programming environment that takes single user inputs (i.e., single expressions), evaluates them, and returns the result to the user.","category":"page"},{"location":"getting-started/explainers/julia/#Package-management-programmatically-and-from-REPL","page":"Working with Julia","title":"Package management programmatically and from REPL","text":"","category":"section"},{"location":"getting-started/explainers/julia/","page":"Working with Julia","title":"Working with Julia","text":"Julia has a built-in package manager called Pkg, which is documented briefly here and in more detail here. The package manager is used to install, update, and manage Julia packages and environments.","category":"page"},{"location":"getting-started/explainers/julia/","page":"Working with Julia","title":"Working with Julia","text":"You can use Pkg programmatically as a normal Julia package, which is often done in scripts. For example, if we wanted to install the OrdinaryDiffEq package as part of executing a julia script, we would add the following lines to the script:","category":"page"},{"location":"getting-started/explainers/julia/","page":"Working with Julia","title":"Working with Julia","text":"using Pkg\nPkg.add(\"OrdinaryDiffEq\")","category":"page"},{"location":"getting-started/explainers/julia/","page":"Working with Julia","title":"Working with Julia","text":"However, you can also use the package manager interactively from the REPL. In our opinion, this is the more common usage of package management in Julia project development.","category":"page"},{"location":"getting-started/explainers/julia/","page":"Working with Julia","title":"Working with Julia","text":"For example, to install the OrdinaryDiffEq package from the REPL you can switch to package mode by typing ] and then type add OrdinaryDiffEq. To exit package mode, type backspace.","category":"page"},{"location":"getting-started/explainers/julia/","page":"Working with Julia","title":"Working with Julia","text":"julia> ]\n(@v1.10) pkg> add OrdinaryDiffEq","category":"page"},{"location":"getting-started/explainers/julia/","page":"Working with Julia","title":"Working with Julia","text":"This workflow is often more convenient than the programmatic interface, especially when setting packages you want to install to the environment for your julia installation, e.g the @v1.10 environment for julia 1.10.","category":"page"},{"location":"getting-started/explainers/julia/","page":"Working with Julia","title":"Working with Julia","text":"By default, the environment for a julia installation is stacked as a primary environment, so that the packages you install in the julia installation environment are available to all projects.","category":"page"},{"location":"getting-started/explainers/julia/#Using-the-Julia-REPL-to-set-up-active-project-environments","page":"Working with Julia","title":"Using the Julia REPL to set up active project environments","text":"","category":"section"},{"location":"getting-started/explainers/julia/","page":"Working with Julia","title":"Working with Julia","text":"To set a new active project environment, you can use the Pkg package manager from the REPL with the command activate with a local directory path. The project environment is named after the directory hosting the Project.toml file. After activating the project environment, you can manage packages to the project environment, as well as use packages from the primary stacked environment as described above.","category":"page"},{"location":"getting-started/explainers/julia/","page":"Working with Julia","title":"Working with Julia","text":"Here is an example of how you can create a new environment for a project when the REPL working directory is in some directory /myproject, and then add OrdinaryDiffEq to the project environment:","category":"page"},{"location":"getting-started/explainers/julia/","page":"Working with Julia","title":"Working with Julia","text":"julia> pwd() #Check your directory\n# \"path/to/myproject\"\njulia> ]\n(@v1.10) pkg> activate .\n(myproject) pkg> add OrdinaryDiffEq","category":"page"},{"location":"getting-started/explainers/julia/","page":"Working with Julia","title":"Working with Julia","text":"Note that if the project directory doesn't have a Project.toml file, one will be created when you add the first package to the project environment.","category":"page"},{"location":"getting-started/explainers/julia/#Experimenting-with-Julia-from-REPL-using-a-temporary-environment","page":"Working with Julia","title":"Experimenting with Julia from REPL using a temporary environment","text":"","category":"section"},{"location":"getting-started/explainers/julia/","page":"Working with Julia","title":"Working with Julia","text":"It is quite common to want to experiment with new Julia packages and code snippets. A convenient way to do this without setting up a new project environment or adding dependencies to the primary environment is to use a temporary environment. To do this:","category":"page"},{"location":"getting-started/explainers/julia/","page":"Working with Julia","title":"Working with Julia","text":"julia> ]\n(@v1.10) pkg> activate --temp\n(jl_FTIz6j) pkg> add InterestingPackage","category":"page"},{"location":"getting-started/explainers/julia/","page":"Working with Julia","title":"Working with Julia","text":"This will create a temporary environment, stacked with the primary environment, that is not saved to disk, and you can add packages to this environment without affecting the primary environment or any project environments. When you exit the REPL, the temporary environment will be deleted.","category":"page"},{"location":"getting-started/explainers/julia/#Recommended-packages-for-the-\"global\"-Julia-version-environment","page":"Working with Julia","title":"Recommended packages for the \"global\" Julia version environment","text":"","category":"section"},{"location":"getting-started/explainers/julia/","page":"Working with Julia","title":"Working with Julia","text":"In our view these packages are useful for your Julia version environment, e.g. v1.10 env, which will be available to other environments.","category":"page"},{"location":"getting-started/explainers/julia/","page":"Working with Julia","title":"Working with Julia","text":"Revise: For modifying package code and using the changes without restarting Julia session.\nTerm: For pretty and stylized REPL output (including error messages).\nJuliaFormatter: For code formatting.\nDocumenter: For local documentation generation.\nPluto: A native Julia notebook for interactive development.\nTestEnv: For easy use of test environments for package testing.\nUnicodePlots: For simple and quick plotting in the REPL without needing to install a fully featured plotting package.","category":"page"},{"location":"getting-started/explainers/julia/#startup.jl-recommendation","page":"Working with Julia","title":"startup.jl recommendation","text":"","category":"section"},{"location":"getting-started/explainers/julia/","page":"Working with Julia","title":"Working with Julia","text":"Revise and Term useful to have available in every Julia session. It is convenient to have these packages loaded automatically when you start a Julia session by adding a startup.jl file. This file should be located in the ~/.julia/config directory. Here is an example of a startup.jl file that loads the Revise and Term:","category":"page"},{"location":"getting-started/explainers/julia/","page":"Working with Julia","title":"Working with Julia","text":"atreplinit() do repl\n # Load Revise if it is installed\n try\n @eval using Revise\n catch e\n @warn \"error while importing Revise\" e\n end\n # Load Term if it is installed\n try\n @eval using Term\n @eval install_term_repr()\n @eval install_term_stacktrace()\n catch e\n @warn \"error while importing Term\" e\n end\nend\n","category":"page"},{"location":"getting-started/explainers/julia/#Developing-a-EpiAware-project-from-VS-Code","page":"Working with Julia","title":"Developing a EpiAware-project from VS-Code","text":"","category":"section"},{"location":"getting-started/explainers/julia/#Julia-extension-for-VS-Code","page":"Working with Julia","title":"Julia extension for VS-Code","text":"","category":"section"},{"location":"getting-started/explainers/julia/","page":"Working with Julia","title":"Working with Julia","text":"Visual Studio Code (VS-Code) is a popular code editor that supports Julia development. The Julia extension for VS-Code provides an interactive development environment that will be familiar to users of other scientific IDEs (e.g. developing R projects in RStudio or using the MATLAB application).","category":"page"},{"location":"getting-started/explainers/julia/#Features-of-the-Julia-extension-for-VS-Code","page":"Working with Julia","title":"Features of the Julia extension for VS-Code","text":"","category":"section"},{"location":"getting-started/explainers/julia/","page":"Working with Julia","title":"Working with Julia","text":"It is worth reading both the VS-Code documentation and the Julia extension documentation, however, here are some highlights:","category":"page"},{"location":"getting-started/explainers/julia/","page":"Working with Julia","title":"Working with Julia","text":"Julia REPL: The Julia extension provides an integrated REPL in the TERMINAL pane that allows you to interact with Julia code directly from the editor. For example, you can run code snippets from highlighting or code blocks defined by ## comments in the scripts.\nPlotting: By default, plots generated by featured plotting packages (e.g. Plots.jl) will be displayed in a Plot pane generated by the VS-Code editor.\nJulia Tab: The Julia extension provides a Julia tab with the following sub-tabs:\nWorkspace: This allows you to inspect the modules, functions and variables in your current REPL session. For variables that can be understood as a Table, you can view them in a tabular format from the workspace tab.\nDocumentation: This allows you to view the documentation for functions and types in the Julia standard library and any packages you have installed.\nPlot Navigator: This allows you to navigate the plots generated by the featured plotting packages.\nTesting: The Julia extension provides interaction between the Testing tab in VS-Code with Julia tests defined using the Julia package TestItems macro @testitem run with TestItemRunner.","category":"page"},{"location":"getting-started/explainers/julia/","page":"Working with Julia","title":"Working with Julia","text":"Other standard IDE features are Code completion, Code linting, Code formatting, Debugging, and Profiling.","category":"page"},{"location":"getting-started/explainers/julia/#Recommended-settings-for-the-Julia-extension-in-VS-Code","page":"Working with Julia","title":"Recommended settings for the Julia extension in VS-Code","text":"","category":"section"},{"location":"getting-started/explainers/julia/","page":"Working with Julia","title":"Working with Julia","text":"The settings of the Julia extension can be found by accessing Preferences: Open User Settings from the command palette in VS-Code and then searching for Julia.","category":"page"},{"location":"getting-started/explainers/julia/","page":"Working with Julia","title":"Working with Julia","text":"We recommend the following workplace settings saved in a file .vscode/settings.json relative to your working directory:","category":"page"},{"location":"getting-started/explainers/julia/","page":"Working with Julia","title":"Working with Julia","text":"{\n \"[julia]\": {\n \"editor.detectIndentation\": false,\n \"editor.insertSpaces\": true,\n \"editor.tabSize\": 4,\n \"files.insertFinalNewline\": true,\n \"files.trimFinalNewlines\": true,\n \"files.trimTrailingWhitespace\": true,\n \"editor.rulers\": [80],\n \"files.eol\": \"\\n\"\n },\n \"julia.liveTestFile\": \"path/to/runtests.jl\",\n \"julia.environmentPath\": \"path/to/project/directory\",\n}","category":"page"},{"location":"getting-started/explainers/julia/","page":"Working with Julia","title":"Working with Julia","text":"These settings set basic code formatting and whitespace settings for Julia files, as well as setting the path to the test file for the project and the path to the project directory for the environment.","category":"page"},{"location":"getting-started/explainers/julia/","page":"Working with Julia","title":"Working with Julia","text":"The VS-Code command Julia: Start REPL will start a REPL in TERMINAL tab in the editor with the environment set to the project directory and the Testing tab will detect the defined tests for the project.","category":"page"},{"location":"getting-started/explainers/julia/#Literate-programming-with-Julia-in-EpiAware","page":"Working with Julia","title":"Literate programming with Julia in EpiAware","text":"","category":"section"},{"location":"getting-started/explainers/julia/","page":"Working with Julia","title":"Working with Julia","text":"Its common to develop technical computing projects using a literate programming style, where code and documentation are interwoven. Julia supports this style of programming through a number of packages. In EpiAware we recommend the following:","category":"page"},{"location":"getting-started/explainers/julia/","page":"Working with Julia","title":"Working with Julia","text":"Pluto: A native Julia notebook for interactive development. Pluto notebooks are reactive, meaning that the output of all cells are updated as input changes. Installation instructions are available here. Pluto notebook files have the extension .jl and can be run as scripts.\nQuarto: A literate programming tool that allows you to write documents in markdown with embedded Julia code. Installation instructions are available here. Quarto files have the extension .qmd.","category":"page"},{"location":"getting-started/explainers/julia/","page":"Working with Julia","title":"Working with Julia","text":"We use Pluto for interactive development and Quarto for generating reports and academic articles. Both tools are useful for developing reproducible workflows.","category":"page"},{"location":"getting-started/explainers/inference/#Inference","page":"Inference","title":"Inference","text":"","category":"section"},{"location":"lib/EpiInference/public/#Public-Documentation","page":"Public API","title":"Public Documentation","text":"","category":"section"},{"location":"lib/EpiInference/public/","page":"Public API","title":"Public API","text":"Documentation for EpiInference.jl's public interface.","category":"page"},{"location":"lib/EpiInference/public/","page":"Public API","title":"Public API","text":"See the Internals section of the manual for internal package docs covering all submodules.","category":"page"},{"location":"lib/EpiInference/public/#Contents","page":"Public API","title":"Contents","text":"","category":"section"},{"location":"lib/EpiInference/public/","page":"Public API","title":"Public API","text":"Pages = [\"public.md\"]\nDepth = 2:2","category":"page"},{"location":"lib/EpiInference/public/#Index","page":"Public API","title":"Index","text":"","category":"section"},{"location":"lib/EpiInference/public/","page":"Public API","title":"Public API","text":"Pages = [\"public.md\"]","category":"page"},{"location":"lib/EpiInference/public/#Public-API","page":"Public API","title":"Public API","text":"","category":"section"},{"location":"lib/EpiInference/public/","page":"Public API","title":"Public API","text":"Modules = [EpiAware.EpiInference]\nPrivate = false","category":"page"},{"location":"lib/EpiInference/public/#EpiAware.EpiInference","page":"Public API","title":"EpiAware.EpiInference","text":"Module for defining inference methods.\n\n\n\n\n\n","category":"module"},{"location":"lib/EpiInference/public/#EpiAware.EpiInference.ManyPathfinder","page":"Public API","title":"EpiAware.EpiInference.ManyPathfinder","text":"struct ManyPathfinder <: AbstractEpiOptMethod\n\nA variational inference method that runs manypathfinder.\n\n\n\nFields\n\nndraws::Int64: Number of draws per pathfinder run.\nnruns::Int64: Number of many pathfinder runs.\nmaxiters::Int64: Maximum number of optimization iterations for each run.\nmax_tries::Int64: Maximum number of tries if all runs fail.\n\n\n\n\n\n","category":"type"},{"location":"lib/EpiInference/public/#EpiAware.EpiInference.NUTSampler","page":"Public API","title":"EpiAware.EpiInference.NUTSampler","text":"struct NUTSampler{A<:ADTypes.AbstractADType, E<:AbstractMCMC.AbstractMCMCEnsemble, M} <: AbstractEpiSamplingMethod\n\nA NUTS method for sampling from a DynamicPPL.Model object.\n\nThe NUTSampler struct represents using the No-U-Turn Sampler (NUTS) to sample from the distribution defined by a DynamicPPL.Model.\n\n\n\nFields\n\ntarget_acceptance::Float64: The target acceptance rate for the sampler.\nadtype::ADTypes.AbstractADType: The automatic differentiation type used for computing gradients.\nmcmc_parallel::AbstractMCMC.AbstractMCMCEnsemble: The parallelization strategy for the MCMC sampler.\nnchains::Int64: The number of MCMC chains to run.\nmax_depth::Int64: Tree depth limit for the NUTS sampler.\nΔ_max::Float64: Divergence threshold for the NUTS sampler.\ninit_ϵ::Float64: The initial step size for the NUTS sampler.\nndraws::Int64: The number of samples to draw from each chain.\nmetricT::Any: The metric type to use for the HMC sampler.\nnadapts::Int64: number of adaptation steps\n\n\n\n\n\n","category":"type"},{"location":"lib/EpiInference/public/#EpiAware.EpiInference.manypathfinder-Tuple{DynamicPPL.Model, Any}","page":"Public API","title":"EpiAware.EpiInference.manypathfinder","text":"manypathfinder(\n mdl::DynamicPPL.Model,\n ndraws;\n nruns,\n maxiters,\n max_tries,\n kwargs...\n) -> Any\n\n\nRun multiple instances of the pathfinder algorithm and returns the pathfinder run with the largest ELBO estimate.\n\nArguments\n\nmdl::DynamicPPL.Model: The model to perform inference on.\nnruns::Int: The number of pathfinder runs to perform.\nndraws::Int: The number of draws per pathfinder run, readjusted to be at least as large as the number of chains.\nnchains::Int: The number of chains that will be initialised by pathfinder draws.\nmaxiters::Int: The maximum number of optimizer iterations per pathfinder run.\nmax_tries::Int: The maximum number of extra tries to find a valid pathfinder result.\nkwargs...: Additional keyword arguments passed to pathfinder.\n\nReturns\n\nbest_pfs::PathfinderResult: Best pathfinder result by estimated ELBO.\n\n\n\n\n\n","category":"method"},{"location":"lib/EpiLatentModels/public/#Public-Documentation","page":"Public API","title":"Public Documentation","text":"","category":"section"},{"location":"lib/EpiLatentModels/public/","page":"Public API","title":"Public API","text":"Documentation for EpiLatentModels.jl's public interface.","category":"page"},{"location":"lib/EpiLatentModels/public/","page":"Public API","title":"Public API","text":"See the Internals section of the manual for internal package docs covering all submodules.","category":"page"},{"location":"lib/EpiLatentModels/public/#Contents","page":"Public API","title":"Contents","text":"","category":"section"},{"location":"lib/EpiLatentModels/public/","page":"Public API","title":"Public API","text":"Pages = [\"public.md\"]\nDepth = 2:2","category":"page"},{"location":"lib/EpiLatentModels/public/#Index","page":"Public API","title":"Index","text":"","category":"section"},{"location":"lib/EpiLatentModels/public/","page":"Public API","title":"Public API","text":"Pages = [\"public.md\"]","category":"page"},{"location":"lib/EpiLatentModels/public/#Public-API","page":"Public API","title":"Public API","text":"","category":"section"},{"location":"lib/EpiLatentModels/public/","page":"Public API","title":"Public API","text":"Modules = [EpiAware.EpiLatentModels]\nPrivate = false","category":"page"},{"location":"lib/EpiLatentModels/public/#EpiAware.EpiLatentModels","page":"Public API","title":"EpiAware.EpiLatentModels","text":"Module for defining latent models.\n\n\n\n\n\n","category":"module"},{"location":"lib/EpiLatentModels/public/#EpiAware.EpiLatentModels.AR","page":"Public API","title":"EpiAware.EpiLatentModels.AR","text":"struct AR{D<:Distributions.Sampleable, S<:Distributions.Sampleable, I<:Distributions.Sampleable, P<:Int64} <: AbstractTuringLatentModel\n\nThe autoregressive (AR) model struct.\n\nConstructors\n\nAR(damp_prior::Distribution, std_prior::Distribution, init_prior::Distribution; p::Int = 1): Constructs an AR model with the specified prior distributions for damping coefficients, standard deviation, and initial conditions. The order of the AR model can also be specified.\nAR(; damp_priors::Vector{D} = [truncated(Normal(0.0, 0.05))], std_prior::Distribution = truncated(Normal(0.0, 0.05), 0.0, Inf), init_priors::Vector{I} = [Normal()]) where {D <: Distribution, I <: Distribution}: Constructs an AR model with the specified prior distributions for damping coefficients, standard deviation, and initial conditions. The order of the AR model is determined by the length of the damp_priors vector.\nAR(damp_prior::Distribution, std_prior::Distribution, init_prior::Distribution, p::Int): Constructs an AR model with the specified prior distributions for damping coefficients, standard deviation, and initial conditions. The order of the AR model is explicitly specified.\n\nExamples\n\nusing Distributions\nusing EpiAware\nar = AR()\nar_model = generate_latent(ar, 10)\nrand(ar_model)\n\n\n\nFields\n\ndamp_prior::Distributions.Sampleable: Prior distribution for the damping coefficients.\nstd_prior::Distributions.Sampleable: Prior distribution for the standard deviation.\ninit_prior::Distributions.Sampleable: Prior distribution for the initial conditions\np::Int64: Order of the AR model.\n\n\n\n\n\n","category":"type"},{"location":"lib/EpiLatentModels/public/#EpiAware.EpiLatentModels.BroadcastLatentModel","page":"Public API","title":"EpiAware.EpiLatentModels.BroadcastLatentModel","text":"struct BroadcastLatentModel{M<:AbstractTuringLatentModel, P<:Integer, B<:AbstractBroadcastRule} <: AbstractTuringLatentModel\n\nThe BroadcastLatentModel struct represents a latent model that supports broadcasting of latent periods.\n\nConstructors\n\nBroadcastLatentModel(;model::M; period::Int, broadcast_rule::B): Constructs a BroadcastLatentModel with the given model, period, and broadcast_rule.\nBroadcastLatentModel(model::M, period::Int, broadcast_rule::B): An alternative constructor that allows the model, period, and broadcast_rule to be specified without keyword arguments.\n\nExamples\n\nusing EpiAware, Turing\neach_model = BroadcastLatentModel(RandomWalk(), 7, RepeatEach())\ngen_each_model = generate_latent(each_model, 10)\nrand(gen_each_model)\n\nblock_model = BroadcastLatentModel(RandomWalk(), 3, RepeatBlock())\ngen_block_model = generate_latent(block_model, 10)\nrand(gen_block_model)\n\n\n\nFields\n\nmodel::AbstractTuringLatentModel: The underlying latent model.\nperiod::Integer: The period of the broadcast.\nbroadcast_rule::AbstractBroadcastRule: The broadcast rule to be applied.\n\n\n\n\n\n","category":"type"},{"location":"lib/EpiLatentModels/public/#EpiAware.EpiLatentModels.CombineLatentModels","page":"Public API","title":"EpiAware.EpiLatentModels.CombineLatentModels","text":"struct CombineLatentModels{M<:(AbstractVector{<:AbstractTuringLatentModel}), P<:(AbstractVector{<:String})} <: AbstractTuringLatentModel\n\nThe CombineLatentModels struct.\n\nThis struct is used to combine multiple latent models into a single latent model. If a prefix is supplied wraps each model with PrefixLatentModel.\n\nConstructors\n\nCombineLatentModels(models::M, prefixes::P) where {M <: AbstractVector{<:AbstractTuringLatentModel}, P <: AbstractVector{<:String}}: Constructs a CombineLatentModels instance with specified models and prefixes, ensuring that there are at least two models and the number of models and prefixes are equal.\nCombineLatentModels(models::M) where {M <: AbstractVector{<:AbstractTuringLatentModel}}: Constructs a CombineLatentModels instance with specified models, automatically generating prefixes for each model. The\n\nautomatic prefixes are of the form Combine.1, Combine.2, etc.\n\nExamples\n\nusing EpiAware, Distributions\ncombined_model = CombineLatentModels([Intercept(Normal(2, 0.2)), AR()])\nlatent_model = generate_latent(combined_model, 10)\nlatent_model()\n\n\n\nFields\n\nmodels::AbstractVector{<:AbstractTuringLatentModel}: A vector of latent models\nprefixes::AbstractVector{<:String}: A vector of prefixes for the latent models\n\n\n\n\n\n","category":"type"},{"location":"lib/EpiLatentModels/public/#EpiAware.EpiLatentModels.ConcatLatentModels","page":"Public API","title":"EpiAware.EpiLatentModels.ConcatLatentModels","text":"struct ConcatLatentModels{M<:(AbstractVector{<:AbstractTuringLatentModel}), N<:Int64, F<:Function, P<:(AbstractVector{<:String})} <: AbstractTuringLatentModel\n\nThe ConcatLatentModels struct.\n\nThis struct is used to concatenate multiple latent models into a single latent model.\n\nConstructors\n\nConcatLatentModels(models::M, no_models::I, dimension_adaptor::F, prefixes::P) where {M <: AbstractVector{<:AbstractTuringLatentModel}, I <: Int, F <: Function, P <: AbstractVector{String}}: Constructs a ConcatLatentModels instance with specified models, number of models, dimension adaptor, and prefixes.\nConcatLatentModels(models::M, dimension_adaptor::F; prefixes::P = \"Concat.\" * string.(1:length(models))) where {M <: AbstractVector{<:AbstractTuringLatentModel}, F <: Function}: Constructs a ConcatLatentModels instance with specified models and dimension adaptor. The number of models is automatically determined as are the prefixes (of the form Concat.1, Concat.2, etc.) by default.\nConcatLatentModels(models::M; dimension_adaptor::Function, prefixes::P) where {M <: AbstractVector{<:AbstractTuringLatentModel}, P <: AbstractVector{String}}: Constructs a ConcatLatentModels instance with specified models, dimension adaptor, prefixes, and automatically determines the number of models.The default dimension adaptor is equal_dimensions. The default prefixes are of the form Concat.1, Concat.2, etc.\nConcatLatentModels(; models::M, dimension_adaptor::Function, prefixes::P) where {M <: AbstractVector{<:AbstractTuringLatentModel}, P <: AbstractVector{String}}: Constructs a ConcatLatentModels instance with specified models, dimension adaptor, prefixes, and automatically determines the number of models. The default dimension adaptor is equal_dimensions. The default prefixes are of the form Concat.1, Concat.2, etc.\n\nExamples\n\nusing EpiAware, Distributions\ncombined_model = ConcatLatentModels([Intercept(Normal(2, 0.2)), AR()])\nlatent_model = generate_latent(combined_model, 10)\nlatent_model()\n\n\n\nFields\n\nmodels::AbstractVector{<:AbstractTuringLatentModel}: A vector of latent models\nno_models::Int64: The number of models in the collection\ndimension_adaptor::Function: The dimension function for the latent variables. By default this divides the number of latent variables by the number of models and returns a vector of dimensions rounding up the first element and rounding down the rest.\nprefixes::AbstractVector{<:String}: A vector of prefixes for the latent models\n\n\n\n\n\n","category":"type"},{"location":"lib/EpiLatentModels/public/#EpiAware.EpiLatentModels.DiffLatentModel","page":"Public API","title":"EpiAware.EpiLatentModels.DiffLatentModel","text":"struct DiffLatentModel{M<:AbstractTuringLatentModel, P<:Distributions.Distribution} <: AbstractTuringLatentModel\n\nModel the latent process as a d-fold differenced version of another process.\n\nMathematical specification\n\nLet Delta be the differencing operator. If tildeZ_t is a realisation of undifferenced latent model supplied to DiffLatentModel, then the differenced process is given by,\n\nDelta^(d) Z_t = tildeZ_t quad t = d+1 ldots\n\nWe can recover Z_t by applying the inverse differencing operator Delta^-1, which corresponds to the cumulative sum operator cumsum in Julia, d-times. The d initial terms Z_1 ldots Z_d are inferred.\n\nConstructors\n\nDiffLatentModel(latent_model, init_prior_distribution::Distribution; d::Int) Constructs a DiffLatentModel for d-fold differencing with latent_model as the undifferenced latent process. All initial terms have common prior init_prior_distribution.\nDiffLatentModel(;model, init_priors::Vector{D} where {D <: Distribution}) Constructs a DiffLatentModel for d-fold differencing with latent_model as the undifferenced latent process. The d initial terms have priors given by the vector init_priors, therefore length(init_priors) sets d.\n\nExample usage with generate_latent\n\ngenerate_latent can be used to construct a Turing model for the differenced latent process. In this example, the underlying undifferenced process is a RandomWalk model.\n\nFirst, we construct a RandomWalk struct with an initial value prior and a step size standard deviation prior.\n\nusing Distributions, EpiAware\nrw = RandomWalk(Normal(0.0, 1.0), truncated(Normal(0.0, 0.05), 0.0, Inf))\n\nThen, we can use DiffLatentModel to construct a DiffLatentModel for d-fold differenced process with rw as the undifferenced latent process.\n\nWe have two constructor options for DiffLatentModel. The first option is to supply a common prior distribution for the initial terms and specify d as follows:\n\ndiff_model = DiffLatentModel(rw, Normal(); d = 2)\n\nOr we can supply a vector of priors for the initial terms and d is inferred as follows:\n\ndiff_model2 = DiffLatentModel(;undiffmodel = rw, init_priors = [Normal(), Normal()])\n\nThen, we can use generate_latent to construct a Turing model for the differenced latent process generating a length n process,\n\n# Construct a Turing model\nn = 100\ndifference_mdl = generate_latent(diff_model, n)\n\nNow we can use the Turing PPL API to sample underlying parameters and generate the unobserved latent process.\n\n#Sample random parameters from prior\nθ = rand(difference_mdl)\n#Get a sampled latent process as a generated quantity from the model\n(Z_t, _) = generated_quantities(difference_mdl, θ)\nZ_t\n\n\n\nFields\n\nmodel::AbstractTuringLatentModel: Underlying latent model for the differenced process\ninit_prior::Distributions.Distribution: The prior distribution for the initial latent variables.\nd::Int64: Number of times differenced.\n\n\n\n\n\n","category":"type"},{"location":"lib/EpiLatentModels/public/#EpiAware.EpiLatentModels.FixedIntercept","page":"Public API","title":"EpiAware.EpiLatentModels.FixedIntercept","text":"struct FixedIntercept{F<:Real} <: AbstractTuringIntercept\n\nA variant of the Intercept struct that represents a fixed intercept value for a latent model.\n\nConstructors\n\nFixedIntercept(intercept) : Constructs a FixedIntercept instance with the specified intercept value.\nFixedIntercept(; intercept) : Constructs a FixedIntercept instance with the specified intercept value using named arguments.\n\nExamples\n\nusing EpiAware\nfi = FixedIntercept(2.0)\nfi_model = generate_latent(fi, 10)\nfi_model()\n\n\n\nFields\n\nintercept::Real\n\n\n\n\n\n","category":"type"},{"location":"lib/EpiLatentModels/public/#EpiAware.EpiLatentModels.HierarchicalNormal","page":"Public API","title":"EpiAware.EpiLatentModels.HierarchicalNormal","text":"struct HierarchicalNormal{R<:Real, D<:Distributions.Sampleable} <: AbstractTuringLatentModel\n\nThe HierarchicalNormal struct represents a non-centered hierarchical normal distribution.\n\nConstructors\n\nHierarchicalNormal(mean, std_prior): Constructs a HierarchicalNormal instance with the specified mean and standard deviation prior.\nHierarchicalNormal(; mean = 0.0, std_prior = truncated(Normal(0,1), 0, Inf)): Constructs a HierarchicalNormal instance with the specified mean and standard deviation prior using named arguments and with default values.\n\nExamples\n\nusing Distributions, EpiAware\nhnorm = HierarchicalNormal(0.0, truncated(Normal(0, 1), 0, Inf))\nhnorm_model = generate_latent(hnorm, 10)\nhnorm_model()\n\n\n\nFields\n\nmean::Real\nstd_prior::Distributions.Sampleable\n\n\n\n\n\n","category":"type"},{"location":"lib/EpiLatentModels/public/#EpiAware.EpiLatentModels.Intercept","page":"Public API","title":"EpiAware.EpiLatentModels.Intercept","text":"struct Intercept{D<:Distributions.Sampleable} <: AbstractTuringIntercept\n\nThe Intercept struct is used to model the intercept of a latent process. It broadcasts a single intercept value to a length n latent process.\n\nConstructors\n\nIntercept(intercept_prior)\nIntercept(; intercept_prior)\n\nExamples\n\nusing Distributions, Turing, EpiAware\nint = Intercept(Normal(0, 1))\nint_model = generate_latent(int, 10)\nrand(int_model)\nint_model()\n\n\n\nFields\n\nintercept_prior::Distributions.Sampleable: Prior distribution for the intercept.\n\n\n\n\n\n","category":"type"},{"location":"lib/EpiLatentModels/public/#EpiAware.EpiLatentModels.PrefixLatentModel","page":"Public API","title":"EpiAware.EpiLatentModels.PrefixLatentModel","text":"struct PrefixLatentModel{M<:AbstractTuringLatentModel, P<:String} <: AbstractTuringLatentModel\n\nGenerate a latent model with a prefix. A lightweight wrapper around `EpiAwareUtils.prefix_submodel`.\n\n# Constructors\n- `PrefixLatentModel(model::M, prefix::P)`: Create a `PrefixLatentModel` with the latent model `model` and the prefix `prefix`.\n- `PrefixLatentModel(; model::M, prefix::P)`: Create a `PrefixLatentModel` with the latent model `model` and the prefix `prefix`.\n\n# Examples\n```julia\nusing EpiAware\nlatent_model = PrefixLatentModel(model = HierarchicalNormal(), prefix = \"Test\")\nmdl = generate_latent(latent_model, 10)\nrand(mdl)\n```\n\n\n\nFields\n\nmodel::AbstractTuringLatentModel: The latent model\nprefix::String: The prefix for the latent model\n\n\n\n\n\n","category":"type"},{"location":"lib/EpiLatentModels/public/#EpiAware.EpiLatentModels.RandomWalk","page":"Public API","title":"EpiAware.EpiLatentModels.RandomWalk","text":"struct RandomWalk{D<:Distributions.Sampleable, S<:Distributions.Sampleable} <: AbstractTuringLatentModel\n\nModel latent process Z_t as a random walk.\n\nMathematical specification\n\nThe random walk Z_t is specified as a parameteric transformation of the white noise sequence (epsilon_t)_tgeq 1,\n\nZ_t = Z_0 + sigma sum_i = 1^t epsilon_t\n\nConstructing a random walk requires specifying:\n\nAn init_prior as a prior for Z_0. Default is Normal().\nA std_prior for sigma. The default is HalfNormal with a mean of 0.25.\n\nConstructors\n\nRandomWalk(; init_prior, std_prior)\n\nExample usage with generate_latent\n\ngenerate_latent can be used to construct a Turing model for the random walk Z_t.\n\nFirst, we construct a RandomWalk struct with priors,\n\nusing Distributions, Turing, EpiAware\n\n# Create a RandomWalk model\nrw = RandomWalk(init_prior = Normal(2., 1.),\n std_prior = HalfNormal(0.1))\n\nThen, we can use generate_latent to construct a Turing model for a 10 step random walk.\n\n# Construct a Turing model\nrw_model = generate_latent(rw, 10)\n\nNow we can use the Turing PPL API to sample underlying parameters and generate the unobserved infections.\n\n#Sample random parameters from prior\nθ = rand(rw_model)\n#Get random walk sample path as a generated quantities from the model\nZ_t, _ = generated_quantities(rw_model, θ)\n\n\n\nFields\n\ninit_prior::Distributions.Sampleable\nstd_prior::Distributions.Sampleable\n\n\n\n\n\n","category":"type"},{"location":"lib/EpiLatentModels/public/#EpiAware.EpiLatentModels.RecordExpectedLatent","page":"Public API","title":"EpiAware.EpiLatentModels.RecordExpectedLatent","text":"struct RecordExpectedLatent{M<:AbstractTuringLatentModel} <: AbstractTuringLatentModel\n\nRecord a variable (using the Turing := syntax) in a latent model.\n\n# Fields\n- `model::AbstractTuringLatentModel`: The latent model to dispatch to.\n\n# Constructors\n\n- `RecordExpectedLatent(model::AbstractTuringLatentModel)`: Record the expected latent vector from the model as `exp_latent`.\n\n# Examples\n\n```julia\nusing EpiAware, Turing\nmdl = RecordExpectedLatent(FixedIntercept(0.1))\ngen_latent = generate_latent(mdl, 1)\nsample(gen_latent, Prior(), 10)\n```\n\n\n\nFields\n\nmodel::AbstractTuringLatentModel\n\n\n\n\n\n","category":"type"},{"location":"lib/EpiLatentModels/public/#EpiAware.EpiLatentModels.RepeatBlock","page":"Public API","title":"EpiAware.EpiLatentModels.RepeatBlock","text":"struct RepeatBlock <: AbstractBroadcastRule\n\nRepeatBlock is a struct that represents a broadcasting rule. It is a subtype of AbstractBroadcastRule.\n\nIt repeats the latent process in blocks of size period. An example of this rule is to repeat the latent process in blocks of size 7 to model a weekly process (though for this we also provide the broadcast_weekly helper function).\n\nExamples\n\nusing EpiAware\nrule = RepeatBlock()\nlatent = [1, 2, 3, 4, 5]\nn = 10\nperiod = 2\nbroadcast_rule(rule, latent, n, period)\n\n\n\nFields\n\n\n\n\n\n","category":"type"},{"location":"lib/EpiLatentModels/public/#EpiAware.EpiLatentModels.RepeatEach","page":"Public API","title":"EpiAware.EpiLatentModels.RepeatEach","text":"struct RepeatEach <: AbstractBroadcastRule\n\nRepeatEach is a struct that represents a broadcasting rule. It is a subtype of AbstractBroadcastRule.\n\nIt repeats the latent process at each period. An example of this rule is to repeat the latent process at each day of the week (though for this we also provide the dayofweek helper function).\n\nExamples\n\nusing EpiAware\nrule = RepeatEach()\nlatent = [1, 2]\nn = 10\nperiod = 2\nbroadcast_rule(rule, latent, n, period)\n\n\n\nFields\n\n\n\n\n\n","category":"type"},{"location":"lib/EpiLatentModels/public/#EpiAware.EpiLatentModels.TransformLatentModel","page":"Public API","title":"EpiAware.EpiLatentModels.TransformLatentModel","text":"struct TransformLatentModel{M<:AbstractTuringLatentModel, F<:Function} <: AbstractTuringLatentModel\n\nThe TransformLatentModel struct represents a latent model that applies a transformation function to the latent variables generated by another latent model.\n\nConstructors\n\nTransformLatentModel(model, trans_function): Constructs a TransformLatentModel instance with the specified latent model and transformation function.\nTransformLatentModel(; model, trans_function): Constructs a TransformLatentModel instance with the specified latent model and transformation function using named arguments.\n\nExample\n\nusing EpiAware, Distributions\ntrans = TransformLatentModel(Intercept(Normal(2, 0.2)), x -> x .|> exp)\ntrans_model = generate_latent(trans, 5)\ntrans_model()\n\n\n\nFields\n\nmodel::AbstractTuringLatentModel: The latent model to transform.\ntrans_function::Function: The transformation function.\n\n\n\n\n\n","category":"type"},{"location":"lib/EpiLatentModels/public/#EpiAware.EpiAwareBase.broadcast_rule-Tuple{RepeatBlock, Any, Any, Any}","page":"Public API","title":"EpiAware.EpiAwareBase.broadcast_rule","text":"broadcast_rule(_::RepeatBlock, latent, n, period) -> Any\n\n\nbroadcast_rule is a function that applies the RepeatBlock rule to the latent process latent to generate n samples.\n\nArguments\n\nrule::RepeatBlock: The broadcasting rule.\nlatent::Vector: The latent process.\nn: The number of samples to generate.\nperiod: The period of the broadcast.\n\nReturns\n\nlatent: The generated broadcasted latent periods.\n\n\n\n\n\n","category":"method"},{"location":"lib/EpiLatentModels/public/#EpiAware.EpiAwareBase.broadcast_rule-Tuple{RepeatEach, Any, Any, Any}","page":"Public API","title":"EpiAware.EpiAwareBase.broadcast_rule","text":"broadcast_rule(_::RepeatEach, latent, n, period) -> Any\n\n\nbroadcast_rule is a function that applies the RepeatEach rule to the latent process latent to generate n samples.\n\nArguments\n\nrule::RepeatEach: The broadcasting rule.\nlatent::Vector: The latent process.\nn: The number of samples to generate.\nperiod: The period of the broadcast.\n\nReturns\n\nlatent: The generated broadcasted latent periods.\n\n\n\n\n\n","category":"method"},{"location":"lib/EpiLatentModels/public/#EpiAware.EpiLatentModels.broadcast_dayofweek-Tuple{AbstractTuringLatentModel}","page":"Public API","title":"EpiAware.EpiLatentModels.broadcast_dayofweek","text":"broadcast_dayofweek(\n model::AbstractTuringLatentModel;\n link\n) -> BroadcastLatentModel{TransformLatentModel{M, EpiAware.EpiLatentModels.var\"#42#44\"}, Int64, RepeatEach} where M<:AbstractTuringLatentModel\n\n\nConstructs a BroadcastLatentModel appropriate for modelling the day of the week for a given AbstractTuringLatentModel.\n\nArguments\n\nmodel::AbstractTuringLatentModel: The latent model to be repeated.\nlink::Function: The link function to transform the latent model before broadcasting\n\nto periodic weekly. Default is x -> 7 * softmax(x) which implements constraint of the sum week effects to be 7.\n\nReturns\n\nBroadcastLatentModel: The broadcast latent model.\n\n\n\n\n\n","category":"method"},{"location":"lib/EpiLatentModels/public/#EpiAware.EpiLatentModels.broadcast_weekly-Tuple{AbstractTuringLatentModel}","page":"Public API","title":"EpiAware.EpiLatentModels.broadcast_weekly","text":"broadcast_weekly(\n model::AbstractTuringLatentModel\n) -> BroadcastLatentModel{<:AbstractTuringLatentModel, Int64, RepeatBlock}\n\n\nConstructs a BroadcastLatentModel appropriate for modelling piecewise constant weekly processes for a given AbstractTuringLatentModel.\n\nArguments\n\nmodel::AbstractTuringLatentModel: The latent model to be repeated.\n\nReturns\n\nBroadcastLatentModel: The broadcast latent model.\n\n\n\n\n\n","category":"method"},{"location":"lib/EpiLatentModels/public/#EpiAware.EpiLatentModels.equal_dimensions-Tuple{Int64, Int64}","page":"Public API","title":"EpiAware.EpiLatentModels.equal_dimensions","text":"equal_dimensions(n::Int64, m::Int64) -> Vector{Int64}\n\n\nReturn a vector of dimensions that are equal or as close as possible, given the total number of elements n and the number of dimensions m. The default dimension adaptor for ConcatLatentModels.\n\nArguments\n\nn::Int: The total number of elements.\nm::Int: The number of dimensions.\n\nReturns\n\ndims::AbstractVector{Int}: A vector of dimensions, where the first element is the ceiling of n / m and the remaining elements are the floor of n / m.\n\n\n\n\n\n","category":"method"},{"location":"getting-started/faq/#Frequently-asked-questions","page":"Frequently asked questions","title":"Frequently asked questions","text":"","category":"section"},{"location":"getting-started/faq/","page":"Frequently asked questions","title":"Frequently asked questions","text":"This page contains a list of frequently asked questions about the EpiAware package. If you have a question that is not answered here, please open a discussion on the GitHub repository.","category":"page"},{"location":"getting-started/faq/","page":"Frequently asked questions","title":"Frequently asked questions","text":"Pages = [\"lib/getting-started/faq.md\"]","category":"page"},{"location":"getting-started/faq/#Pluto-notebooks","page":"Frequently asked questions","title":"Pluto notebooks","text":"","category":"section"},{"location":"getting-started/faq/","page":"Frequently asked questions","title":"Frequently asked questions","text":"In some of the showcase examples in EpiAware/docs/src/showcase we use Pluto.jl notebooks for the underlying computation. As well as reading the code blocks and output of the notebooks in this documentation, you can also run these notebooks by cloning EpiAware and running the notebooks with Pluto.jl (for further details see developer notes).","category":"page"},{"location":"getting-started/faq/","page":"Frequently asked questions","title":"Frequently asked questions","text":"It should be noted that Pluto.jl notebooks are reactive, meaning that they re-run downstream code after changes with downstreaming determined by a tree of dependent code blocks. This is different from the standard Julia REPL, and some other notebook formats (e.g. .ipynb). In Pluto each code block is a single lines of code or encapsulated by let ... end and begin ... end. The difference between let ... end blocks and begin ... end blocks are that the let ... end type of code block only adds the final output/return value of the block to scope, like an anonymous function, whereas begin ... end executes each line and adds defined variables to scope.","category":"page"},{"location":"getting-started/faq/","page":"Frequently asked questions","title":"Frequently asked questions","text":"For installation instructions and more information and documentation on Pluto.jl see the Pluto.jl documentation.","category":"page"},{"location":"getting-started/faq/#Manipulating-EpiAware-model-specifications","page":"Frequently asked questions","title":"Manipulating EpiAware model specifications","text":"","category":"section"},{"location":"getting-started/faq/#Modular-model-construction","page":"Frequently asked questions","title":"Modular model construction","text":"","category":"section"},{"location":"getting-started/faq/","page":"Frequently asked questions","title":"Frequently asked questions","text":"One of the key features of EpiAware is the ability to specify models as components of a larger model. This is useful for specifying models that are shared across multiple EpiProblems or for specifying models that are used in multiple methods. You can see an examples of this approach in our showcases.","category":"page"},{"location":"getting-started/faq/#Remaking-models","page":"Frequently asked questions","title":"Remaking models","text":"","category":"section"},{"location":"getting-started/faq/","page":"Frequently asked questions","title":"Frequently asked questions","text":"An alternative to modular model construction is to remake models with different parameters. This can be useful for comparing models with different parameters or for comparing models with different priors. Whilst we don't have a built in function for this, we recommend the Accessors.jl package for this purpose. For examples of how to use this package see the documentation.","category":"page"},{"location":"getting-started/faq/#Working-with-Turing.jl-models","page":"Frequently asked questions","title":"Working with Turing.jl models","text":"","category":"section"},{"location":"getting-started/faq/#[DynamicPPL.jl](https://github.com/TuringLang/DynamicPPL.jl)","page":"Frequently asked questions","title":"DynamicPPL.jl","text":"","category":"section"},{"location":"getting-started/faq/","page":"Frequently asked questions","title":"Frequently asked questions","text":"Whilst Turing.jl is the front end of the Turing.jl ecosystem, it is not the only package that can be used to work with Turing.jl models. DynamicPPL.jl is the part of the ecosytem that deals with defining, running, and manipulating models.","category":"page"},{"location":"getting-started/faq/#Conditioning-and-deconditioning-models","page":"Frequently asked questions","title":"Conditioning and deconditioning models","text":"","category":"section"},{"location":"getting-started/faq/","page":"Frequently asked questions","title":"Frequently asked questions","text":"DynamicPPL supports the condition (alased with |) to fix values as known observations in the model (i.e fixing values on the left hand side of ~ definitions). This is useful for fixing parameters to known values or for conditioning the model on data. The decondition function can be used to remove these conditions. Internally this is what apply_method(::EpiProblem, ...) does to condition the user supplied EpiProblem to data. See more here.","category":"page"},{"location":"getting-started/faq/#Fixing-and-unfixing-models","page":"Frequently asked questions","title":"Fixing and unfixing models","text":"","category":"section"},{"location":"getting-started/faq/","page":"Frequently asked questions","title":"Frequently asked questions","text":"Similarly to conditioning and deconditioning models, DynamicPPL supports fixing and unfixing models via the fix and unfix functions. Fixing is essentially saying that variables are constants (i.e replacing the right hand side of ~ with a value and changing the ~ to a =). A common use of this would be to simplify a prespecified model, for example to make the variance of a random walk be known versus estimated from the data. We also use this functionality in apply_method(::EpiProblem, ...) to allow users to simplify EpiProblems on the fly. See more here.","category":"page"},{"location":"getting-started/faq/#Tools-for-working-with-MCMCChain-objects","page":"Frequently asked questions","title":"Tools for working with MCMCChain objects","text":"","category":"section"},{"location":"getting-started/faq/#[MCMCChain.jl](https://turinglang.org/MCMCChains.jl/stable/)","page":"Frequently asked questions","title":"MCMCChain.jl","text":"","category":"section"},{"location":"getting-started/faq/","page":"Frequently asked questions","title":"Frequently asked questions","text":"MCMCChain.jl is the package from which MCMCChains is imported. It provides a number of useful functions for working with MCMCChain objects. These include functions for summarising, plotting, and manipulating chains. Below is a list of some of the most useful functions.","category":"page"},{"location":"getting-started/faq/","page":"Frequently asked questions","title":"Frequently asked questions","text":"plot: Plots trace and density plots for each parameter in the chain object.\nhistogram: Plots histograms for each parameter in the chain object by chain.\nget: Accesses the values of a parameter/s in the chain object.\nDataFrames.DataFrame converts a chain into a wide format DataFrame.\ndescribe: Prints the summary statistics of the chain object.","category":"page"},{"location":"getting-started/faq/","page":"Frequently asked questions","title":"Frequently asked questions","text":"There are many more functions available in the MCMCChain.jl package. For a full list of functions, see the documentation.","category":"page"},{"location":"getting-started/faq/#[Arviz.jl](https://julia.arviz.org/ArviZ/stable/)","page":"Frequently asked questions","title":"Arviz.jl","text":"","category":"section"},{"location":"getting-started/faq/","page":"Frequently asked questions","title":"Frequently asked questions","text":"An alternative to MCMCChain.jl is the ArviZ.jl package. ArviZ.jl is a Julia meta-package for exploratory analysis of Bayesian models. It is part of the ArviZ project, which also includes a related Python package.","category":"page"},{"location":"getting-started/faq/","page":"Frequently asked questions","title":"Frequently asked questions","text":"ArviZ.jl uses a InferenceData object to store the results of a Bayesian analysis. This object can be created from a MCMCChain object using the from_mcmcchains function. The InferenceData object can then be used to create a range of plots and summaries of the model. This is particularly useful as it allows you to specify the indexes of your parameters (for example you could use dates for time parameters).","category":"page"},{"location":"getting-started/faq/","page":"Frequently asked questions","title":"Frequently asked questions","text":"In addition to this useful functionality from_mcmcchains can also be used to combine posterior predictions with prior predictions, prior information and the log likelihood of the model (see here for an example of this). This unlocks a range of useful diagnostics and plots that can be used to assess the model.","category":"page"},{"location":"getting-started/faq/","page":"Frequently asked questions","title":"Frequently asked questions","text":"There is a lot of functionality in ArviZ.jl and it is worth exploring the documentation to see what is available.","category":"page"},{"location":"showcase/replications/chatzilena-2019/","page":"Statistical inference for ODE-based infectious disease models","title":"Statistical inference for ODE-based infectious disease models","text":"\n\n\n\n\n\n\n\n

              Example: Statistical inference for ODE-based infectious disease models

              Introduction

              What are we going to do in this Vignette

              In this vignette, we'll demonstrate how to use EpiAware in conjunction with SciML ecosystem for Bayesian inference of infectious disease dynamics. The model and data is heavily based on Contemporary statistical inference for infectious disease models using Stan Chatzilena et al. 2019.

              We'll cover the following key points:

              1. Defining the deterministic ODE model from Chatzilena et al section 2.2.2 using SciML ODE functionality and an EpiAware observation model.

              2. Build on this to define the stochastic ODE model from Chatzilena et al section 2.2.3 using an EpiAware observation model.

              3. Fitting the deterministic ODE model to data from an Influenza outbreak in an English boarding school.

              4. Fitting the stochastic ODE model to data from an Influenza outbreak in an English boarding school.

              What might I need to know before starting

              This vignette builds on concepts from EpiAware observation models and a familarity with the SciML and Turing ecosystems would be useful but not essential.

              Packages used in this vignette

              Alongside the EpiAware package we will use the OrdinaryDiffEq and SciMLSensitivity packages for interfacing with SciML ecosystem; this is a lower dependency usage of DifferentialEquations.jl that, respectively, exposes ODE solvers and adjoint methods for ODE solvees; that is the method of propagating parameter derivatives through functions containing ODE solutions. Bayesian inference will be done with NUTS from the Turing ecosystem. We will also use the CairoMakie package for plotting and DataFramesMeta for data manipulation.

              \n\n
              using EpiAware
              \n\n\n
              using Turing
              \n\n\n
              using OrdinaryDiffEq, SciMLSensitivity #ODE solvers and adjoint methods
              \n\n\n
              using Distributions, Statistics, LogExpFunctions #Statistics and special func packages
              \n\n\n
              using CSV, DataFramesMeta #Data wrangling
              \n\n\n
              using CairoMakie, PairPlots
              \n\n\n
              using ReverseDiff #Automatic differentiation backend
              \n\n\n
              begin #Date utility and set Random seed\n    using Dates\n    using Random\n    Random.seed!(1234)\nend
              \n
              TaskLocalRNG()
              \n\n\n

              Single population SIR model

              As mentioned in Chatzilena et al disease spread is frequently modelled in terms of ODE-based models. The study population is divided into compartments representing a specific stage of the epidemic status. In this case, susceptible, infected, and recovered individuals.

              $$\\begin{aligned}\n{dS \\over dt} &= - \\beta \\frac{I(t)}{N} S(t) \\\\\n{dI \\over dt} &= \\beta \\frac{I(t)}{N} S(t) - \\gamma I(t) \\\\\n{dR \\over dt} &= \\gamma I(t). \\\\\n\\end{aligned}$$

              where S(t) represents the number of susceptible, I(t) the number of infected and R(t) the number of recovered individuals at time t. The total population size is denoted by N (with N = S(t) + I(t) + R(t)), β denotes the transmission rate and γ denotes the recovery rate.

              \n\n\n

              We can interface to the SciML ecosystem by writing a function with the signature:

              (du, u, p, t) -> nothing

              Where:

              We do this for the SIR model described above in a function called sir!:

              \n\n
              function sir!(du, u, p, t)\n    S, I, R = u\n    β, γ = p\n    du[1] = -β * I * S\n    du[2] = β * I * S - γ * I\n    du[3] = γ * I\n\n    return nothing\nend
              \n
              sir! (generic function with 1 method)
              \n\n\n

              We combine vector field function sir! with a initial condition u0 and the integration period tspan to make an ODEProblem. We do not define the parameters, these will be defined within an inference approach.

              \n\n
              sir_prob = ODEProblem(\n    sir!,\n    N .* [0.99, 0.01, 0.0],\n    (0.0, (Date(1978, 2, 4) - Date(1978, 1, 22)).value + 1)\n)
              \n
              ODEProblem with uType Vector{Float64} and tType Float64. In-place: true\ntimespan: (0.0, 14.0)\nu0: 3-element Vector{Float64}:\n 755.37\n   7.63\n   0.0
              \n\n\n

              Note that this is analogous to the EpiProblem approach we expose from EpiAware, as used in the Mishra et al replication. The difference is that here we are going to use ODE solvers from the SciML ecosystem to generate the dynamics of the underlying infections. In the linked example, we use latent process generation exposed by EpiAware as the underlying generative process for underlying dynamics.

              \n\n","category":"page"},{"location":"showcase/replications/chatzilena-2019/#Data-for-inference","page":"Statistical inference for ODE-based infectious disease models","title":"Data for inference","text":"","category":"section"},{"location":"showcase/replications/chatzilena-2019/","page":"Statistical inference for ODE-based infectious disease models","title":"Statistical inference for ODE-based infectious disease models","text":"
              \n

              There was a brief, but intense, outbreak of Influenza within the (semi-) closed community of a boarding school reported to the British medical journal in 1978. The outbreak lasted from 22nd January to 4th February and it is reported that one infected child started the epidemic and then it spread rapidly. Of the 763 children at the boarding scholl, 512 became ill.

              We downloaded the data of this outbreak using the R package outbreaks which is maintained as part of the R Epidemics Consortium(RECON).

              \n\n
              data = \"https://raw.githubusercontent.com/CDCgov/Rt-without-renewal/refs/heads/main/EpiAware/docs/src/showcase/replications/chatzilena-2019/influenza_england_1978_school.csv2\" |>\n       url -> CSV.read(download(url), DataFrame) |>\n              df -> @transform(df,\n    :ts=(:date .- minimum(:date)) .|> d -> d.value + 1.0,)
              \n
              Column1datein_bedconvalescentts
              111978-01-22301.0
              221978-01-23802.0
              331978-01-242603.0
              441978-01-257604.0
              551978-01-2622595.0
              661978-01-27298176.0
              771978-01-282581057.0
              881978-01-292331628.0
              991978-01-301891769.0
              10101978-01-3112816610.0
              11111978-02-016815011.0
              12121978-02-02298512.0
              13131978-02-03144713.0
              14141978-02-0442014.0
              \n\n
              N = 763;
              \n\n\n","category":"page"},{"location":"showcase/replications/chatzilena-2019/#Inference-for-the-deterministic-SIR-model","page":"Statistical inference for ODE-based infectious disease models","title":"Inference for the deterministic SIR model","text":"","category":"section"},{"location":"showcase/replications/chatzilena-2019/","page":"Statistical inference for ODE-based infectious disease models","title":"Statistical inference for ODE-based infectious disease models","text":"
              \n

              The boarding school data gives the number of children \"in bed\" and \"convalescent\" on each of 14 days from 22nd Jan to 4th Feb 1978. We follow Chatzilena et al and treat the number \"in bed\" as a proxy for the number of children in the infectious (I) compartment in the ODE model.

              The full observation model is:

              $$\\begin{aligned}\nY_t &\\sim \\text{Poisson}(\\lambda_t)\\\\\n\\lambda_t &= I(t)\\\\\n\\beta &\\sim \\text{LogNormal}(\\text{logmean}=0,\\text{logstd}=1) \\\\\n\\gamma & \\sim \\text{Gamma}(\\text{shape} = 0.004, \\text{scale} = 50)\\\\\nS(0) /N &\\sim \\text{Beta}(0.5, 0.5).\n\\end{aligned}$$

              NB: Chatzilena et al give \\(\\lambda_t = \\int_0^t \\beta \\frac{I(s)}{N} S(s) - \\gamma I(s)ds = I(t) - I(0).\\) However, this doesn't match their underlying stan code.

              \n\n\n

              From EpiAware, we have the PoissonError struct which defines the probabilistic structure of this observation error model.

              \n\n
              obs = PoissonError()
              \n
              PoissonError()
              \n\n\n

              Now we can write the probabilistic model using the Turing PPL. Note that instead of using \\(I(t)\\) directly we do the softplus transform on \\(I(t)\\) implemented by LogExpFunctions.log1pexp. The reason is that the solver can return small negative numbers, the soft plus transform smoothly maintains positivity which being very close to \\(I(t)\\) when \\(I(t) > 2\\).

              \n\n
              @model function deterministic_ode_mdl(y_t, ts, obs, prob, N;\n        solver = AutoTsit5(Rosenbrock23())\n)\n    ##Priors##\n    β ~ LogNormal(0.0, 1.0)\n    γ ~ Gamma(0.004, 1 / 0.002)\n    S₀ ~ Beta(0.5, 0.5)\n\n    ##remake ODE model##\n    _prob = remake(prob;\n        u0 = [S₀, 1 - S₀, 0.0],\n        p = [β, γ]\n    )\n\n    ##Solve remade ODE model##\n\n    sol = solve(_prob, solver;\n        saveat = ts,\n        verbose = false)\n\n    ##log-like accumulation using obs##\n    λt = log1pexp.(N * sol[2, :]) # #expected It\n    @submodel generated_y_t = generate_observations(obs, y_t, λt)\n\n    ##Generated quantities##\n    return (; sol, generated_y_t, R0 = β / γ)\nend
              \n
              deterministic_ode_mdl (generic function with 2 methods)
              \n\n\n

              We instantiate the model in two ways:

              1. deterministic_mdl: This conditions the generative model on the data observation. We can sample from this model to find the posterior distribution of the parameters.

              2. deterministic_uncond_mdl: This doesn't condition on the data. This is useful for prior and posterior predictive modelling.

              Here we construct the Turing model directly, in the Mishra et al replication we using the EpiProblem functionality to build a Turing model under the hood. Because in this note we are using a mix of functionality from SciML and EpiAware, we construct the model to sample from directly.

              \n\n
              deterministic_mdl = deterministic_ode_mdl(data.in_bed, data.ts, obs, sir_prob, N);
              \n\n\n
              deterministic_uncond_mdl = deterministic_ode_mdl(\n    fill(missing, length(data.in_bed)), data.ts, obs, sir_prob, N);
              \n\n\n\n

              We add a useful plotting utility.

              \n\n
              function plot_predYt(data, gens; title::String, ylabel::String)\n    fig = Figure()\n    ga = fig[1, 1:2] = GridLayout()\n\n    ax = Axis(ga[1, 1];\n        title = title,\n        xticks = (data.ts[1:3:end], data.date[1:3:end] .|> string),\n        ylabel = ylabel\n    )\n    pred_Yt = mapreduce(hcat, gens) do gen\n        gen.generated_y_t\n    end |> X -> mapreduce(vcat, eachrow(X)) do row\n        quantile(row, [0.5, 0.025, 0.975, 0.1, 0.9, 0.25, 0.75])'\n    end\n\n    lines!(ax, data.ts, pred_Yt[:, 1]; linewidth = 3, color = :green, label = \"Median\")\n    band!(\n        ax, data.ts, pred_Yt[:, 2], pred_Yt[:, 3], color = (:green, 0.2), label = \"95% CI\")\n    band!(\n        ax, data.ts, pred_Yt[:, 4], pred_Yt[:, 5], color = (:green, 0.4), label = \"80% CI\")\n    band!(\n        ax, data.ts, pred_Yt[:, 6], pred_Yt[:, 7], color = (:green, 0.6), label = \"50% CI\")\n    scatter!(ax, data.in_bed, label = \"data\")\n    leg = Legend(ga[1, 2], ax; framevisible = false)\n    hidespines!(ax)\n\n    fig\nend
              \n
              plot_predYt (generic function with 1 method)
              \n\n\n

              Prior predictive sampling

              \n\n
              let\n    prior_chn = sample(deterministic_uncond_mdl, Prior(), 2000)\n    gens = generated_quantities(deterministic_uncond_mdl, prior_chn)\n    plot_predYt(data, gens;\n        title = \"Prior predictive: deterministic model\",\n        ylabel = \"Number of Infected students\"\n    )\nend
              \n\n\n\n

              The prior predictive checking suggests that a priori our parameter beliefs are very far from the data. Approaching the inference naively can lead to poor fits.

              We do three things to mitigate this:

              1. We choose a switching ODE solver which switches between explicit (Tsit5) and implicit (Rosenbrock23) solvers. This helps avoid the ODE solver failing when the sampler tries extreme parameter values. This is the default solver = AutoTsit5(Rosenbrock23()) above.

              2. We locate the maximum likelihood point, that is we ignore the influence of the priors, as a useful starting point for NUTS.

              \n\n
              nmle_tries = 100
              \n
              100
              \n\n
              mle_fit = map(1:nmle_tries) do _\n    fit = try\n        maximum_likelihood(deterministic_mdl)\n    catch\n        (lp = -Inf,)\n    end\nend |>\n          fits -> (findmax(fit -> fit.lp, fits)[2], fits) |>\n                  max_and_fits -> max_and_fits[2][max_and_fits[1]]
              \n
              ModeResult with maximized lp of -67.36\n[1.8991528341217605, 0.4808836287362608, 0.9995360155493858]
              \n\n
              mle_fit.optim_result.retcode
              \n
              ReturnCode.Success = 1
              \n\n\n

              Note that we choose the best out of 100 tries for the MLE estimators.

              Now, we sample aiming at 1000 samples for each of 4 chains.

              \n\n
              chn = sample(\n    deterministic_mdl, NUTS(), MCMCThreads(), 1000, 4;\n    initial_params = fill(mle_fit.values.array, 4)\n)
              \n
              iterationchainβγS₀lpn_stepsis_accept...
              150111.924530.4987620.999601-80.69227.01.0
              250211.913930.5104770.999503-83.06197.01.0
              350311.820630.4548110.999413-83.09815.01.0
              450412.008220.5025520.999739-83.595631.01.0
              550512.025140.4618030.999763-83.66615.01.0
              650611.999270.4659280.999722-81.84247.01.0
              750711.793810.4888090.999205-81.262563.01.0
              850811.790290.4903840.999199-81.4663.01.0
              950911.794890.4711040.999239-80.879915.01.0
              1051011.897170.4745680.999502-79.58715.01.0
              ...
              \n\n
              describe(chn)
              \n
              2-element Vector{ChainDataFrame}:\n Summary Statistics (3 x 8)\n Quantiles (3 x 6)
              \n\n
              pairplot(chn)
              \n\n\n\n

              Posterior predictive plotting

              \n\n
              let\n    gens = generated_quantities(deterministic_uncond_mdl, chn)\n    plot_predYt(data, gens;\n        title = \"Fitted deterministic model\",\n        ylabel = \"Number of Infected students\"\n    )\nend
              \n\n\n","category":"page"},{"location":"showcase/replications/chatzilena-2019/#Inference-for-the-Stochastic-SIR-model","page":"Statistical inference for ODE-based infectious disease models","title":"Inference for the Stochastic SIR model","text":"","category":"section"},{"location":"showcase/replications/chatzilena-2019/","page":"Statistical inference for ODE-based infectious disease models","title":"Statistical inference for ODE-based infectious disease models","text":"
              \n

              In Chatzilena et al, they present an auto-regressive model for connecting the outcome of the ODE model to illness observations. The argument is that the stochastic component of the model can absorb the noise generated by a possible mis-specification of the model.

              In their approach they consider \\(\\kappa_t = \\log \\lambda_t\\) where \\(\\kappa_t\\) evolves according to an Ornstein-Uhlenbeck process:

              $$d\\kappa_t = \\phi(\\mu_t - \\kappa_t) dt + \\sigma dB_t.$$

              Which has transition density:

              $$\\kappa_{t+1} | \\kappa_t \\sim N\\Big(\\mu_t + \\left(\\kappa_t - \\mu_t\\right)e^{-\\phi}, {\\sigma^2 \\over 2 \\phi} \\left(1 - e^{-2\\phi} \\right)\\Big).$$

              Where \\(\\mu_t = \\log(I(t))\\).

              We modify this approach since it implies that the \\(\\mu_t\\) is treated as constant between observation times.

              Instead we redefine \\(\\kappa_t\\) as the log-residual:

              $$\\kappa_t = \\log(\\lambda_t / I(t)).$$

              With the transition density:

              $$\\kappa_{t+1} | \\kappa_t \\sim N\\Big(\\kappa_te^{-\\phi}, {\\sigma^2 \\over 2 \\phi} \\left(1 - e^{-2\\phi} \\right)\\Big).$$

              This is an AR(1) process.

              The stochastic model is completed:

              $$\\begin{aligned}\nY_t &\\sim \\text{Poisson}(\\lambda_t)\\\\\n\\lambda_t &= I(t)\\exp(\\kappa_t)\\\\\n\\beta &\\sim \\text{LogNormal}(\\text{logmean}=0,\\text{logstd}=1) \\\\\n\\gamma & \\sim \\text{Gamma}(\\text{shape} = 0.004, \\text{scale} = 50)\\\\\nS(0) /N &\\sim \\text{Beta}(0.5, 0.5)\\\\\n\\phi & \\sim \\text{HalfNormal}(0, 100) \\\\\n1 / \\sigma^2 & \\sim \\text{InvGamma}(0.1,0.1).\n\\end{aligned}$$

              \n\n\n

              We will using the AR struct from EpiAware to define the auto-regressive process in this model which has a direct parameterisation of the AR model.

              To convert from the formulation above we sample from the priors, and define HalfNormal priors based on the sampled prior means of \\(e^{-\\phi}\\) and \\({\\sigma^2 \\over 2 \\phi} \\left(1 - e^{-2\\phi} \\right)\\). We also add a strong prior that \\(\\kappa_1 \\approx 0\\).

              \n\n
              ϕs = rand(truncated(Normal(0, 100), lower = 0.0), 1000)
              \n
              1000-element Vector{Float64}:\n  84.27394515942191\n  13.516491690956862\n  51.07348186961277\n  37.941468070981934\n 128.41727813505105\n  43.06012859066134\n  62.31804897315879\n   ⋮\n  56.57116875489856\n 158.33706887743045\n  42.72304061442974\n   7.423694327684998\n 155.60429115685992\n  22.802727733585563
              \n\n
              σ²s = rand(InverseGamma(0.1, 0.1), 1000) .|> x -> 1 / x
              \n
              1000-element Vector{Float64}:\n 0.0016224742151858818\n 6.79221353591839e-9\n 6.207746413070522e-7\n 0.18882277475797452\n 0.0001662633660039789\n 0.1923483831345634\n 0.14764829136880042\n ⋮\n 0.06624877782984823\n 0.14836794638364514\n 0.00021895942825830565\n 2.209773387224151\n 0.06613574232694587\n 0.0026714312973339926
              \n\n
              sampled_AR_damps = ϕs .|> ϕ -> exp(-ϕ)
              \n
              1000-element Vector{Float64}:\n 2.5135680594819346e-37\n 1.3485350660539842e-6\n 6.592781044298219e-23\n 3.3283560716429985e-17\n 1.6946683748176592e-56\n 1.991699264693254e-19\n 8.622142732783223e-28\n ⋮\n 2.7005584094809084e-25\n 1.7182434846473966e-69\n 2.7900964146464195e-19\n 0.0005969397758191972\n 2.641891576222659e-68\n 1.249974556559806e-10
              \n\n
              sampled_AR_stds = map(ϕs, σ²s) do ϕ, σ²\n    (1 - exp(-2 * ϕ)) * σ² / (2 * ϕ)\nend
              \n
              1000-element Vector{Float64}:\n 9.626191179946722e-6\n 2.5125652762581625e-10\n 6.0772696376159436e-9\n 0.00248834302358464\n 6.473559026423481e-7\n 0.002233485935017376\n 0.001184635059999989\n ⋮\n 0.0005855348164793897\n 0.00046851930326718863\n 2.562545000417783e-6\n 0.14883240757631075\n 0.00021251259150776393\n 5.857701167477672e-5
              \n\n\n

              We define the AR(1) process by matching means of HalfNormal prior distributions for the damp parameters and std deviation parameter to the calculated the prior means from the Chatzilena et al definition.

              \n\n
              ar = AR(\n    damp_priors = [HalfNormal(mean(sampled_AR_damps))],\n    std_prior = HalfNormal(mean(sampled_AR_stds)),\n    init_priors = [Normal(0, 0.001)]\n)
              \n
              AR{Product{Continuous, HalfNormal{Float64}, FillArrays.Fill{HalfNormal{Float64}, 1, Tuple{Base.OneTo{Int64}}}}, HalfNormal{Float64}, DistributionsAD.TuringScalMvNormal{Vector{Float64}, Float64}, Int64}(Distributions.Product{Distributions.Continuous, HalfNormal{Float64}, FillArrays.Fill{HalfNormal{Float64}, 1, Tuple{Base.OneTo{Int64}}}}(v=Fill(HalfNormal{Float64}(μ=0.004725237126863895), 1)), HalfNormal{Float64}(μ=0.0184303247003225), DistributionsAD.TuringScalMvNormal{Vector{Float64}, Float64}(m=[0.0], σ=0.001), 1)
              \n\n\n

              We can sample directly from the behaviour specified by the ar struct to do prior predictive checking on the AR(1) process.

              \n\n
              let\n    nobs = size(data, 1)\n    ar_mdl = generate_latent(ar, nobs)\n    fig = Figure()\n    ax = Axis(fig[1, 1],\n        xticks = (data.ts[1:3:end], data.date[1:3:end] .|> string),\n        ylabel = \"exp(kt)\",\n        title = \"Prior predictive sampling for relative residual in mean pred.\"\n    )\n    for i in 1:500\n        lines!(ax, ar_mdl() .|> exp, color = (:grey, 0.15))\n    end\n    fig\nend
              \n\n\n\n

              We see that the choice of priors implies an a priori belief that the extra observation noise on the mean prediction of the ODE model is fairly small, approximately 10% relative to the mean prediction.

              \n\n\n

              We can now define the probabilistic model. The stochastic model assumes a (random) time-varying ascertainment, which we implement using the Ascertainment struct from EpiAware. Note that instead of implementing an ascertainment factor exp.(κₜ) directly, which can be unstable for large primal values, by default Ascertainment uses the LogExpFunctions.xexpy function which implements \\(x\\exp(y)\\) stabily for a wide range of values.

              \n\n\n

              To distinguish random variables sampled by various sub-processes EpiAware process types create prefixes. The default for Ascertainment is just the string \"Ascertainment\", but in this case we use the less verbose \"va\" for \"varying ascertainment\".

              \n\n
              mdl_prefix = \"va\"
              \n
              \"va\"
              \n\n\n

              Now we can construct our time varying ascertianment model. The main keyword arguments here are model and latent_model. model sets the connection between the expected observation and the actual observation. In this case, we reuse our PoissonError model from above. latent_model sets the modification model on the expected values. In this case, we use the AR process we defined above.

              \n\n
              varying_ascertainment = Ascertainment(\n    model = obs,\n    latent_model = ar,\n    latent_prefix = mdl_prefix\n)
              \n
              Ascertainment{PoissonError, AbstractTuringLatentModel, EpiAware.EpiObsModels.var\"#10#16\", String}(PoissonError(), PrefixLatentModel{AR{Product{Continuous, HalfNormal{Float64}, FillArrays.Fill{HalfNormal{Float64}, 1, Tuple{Base.OneTo{Int64}}}}, HalfNormal{Float64}, DistributionsAD.TuringScalMvNormal{Vector{Float64}, Float64}, Int64}, String}(AR{Product{Continuous, HalfNormal{Float64}, FillArrays.Fill{HalfNormal{Float64}, 1, Tuple{Base.OneTo{Int64}}}}, HalfNormal{Float64}, DistributionsAD.TuringScalMvNormal{Vector{Float64}, Float64}, Int64}(Distributions.Product{Distributions.Continuous, HalfNormal{Float64}, FillArrays.Fill{HalfNormal{Float64}, 1, Tuple{Base.OneTo{Int64}}}}(v=Fill(HalfNormal{Float64}(μ=0.004725237126863895), 1)), HalfNormal{Float64}(μ=0.0184303247003225), DistributionsAD.TuringScalMvNormal{Vector{Float64}, Float64}(m=[0.0], σ=0.001), 1), \"va\"), EpiAware.EpiObsModels.var\"#10#16\"(), \"va\")
              \n\n\n

              Now we can declare the full model in the Turing PPL.

              \n\n
              @model function stochastic_ode_mdl(y_t, ts, obs, prob, N;\n        solver = AutoTsit5(Rosenbrock23())\n)\n\n    ##Priors##\n    β ~ LogNormal(0.0, 1.0)\n    γ ~ Gamma(0.004, 1 / 0.002)\n    S₀ ~ Beta(0.5, 0.5)\n\n    ##Remake ODE model##\n    _prob = remake(prob;\n        u0 = [S₀, 1 - S₀, 0.0],\n        p = [β, γ]\n    )\n\n    ##Solve ODE model##\n    sol = solve(_prob, solver;\n        saveat = ts,\n        verbose = false\n    )\n    λt = log1pexp.(N * sol[2, :])\n\n    ##Observation##\n    @submodel generated_y_t = generate_observations(obs, y_t, λt)\n\n    ##Generated quantities##\n    return (; sol, generated_y_t, R0 = β / γ)\nend
              \n
              stochastic_ode_mdl (generic function with 2 methods)
              \n\n
              stochastic_mdl = stochastic_ode_mdl(\n    data.in_bed,\n    data.ts,\n    varying_ascertainment,\n    sir_prob,\n    N\n)
              \n
              DynamicPPL.Model{typeof(stochastic_ode_mdl), (:y_t, :ts, :obs, :prob, :N), (:solver,), (), Tuple{Vector{Int64}, Vector{Float64}, Ascertainment{PoissonError, AbstractTuringLatentModel, EpiAware.EpiObsModels.var\"#10#16\", String}, ODEProblem{Vector{Float64}, Tuple{Float64, Float64}, true, SciMLBase.NullParameters, ODEFunction{true, SciMLBase.AutoSpecialize, typeof(sir!), LinearAlgebra.UniformScaling{Bool}, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, Base.Pairs{Symbol, Union{}, Tuple{}, @NamedTuple{}}, SciMLBase.StandardODEProblem}, Int64}, Tuple{CompositeAlgorithm{0, Tuple{Tsit5{typeof(OrdinaryDiffEqCore.trivial_limiter!), typeof(OrdinaryDiffEqCore.trivial_limiter!), Static.False}, Rosenbrock23{0, true, Nothing, typeof(OrdinaryDiffEqCore.DEFAULT_PRECS), Val{:forward}, true, nothing, typeof(OrdinaryDiffEqCore.trivial_limiter!), typeof(OrdinaryDiffEqCore.trivial_limiter!)}}, AutoSwitch{Tsit5{typeof(OrdinaryDiffEqCore.trivial_limiter!), typeof(OrdinaryDiffEqCore.trivial_limiter!), Static.False}, Rosenbrock23{0, true, Nothing, typeof(OrdinaryDiffEqCore.DEFAULT_PRECS), Val{:forward}, true, nothing, typeof(OrdinaryDiffEqCore.trivial_limiter!), typeof(OrdinaryDiffEqCore.trivial_limiter!)}, Rational{Int64}, Int64}}}, DynamicPPL.DefaultContext}(Main.var\"workspace#17\".stochastic_ode_mdl, (y_t = [3, 8, 26, 76, 225, 298, 258, 233, 189, 128, 68, 29, 14, 4], ts = [1.0, 2.0, 3.0, 4.0, 5.0, 6.0, 7.0, 8.0, 9.0, 10.0, 11.0, 12.0, 13.0, 14.0], obs = Ascertainment{PoissonError, AbstractTuringLatentModel, EpiAware.EpiObsModels.var\"#10#16\", String}(PoissonError(), PrefixLatentModel{AR{Product{Continuous, HalfNormal{Float64}, FillArrays.Fill{HalfNormal{Float64}, 1, Tuple{Base.OneTo{Int64}}}}, HalfNormal{Float64}, DistributionsAD.TuringScalMvNormal{Vector{Float64}, Float64}, Int64}, String}(AR{Product{Continuous, HalfNormal{Float64}, FillArrays.Fill{HalfNormal{Float64}, 1, Tuple{Base.OneTo{Int64}}}}, HalfNormal{Float64}, DistributionsAD.TuringScalMvNormal{Vector{Float64}, Float64}, Int64}(Distributions.Product{Distributions.Continuous, HalfNormal{Float64}, FillArrays.Fill{HalfNormal{Float64}, 1, Tuple{Base.OneTo{Int64}}}}(v=Fill(HalfNormal{Float64}(μ=0.004725237126863895), 1)), HalfNormal{Float64}(μ=0.0184303247003225), DistributionsAD.TuringScalMvNormal{Vector{Float64}, Float64}(m=[0.0], σ=0.001), 1), \"va\"), EpiAware.EpiObsModels.var\"#10#16\"(), \"va\"), prob = ODEProblem{Vector{Float64}, Tuple{Float64, Float64}, true, SciMLBase.NullParameters, ODEFunction{true, SciMLBase.AutoSpecialize, typeof(sir!), LinearAlgebra.UniformScaling{Bool}, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, Base.Pairs{Symbol, Union{}, Tuple{}, @NamedTuple{}}, SciMLBase.StandardODEProblem}(ODEFunction{true, SciMLBase.AutoSpecialize, typeof(sir!), LinearAlgebra.UniformScaling{Bool}, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}(Main.var\"workspace#17\".sir!, LinearAlgebra.UniformScaling{Bool}(true), nothing, nothing, nothing, nothing, nothing, nothing, nothing, nothing, nothing, nothing, nothing, SciMLBase.DEFAULT_OBSERVED, nothing, nothing, nothing, nothing, nothing, nothing), [755.37, 7.63, 0.0], (0.0, 14.0), SciMLBase.NullParameters(), Base.Pairs{Symbol, Union{}, Tuple{}, @NamedTuple{}}(), SciMLBase.StandardODEProblem()), N = 763), (solver = CompositeAlgorithm{0, Tuple{Tsit5{typeof(OrdinaryDiffEqCore.trivial_limiter!), typeof(OrdinaryDiffEqCore.trivial_limiter!), Static.False}, Rosenbrock23{0, true, Nothing, typeof(OrdinaryDiffEqCore.DEFAULT_PRECS), Val{:forward}, true, nothing, typeof(OrdinaryDiffEqCore.trivial_limiter!), typeof(OrdinaryDiffEqCore.trivial_limiter!)}}, AutoSwitch{Tsit5{typeof(OrdinaryDiffEqCore.trivial_limiter!), typeof(OrdinaryDiffEqCore.trivial_limiter!), Static.False}, Rosenbrock23{0, true, Nothing, typeof(OrdinaryDiffEqCore.DEFAULT_PRECS), Val{:forward}, true, nothing, typeof(OrdinaryDiffEqCore.trivial_limiter!), typeof(OrdinaryDiffEqCore.trivial_limiter!)}, Rational{Int64}, Int64}}((Tsit5{typeof(OrdinaryDiffEqCore.trivial_limiter!), typeof(OrdinaryDiffEqCore.trivial_limiter!), Static.False}(OrdinaryDiffEqCore.trivial_limiter!, OrdinaryDiffEqCore.trivial_limiter!, static(false)), Rosenbrock23{0, true, Nothing, typeof(OrdinaryDiffEqCore.DEFAULT_PRECS), Val{:forward}, true, nothing, typeof(OrdinaryDiffEqCore.trivial_limiter!), typeof(OrdinaryDiffEqCore.trivial_limiter!)}(nothing, OrdinaryDiffEqCore.DEFAULT_PRECS, OrdinaryDiffEqCore.trivial_limiter!, OrdinaryDiffEqCore.trivial_limiter!)), AutoSwitch{Tsit5{typeof(OrdinaryDiffEqCore.trivial_limiter!), typeof(OrdinaryDiffEqCore.trivial_limiter!), Static.False}, Rosenbrock23{0, true, Nothing, typeof(OrdinaryDiffEqCore.DEFAULT_PRECS), Val{:forward}, true, nothing, typeof(OrdinaryDiffEqCore.trivial_limiter!), typeof(OrdinaryDiffEqCore.trivial_limiter!)}, Rational{Int64}, Int64}(Tsit5{typeof(OrdinaryDiffEqCore.trivial_limiter!), typeof(OrdinaryDiffEqCore.trivial_limiter!), Static.False}(OrdinaryDiffEqCore.trivial_limiter!, OrdinaryDiffEqCore.trivial_limiter!, static(false)), Rosenbrock23{0, true, Nothing, typeof(OrdinaryDiffEqCore.DEFAULT_PRECS), Val{:forward}, true, nothing, typeof(OrdinaryDiffEqCore.trivial_limiter!), typeof(OrdinaryDiffEqCore.trivial_limiter!)}(nothing, OrdinaryDiffEqCore.DEFAULT_PRECS, OrdinaryDiffEqCore.trivial_limiter!, OrdinaryDiffEqCore.trivial_limiter!), 10, 3, 9//10, 9//10, 2, false, 5)),), DynamicPPL.DefaultContext())
              \n\n
              stochastic_uncond_mdl = stochastic_ode_mdl(\n    fill(missing, length(data.in_bed)),\n    data.ts,\n    varying_ascertainment,\n    sir_prob,\n    N\n)
              \n
              DynamicPPL.Model{typeof(stochastic_ode_mdl), (:y_t, :ts, :obs, :prob, :N), (:solver,), (), Tuple{Vector{Missing}, Vector{Float64}, Ascertainment{PoissonError, AbstractTuringLatentModel, EpiAware.EpiObsModels.var\"#10#16\", String}, ODEProblem{Vector{Float64}, Tuple{Float64, Float64}, true, SciMLBase.NullParameters, ODEFunction{true, SciMLBase.AutoSpecialize, typeof(sir!), LinearAlgebra.UniformScaling{Bool}, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, Base.Pairs{Symbol, Union{}, Tuple{}, @NamedTuple{}}, SciMLBase.StandardODEProblem}, Int64}, Tuple{CompositeAlgorithm{0, Tuple{Tsit5{typeof(OrdinaryDiffEqCore.trivial_limiter!), typeof(OrdinaryDiffEqCore.trivial_limiter!), Static.False}, Rosenbrock23{0, true, Nothing, typeof(OrdinaryDiffEqCore.DEFAULT_PRECS), Val{:forward}, true, nothing, typeof(OrdinaryDiffEqCore.trivial_limiter!), typeof(OrdinaryDiffEqCore.trivial_limiter!)}}, AutoSwitch{Tsit5{typeof(OrdinaryDiffEqCore.trivial_limiter!), typeof(OrdinaryDiffEqCore.trivial_limiter!), Static.False}, Rosenbrock23{0, true, Nothing, typeof(OrdinaryDiffEqCore.DEFAULT_PRECS), Val{:forward}, true, nothing, typeof(OrdinaryDiffEqCore.trivial_limiter!), typeof(OrdinaryDiffEqCore.trivial_limiter!)}, Rational{Int64}, Int64}}}, DynamicPPL.DefaultContext}(Main.var\"workspace#17\".stochastic_ode_mdl, (y_t = [missing, missing, missing, missing, missing, missing, missing, missing, missing, missing, missing, missing, missing, missing], ts = [1.0, 2.0, 3.0, 4.0, 5.0, 6.0, 7.0, 8.0, 9.0, 10.0, 11.0, 12.0, 13.0, 14.0], obs = Ascertainment{PoissonError, AbstractTuringLatentModel, EpiAware.EpiObsModels.var\"#10#16\", String}(PoissonError(), PrefixLatentModel{AR{Product{Continuous, HalfNormal{Float64}, FillArrays.Fill{HalfNormal{Float64}, 1, Tuple{Base.OneTo{Int64}}}}, HalfNormal{Float64}, DistributionsAD.TuringScalMvNormal{Vector{Float64}, Float64}, Int64}, String}(AR{Product{Continuous, HalfNormal{Float64}, FillArrays.Fill{HalfNormal{Float64}, 1, Tuple{Base.OneTo{Int64}}}}, HalfNormal{Float64}, DistributionsAD.TuringScalMvNormal{Vector{Float64}, Float64}, Int64}(Distributions.Product{Distributions.Continuous, HalfNormal{Float64}, FillArrays.Fill{HalfNormal{Float64}, 1, Tuple{Base.OneTo{Int64}}}}(v=Fill(HalfNormal{Float64}(μ=0.004725237126863895), 1)), HalfNormal{Float64}(μ=0.0184303247003225), DistributionsAD.TuringScalMvNormal{Vector{Float64}, Float64}(m=[0.0], σ=0.001), 1), \"va\"), EpiAware.EpiObsModels.var\"#10#16\"(), \"va\"), prob = ODEProblem{Vector{Float64}, Tuple{Float64, Float64}, true, SciMLBase.NullParameters, ODEFunction{true, SciMLBase.AutoSpecialize, typeof(sir!), LinearAlgebra.UniformScaling{Bool}, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, Base.Pairs{Symbol, Union{}, Tuple{}, @NamedTuple{}}, SciMLBase.StandardODEProblem}(ODEFunction{true, SciMLBase.AutoSpecialize, typeof(sir!), LinearAlgebra.UniformScaling{Bool}, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}(Main.var\"workspace#17\".sir!, LinearAlgebra.UniformScaling{Bool}(true), nothing, nothing, nothing, nothing, nothing, nothing, nothing, nothing, nothing, nothing, nothing, SciMLBase.DEFAULT_OBSERVED, nothing, nothing, nothing, nothing, nothing, nothing), [755.37, 7.63, 0.0], (0.0, 14.0), SciMLBase.NullParameters(), Base.Pairs{Symbol, Union{}, Tuple{}, @NamedTuple{}}(), SciMLBase.StandardODEProblem()), N = 763), (solver = CompositeAlgorithm{0, Tuple{Tsit5{typeof(OrdinaryDiffEqCore.trivial_limiter!), typeof(OrdinaryDiffEqCore.trivial_limiter!), Static.False}, Rosenbrock23{0, true, Nothing, typeof(OrdinaryDiffEqCore.DEFAULT_PRECS), Val{:forward}, true, nothing, typeof(OrdinaryDiffEqCore.trivial_limiter!), typeof(OrdinaryDiffEqCore.trivial_limiter!)}}, AutoSwitch{Tsit5{typeof(OrdinaryDiffEqCore.trivial_limiter!), typeof(OrdinaryDiffEqCore.trivial_limiter!), Static.False}, Rosenbrock23{0, true, Nothing, typeof(OrdinaryDiffEqCore.DEFAULT_PRECS), Val{:forward}, true, nothing, typeof(OrdinaryDiffEqCore.trivial_limiter!), typeof(OrdinaryDiffEqCore.trivial_limiter!)}, Rational{Int64}, Int64}}((Tsit5{typeof(OrdinaryDiffEqCore.trivial_limiter!), typeof(OrdinaryDiffEqCore.trivial_limiter!), Static.False}(OrdinaryDiffEqCore.trivial_limiter!, OrdinaryDiffEqCore.trivial_limiter!, static(false)), Rosenbrock23{0, true, Nothing, typeof(OrdinaryDiffEqCore.DEFAULT_PRECS), Val{:forward}, true, nothing, typeof(OrdinaryDiffEqCore.trivial_limiter!), typeof(OrdinaryDiffEqCore.trivial_limiter!)}(nothing, OrdinaryDiffEqCore.DEFAULT_PRECS, OrdinaryDiffEqCore.trivial_limiter!, OrdinaryDiffEqCore.trivial_limiter!)), AutoSwitch{Tsit5{typeof(OrdinaryDiffEqCore.trivial_limiter!), typeof(OrdinaryDiffEqCore.trivial_limiter!), Static.False}, Rosenbrock23{0, true, Nothing, typeof(OrdinaryDiffEqCore.DEFAULT_PRECS), Val{:forward}, true, nothing, typeof(OrdinaryDiffEqCore.trivial_limiter!), typeof(OrdinaryDiffEqCore.trivial_limiter!)}, Rational{Int64}, Int64}(Tsit5{typeof(OrdinaryDiffEqCore.trivial_limiter!), typeof(OrdinaryDiffEqCore.trivial_limiter!), Static.False}(OrdinaryDiffEqCore.trivial_limiter!, OrdinaryDiffEqCore.trivial_limiter!, static(false)), Rosenbrock23{0, true, Nothing, typeof(OrdinaryDiffEqCore.DEFAULT_PRECS), Val{:forward}, true, nothing, typeof(OrdinaryDiffEqCore.trivial_limiter!), typeof(OrdinaryDiffEqCore.trivial_limiter!)}(nothing, OrdinaryDiffEqCore.DEFAULT_PRECS, OrdinaryDiffEqCore.trivial_limiter!, OrdinaryDiffEqCore.trivial_limiter!), 10, 3, 9//10, 9//10, 2, false, 5)),), DynamicPPL.DefaultContext())
              \n\n\n

              Prior predictive checking

              \n\n
              let\n    prior_chn = sample(stochastic_uncond_mdl, Prior(), 2000)\n    gens = generated_quantities(stochastic_uncond_mdl, prior_chn)\n    plot_predYt(data, gens;\n        title = \"Prior predictive: stochastic model\",\n        ylabel = \"Number of Infected students\"\n    )\nend
              \n\n\n\n

              The prior predictive checking again shows misaligned prior beliefs; for example a priori without data we would not expect the median prediction of number of ill children as about 600 out of 763 after 1 day.

              The latent process for the log-residuals \\(\\kappa_t\\) doesn't make much sense without priors, so we look for a reasonable MAP point to start NUTS from. We do this by first making an initial guess which is a mixture of:

              1. The posterior averages from the deterministic model.

              2. The prior averages of the structure parameters of the AR(1) process.

              3. Zero for the time-varying noise underlying the AR(1) process.

              \n\n
              rand(stochastic_mdl)
              \n
              (β = 1.4733099145592605, γ = 2.903750758256854e-123, S₀ = 0.29861836011258897, var\"va.σ_AR\" = 0.04830504386163741, var\"va.ar_init\" = [-0.00024863122657975786], var\"va.damp_AR\" = [0.0032571734979405884], var\"va.ϵ_t\" = [0.3072398792156006, -1.183649965567883, 2.771050948892893, -0.6366422192999562, 1.6191332959597484, 0.24589190588482895, 1.4615005554123257, 0.025353011915720307, 0.16407599045634794, 0.2628599221133207, -1.0048884450877293, 1.96700665270484, -0.7501415436101209])
              \n\n
              initial_guess = [[mean(chn[:β]),\n                     mean(chn[:γ]),\n                     mean(chn[:S₀]),\n                     mean(ar.std_prior),\n                     mean(ar.init_prior)[1],\n                     mean(ar.damp_prior)[1]]\n                 zeros(13)]
              \n
              19-element Vector{Float64}:\n 1.8942148283773665\n 0.48062141906187955\n 0.9995061985155343\n 0.0184303247003225\n 0.0\n 0.004725237126863895\n 0.0\n ⋮\n 0.0\n 0.0\n 0.0\n 0.0\n 0.0\n 0.0
              \n\n\n

              Starting from the initial guess, the MAP point is calculated rapidly in one pass.

              \n\n
              map_fit_stoch_mdl = maximum_a_posteriori(stochastic_mdl;\n    adtype = AutoReverseDiff(),\n    initial_params = initial_guess\n)
              \n
              ModeResult with maximized lp of -69.56\n[1.9168299382321734, 0.4897041462336449, 0.9995563465712941, 0.06675569386075603, 1.3740571689410578e-6, 0.0001575538604931212, 0.14269439047176274, 0.17055298256610424, -0.29859817140192074, 0.6377161540197321, -0.00838185466017144, -0.5911576835821275, 0.7987402297108667, 1.7391572409676643, 1.4382700211216297, 0.24515802269495504, -0.6799723098817362, -0.7437116100116361, -0.8064297391295364]
              \n\n\n

              Now we can run NUTS, sampling 1000 posterior draws per chain for 4 chains.

              \n\n
              chn2 = sample(\n    stochastic_mdl,\n    NUTS(; adtype = AutoReverseDiff(true)),\n    MCMCThreads(), 1000, 4;\n    initial_params = fill(map_fit_stoch_mdl.values.array, 4)\n)
              \n
              iterationchainβγS₀va.σ_ARva.ar_init[1]va.damp_AR[1]...
              150111.930590.4871270.9995620.0964630.0001407320.000875502
              250211.970720.482620.999640.0441607-0.001037260.00110656
              350311.950660.4925990.9996390.03942060.0006763310.00519116
              450411.934920.4767340.9995880.04966710.0006703620.00955519
              550512.025580.4800110.9997290.066430.0006710760.00216336
              650611.980630.4665650.9996840.04910650.0006571630.00561317
              750711.815110.4942140.9992770.0164223-0.0006551070.00677129
              850812.001250.4661080.9996710.05584160.00340180.00524714
              950911.857780.489590.9993950.01294030.001782020.0014243
              1051011.753330.4825710.9990380.01366320.001519150.00445519
              ...
              \n\n
              describe(chn2)
              \n
              2-element Vector{ChainDataFrame}:\n Summary Statistics (19 x 8)\n Quantiles (19 x 6)
              \n\n
              pairplot(chn2[[:β, :γ, :S₀, Symbol(mdl_prefix * \".σ_AR\"),\n    Symbol(mdl_prefix * \".ar_init[1]\"), Symbol(mdl_prefix * \".damp_AR[1]\")]])
              \n\n\n
              let\n    vars = mapreduce(vcat, 1:13) do i\n        Symbol(mdl_prefix * \".ϵ_t[$i]\")\n    end\n    pairplot(chn2[vars])\nend
              \n\n\n
              let\n    gens = generated_quantities(stochastic_uncond_mdl, chn2)\n    plot_predYt(data, gens;\n        title = \"Fitted stochastic model\",\n        ylabel = \"Number of Infected students\"\n    )\nend
              \n\n\n","category":"page"},{"location":"showcase/replications/chatzilena-2019/","page":"Statistical inference for ODE-based infectious disease models","title":"Statistical inference for ODE-based infectious disease models","text":"EditURL = \"https://github.com/CDCgov/Rt-without-renewal/blob/main/docs/src/showcase/replications/chatzilena-2019/index.jl\"","category":"page"},{"location":"getting-started/explainers/latent-models/#Latent-models","page":"Latent models","title":"Latent models","text":"","category":"section"},{"location":"lib/internals/#Internal-Documentation","page":"Internal API","title":"Internal Documentation","text":"","category":"section"},{"location":"lib/internals/","page":"Internal API","title":"Internal API","text":"Documentation for EpiAware.jl's internal interface.","category":"page"},{"location":"lib/internals/#Contents","page":"Internal API","title":"Contents","text":"","category":"section"},{"location":"lib/internals/","page":"Internal API","title":"Internal API","text":"Pages = [\"internals.md\"]\nDepth = 2:2","category":"page"},{"location":"lib/internals/#Index","page":"Internal API","title":"Index","text":"","category":"section"},{"location":"lib/internals/","page":"Internal API","title":"Internal API","text":"Pages = [\"internals.md\"]","category":"page"},{"location":"lib/internals/#Internal-API","page":"Internal API","title":"Internal API","text":"","category":"section"},{"location":"lib/internals/","page":"Internal API","title":"Internal API","text":"Modules = [EpiAware]\nPublic = false","category":"page"},{"location":"getting-started/explainers/intro/#Introduction","page":"Introduction to EpiAware","title":"Introduction","text":"","category":"section"},{"location":"getting-started/explainers/intro/","page":"Introduction to EpiAware","title":"Introduction to EpiAware","text":"The diagram below shows the relationship between the modules in the package for a typical workflow.","category":"page"},{"location":"getting-started/explainers/intro/","page":"Introduction to EpiAware","title":"Introduction to EpiAware","text":"flowchart LR\n\nA[\"Underlying GI\nBijector\"]\n\nEpiModel[\"AbstractTuringEpiModel\n----------------------\nChoice of target\nfor latent process:\n\nDirectInfections\n ExpGrowthRate\n Renewal\"]\n\nInitModel[\"Priors for\ninitial scale of incidence\"]\n\nDataW[Data wrangling and QC]\n\n\nObsData[\"Observational Data\n---------------------\nObs. cases y_t\"]\n\nLatentProcPriors[\"Latent process priors\"]\n\nLatentProc[\"AbstractTuringLatentModel\n---------------------\nRandomWalk\"]\n\nObsModelPriors[\"Observation model priors\nchoice of delayed obs. model\"]\n\nObsModel[\"AbstractObservationModel\n---------------------\nDelayObservations\"]\n\nE[\"Turing model constructor\n---------------------\ngenerate_epiaware\"]\n\nG[Posterior draws]\nH[Posterior checking]\nI[Post-processing]\n\n\n\nA --> EpiData\nEpiData --> EpiModel\nInitModel --> EpiModel\nEpiModel -->E\nObsData-->E\nDataW-.->ObsData\nLatentProcPriors-->LatentProc\nLatentProc-->E\nObsModelPriors-->ObsModel\nObsModel-->E\n\n\nE-->|sample...NUTS...| G\nG-->H\nH-->I","category":"page"},{"location":"lib/EpiAwareBase/internals/#Internal-Documentation","page":"Internal API","title":"Internal Documentation","text":"","category":"section"},{"location":"lib/EpiAwareBase/internals/","page":"Internal API","title":"Internal API","text":"Documentation for EpiAwareBase.jl's internal interface.","category":"page"},{"location":"lib/EpiAwareBase/internals/#Contents","page":"Internal API","title":"Contents","text":"","category":"section"},{"location":"lib/EpiAwareBase/internals/#Contents-2","page":"Internal API","title":"Contents","text":"","category":"section"},{"location":"lib/EpiAwareBase/internals/","page":"Internal API","title":"Internal API","text":"Pages = [\"internals.md\"]\nDepth = 2:2","category":"page"},{"location":"lib/EpiAwareBase/internals/#Index","page":"Internal API","title":"Index","text":"","category":"section"},{"location":"lib/EpiAwareBase/internals/","page":"Internal API","title":"Internal API","text":"Pages = [\"internals.md\"]","category":"page"},{"location":"lib/EpiAwareBase/internals/","page":"Internal API","title":"Internal API","text":"Modules = [EpiAware.EpiAwareBase]\nPublic = false","category":"page"},{"location":"lib/EpiInfModels/internals/#Internal-Documentation","page":"Internal API","title":"Internal Documentation","text":"","category":"section"},{"location":"lib/EpiInfModels/internals/","page":"Internal API","title":"Internal API","text":"Documentation for EpiInfModels.jl's internal interface.","category":"page"},{"location":"lib/EpiInfModels/internals/#Contents","page":"Internal API","title":"Contents","text":"","category":"section"},{"location":"lib/EpiInfModels/internals/","page":"Internal API","title":"Internal API","text":"Pages = [\"internals.md\"]\nDepth = 2:2","category":"page"},{"location":"lib/EpiInfModels/internals/#Index","page":"Internal API","title":"Index","text":"","category":"section"},{"location":"lib/EpiInfModels/internals/","page":"Internal API","title":"Internal API","text":"Pages = [\"internals.md\"]","category":"page"},{"location":"lib/EpiInfModels/internals/#Internal-API","page":"Internal API","title":"Internal API","text":"","category":"section"},{"location":"lib/EpiInfModels/internals/","page":"Internal API","title":"Internal API","text":"Modules = [EpiAware.EpiInfModels]\nPublic = false","category":"page"},{"location":"lib/EpiInfModels/internals/#EpiAware.EpiInfModels.AbstractConstantRenewalStep","page":"Internal API","title":"EpiAware.EpiInfModels.AbstractConstantRenewalStep","text":"abstract type AbstractConstantRenewalStep <: AbstractAccumulationStep\n\nAbstract type representing an accumulation iteration/step for a Renewal model with a constant generation interval.\n\n\n\nFields\n\n\n\n\n\n","category":"type"},{"location":"lib/EpiInfModels/internals/#EpiAware.EpiInfModels.ConstantRenewalStep","page":"Internal API","title":"EpiAware.EpiInfModels.ConstantRenewalStep","text":"struct ConstantRenewalStep{T} <: EpiAware.EpiInfModels.AbstractConstantRenewalStep\n\nThe renewal process iteration/step function struct with constant generation interval.\n\nNote that the generation interval is stored in reverse order.\n\n\n\nFields\n\nrev_gen_int::Vector\n\n\n\n\n\n","category":"type"},{"location":"lib/EpiInfModels/internals/#EpiAware.EpiInfModels.ConstantRenewalStep-Tuple{Any, Any}","page":"Internal API","title":"EpiAware.EpiInfModels.ConstantRenewalStep","text":"function (recurrent_step::ConstantRenewalStep)(recent_incidence, Rt)\n\nImplement the Renewal model iteration/step function, with constant generation interval.\n\nMathematical specification\n\nThe new incidence is given by\n\nI_t = R_t sum_i=1^n-1 I_t-i g_i\n\nwhere I_t is the new incidence, R_t is the reproduction number, I_{t-i} is the recent incidence and g_i is the generation interval.\n\nArguments\n\nrecent_incidence: Array of recent incidence values order least recent to most recent.\nRt: Reproduction number.\n\nReturns\n\nUpdated incidence array.\n\n\n\n\n\n","category":"method"},{"location":"lib/EpiInfModels/internals/#EpiAware.EpiInfModels.ConstantRenewalWithPopulationStep","page":"Internal API","title":"EpiAware.EpiInfModels.ConstantRenewalWithPopulationStep","text":"struct ConstantRenewalWithPopulationStep{T} <: EpiAware.EpiInfModels.AbstractConstantRenewalStep\n\nThe renewal process iteration/step function struct with constant generation interval and a fixed population size.\n\nNote that the generation interval is stored in reverse order.\n\n\n\nFields\n\nrev_gen_int::Vector\npop_size::Any\n\n\n\n\n\n","category":"type"},{"location":"lib/EpiInfModels/internals/#EpiAware.EpiInfModels.ConstantRenewalWithPopulationStep-Tuple{Any, Any}","page":"Internal API","title":"EpiAware.EpiInfModels.ConstantRenewalWithPopulationStep","text":"function (recurrent_step::ConstantRenewalWithPopulationStep)(recent_incidence_and_available_sus, Rt)\n\nCallable on a RenewalWithPopulation struct for compute new incidence based on recent incidence, Rt and depletion of susceptibles.\n\nMathematical specification\n\nThe new incidence is given by\n\nI_t = S_t-1 N R_t sum_i=1^n-1 I_t-i g_i\n\nwhere I_t is the new incidence, R_t is the reproduction number, I_{t-i} is the recent incidence and g_i is the generation interval.\n\nArguments\n\nrecent_incidence_and_available_sus: A tuple with an array of recent incidence\n\nvalues and the remaining susceptible/available individuals.\n\nRt: Reproduction number.\n\nReturns\n\nVector containing the updated incidence array and the new recent_incidence_and_available_sus\n\nvalue.\n\n\n\n\n\n","category":"method"},{"location":"lib/EpiInfModels/internals/#EpiAware.EpiAwareBase.generate_latent_infs-Tuple{AbstractTuringRenewal, Any}","page":"Internal API","title":"EpiAware.EpiAwareBase.generate_latent_infs","text":"generate_latent_infs(\n epi_model::AbstractTuringRenewal,\n _Rt\n) -> Any\n\n\nImplement the generate_latent_infs function for the Renewal model.\n\nExample usage with Renewal type of model for unobserved infection process\n\ngenerate_latent_infs can be used to construct a Turing model for the latent infections conditional on the sample path of a latent process. In this example, we generate a sample of a white noise latent process.\n\nFirst, we construct an Renewal struct with an EpiData object, an initialisation prior and a transformation function.\n\nusing Distributions, Turing, EpiAware\ngen_int = [0.2, 0.3, 0.5]\ng = exp\n\n# Create an EpiData object\ndata = EpiData(gen_int, g)\n\n# Create an Renewal model\nrenewal_model = Renewal(data; initialisation_prior = Normal())\n\nThen, we can use generate_latent_infs to construct a Turing model for the unobserved infection generation model set by the type of renewal_model.\n\n# Construct a Turing model\nZ_t = randn(100) * 0.05\nlatent_inf = generate_latent_infs(renewal_model, Z_t)\n\nNow we can use the Turing PPL API to sample underlying parameters and generate the unobserved infections.\n\n# Sample from the unobserved infections model\n\n#Sample random parameters from prior\nθ = rand(latent_inf)\n#Get unobserved infections as a generated quantities from the model\nI_t = generated_quantities(latent_inf, θ)\n\n\n\n\n\n","category":"method"},{"location":"lib/EpiInfModels/internals/#EpiAware.EpiAwareBase.generate_latent_infs-Tuple{DirectInfections, Any}","page":"Internal API","title":"EpiAware.EpiAwareBase.generate_latent_infs","text":"generate_latent_infs(\n epi_model::DirectInfections,\n Z_t\n) -> Any\n\n\nImplement the generate_latent_infs function for the DirectInfections model.\n\nExample usage with DirectInfections type of model for unobserved infection process\n\nFirst, we construct a DirectInfections struct with an EpiData object, an initialisation prior and a transformation function.\n\nusing Distributions, Turing, EpiAware\ngen_int = [0.2, 0.3, 0.5]\ng = exp\n\n# Create an EpiData object\ndata = EpiData(gen_int, g)\n\n# Create a DirectInfections model\ndirect_inf_model = DirectInfections(data = data, initialisation_prior = Normal())\n\nThen, we can use generate_latent_infs to construct a Turing model for the unobserved infection generation model set by the type of direct_inf_model.\n\n# Construct a Turing model\nZ_t = randn(100)\nlatent_inf = generate_latent_infs(direct_inf_model, Z_t)\n\nNow we can use the Turing PPL API to sample underlying parameters and generate the unobserved infections.\n\n# Sample from the unobserved infections model\n\n#Sample random parameters from prior\nθ = rand(latent_inf)\n#Get unobserved infections as a generated quantities from the model\nI_t = generated_quantities(latent_inf, θ)\n\n\n\n\n\n","category":"method"},{"location":"lib/EpiInfModels/internals/#EpiAware.EpiAwareBase.generate_latent_infs-Tuple{ExpGrowthRate, Any}","page":"Internal API","title":"EpiAware.EpiAwareBase.generate_latent_infs","text":"generate_latent_infs(epi_model::ExpGrowthRate, rt) -> Any\n\n\nImplement the generate_latent_infs function for the ExpGrowthRate model.\n\nExample usage with ExpGrowthRate type of model for unobserved infection process\n\ngenerate_latent_infs can be used to construct a Turing model for the latent infections conditional on the sample path of a latent process. In this example, we generate a sample of a white noise latent process.\n\nFirst, we construct an ExpGrowthRate struct with an EpiData object, an initialisation prior and a transformation function.\n\nusing Distributions, Turing, EpiAware\ngen_int = [0.2, 0.3, 0.5]\ng = exp\n\n# Create an EpiData object\ndata = EpiData(gen_int, g)\n\n# Create an ExpGrowthRate model\nexp_growth_model = ExpGrowthRate(data = data, initialisation_prior = Normal())\n\nThen, we can use generate_latent_infs to construct a Turing model for the unobserved infection generation model set by the type of direct_inf_model.\n\n# Construct a Turing model\nZ_t = randn(100) * 0.05\nlatent_inf = generate_latent_infs(exp_growth_model, Z_t)\n\nNow we can use the Turing PPL API to sample underlying parameters and generate the unobserved infections.\n\n# Sample from the unobserved infections model\n\n#Sample random parameters from prior\nθ = rand(latent_inf)\n#Get unobserved infections as a generated quantities from the model\nI_t = generated_quantities(latent_inf, θ)\n\n\n\n\n\n","category":"method"},{"location":"lib/EpiInfModels/internals/#EpiAware.EpiAwareBase.generate_latent_infs-Tuple{ODEProcess, ODEParams}","page":"Internal API","title":"EpiAware.EpiAwareBase.generate_latent_infs","text":"generate_latent_infs(\n epi_model::ODEProcess,\n params::ODEParams\n) -> Any\n\n\nImplement the generate_latent_infs function for the ODEProcess model.\n\nConstructs a Turing model to generate latent infections using the specified epidemiological model and parameters.\n\nArguments\n\nepi_model::ODEProcess: The ODEProcess model containing the problem definition, time steps, solver, and solution-to-infections transformation function.\nparams::ODEParams: The initial conditions (u0) and parameters (p) for the ODE problem.\n\nGenerated quantities\n\nI_t: The latent infections generated by solving the ODE problem with the specified parameters.\n\nDetails\n\nThis function remakes the ODE problem with the provided initial conditions and parameters, solves it using the specified solver, and then transforms the solution into latent infections using the sol2infs function.\n\nExample usage\n\nusing EpiAware, OrdinaryDiffEq\nr = log(2) / 7 # Growth rate corresponding to 7 day doubling time\nu0 = [1.0]\np = [r]\nparams = ODEParams(u0 = u0, p = p)\n\n# Define the ODE problem using SciML\n# We use a simple exponential growth model\n\nfunction expgrowth(du, u, p, t)\n du[1] = p[1] * u[1]\nend\nprob = ODEProblem(expgrowth, u0, (0.0, 10.0), p)\n\n# Define the ODEProcess\n\nexpgrowth_model = ODEProcess(prob::ODEProblem; ts = 0:1:10,\n solver = Tsit5(),\n sol2infs = sol -> sol[1, :])\n\n# Generate the latent infections\nI_t = generate_latent_infs(expgrowth_model, params)()\n\n\n\n\n\n","category":"method"},{"location":"lib/EpiInfModels/internals/#EpiAware.EpiAwareUtils.get_state-Tuple{EpiAware.EpiInfModels.ConstantRenewalStep, Any, Any}","page":"Internal API","title":"EpiAware.EpiAwareUtils.get_state","text":"get_state(\n acc_step::EpiAware.EpiInfModels.ConstantRenewalStep,\n initial_state,\n state\n) -> Any\n\n\nMethod to get the state of the accumulation for a ConstantRenewalStep object.\n\n\n\n\n\n","category":"method"},{"location":"lib/EpiInfModels/internals/#EpiAware.EpiAwareUtils.get_state-Tuple{EpiAware.EpiInfModels.ConstantRenewalWithPopulationStep, Any, Any}","page":"Internal API","title":"EpiAware.EpiAwareUtils.get_state","text":"get_state(\n acc_step::EpiAware.EpiInfModels.ConstantRenewalWithPopulationStep,\n initial_state,\n state\n) -> Any\n\n\nMethod to get the state of the accumulation for a ConstantRenewalWithPopulationStep object.\n\n\n\n\n\n","category":"method"},{"location":"lib/EpiInfModels/internals/#EpiAware.EpiInfModels.make_renewal_init-Tuple{Renewal, Any, Any}","page":"Internal API","title":"EpiAware.EpiInfModels.make_renewal_init","text":"make_renewal_init(epi_model::Renewal, I₀, Rt₀) -> Any\n\n\nCreate the initial state of the Renewal model.\n\nArguments\n\nepi_model::Renewal: The Renewal model.\nI₀: The initial number of infected individuals.\nRt₀: The initial time-varying reproduction number.\n\nReturns\n\nThe initial vector of infected individuals.\n\n\n\n\n\n","category":"method"},{"location":"lib/EpiInfModels/internals/#EpiAware.EpiInfModels.neg_MGF-Tuple{Any, AbstractVector}","page":"Internal API","title":"EpiAware.EpiInfModels.neg_MGF","text":"neg_MGF(r, w::AbstractVector) -> Any\n\n\nCompute the negative moment generating function (MGF) for a given rate r and weights w.\n\nArguments\n\nr: The rate parameter.\nw: An abstract vector of weights.\n\nReturns\n\nThe value of the negative MGF.\n\n\n\n\n\n","category":"method"},{"location":"lib/EpiInfModels/internals/#EpiAware.EpiInfModels.oneexpy-Tuple{T} where T","page":"Internal API","title":"EpiAware.EpiInfModels.oneexpy","text":"oneexpy(y) -> Any\n\n\nVersion of LogExpFunctions.xexpy that takes a single argument y and returns exp(y).\n\n\n\n\n\n","category":"method"},{"location":"lib/EpiInfModels/internals/#EpiAware.EpiInfModels.renewal_init_state-Tuple{EpiAware.EpiInfModels.ConstantRenewalStep, Any, Any, Any}","page":"Internal API","title":"EpiAware.EpiInfModels.renewal_init_state","text":"renewal_init_state(\n recurrent_step::EpiAware.EpiInfModels.ConstantRenewalStep,\n I₀,\n r_approx,\n len_gen_int\n) -> Any\n\n\nConstructs the initial conditions for a renewal model with ConstantRenewalStep type of step function.\n\n\n\n\n\n","category":"method"},{"location":"lib/EpiInfModels/internals/#EpiAware.EpiInfModels.renewal_init_state-Tuple{EpiAware.EpiInfModels.ConstantRenewalWithPopulationStep, Any, Any, Any}","page":"Internal API","title":"EpiAware.EpiInfModels.renewal_init_state","text":"renewal_init_state(\n recurrent_step::EpiAware.EpiInfModels.ConstantRenewalWithPopulationStep,\n I₀,\n r_approx,\n len_gen_int\n) -> Any\n\n\nConstructs the initial conditions for a renewal model with ConstantRenewalWithPopulationStep type of step function.\n\n\n\n\n\n","category":"method"},{"location":"getting-started/explainers/interfaces/#Interfaces","page":"Interfaces","title":"Interfaces","text":"","category":"section"},{"location":"getting-started/explainers/interfaces/","page":"Interfaces","title":"Interfaces","text":"We support two primary workflows for using the package:","category":"page"},{"location":"getting-started/explainers/interfaces/","page":"Interfaces","title":"Interfaces","text":"EpiProblem: A high-level interface for defining and fitting models to data. This is the recommended way to use the package.\nTuring interface: A lower-level interface for defining and fitting models to data. This is the more flexible way to use the package and may also be more familiar to users of Turing.jl.","category":"page"},{"location":"getting-started/explainers/interfaces/","page":"Interfaces","title":"Interfaces","text":"See the getting started section for tutorials on each of these workflows.","category":"page"},{"location":"getting-started/explainers/interfaces/#EpiProblem","page":"Interfaces","title":"EpiProblem","text":"","category":"section"},{"location":"getting-started/explainers/interfaces/","page":"Interfaces","title":"Interfaces","text":"Each module of the overall epidemiological model we are interested in is a Turing Model in its own right. In this section, we compose the individual models into the full epidemiological model using the EpiProblem struct.","category":"page"},{"location":"getting-started/explainers/interfaces/","page":"Interfaces","title":"Interfaces","text":"The constructor for an EpiProblem requires:","category":"page"},{"location":"getting-started/explainers/interfaces/","page":"Interfaces","title":"Interfaces","text":"An epi_model.\nA latent_model.\nAn observation_model.\nA tspan.","category":"page"},{"location":"getting-started/explainers/interfaces/","page":"Interfaces","title":"Interfaces","text":"The tspan set the range of the time index for the models.","category":"page"},{"location":"getting-started/explainers/interfaces/#Turing-interface","page":"Interfaces","title":"Turing interface","text":"","category":"section"},{"location":"getting-started/explainers/interfaces/","page":"Interfaces","title":"Interfaces","text":"The Turing interface is a lower-level interface for defining and fitting models to data. This is the more flexible way to use the package and may also be more familiar to users of Turing.jl.","category":"page"},{"location":"getting-started/tutorials/#Tutorials","page":"Overview","title":"Tutorials","text":"","category":"section"},{"location":"getting-started/tutorials/","page":"Overview","title":"Overview","text":"This section contains tutorials that will help you get started with EpiAware for specific tasks. See the sidebar for the list of tutorials.","category":"page"},{"location":"overview/#overview","page":"Overview","title":"Overview of the EpiAware Software Ecosystem","text":"","category":"section"},{"location":"overview/","page":"Overview","title":"Overview","text":"EpiAware is not a standard toolkit for infectious disease modelling.","category":"page"},{"location":"overview/","page":"Overview","title":"Overview","text":"It seeks to be highly modular and composable for advanced users whilst still providing opinionated workflows for those who are new to the field. Developed by the authors behind other widely used infectious disease modelling packages such as EpiNow2, epinowcast, and epidist, alongside experts in infectious disease modelling in Julia,EpiAware is designed to go beyond the capabilities of these packages by providing a more flexible and extensible framework for modelling and inference of infectious disease dynamics.","category":"page"},{"location":"overview/#Package-Features","page":"Overview","title":"Package Features","text":"","category":"section"},{"location":"overview/","page":"Overview","title":"Overview","text":"Flexible: The package is designed to be flexible and extensible, and to provide a consistent interface for fitting and simulating models.\nModular: The package is designed to be modular, with a clear separation between the model and the data.\nExtensible: The package is designed to be extensible, with a clear separation between the model and the data.\nConsistent: The package is designed to provide a consistent interface for fitting and simulating models.\nEfficient: The package is designed to be efficient, with a clear separation between the model and the data.","category":"page"},{"location":"overview/#Package-structure","page":"Overview","title":"Package structure","text":"","category":"section"},{"location":"overview/","page":"Overview","title":"Overview","text":"EpiAware.jl is a wrapper around a series of submodules, each of which provides a different aspect of the package's functionality (much like the tidyverse in R). The package is designed to be modular, with a clear separation between modules and between modules and data. Currently included modules are:","category":"page"},{"location":"overview/","page":"Overview","title":"Overview","text":"EpiAwareBase: The core module, which provides the underlying abstract types and functions for the package.\nEpiAwareUtils: A utility module, which provides a series of utility functions for working with the package.\nEpiInference: An inference module, which provides a series of functions for fitting models to data. Builds on top of Turing.jl.\nEpiInfModels: Provides tools for composing models of the disease transmission process. Builds on top of Turing.jl, in particular the DynamicPPL.jl interface.\nEpiLatentModels: Provides tools for composing latent models such as random walks, autoregressive models, etc. Builds on top of DynamicPPL.jl. Used by all other modelling modules to define latent processes.\nEpiObsModels: Provides tools for composing observation models, such as Poisson, Binomial, etc. Builds on top of DynamicPPL.jl.","category":"page"},{"location":"overview/#Using-the-package","page":"Overview","title":"Using the package","text":"","category":"section"},{"location":"overview/","page":"Overview","title":"Overview","text":"We support two primary workflows for using the package:","category":"page"},{"location":"overview/","page":"Overview","title":"Overview","text":"EpiProblem: A high-level interface for defining and fitting models to data. This is the recommended way to use the package.\nTuring interface: A lower-level interface for defining and fitting models to data. This is the more flexible way to use the package and may also be more familiar to users of Turing.jl.","category":"page"},{"location":"overview/","page":"Overview","title":"Overview","text":"See the getting started section for tutorials on each of these workflows.","category":"page"},{"location":"lib/EpiAwareUtils/internals/#Internal-Documentation","page":"Internal API","title":"Internal Documentation","text":"","category":"section"},{"location":"lib/EpiAwareUtils/internals/","page":"Internal API","title":"Internal API","text":"Documentation for EpiAwareUtils.jl's internal interface.","category":"page"},{"location":"lib/EpiAwareUtils/internals/#Contents","page":"Internal API","title":"Contents","text":"","category":"section"},{"location":"lib/EpiAwareUtils/internals/","page":"Internal API","title":"Internal API","text":"Pages = [\"internals.md\"]\nDepth = 2:2","category":"page"},{"location":"lib/EpiAwareUtils/internals/#Index","page":"Internal API","title":"Index","text":"","category":"section"},{"location":"lib/EpiAwareUtils/internals/","page":"Internal API","title":"Internal API","text":"Pages = [\"internals.md\"]","category":"page"},{"location":"lib/EpiAwareUtils/internals/#Internal-API","page":"Internal API","title":"Internal API","text":"","category":"section"},{"location":"lib/EpiAwareUtils/internals/","page":"Internal API","title":"Internal API","text":"Modules = [EpiAware.EpiAwareUtils]\nPublic = false","category":"page"},{"location":"lib/EpiAwareUtils/internals/#EpiAware.EpiAwareBase._apply_method","page":"Internal API","title":"EpiAware.EpiAwareBase._apply_method","text":"_apply_method(\n model::DynamicPPL.Model,\n method::DirectSample;\n ...\n) -> Any\n_apply_method(\n model::DynamicPPL.Model,\n method::DirectSample,\n prev_result;\n kwargs...\n) -> Any\n\n\nImplements direct sampling from a Turing model.\n\n\n\n\n\n","category":"function"},{"location":"lib/EpiAwareUtils/internals/#EpiAware.EpiAwareBase._apply_method-2","page":"Internal API","title":"EpiAware.EpiAwareBase._apply_method","text":"_apply_method(\n model::DynamicPPL.Model,\n method::AbstractEpiMethod;\n ...\n) -> Any\n_apply_method(\n model::DynamicPPL.Model,\n method::AbstractEpiMethod,\n prev_result;\n kwargs...\n) -> Any\n\n\nApply the inference/generative method method to the Model object mdl.\n\nArguments\n\nmodel::AbstractEpiModel: The model to apply the method to.\nmethod::AbstractEpiMethod: The epidemiological method to apply.\nprev_result: The previous result of the method.\nkwargs: Additional keyword arguments passed to the method.\n\nReturns\n\nnothing: If no concrete implementation is defined for the given method.\n\n\n\n\n\n","category":"function"},{"location":"lib/EpiAwareUtils/internals/#EpiAware.EpiAwareBase._apply_method-Tuple{DynamicPPL.Model, EpiMethod, Any}","page":"Internal API","title":"EpiAware.EpiAwareBase._apply_method","text":"_apply_method(\n model::DynamicPPL.Model,\n method::EpiMethod,\n prev_result;\n kwargs...\n) -> Any\n\n\nApply steps defined by an EpiMethod to a model object.\n\nThis function applies the steps defined by an EpiMethod object to a Model object. It iterates over the pre-sampler steps defined in the EpiMethod object and recursively applies them to the model. Finally, it applies the sampler step defined in the EpiMethod object to the model. The prev_result argument is used to pass the result obtained from applying the previous steps, if any.\n\nArguments\n\nmethod::EpiMethod: The EpiMethod object containing the steps to be applied.\nmodel::Model: The model object to which the steps will be applied.\nprev_result: The previous result obtained from applying the steps. Defaults to nothing.\nkwargs...: Additional keyword arguments that can be passed to the steps.\n\nReturns\n\nprev_result: The result obtained after applying the steps.\n\n\n\n\n\n","category":"method"},{"location":"lib/EpiAwareUtils/internals/#EpiAware.EpiAwareBase._apply_method-Tuple{DynamicPPL.Model, EpiMethod}","page":"Internal API","title":"EpiAware.EpiAwareBase._apply_method","text":"_apply_method(\n model::DynamicPPL.Model,\n method::EpiMethod;\n kwargs...\n) -> Any\n\n\nApply a method to a mode without previous results\n\nArguments\n\nmodel::Model: The model to apply the method to.\nmethod::EpiMethod: The method to apply.\nkwargs...: Additional keyword arguments.\n\nReturns\n\nThe result of applying the method to the model.\n\n\n\n\n\n","category":"method"},{"location":"lib/EpiAwareUtils/internals/#EpiAware.EpiAwareBase.condition_model-Tuple{DynamicPPL.Model, NamedTuple, NamedTuple}","page":"Internal API","title":"EpiAware.EpiAwareBase.condition_model","text":"condition_model(\n model::DynamicPPL.Model,\n fix_parameters::NamedTuple,\n condition_parameters::NamedTuple\n) -> Any\n\n\nApply the condition to the model by fixing the specified parameters and conditioning on the others.\n\nArguments\n\nmodel::Model: The model to be conditioned.\nfix_parameters::NamedTuple: The parameters to be fixed.\ncondition_parameters::NamedTuple: The parameters to be conditioned on.\n\nReturns\n\n_model: The conditioned model.\n\n\n\n\n\n","category":"method"},{"location":"lib/EpiAwareUtils/internals/#EpiAware.EpiAwareBase.generate_epiaware-Tuple{Any, Any, AbstractTuringEpiModel}","page":"Internal API","title":"EpiAware.EpiAwareBase.generate_epiaware","text":"generate_epiaware(\n y_t,\n time_steps,\n epi_model::AbstractTuringEpiModel;\n latent_model,\n observation_model\n)\n\n\nGenerate an epi-aware model given the observed data and model specifications.\n\nArguments\n\ny_t: Observed data.\ntime_steps: Number of time steps.\nepi_model: A Turing Epi model specification.\nlatent_model: A Turing Latent model specification.\nobservation_model: A Turing Observation model specification.\n\nReturns\n\nA DynamicPPPL.Model object.\n\n\n\n\n\n","category":"method"},{"location":"lib/EpiAwareUtils/internals/#EpiAware.EpiAwareBase.generated_observables-Tuple{DynamicPPL.Model, Any, Union{NamedTuple, MCMCChains.Chains}}","page":"Internal API","title":"EpiAware.EpiAwareBase.generated_observables","text":"generated_observables(\n model::DynamicPPL.Model,\n data,\n solution::Union{NamedTuple, MCMCChains.Chains}\n) -> EpiAwareObservables\n\n\nGenerate observables from a given model and solution including generated quantities.\n\n\n\n\n\n","category":"method"},{"location":"lib/EpiAwareUtils/internals/#EpiAware.EpiAwareUtils._apply_direct_sample-Tuple{Any, Any, Int64}","page":"Internal API","title":"EpiAware.EpiAwareUtils._apply_direct_sample","text":"_apply_direct_sample(\n model,\n method,\n n_samples::Int64;\n kwargs...\n) -> Any\n\n\nSample the model directly using Turing.Prior() and a NamedTuple of the sampled random variables along with generated quantities.\n\n\n\n\n\n","category":"method"},{"location":"lib/EpiAwareUtils/internals/#EpiAware.EpiAwareUtils._apply_direct_sample-Tuple{Any, Any, Nothing}","page":"Internal API","title":"EpiAware.EpiAwareUtils._apply_direct_sample","text":"_apply_direct_sample(\n model,\n method,\n n_samples::Nothing\n) -> Any\n\n\nSample the model directly using rand and return a single set of sampled random variables.\n\n\n\n\n\n","category":"method"},{"location":"lib/EpiAwareUtils/internals/#EpiAware.EpiAwareUtils._check_and_give_ts-Tuple{Distributions.Distribution, Any, Any, Any}","page":"Internal API","title":"EpiAware.EpiAwareUtils._check_and_give_ts","text":"_check_and_give_ts(\n dist::Distributions.Distribution,\n Δd,\n D,\n upper\n) -> Any\n\n\nInternal function to check censored_pmf arguments and return the time steps of the rightmost limits of the censor intervals.\n\n\n\n\n\n","category":"method"},{"location":"#EpiAware.jl","page":"EpiAware.jl: Real-time infectious disease monitoring","title":"EpiAware.jl","text":"","category":"section"},{"location":"","page":"EpiAware.jl: Real-time infectious disease monitoring","title":"EpiAware.jl: Real-time infectious disease monitoring","text":"Infectious disease situational awareness modelling toolkit for Julia.","category":"page"},{"location":"#Where-to-start","page":"EpiAware.jl: Real-time infectious disease monitoring","title":"Where to start","text":"","category":"section"},{"location":"","page":"EpiAware.jl: Real-time infectious disease monitoring","title":"EpiAware.jl: Real-time infectious disease monitoring","text":"Want to get started running code? Check out the Getting Started Tutorials.\nWhat is EpiAware? Check out our Overview.\nWant to see some end-to-end examples? Check out our EpiAware showcase.\nWant to understand the API? Check out our API Reference.\nWant to chat with someone about EpiAware? Post on our GitHub Discussions.\nWant to contribute to EpiAware? Check out our Developer documentation.\nWant to see our code? Check out our GitHub Repository.","category":"page"},{"location":"getting-started/tutorials/censored-obs/","page":"Fitting distributions with censored data","title":"Fitting distributions with censored data","text":"\n\n\n\n\n

              Fitting distributions using EpiAware and Turing PPL

              Introduction

              What are we going to do in this Vignette

              In this vignette, we'll demonstrate how to use the CDF function for censored delay distributions EpiAwareUtils.∫F, which underlies EpiAwareUtils.censored_pmf in conjunction with the Turing PPL for Bayesian inference of epidemiological delay distributions. We'll cover the following key points:

              1. Simulating censored delay distribution data

              2. Fitting a naive model using Turing

              3. Evaluating the naive model's performance

              4. Fitting an improved model using censored delay functionality from EpiAware.

              5. Comparing the censored delay model's performance to the naive model

              What might I need to know before starting

              This note builds on the concepts introduced in the R/stan package primarycensoreddist, especially the Fitting distributions using primarycensorseddist and cmdstan vignette and assumes familiarity with using Turing tools as covered in the Turing documentation.

              This note is generated using the EpiAware package locally via Pkg.develop, in the EpiAware/docs environment. It is also possible to install EpiAware using

              Pkg.add(url=\"https://github.com/CDCgov/Rt-without-renewal\", subdir=\"EpiAware\")

              Packages used in this vignette

              As well as EpiAware and Turing we will use Makie ecosystem packages for plotting and DataFramesMeta for data manipulation.

              \n\n
              let\n    docs_dir = dirname(dirname(dirname(@__DIR__)))\n    using Pkg: Pkg\n    Pkg.activate(docs_dir)\n    Pkg.instantiate()\nend
              \n\n\n\n

              The other dependencies are as follows:

              \n\n
              begin\n    using EpiAware.EpiAwareUtils: censored_pmf, censored_cdf, ∫F\n    using Random, Distributions, StatsBase #utilities for random events\n    using DataFramesMeta #Data wrangling\n    using CairoMakie, PairPlots #plotting\n    using Turing #PPL\nend
              \n\n\n","category":"page"},{"location":"getting-started/tutorials/censored-obs/#Simulating-censored-and-truncated-delay-distribution-data","page":"Fitting distributions with censored data","title":"Simulating censored and truncated delay distribution data","text":"","category":"section"},{"location":"getting-started/tutorials/censored-obs/","page":"Fitting distributions with censored data","title":"Fitting distributions with censored data","text":"
              \n

              We'll start by simulating some censored and truncated delay distribution data. We’ll define a rpcens function for generating data.

              \n\n
              Random.seed!(123) # For reproducibility
              \n
              TaskLocalRNG()
              \n\n\n

              Define the true distribution parameters

              \n\n
              n = 2000
              \n
              2000
              \n\n
              meanlog = 1.5
              \n
              1.5
              \n\n
              sdlog = 0.75
              \n
              0.75
              \n\n
              true_dist = LogNormal(meanlog, sdlog)
              \n
              Distributions.LogNormal{Float64}(μ=1.5, σ=0.75)
              \n\n\n

              Generate varying pwindow, swindow, and obs_time lengths

              \n\n
              pwindows = rand(1:2, n)
              \n
              2000-element Vector{Int64}:\n 2\n 2\n 2\n 1\n 2\n 1\n 1\n ⋮\n 1\n 2\n 1\n 2\n 2\n 2
              \n\n
              swindows = rand(1:2, n)
              \n
              2000-element Vector{Int64}:\n 1\n 2\n 2\n 1\n 2\n 1\n 1\n ⋮\n 2\n 2\n 2\n 1\n 1\n 2
              \n\n
              obs_times = rand(8:10, n)
              \n
              2000-element Vector{Int64}:\n 10\n  9\n  9\n 10\n  9\n  8\n  8\n  ⋮\n  8\n  9\n  9\n 10\n  8\n  8
              \n\n\n

              We recreate the primary censored sampling function from primarycensoreddist, c.f. documentation here.

              \n\n
              \"\"\"\n    function rpcens(dist; pwindow = 1, swindow = 1, D = Inf, max_tries = 1000)\n\nDoes a truncated censored sample from `dist` with a uniform primary time on `[0, pwindow]`.\n\"\"\"\nfunction rpcens(dist; pwindow = 1, swindow = 1, D = Inf, max_tries = 1000)\n    T = zero(eltype(dist))\n    invalid_sample = true\n    attempts = 1\n    while (invalid_sample && attempts <= max_tries)\n        X = rand(dist)\n        U = rand() * pwindow\n        T = X + U\n        attempts += 1\n        if X + U < D\n            invalid_sample = false\n        end\n    end\n\n    @assert !invalid_sample \"censored value not found in $max_tries attempts\"\n\n    return (T ÷ swindow) * swindow\nend
              \n\n\n
              #Sample secondary time relative to beginning of primary censor window respecting the right-truncation\nsamples = map(pwindows, swindows, obs_times) do pw, sw, ot\n    rpcens(true_dist; pwindow = pw, swindow = sw, D = ot)\nend
              \n
              2000-element Vector{Float64}:\n 4.0\n 2.0\n 2.0\n 2.0\n 4.0\n 3.0\n 6.0\n ⋮\n 4.0\n 6.0\n 2.0\n 6.0\n 4.0\n 4.0
              \n\n\n

              Aggregate to unique combinations and count occurrences

              \n\n
              delay_counts = mapreduce(vcat, pwindows, swindows, obs_times, samples) do pw, sw, ot, s\n    DataFrame(\n        pwindow = pw,\n        swindow = sw,\n        obs_time = ot,\n        observed_delay = s,\n        observed_delay_upper = s + sw\n    )\nend |>\n               df -> @groupby(df, :pwindow, :swindow, :obs_time, :observed_delay,\n    :observed_delay_upper) |>\n                     gd -> @combine(gd, :n=length(:pwindow))
              \n
              pwindowswindowobs_timeobserved_delayobserved_delay_uppern
              11180.01.01
              21181.02.013
              31182.03.032
              41183.04.029
              51184.05.034
              61185.06.026
              71186.07.019
              81187.08.014
              91190.01.02
              101191.02.05
              ...
              8022108.010.022
              \n\n\n

              Compare the samples with and without secondary censoring to the true distribution and calculate empirical CDF

              \n\n
              empirical_cdf = ecdf(samples)
              \n
              ECDF{Vector{Float64}, Weights{Float64, Float64, Vector{Float64}}}([0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0  …  9.0, 9.0, 9.0, 9.0, 9.0, 9.0, 9.0, 9.0, 9.0, 9.0], Float64[])
              \n\n
              empirical_cdf_obs = ecdf(delay_counts.observed_delay, weights = delay_counts.n)
              \n
              ECDF{Vector{Float64}, Weights{Int64, Int64, Vector{Int64}}}([0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0  …  8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 8.0, 9.0, 9.0], [1, 2, 2, 13, 16, 21, 1, 13, 13, 9  …  9, 10, 9, 13, 17, 15, 12, 22, 8, 7])
              \n\n
              x_seq = range(minimum(samples), maximum(samples), 100)
              \n
              0.0:0.09090909090909091:9.0
              \n\n
              theoretical_cdf = x_seq |> x -> cdf(true_dist, x)
              \n
              100-element Vector{Float64}:\n 0.0\n 1.011597608751049e-7\n 9.643132895117507e-6\n 9.484054524759167e-5\n 0.0004058100212574347\n 0.0011393531997368723\n 0.0024911102275376566\n ⋮\n 0.8052522612515658\n 0.8091156793117527\n 0.8128920005554523\n 0.8165833494282897\n 0.8201917991805499\n 0.8237193727611859
              \n\n
              let\n    f = Figure()\n    ax = Axis(f[1, 1],\n        title = \"Comparison of Observed vs Theoretical CDF\",\n        ylabel = \"Cumulative Probability\",\n        xlabel = \"Delay\"\n    )\n    lines!(\n        ax, x_seq, empirical_cdf_obs, label = \"Empirical CDF\", color = :blue, linewidth = 2)\n    lines!(ax, x_seq, theoretical_cdf, label = \"Theoretical CDF\",\n        color = :black, linewidth = 2)\n    vlines!(ax, [mean(samples)], color = :blue, linestyle = :dash,\n        label = \"Empirical mean\", linewidth = 2)\n    vlines!(ax, [mean(true_dist)], linestyle = :dash,\n        label = \"Theoretical mean\", color = :black, linewidth = 2)\n    axislegend(position = :rb)\n\n    f\nend
              \n\n\n\n

              We've aggregated the data to unique combinations of pwindow, swindow, and obs_time and counted the number of occurrences of each observed_delay for each combination. This is the data we will use to fit our model.

              \n\n","category":"page"},{"location":"getting-started/tutorials/censored-obs/#Fitting-a-naive-model-using-Turing","page":"Fitting distributions with censored data","title":"Fitting a naive model using Turing","text":"","category":"section"},{"location":"getting-started/tutorials/censored-obs/","page":"Fitting distributions with censored data","title":"Fitting distributions with censored data","text":"
              \n

              We'll start by fitting a naive model using NUTS from Turing. We define the model in the Turing PPL.

              \n\n
              @model function naive_model(N, y, n)\n    mu ~ Normal(1.0, 1.0)\n    sigma ~ truncated(Normal(0.5, 1.0); lower = 0.0)\n    d = LogNormal(mu, sigma)\n\n    for i in eachindex(y)\n        Turing.@addlogprob! n[i] * logpdf(d, y[i])\n    end\nend
              \n
              naive_model (generic function with 2 methods)
              \n\n\n

              Now lets instantiate this model with data

              \n\n
              naive_mdl = naive_model(\n    size(delay_counts, 1),\n    delay_counts.observed_delay .+ 1e-6, # Add a small constant to avoid log(0)\n    delay_counts.n)
              \n
              DynamicPPL.Model{typeof(naive_model), (:N, :y, :n), (), (), Tuple{Int64, Vector{Float64}, Vector{Int64}}, Tuple{}, DynamicPPL.DefaultContext}(Main.var\"workspace#5\".naive_model, (N = 80, y = [1.0e-6, 1.000001, 2.000001, 3.000001, 4.000001, 5.000001, 6.000001, 7.000001, 1.0e-6, 1.000001  …  1.0e-6, 2.000001, 4.000001, 6.000001, 8.000001, 1.0e-6, 2.000001, 4.000001, 6.000001, 8.000001], n = [1, 13, 32, 29, 34, 26, 19, 14, 2, 5  …  13, 69, 59, 30, 12, 9, 69, 48, 29, 22]), NamedTuple(), DynamicPPL.DefaultContext())
              \n\n\n

              and now let's fit the compiled model.

              \n\n
              naive_fit = sample(naive_mdl, NUTS(), MCMCThreads(), 500, 4)
              \n
              iterationchainmusigmalpn_stepsis_acceptacceptance_rate...
              125110.5698283.16687-6326.423.01.00.889234
              225210.515543.21602-6327.173.01.00.748919
              325310.6991243.15787-6327.727.01.00.895135
              425410.5230353.16482-6326.83.01.00.728358
              525510.4975833.17839-6327.163.01.00.840027
              625610.5489563.15474-6326.613.01.00.792753
              725710.5693353.16453-6326.433.01.01.0
              825810.6062343.2018-6326.533.01.00.977859
              925910.5306583.17006-6326.693.01.00.900204
              1026010.6271173.11173-6327.417.01.00.761508
              ...
              \n\n
              summarize(naive_fit)
              \n
              parametersmeanstdmcseess_bulkess_tailrhatess_per_sec
              1:mu0.5841290.07040670.001514492153.021487.541.00003310.726
              2:sigma3.177660.04957750.001135421905.041306.941.00127274.937
              \n\n
              let\n    f = pairplot(naive_fit)\n    vlines!(f[1, 1], [meanlog], linewidth = 4)\n    vlines!(f[2, 2], [sdlog], linewidth = 4)\n    f\nend
              \n\n\n\n

              We see that the model has converged and the diagnostics look good. However, just from the model posterior summary we see that we might not be very happy with the fit. mu is smaller than the target 1.5 and sigma is larger than the target 0.75.

              \n\n","category":"page"},{"location":"getting-started/tutorials/censored-obs/#Fitting-an-improved-model-using-censoring-utilities","page":"Fitting distributions with censored data","title":"Fitting an improved model using censoring utilities","text":"","category":"section"},{"location":"getting-started/tutorials/censored-obs/","page":"Fitting distributions with censored data","title":"Fitting distributions with censored data","text":"
              \n

              We'll now fit an improved model using the ∫F function from EpiAware.EpiAwareUtils for calculating the CDF of the total delay from the beginning of the primary window to the secondary event time. This includes both the delay distribution we are making inference on and the time between the start of the primary censor window and the primary event. The ∫F function underlies censored_pmf function from the EpiAware.EpiAwareUtils submodule.

              Using the ∫F function we can write a log-pmf function primary_censored_dist_lpmf that accounts for:

              This is the analog function to the function of the same name in primarycensoreddist: it calculates the log-probability of the secondary event occurring in the secondary censoring window conditional on the primary event occurring in the primary censoring window by calculating the increase in the CDF over the secondary window and rescaling by the probability of the secondary event occuring within the maximum observation time D.

              \n\n
              function primary_censored_dist_lpmf(dist, y, pwindow, y_upper, D)\n    if y == 0.0\n        return log(∫F(dist, y_upper, pwindow)) - log(∫F(dist, D, pwindow))\n    else\n        return log(∫F(dist, y_upper, pwindow) - ∫F(dist, y, pwindow)) -\n               log(∫F(dist, D, pwindow))\n    end\nend
              \n
              primary_censored_dist_lpmf (generic function with 1 method)
              \n\n\n

              We make a new Turing model that now uses primary_censored_dist_lpmf rather than the naive uncensored and untruncated logpdf.

              \n\n
              @model function primarycensoreddist_model(y, y_upper, n, pws, Ds)\n    mu ~ Normal(1.0, 1.0)\n    sigma ~ truncated(Normal(0.5, 0.5); lower = 0.0)\n    dist = LogNormal(mu, sigma)\n\n    for i in eachindex(y)\n        Turing.@addlogprob! n[i] * primary_censored_dist_lpmf(\n            dist, y[i], pws[i], y_upper[i], Ds[i])\n    end\nend
              \n
              primarycensoreddist_model (generic function with 2 methods)
              \n\n\n

              Lets instantiate this model with data

              \n\n
              primarycensoreddist_mdl = primarycensoreddist_model(\n    delay_counts.observed_delay,\n    delay_counts.observed_delay_upper,\n    delay_counts.n,\n    delay_counts.pwindow,\n    delay_counts.obs_time\n)
              \n
              DynamicPPL.Model{typeof(primarycensoreddist_model), (:y, :y_upper, :n, :pws, :Ds), (), (), Tuple{Vector{Float64}, Vector{Float64}, Vector{Int64}, Vector{Int64}, Vector{Int64}}, Tuple{}, DynamicPPL.DefaultContext}(Main.var\"workspace#5\".primarycensoreddist_model, (y = [0.0, 1.0, 2.0, 3.0, 4.0, 5.0, 6.0, 7.0, 0.0, 1.0  …  0.0, 2.0, 4.0, 6.0, 8.0, 0.0, 2.0, 4.0, 6.0, 8.0], y_upper = [1.0, 2.0, 3.0, 4.0, 5.0, 6.0, 7.0, 8.0, 1.0, 2.0  …  2.0, 4.0, 6.0, 8.0, 10.0, 2.0, 4.0, 6.0, 8.0, 10.0], n = [1, 13, 32, 29, 34, 26, 19, 14, 2, 5  …  13, 69, 59, 30, 12, 9, 69, 48, 29, 22], pws = [1, 1, 1, 1, 1, 1, 1, 1, 1, 1  …  2, 2, 2, 2, 2, 2, 2, 2, 2, 2], Ds = [8, 8, 8, 8, 8, 8, 8, 8, 9, 9  …  9, 9, 9, 9, 9, 10, 10, 10, 10, 10]), NamedTuple(), DynamicPPL.DefaultContext())
              \n\n\n

              Now let’s fit the compiled model.

              \n\n
              primarycensoreddist_fit = sample(\n    primarycensoreddist_mdl, NUTS(), MCMCThreads(), 1000, 4)
              \n
              iterationchainmusigmalpn_stepsis_acceptacceptance_rate...
              150111.468190.771804-3376.413.01.01.0
              250211.468770.738944-3375.13.01.00.999202
              350311.497350.740651-3376.393.01.00.741842
              450411.476180.762895-3375.613.01.00.983406
              550511.481320.74067-3375.473.01.00.852127
              650611.407460.711968-3375.637.01.00.914995
              750711.443290.747661-3375.557.01.00.89498
              850811.435680.734698-3375.173.01.00.977821
              950911.424560.696408-3375.793.01.00.941795
              1051011.469660.758485-3375.465.01.01.0
              ...
              \n\n
              summarize(primarycensoreddist_fit)
              \n
              parametersmeanstdmcseess_bulkess_tailrhatess_per_sec
              1:mu1.451680.03555140.001086361087.881420.311.0016960.7926
              2:sigma0.7330080.02749250.000827291110.271643.861.0017362.0434
              \n\n
              let\n    f = pairplot(primarycensoreddist_fit)\n    CairoMakie.vlines!(f[1, 1], [meanlog], linewidth = 3)\n    CairoMakie.vlines!(f[2, 2], [sdlog], linewidth = 3)\n    f\nend
              \n\n\n\n

              We see that the model has converged and the diagnostics look good. We also see that the posterior means are very near the true parameters and the 90% credible intervals include the true parameters.

              \n\n","category":"page"},{"location":"getting-started/tutorials/censored-obs/","page":"Fitting distributions with censored data","title":"Fitting distributions with censored data","text":"EditURL = \"https://github.com/CDCgov/Rt-without-renewal/blob/main/docs/src/getting-started/tutorials/censored-obs.jl\"","category":"page"},{"location":"lib/EpiInfModels/public/#Public-Documentation","page":"Public API","title":"Public Documentation","text":"","category":"section"},{"location":"lib/EpiInfModels/public/","page":"Public API","title":"Public API","text":"Documentation for EpiInfModels.jl's public interface.","category":"page"},{"location":"lib/EpiInfModels/public/","page":"Public API","title":"Public API","text":"See the Internals section of the manual for internal package docs covering all submodules.","category":"page"},{"location":"lib/EpiInfModels/public/#Contents","page":"Public API","title":"Contents","text":"","category":"section"},{"location":"lib/EpiInfModels/public/","page":"Public API","title":"Public API","text":"Pages = [\"public.md\"]\nDepth = 2:2","category":"page"},{"location":"lib/EpiInfModels/public/#Index","page":"Public API","title":"Index","text":"","category":"section"},{"location":"lib/EpiInfModels/public/","page":"Public API","title":"Public API","text":"Pages = [\"public.md\"]","category":"page"},{"location":"lib/EpiInfModels/public/#Public-API","page":"Public API","title":"Public API","text":"","category":"section"},{"location":"lib/EpiInfModels/public/","page":"Public API","title":"Public API","text":"Modules = [EpiAware.EpiInfModels]\nPrivate = false","category":"page"},{"location":"lib/EpiInfModels/public/#EpiAware.EpiInfModels","page":"Public API","title":"EpiAware.EpiInfModels","text":"Module for defining epidemiological models.\n\n\n\n\n\n","category":"module"},{"location":"lib/EpiInfModels/public/#EpiAware.EpiInfModels.DirectInfections","page":"Public API","title":"EpiAware.EpiInfModels.DirectInfections","text":"struct DirectInfections{S<:Distributions.Sampleable} <: AbstractTuringEpiModel\n\nModel unobserved/latent infections as a transformation on a sampled latent process.\n\nMathematical specification\n\nIf Z_t is a realisation of the latent model, then the unobserved/latent infections are given by\n\nI_t = g(hatI_0 + Z_t)\n\nwhere g is a transformation function and the unconstrained initial infections hatI_0 are sampled from a prior distribution.\n\nDirectInfections are constructed by passing an EpiData object data and an initialisation_prior for the prior distribution of hatI_0. The default initialisation_prior is Normal().\n\nConstructors\n\nDirectInfections(; data, initialisation_prior)\n\nExample usage with generate_latent_infs\n\ngenerate_latent_infs can be used to construct a Turing model for the latent infections conditional on the sample path of a latent process. In this example, we generate a sample of a white noise latent process.\n\nFirst, we construct a DirectInfections struct with an EpiData object, an initialisation prior and a transformation function.\n\nusing Distributions, Turing, EpiAware\ngen_int = [0.2, 0.3, 0.5]\ng = exp\n\n# Create an EpiData object\ndata = EpiData(gen_int, g)\n\n# Create a DirectInfections model\ndirect_inf_model = DirectInfections(data = data, initialisation_prior = Normal())\n\nThen, we can use generate_latent_infs to construct a Turing model for the unobserved infection generation model set by the type of direct_inf_model.\n\n# Construct a Turing model\nZ_t = randn(100)\nlatent_inf = generate_latent_infs(direct_inf_model, Z_t)\n\nNow we can use the Turing PPL API to sample underlying parameters and generate the unobserved infections.\n\n# Sample from the unobserved infections model\n\n#Sample random parameters from prior\nθ = rand(latent_inf)\n#Get unobserved infections as a generated quantities from the model\nI_t = generated_quantities(latent_inf, θ)\n\n\n\nFields\n\ndata::EpiData: Epidata object.\ninitialisation_prior::Distributions.Sampleable: Prior distribution for the initialisation of the infections. Default is Normal().\n\n\n\n\n\n","category":"type"},{"location":"lib/EpiInfModels/public/#EpiAware.EpiInfModels.EpiData","page":"Public API","title":"EpiAware.EpiInfModels.EpiData","text":"struct EpiData{T<:Real, F<:Function}\n\nThe EpiData struct represents epidemiological data used in infectious disease modeling.\n\nConstructors\n\nEpiData(gen_int, transformation::Function). Constructs an EpiData object with discrete\n\ngeneration interval gen_int and transformation function transformation.\n\nEpiData(;gen_distribution::ContinuousDistribution, D_gen, Δd = 1.0, transformation::Function = exp).\n\nConstructs an EpiData object with double interval censoring discretisation of the continuous next generation interval distribution gen_distribution with additional right truncation at D_gen. Δd sets the interval width (default = 1.0). transformation sets the transformation function\n\nExamples\n\nConstruction direct from discrete generation interval and transformation function:\n\nusing EpiAware\ngen_int = [0.2, 0.3, 0.5]\ng = exp\ndata = EpiData(gen_int, g)\n\nConstruction from continuous distribution for generation interval.\n\nusing Distributions\n\ngen_distribution = Uniform(0.0, 10.0)\n\ndata = EpiData(;gen_distribution\n D_gen = 10.0)\n\n\n\nFields\n\ngen_int::Vector{T} where T<:Real: Discrete generation interval.\nlen_gen_int::Integer: Length of the discrete generation interval.\ntransformation::Function: Transformation function defining constrained and unconstrained domain bijections.\n\n\n\n\n\n","category":"type"},{"location":"lib/EpiInfModels/public/#EpiAware.EpiInfModels.ExpGrowthRate","page":"Public API","title":"EpiAware.EpiInfModels.ExpGrowthRate","text":"struct ExpGrowthRate{S<:Distributions.Sampleable} <: AbstractTuringEpiModel\n\nModel unobserved/latent infections as due to time-varying exponential growth rate r_t which is generated by a latent process.\n\nMathematical specification\n\nIf Z_t is a realisation of the latent model, then the unobserved/latent infections are given by\n\nI_t = g(hatI_0) exp(Z_t)\n\nwhere g is a transformation function and the unconstrained initial infections hatI_0 are sampled from a prior distribution.\n\nExpGrowthRate are constructed by passing an EpiData object data and an initialisation_prior for the prior distribution of hatI_0. The default initialisation_prior is Normal().\n\nConstructor\n\nExpGrowthRate(; data, initialisation_prior).\n\nExample usage with generate_latent_infs\n\ngenerate_latent_infs can be used to construct a Turing model for the latent infections conditional on the sample path of a latent process. In this example, we generate a sample of a white noise latent process.\n\nFirst, we construct an ExpGrowthRate struct with an EpiData object, an initialisation prior and a transformation function.\n\nusing Distributions, Turing, EpiAware\ngen_int = [0.2, 0.3, 0.5]\ng = exp\n\n# Create an EpiData object\ndata = EpiData(gen_int, g)\n\n# Create an ExpGrowthRate model\nexp_growth_model = ExpGrowthRate(data = data, initialisation_prior = Normal())\n\nThen, we can use generate_latent_infs to construct a Turing model for the unobserved infection generation model set by the type of direct_inf_model.\n\n# Construct a Turing model\nZ_t = randn(100) * 0.05\nlatent_inf = generate_latent_infs(exp_growth_model, Z_t)\n\nNow we can use the Turing PPL API to sample underlying parameters and generate the unobserved infections.\n\n# Sample from the unobserved infections model\n\n#Sample random parameters from prior\nθ = rand(latent_inf)\n#Get unobserved infections as a generated quantities from the model\nI_t = generated_quantities(latent_inf, θ)\n\n\n\nFields\n\ndata::EpiData\ninitialisation_prior::Distributions.Sampleable\n\n\n\n\n\n","category":"type"},{"location":"lib/EpiInfModels/public/#EpiAware.EpiInfModels.ODEParams","page":"Public API","title":"EpiAware.EpiInfModels.ODEParams","text":"struct ODEParams{T}\n\nA structure to hold the initial condition and parameters for an ODE (Ordinary Differential Equation) process. params::ODEParams is used in the method generate_latent_infs(epi_model::ODEProcess, params::ODEParams)\n\nConstructors\n\nODEParams(; u0::VecOrMat, p::VecOrMat): Create an ODEParams object with the initial condition(s) u0 and parameters p.\n\nExample\n\nusing EpiAware\nparams = ODEParams(; u0 = ones(10), p = [2, 3])\n\n# output\n\nODEParams{Float64}([1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0], [2.0, 3.0])\n\n\n\nFields\n\nu0::VecOrMat: The initial condition(s) for the ODE, which can be a vector or matrix of type T.\np::VecOrMat: The parameters for the ODE, which can be a vector or matrix of type T.\n\n\n\n\n\n","category":"type"},{"location":"lib/EpiInfModels/public/#EpiAware.EpiInfModels.ODEProcess","page":"Public API","title":"EpiAware.EpiInfModels.ODEProcess","text":"struct ODEProcess{P<:SciMLBase.ODEProblem, T, S, F<:Function} <: AbstractTuringEpiModel\n\nA structure representing an infection process modeled by an Ordinary Differential Equation (ODE).\n\nBackground\n\nThe purpose of this structure is to define the behaviour of modelling an infection process using an ODE. We use the SciML ecosystem to define and solve the ODE. For ODEProcess structs we focus on defining from a restricted set of ODE problems:\n\nThe initial condition u0 must be a vector or matrix.\nThe parameters p must be a vector or matrix.\nThe output of the ODE should be interpreted as the infection incidence at each time point in\n\nts via the function sol2infs which maps the solution object sol of the ODE solve to infection counts.\n\nConstructors\n\nODEProcess(prob::ODEProblem; ts, solver, sol2infs): Create an ODEProcess\n\nobject with the ODE problem prob, time points ts, solver solver, and function sol2infs.\n\nExample\n\nusing EpiAware, OrdinaryDiffEq\nr = log(2) / 7 # Growth rate corresponding to 7 day doubling time\nu0 = [1.0]\np = [r]\nparams = ODEParams(u0 = u0, p = p)\n\n# Define the ODE problem using SciML\n# We use a simple exponential growth model\n\nfunction expgrowth(du, u, p, t)\n du[1] = p[1] * u[1]\nend\nprob = ODEProblem(expgrowth, u0, (0.0, 10.0), p)\n\n# Define the ODEProcess\n\nexpgrowth_model = ODEProcess(prob::ODEProblem; ts = 0:1:10,\n solver = Tsit5(),\n sol2infs = sol -> sol[1, :])\n\n# Generate the latent infections\nI_t = generate_latent_infs(expgrowth_model, params)()\n\n# output\n\n11-element Vector{Float64}:\n 1.0\n 1.1040895124087677\n 1.2190137467993492\n 1.3459001375697022\n 1.4859941865014936\n 1.640671113705054\n 1.8114471151863056\n 1.9999990356297939\n 2.2081789476865237\n 2.438027196361022\n 2.6918002758361723\n\n\n\nFields\n\nprob::SciMLBase.ODEProblem: The ODE problem instance, where P is a subtype of ODEProblem.\nts::Vector: A vector of time points, where T is the type of the time points.\nsolver::Any: The solver used for the ODE problem.\nsol2infs::Function: A function that maps the solution object of the ODE to infection counts.\n\n\n\n\n\n","category":"type"},{"location":"lib/EpiInfModels/public/#EpiAware.EpiInfModels.Renewal","page":"Public API","title":"EpiAware.EpiInfModels.Renewal","text":"struct Renewal{E, S<:Distributions.Sampleable, A} <: AbstractTuringRenewal\n\nModel unobserved/latent infections as due to time-varying Renewal model with reproduction number mathcalR_t which is generated by a latent process.\n\nMathematical specification\n\nIf Z_t is a realisation of the latent model, then the unobserved/latent infections are given by\n\nbeginalign\nmathcalR_t = g(Z_t)\nI_t = mathcalR_t sum_i=1^n-1 I_t-i g_i qquad t geq 1 \nI_t = g(hatI_0) exp(r(mathcalR_1) t) qquad t leq 0\nendalign\n\nwhere g is a transformation function and the unconstrained initial infections hatI_0 are sampled from a prior distribution. The discrete generation interval is given by g_i.\n\nr(mathcalR_1) is the exponential growth rate implied by mathcalR_1) using the implicit relationship between the exponential growth rate and the reproduction number.\n\nmathcalR sum_j geq 1 g_j exp(- r j)= 1\n\nRenewal are constructed by passing an EpiData object data and an initialisation_prior for the prior distribution of hatI_0. The default initialisation_prior is Normal().\n\nConstructors\n\nRenewal(; data, initialisation_prior). Construct a Renewal model with default update steps.\nRenewal(data; initialisation_prior). Construct a Renewal model with default update steps.\nRenewal(data, initialisation_prior, recurrent_step) Construct a Renewal model with recurrent_step update step function.\n\nExample usage with generate_latent_infs\n\ngenerate_latent_infs can be used to construct a Turing model for the latent infections conditional on the sample path of a latent process. In this example, we generate a sample of a white noise latent process.\n\nFirst, we construct an Renewal struct with an EpiData object, an initialisation prior and a transformation function.\n\nusing Distributions, Turing, EpiAware\ngen_int = [0.2, 0.3, 0.5]\ng = exp\n\n# Create an EpiData object\ndata = EpiData(gen_int, g)\n\n# Create an Renewal model\nrenewal_model = Renewal(data; initialisation_prior = Normal())\n\nThen, we can use generate_latent_infs to construct a Turing model for the unobserved infection generation model set by the type of direct_inf_model.\n\n# Construct a Turing model\nZ_t = randn(100) * 0.05\nlatent_inf = generate_latent_infs(renewal_model, Z_t)\n\nNow we can use the Turing PPL API to sample underlying parameters and generate the unobserved infections.\n\n# Sample from the unobserved infections model\n\n#Sample random parameters from prior\nθ = rand(latent_inf)\n#Get unobserved infections as a generated quantities from the model\nI_t = generated_quantities(latent_inf, θ)\n\n\n\nFields\n\ndata::Any\ninitialisation_prior::Distributions.Sampleable\nrecurrent_step::Any\n\n\n\n\n\n","category":"type"},{"location":"lib/EpiInfModels/public/#EpiAware.EpiInfModels.R_to_r-Union{Tuple{T}, Tuple{Any, Vector{T}}} where T<:AbstractFloat","page":"Public API","title":"EpiAware.EpiInfModels.R_to_r","text":"R_to_r(\n R₀,\n w::Array{T<:AbstractFloat, 1};\n newton_steps,\n Δd\n) -> Any\n\n\nThis function computes an approximation to the exponential growth rate r given the reproductive ratio R₀ and the discretized generation interval w with discretized interval width Δd. This is based on the implicit solution of\n\nG(r) - 1 over R_0 = 0\n\nwhere\n\nG(r) = sum_i=1^n w_i e^-r i\n\nis the negative moment generating function (MGF) of the generation interval distribution.\n\nThe two step approximation is based on: 1. Direct solution of implicit equation for a small r approximation. 2. Improving the approximation using Newton's method for a fixed number of steps newton_steps.\n\nReturns:\n\nThe approximate value of r.\n\n\n\n\n\n","category":"method"},{"location":"lib/EpiInfModels/public/#EpiAware.EpiInfModels.expected_Rt-Tuple{EpiData, Vector{<:Real}}","page":"Public API","title":"EpiAware.EpiInfModels.expected_Rt","text":"expected_Rt(\n data::EpiData,\n infections::Vector{<:Real}\n) -> Any\n\n\nCalculate the expected Rt values based on the given EpiData object and infections.\n\nR_t = fracI_tsum_i=1^n I_t-i g_i\n\nArguments\n\ndata::EpiData: An instance of the EpiData type containing generation interval data.\ninfections::Vector{<:Real}: A vector of infection data.\n\nReturns\n\nexp_Rt::Vector{Float64}: A vector of expected Rt values.\n\nExamples\n\nusing EpiAware\n\ndata = EpiData([0.2, 0.3, 0.5], exp)\ninfections = [100, 200, 300, 400, 500]\nexpected_Rt(data, infections)\n\n\n\n\n\n","category":"method"},{"location":"lib/EpiInfModels/public/#EpiAware.EpiInfModels.r_to_R-Tuple{Any, AbstractVector}","page":"Public API","title":"EpiAware.EpiInfModels.r_to_R","text":"r_to_R(r, w::AbstractVector) -> Any\n\n\nr_to_R(r, w)\n\nCompute the reproductive ratio given exponential growth rate r and discretized generation interval w.\n\nArguments\n\nr: The exponential growth rate.\nw: discretized generation interval.\n\nReturns\n\nThe reproductive ratio.\n\n\n\n\n\n","category":"method"},{"location":"lib/EpiAwareUtils/#EpiAwareUtils.jl","page":"Overview","title":"EpiAwareUtils.jl","text":"","category":"section"},{"location":"lib/EpiAwareUtils/","page":"Overview","title":"Overview","text":"This package provides utility functions for the EpiAware ecosystem.","category":"page"},{"location":"lib/EpiAwareUtils/#API","page":"Overview","title":"API","text":"","category":"section"},{"location":"lib/EpiAwareUtils/","page":"Overview","title":"Overview","text":"Pages = [\"lib/EpiAwareUtils/public.md\", \"lib/EpiAwareUtils/internals.md\"]","category":"page"},{"location":"lib/EpiInference/#EpiInference.jl","page":"Overview","title":"EpiInference.jl","text":"","category":"section"},{"location":"lib/EpiInference/","page":"Overview","title":"Overview","text":"This package provides inference algorithms for the EpiAware ecosystem.","category":"page"},{"location":"lib/EpiInference/#API","page":"Overview","title":"API","text":"","category":"section"},{"location":"lib/EpiInference/","page":"Overview","title":"Overview","text":"Pages = [\"lib/EpiInference/public.md\", \"lib/EpiInference/internals.md\"]","category":"page"},{"location":"lib/public/#Public-Documentation","page":"Public API","title":"Public Documentation","text":"","category":"section"},{"location":"lib/public/","page":"Public API","title":"Public API","text":"Documentation for EpiAware.jl's public interface.","category":"page"},{"location":"lib/public/","page":"Public API","title":"Public API","text":"See the Internals section of the manual for internal package docs covering all submodules.","category":"page"},{"location":"lib/public/#Contents","page":"Public API","title":"Contents","text":"","category":"section"},{"location":"lib/public/","page":"Public API","title":"Public API","text":"Pages = [\"public.md\"]\nDepth = 2:2","category":"page"},{"location":"lib/public/#Index","page":"Public API","title":"Index","text":"","category":"section"},{"location":"lib/public/","page":"Public API","title":"Public API","text":"Pages = [\"public.md\"]","category":"page"},{"location":"lib/public/#Public-API","page":"Public API","title":"Public API","text":"","category":"section"},{"location":"lib/public/","page":"Public API","title":"Public API","text":"Modules = [EpiAware]\nPrivate = false","category":"page"},{"location":"showcase/replications/mishra-2020/","page":"On the derivation of the renewal equation from an age-dependent branching process: an epidemic modelling perspective","title":"On the derivation of the renewal equation from an age-dependent branching process: an epidemic modelling perspective","text":"\n\n\n\n\n

              Example: Early COVID-19 case data in South Korea

              In this example we use EpiAware functionality to largely recreate an epidemiological model presented in On the derivation of the renewal equation from an age-dependent branching process: an epidemic modelling perspective, Mishra et al (2020). Mishra et al consider test-confirmed cases of COVID-19 in South Korea between January to July 2020. The components of the epidemilogical model they consider are:

              $$I_t = R_t \\sum_{s\\geq 1} I_{t-s} g_s.$$

              $$G \\sim \\text{Gamma}(6.5,0.62).$$

              $$C_t \\sim \\text{NegBin}(\\text{mean} = I_t,~ \\text{overdispersion} = \\phi).$$

              In the examples below we are going to largely recreate the Mishra et al model, whilst emphasing that each component of the overall epidemiological model is, itself, a stand alone model that can be sampled from.

              \n\n\n\n\n","category":"page"},{"location":"showcase/replications/mishra-2020/#Dependencies-for-this-notebook","page":"On the derivation of the renewal equation from an age-dependent branching process: an epidemic modelling perspective","title":"Dependencies for this notebook","text":"","category":"section"},{"location":"showcase/replications/mishra-2020/","page":"On the derivation of the renewal equation from an age-dependent branching process: an epidemic modelling perspective","title":"On the derivation of the renewal equation from an age-dependent branching process: an epidemic modelling perspective","text":"
              \n

              Now we want to import these dependencies into scope. If evaluating these code lines/blocks in REPL, then the REPL will offer to install any missing dependencies. Alternatively, you can add them to your active environment using Pkg.add.

              \n\n
              using EpiAware
              \n\n\n
              using Turing, DynamicPPL #Underlying Turing ecosystem packages to interact with models
              \n\n\n
              using Distributions, Statistics #Statistics packages
              \n\n\n
              using CSV, DataFramesMeta #Data wrangling
              \n\n\n
              using CairoMakie, PairPlots, TimeSeries #Plotting backend
              \n\n\n
              using ReverseDiff #Automatic differentiation backend
              \n\n\n
              begin #Date utility and set Random seed\n    using Dates\n    using Random\n    Random.seed!(1)\nend
              \n
              TaskLocalRNG()
              \n\n","category":"page"},{"location":"showcase/replications/mishra-2020/#Load-early-SARS-2-case-data-for-South-Korea","page":"On the derivation of the renewal equation from an age-dependent branching process: an epidemic modelling perspective","title":"Load early SARS-2 case data for South Korea","text":"","category":"section"},{"location":"showcase/replications/mishra-2020/","page":"On the derivation of the renewal equation from an age-dependent branching process: an epidemic modelling perspective","title":"On the derivation of the renewal equation from an age-dependent branching process: an epidemic modelling perspective","text":"
              \n

              First, we make sure that we have the data we want to analysis in scope by downloading it for where we have saved a copy in the EpiAware repository.

              NB: The case data is curated by the covidregionaldata package. We accessed the South Korean case data using a short R script. It is possible to interface directly from a Julia session using the RCall.jl package, but we do not do this in this notebook to reduce the number of underlying dependencies required to run this notebook.

              \n\n
              url = \"https://raw.githubusercontent.com/CDCgov/Rt-without-renewal/main/EpiAware/docs/src/showcase/replications/mishra-2020/south_korea_data.csv2\"
              \n
              \"https://raw.githubusercontent.com/CDCgov/Rt-without-renewal/main/EpiAware/docs/src/showcase/replications/mishra-2020/south_korea_data.csv2\"
              \n\n
              data = CSV.read(download(url), DataFrame)
              \n
              Column1datecases_newdeaths_new
              112019-12-3100
              222020-01-0100
              332020-01-0200
              442020-01-0300
              552020-01-0400
              662020-01-0500
              772020-01-0600
              882020-01-0700
              992020-01-0800
              10102020-01-0900
              ...
              2142142020-07-31361
              \n\n\n

              Time-varying reproduction number as an AbstractLatentModel type

              EpiAware exposes a AbstractLatentModel abstract type; the purpose of which is to group stochastic processes which can be interpreted as generating time-varying parameters/quantities of interest which we call latent process models.

              In the Mishra et al model the log-time varying reproductive number \\(Z_t\\) is assumed to evolve as an auto-regressive process, AR(2):

              $$\\begin{align}\nR_t &= \\exp Z_t, \\\\\nZ_t &= \\rho_1 Z_{t-1} + \\rho_2 Z_{t-2} + \\epsilon_t, \\\\\n\\epsilon_t &\\sim \\text{Normal}(0, \\sigma^*).\n\\end{align}$$

              Where \\(\\rho_1,\\rho_2\\), which are the parameters of AR process, and \\(\\epsilon_t\\) is a white noise process with standard deviation \\(\\sigma^*\\).

              \n\n\n

              In EpiAware we determine the behaviour of a latent process by choosing a concrete subtype (i.e. a struct) of AbstractLatentModel which has fields that set the priors of the various parameters required for the latent process.

              The AR process has the struct AR <: AbstractLatentModel. The user can supply the priors for \\(\\rho_1,\\rho_2\\) in the field damp_priors, for \\(\\sigma^*\\) in the field std_prior, and the initial values \\(Z_1, Z_2\\) in the field init_priors.

              \n\n\n

              We choose priors based on Mishra et al using the Distributions.jl interface to probability distributions. Note that we condition the AR parameters onto \\([0,1]\\), as in Mishra et al, using the truncated function.

              In Mishra et al the standard deviation of the stationary distribution of \\(Z_t\\) which has a standard normal distribution conditioned to be positive \\(\\sigma \\sim \\mathcal{N}^+(0,1)\\). The value \\(σ^*\\) was determined from a nonlinear function of sampled \\(\\sigma, ~\\rho_1, ~\\rho_2\\) values. Since, Mishra et al give sharply informative priors for \\(\\rho_1,~\\rho_2\\) (see below) we simplify by calculating \\(\\sigma^*\\) at the prior mode of \\(\\rho_1,~\\rho_2\\). This results in a \\(\\sigma^* \\sim \\mathcal{N}^+(0, 0.5)\\) prior.

              \n\n
              ar = AR(\n    damp_priors = reverse([truncated(Normal(0.8, 0.05), 0, 1),\n        truncated(Normal(0.1, 0.05), 0, 1)]),\n    std_prior = HalfNormal(0.5),\n    init_priors = [Normal(-1.0, 0.1), Normal(-1.0, 0.5)]\n)
              \n
              AR{Product{Continuous, Truncated{Normal{Float64}, Continuous, Float64, Float64, Float64}, Vector{Truncated{Normal{Float64}, Continuous, Float64, Float64, Float64}}}, HalfNormal{Float64}, Product{Continuous, Normal{Float64}, Vector{Normal{Float64}}}, Int64}(Distributions.Product{Distributions.Continuous, Distributions.Truncated{Distributions.Normal{Float64}, Distributions.Continuous, Float64, Float64, Float64}, Vector{Distributions.Truncated{Distributions.Normal{Float64}, Distributions.Continuous, Float64, Float64, Float64}}}(v=Truncated{Normal{Float64}, Continuous, Float64, Float64, Float64}[Truncated(Distributions.Normal{Float64}(μ=0.1, σ=0.05); lower=0.0, upper=1.0), Truncated(Distributions.Normal{Float64}(μ=0.8, σ=0.05); lower=0.0, upper=1.0)]), HalfNormal{Float64}(μ=0.5), Distributions.Product{Distributions.Continuous, Distributions.Normal{Float64}, Vector{Distributions.Normal{Float64}}}(v=Normal{Float64}[Distributions.Normal{Float64}(μ=-1.0, σ=0.1), Distributions.Normal{Float64}(μ=-1.0, σ=0.5)]), 2)
              \n\n\n

              Turing model interface to the AR process

              As mentioned above, we can use this instance of the AR latent model to construct a Turing model object which implements the probabilistic behaviour determined by ar. We do this with the constructor function exposed by EpiAware: generate_latent which combines an AbstractLatentModel substype struct with the number of time steps for which we want to generate the latent process.

              As a refresher, we remind that the Turing.Model object has the following properties:

              As a concrete example we create a model object for the AR(2) process we specified above for 50 time steps:

              \n\n
              ar_mdl = generate_latent(ar, 50)
              \n
              Model{typeof(generate_latent), (:latent_model, :n), (), (), Tuple{AR{Product{Continuous, Truncated{Normal{Float64}, Continuous, Float64, Float64, Float64}, Vector{Truncated{Normal{Float64}, Continuous, Float64, Float64, Float64}}}, HalfNormal{Float64}, Product{Continuous, Normal{Float64}, Vector{Normal{Float64}}}, Int64}, Int64}, Tuple{}, DefaultContext}(EpiAware.EpiAwareBase.generate_latent, (latent_model = AR{Product{Continuous, Truncated{Normal{Float64}, Continuous, Float64, Float64, Float64}, Vector{Truncated{Normal{Float64}, Continuous, Float64, Float64, Float64}}}, HalfNormal{Float64}, Product{Continuous, Normal{Float64}, Vector{Normal{Float64}}}, Int64}(Distributions.Product{Distributions.Continuous, Distributions.Truncated{Distributions.Normal{Float64}, Distributions.Continuous, Float64, Float64, Float64}, Vector{Distributions.Truncated{Distributions.Normal{Float64}, Distributions.Continuous, Float64, Float64, Float64}}}(v=Truncated{Normal{Float64}, Continuous, Float64, Float64, Float64}[Truncated(Distributions.Normal{Float64}(μ=0.1, σ=0.05); lower=0.0, upper=1.0), Truncated(Distributions.Normal{Float64}(μ=0.8, σ=0.05); lower=0.0, upper=1.0)]), HalfNormal{Float64}(μ=0.5), Distributions.Product{Distributions.Continuous, Distributions.Normal{Float64}, Vector{Distributions.Normal{Float64}}}(v=Normal{Float64}[Distributions.Normal{Float64}(μ=-1.0, σ=0.1), Distributions.Normal{Float64}(μ=-1.0, σ=0.5)]), 2), n = 50), NamedTuple(), DefaultContext())
              \n\n\n

              Ultimately, this will only be one component of the full epidemiological model. However, it is useful to visualise its probabilistic behaviour for model diagnostic and prior predictive checking.

              We can spaghetti plot generative samples from the AR(2) process with the priors specified above.

              \n\n
              plt_ar_sample = let\n    n_samples = 100\n    ar_mdl_samples = mapreduce(hcat, 1:n_samples) do _\n        ar_mdl() .|> exp #Sample Z_t trajectories for the model\n    end\n\n    fig = Figure()\n    ax = Axis(fig[1, 1];\n        yscale = log10,\n        ylabel = \"Time varying Rₜ\",\n        title = \"$(n_samples) draws from the prior Rₜ model\"\n    )\n    for col in eachcol(ar_mdl_samples)\n        lines!(ax, col, color = (:grey, 0.1))\n    end\n    fig\nend
              \n\n\n\n

              This suggests that a priori we believe that there is a few percent chance of achieving very high \\(R_t\\) values, i.e. \\(R_t \\sim 10-1000\\) is not excluded by our priors.

              \n\n\n

              The Renewal model as an AbstractEpiModel type

              The abstract type for models that generate infections exposed by EpiAware is called AbstractEpiModel. As with latent models different concrete subtypes of AbstractEpiModel define different classes of infection generating process. In this case we want to implement a renewal model.

              The Renewal <: AbstractEpiModel type of struct needs two fields:

              In Mishra et al they use an estimate of the serial interval of SARS-CoV-2 as an estimate of the generation interval.

              \n\n
              truth_GI = Gamma(6.5, 0.62)
              \n
              Distributions.Gamma{Float64}(α=6.5, θ=0.62)
              \n\n\n

              This is a representation of the generation interval distribution as continuous whereas the infection process will be formulated in discrete daily time steps. By default, EpiAware performs double interval censoring to convert our continuous estimate of the generation interval into a discretized version \\(g_t\\), whilst also applying left truncation such that \\(g_0 = 0\\) and normalising \\(\\sum_t g_t = 1.\\)

              The constructor for converting a continuous estimate of the generation interval distribution into a usable discrete time estimate is EpiData.

              \n\n
              model_data = EpiData(gen_distribution = truth_GI)
              \n
              EpiData{Float64, typeof(exp)}([0.026663134095601056, 0.14059778064943784, 0.2502660305615846, 0.24789569560506844, 0.1731751163417783, 0.09635404000022223, 0.04573437575216367, 0.019313826994143808], 8, exp)
              \n\n\n

              We can compare the discretized generation interval with the continuous estimate, which in this example is the serial interval estimate.

              \n\n
              let\n    fig = Figure()\n    ax = Axis(fig[1, 1];\n        xticks = 0:14,\n        xlabel = \"Days\",\n        title = \"Continuous and discrete generation intervals\"\n    )\n    barplot!(ax, model_data.gen_int;\n        label = \"Discretized next gen pmf\"\n    )\n    lines!(truth_GI;\n        label = \"Continuous serial interval\",\n        color = :green\n    )\n    axislegend(ax)\n    fig\nend
              \n\n\n\n

              The user also needs to specify a prior for the log incidence at time zero, \\(\\log I_0\\). The initial history of latent infections \\(I_{-1}, I_{-2},\\dots\\) is constructed as

              $$I_t = e^{rt} I_0,\\qquad t = 0, -1, -2,...$$

              Where the exponential growth rate \\(r\\) is determined by the initial reproductive number \\(R_1\\) via the solution to the implicit equation,

              $$R_1 = 1 \\Big{/} \\sum_{t\\geq 1} e^{-rt} g_t$$

              \n\n
              log_I0_prior = Normal(log(1.0), 1.0)
              \n
              Distributions.Normal{Float64}(μ=0.0, σ=1.0)
              \n\n
              epi = Renewal(model_data; initialisation_prior = log_I0_prior)
              \n
              Renewal{EpiData{Float64, typeof(exp)}, Normal{Float64}, EpiAware.EpiInfModels.ConstantRenewalStep{Float64}}(EpiData{Float64, typeof(exp)}([0.026663134095601056, 0.14059778064943784, 0.2502660305615846, 0.24789569560506844, 0.1731751163417783, 0.09635404000022223, 0.04573437575216367, 0.019313826994143808], 8, exp), Distributions.Normal{Float64}(μ=0.0, σ=1.0), EpiAware.EpiInfModels.ConstantRenewalStep{Float64}([0.019313826994143808, 0.04573437575216367, 0.09635404000022223, 0.1731751163417783, 0.24789569560506844, 0.2502660305615846, 0.14059778064943784, 0.026663134095601056]))
              \n\n\n

              NB: We don't implement a background infection rate in this model.

              \n\n\n

              Turing model interface to Renewal process

              As mentioned above, we can use this instance of the Renewal latent infection model to construct a TuringModel which implements the probabilistic behaviour determined by epi using the constructor function generate_latent_infs which combines epi with a provided \\(\\log R_t\\) time series.

              Here we choose an example where \\(R_t\\) decreases from \\(R_t = 3\\) to \\(R_t = 0.5\\) over the course of 50 days.

              \n\n
              R_t_fixed = [0.5 + 2.5 / (1 + exp(t - 15)) for t in 1:50]
              \n
              50-element Vector{Float64}:\n 2.9999979211799306\n 2.9999943491892553\n 2.9999846395634946\n 2.99995824644538\n 2.9998865053282437\n 2.9996915135600344\n 2.999161624673834\n ⋮\n 0.5000000000002339\n 0.500000000000086\n 0.5000000000000316\n 0.5000000000000117\n 0.5000000000000043\n 0.5000000000000016
              \n\n
              latent_inf_mdl = generate_latent_infs(epi, log.(R_t_fixed))
              \n
              Model{typeof(generate_latent_infs), (:epi_model, :_Rt), (), (), Tuple{Renewal{EpiData{Float64, typeof(exp)}, Normal{Float64}, EpiAware.EpiInfModels.ConstantRenewalStep{Float64}}, Vector{Float64}}, Tuple{}, DefaultContext}(EpiAware.EpiAwareBase.generate_latent_infs, (epi_model = Renewal{EpiData{Float64, typeof(exp)}, Normal{Float64}, EpiAware.EpiInfModels.ConstantRenewalStep{Float64}}(EpiData{Float64, typeof(exp)}([0.026663134095601056, 0.14059778064943784, 0.2502660305615846, 0.24789569560506844, 0.1731751163417783, 0.09635404000022223, 0.04573437575216367, 0.019313826994143808], 8, exp), Distributions.Normal{Float64}(μ=0.0, σ=1.0), EpiAware.EpiInfModels.ConstantRenewalStep{Float64}([0.019313826994143808, 0.04573437575216367, 0.09635404000022223, 0.1731751163417783, 0.24789569560506844, 0.2502660305615846, 0.14059778064943784, 0.026663134095601056])), _Rt = [1.0986115957278464, 1.098610405062754, 1.0986071685094998, 1.0985983707197156, 1.0985744563952262, 1.098509454567543, 1.0983327911702674, 1.097852790994088, 1.09654964358037, 1.0930193012626002  …  -0.6931471805343999, -0.6931471805505477, -0.693147180556488, -0.6931471805586734, -0.6931471805594774, -0.6931471805597732, -0.693147180559882, -0.693147180559922, -0.6931471805599366, -0.6931471805599422]), NamedTuple(), DefaultContext())
              \n\n
              plt_epi = let\n    n_samples = 100\n    #Sample unconditionally the underlying parameters of the model\n    epi_mdl_samples = mapreduce(hcat, 1:n_samples) do _\n        latent_inf_mdl()\n    end\n    fig = Figure()\n    ax1 = Axis(fig[1, 1];\n        title = \"$(n_samples) draws from renewal model with chosen Rt\",\n        ylabel = \"Latent infections\"\n    )\n    ax2 = Axis(fig[2, 1];\n        ylabel = \"Rt\"\n    )\n    for col in eachcol(epi_mdl_samples)\n        lines!(ax1, col;\n            color = (:grey, 0.1)\n        )\n    end\n    lines!(ax2, R_t_fixed;\n        linewidth = 2\n    )\n    fig\nend
              \n\n\n\n

              Negative Binomial Observations as an ObservationModel type

              In Mishra et al latent infections were assumed to occur on their observation day with negative binomial errors, this motivates using the serial interval (the time between onset of symptoms of a primary and secondary case) rather than generation interval distribution (the time between infection time of a primary and secondary case).

              Observation models are set in EpiAware as concrete subtypes of an ObservationModel. The Negative binomial error model without observation delays is set with a NegativeBinomialError struct. In Mishra et al the overdispersion parameter \\(\\phi\\) sets the relationship between the mean and variance of the negative binomial errors,

              $$\\text{var} = \\text{mean} + {\\text{mean}^2 \\over \\phi}.$$

              In EpiAware, we default to a prior on \\(\\sqrt{1/\\phi}\\) because this quantity is approximately the coefficient of variation of the observation noise and, therefore, is easier to reason on a priori beliefs. We call this quantity the cluster factor.

              A prior for \\(\\phi\\) was not specified in Mishra et al, we select one below but we will condition a value in analysis below.

              \n\n
              obs = NegativeBinomialError(cluster_factor_prior = HalfNormal(0.1))
              \n
              NegativeBinomialError{HalfNormal{Float64}}(HalfNormal{Float64}(μ=0.1))
              \n\n\n

              Turing model interface to the NegativeBinomialError model

              We can construct a NegativeBinomialError model implementation as a TuringModel using the EpiAwaregenerate_observations functions.

              Turing uses missing arguments to indicate variables that are to be sampled. We use this to observe a forward model that samples observations, conditional on an underlying expected observation time series.

              \n\n\n

              First, we set an artificial expected cases curve.

              \n\n
              expected_cases = [1000 * exp(-(t - 15)^2 / (2 * 4)) for t in 1:30]
              \n
              30-element Vector{Float64}:\n 2.289734845645553e-8\n 6.691586091292782e-7\n 1.5229979744712628e-5\n 0.0002699578503363014\n 0.003726653172078671\n 0.04006529739295107\n 0.33546262790251186\n ⋮\n 0.003726653172078671\n 0.0002699578503363014\n 1.5229979744712628e-5\n 6.691586091292782e-7\n 2.289734845645553e-8\n 6.101936677605324e-10
              \n\n
              obs_mdl = generate_observations(obs, missing, expected_cases)
              \n
              Model{typeof(generate_observations), (:obs_model, :y_t, :Y_t), (), (:y_t,), Tuple{NegativeBinomialError{HalfNormal{Float64}}, Missing, Vector{Float64}}, Tuple{}, DefaultContext}(EpiAware.EpiAwareBase.generate_observations, (obs_model = NegativeBinomialError{HalfNormal{Float64}}(HalfNormal{Float64}(μ=0.1)), y_t = missing, Y_t = [2.289734845645553e-8, 6.691586091292782e-7, 1.5229979744712628e-5, 0.0002699578503363014, 0.003726653172078671, 0.04006529739295107, 0.33546262790251186, 2.187491118182885, 11.108996538242305, 43.93693362340742  …  11.108996538242305, 2.187491118182885, 0.33546262790251186, 0.04006529739295107, 0.003726653172078671, 0.0002699578503363014, 1.5229979744712628e-5, 6.691586091292782e-7, 2.289734845645553e-8, 6.101936677605324e-10]), NamedTuple(), DefaultContext())
              \n\n
              plt_obs = let\n    n_samples = 100\n    obs_mdl_samples = mapreduce(hcat, 1:n_samples) do _\n        θ = obs_mdl() #Sample unconditionally the underlying parameters of the model\n    end\n    fig = Figure()\n    ax = Axis(fig[1, 1];\n        title = \"$(n_samples) draws from neg. bin. obs model\",\n        ylabel = \"Observed cases\"\n    )\n    for col in eachcol(obs_mdl_samples)\n        scatter!(ax, col;\n            color = (:grey, 0.2)\n        )\n    end\n    lines!(ax, expected_cases;\n        color = :red,\n        linewidth = 3,\n        label = \"Expected cases\"\n    )\n    axislegend(ax)\n    fig\nend
              \n\n\n\n

              Composing models into an EpiProblem

              Mishra et al follows a common pattern of having an infection generation process driven by a latent process with an observation model that links the infection process to a discrete valued time series of incidence data.

              In EpiAware we provide an EpiProblem constructor for this common epidemiological model pattern.

              The constructor for an EpiProblem requires:

              The tspan set the range of the time index for the models.

              \n\n
              epi_prob = EpiProblem(epi_model = epi,\n    latent_model = ar,\n    observation_model = obs,\n    tspan = (45, 80))
              \n
              EpiProblem{Renewal{EpiData{Float64, typeof(exp)}, Normal{Float64}, EpiAware.EpiInfModels.ConstantRenewalStep{Float64}}, AR{Product{Continuous, Truncated{Normal{Float64}, Continuous, Float64, Float64, Float64}, Vector{Truncated{Normal{Float64}, Continuous, Float64, Float64, Float64}}}, HalfNormal{Float64}, Product{Continuous, Normal{Float64}, Vector{Normal{Float64}}}, Int64}, NegativeBinomialError{HalfNormal{Float64}}}(Renewal{EpiData{Float64, typeof(exp)}, Normal{Float64}, EpiAware.EpiInfModels.ConstantRenewalStep{Float64}}(EpiData{Float64, typeof(exp)}([0.026663134095601056, 0.14059778064943784, 0.2502660305615846, 0.24789569560506844, 0.1731751163417783, 0.09635404000022223, 0.04573437575216367, 0.019313826994143808], 8, exp), Distributions.Normal{Float64}(μ=0.0, σ=1.0), EpiAware.EpiInfModels.ConstantRenewalStep{Float64}([0.019313826994143808, 0.04573437575216367, 0.09635404000022223, 0.1731751163417783, 0.24789569560506844, 0.2502660305615846, 0.14059778064943784, 0.026663134095601056])), AR{Product{Continuous, Truncated{Normal{Float64}, Continuous, Float64, Float64, Float64}, Vector{Truncated{Normal{Float64}, Continuous, Float64, Float64, Float64}}}, HalfNormal{Float64}, Product{Continuous, Normal{Float64}, Vector{Normal{Float64}}}, Int64}(Distributions.Product{Distributions.Continuous, Distributions.Truncated{Distributions.Normal{Float64}, Distributions.Continuous, Float64, Float64, Float64}, Vector{Distributions.Truncated{Distributions.Normal{Float64}, Distributions.Continuous, Float64, Float64, Float64}}}(v=Truncated{Normal{Float64}, Continuous, Float64, Float64, Float64}[Truncated(Distributions.Normal{Float64}(μ=0.1, σ=0.05); lower=0.0, upper=1.0), Truncated(Distributions.Normal{Float64}(μ=0.8, σ=0.05); lower=0.0, upper=1.0)]), HalfNormal{Float64}(μ=0.5), Distributions.Product{Distributions.Continuous, Distributions.Normal{Float64}, Vector{Distributions.Normal{Float64}}}(v=Normal{Float64}[Distributions.Normal{Float64}(μ=-1.0, σ=0.1), Distributions.Normal{Float64}(μ=-1.0, σ=0.5)]), 2), NegativeBinomialError{HalfNormal{Float64}}(HalfNormal{Float64}(μ=0.1)), (45, 80))
              \n\n","category":"page"},{"location":"showcase/replications/mishra-2020/#Inference-Methods","page":"On the derivation of the renewal equation from an age-dependent branching process: an epidemic modelling perspective","title":"Inference Methods","text":"","category":"section"},{"location":"showcase/replications/mishra-2020/","page":"On the derivation of the renewal equation from an age-dependent branching process: an epidemic modelling perspective","title":"On the derivation of the renewal equation from an age-dependent branching process: an epidemic modelling perspective","text":"
              \n

              We make inferences on the unobserved quantities, such as \\(R_t\\) by sampling from the model conditioned on the observed data. We generate the posterior samples using the No U-Turns (NUTS) sampler.

              To make NUTS more robust we provide manypathfinder, which is built on pathfinder variational inference from Pathfinder.jl. manypathfinder runs nruns pathfinder processes on the inference problem and returns the pathfinder run with maximum estimated ELBO.

              The composition of doing variational inference as a pre-sampler step which gets passed to NUTS initialisation is defined using the EpiMethod struct, where a sequence of pre-sampler steps can be be defined.

              EpiMethod also allows the specification of NUTS parameters, such as type of automatic differentiation, type of parallelism and number of parallel chains to sample.

              \n\n
              num_threads = min(10, Threads.nthreads())
              \n
              1
              \n\n
              inference_method = EpiMethod(\n    pre_sampler_steps = [ManyPathfinder(nruns = 4, maxiters = 100)],\n    sampler = NUTSampler(\n        adtype = AutoReverseDiff(compile = true),\n        ndraws = 2000,\n        nchains = num_threads,\n        mcmc_parallel = MCMCThreads())\n)
              \n
              EpiMethod{ManyPathfinder, NUTSampler{AutoReverseDiff{true}, MCMCThreads, UnionAll}}(ManyPathfinder[ManyPathfinder(10, 4, 100, 100)], NUTSampler{AutoReverseDiff{true}, MCMCThreads, UnionAll}(0.8, AutoReverseDiff(compile=true), MCMCThreads(), 1, 10, 1000.0, 0.0, 2000, AdvancedHMC.DiagEuclideanMetric, -1))
              \n\n","category":"page"},{"location":"showcase/replications/mishra-2020/#Inference-and-analysis","page":"On the derivation of the renewal equation from an age-dependent branching process: an epidemic modelling perspective","title":"Inference and analysis","text":"","category":"section"},{"location":"showcase/replications/mishra-2020/","page":"On the derivation of the renewal equation from an age-dependent branching process: an epidemic modelling perspective","title":"On the derivation of the renewal equation from an age-dependent branching process: an epidemic modelling perspective","text":"
              \n

              We supply the data as a NamedTuple with the y_t field containing the observed data, shortened to fit the chosen tspan of epi_prob.

              \n\n
              south_korea_data = (y_t = data.cases_new[epi_prob.tspan[1]:epi_prob.tspan[2]],\n    dates = data.date[epi_prob.tspan[1]:epi_prob.tspan[2]])
              \n
              (y_t = [0, 0, 0, 1, 1, 1, 15, 34, 75, 190  …  131, 242, 114, 110, 107, 76, 74, 84, 93, 152], dates = [Date(\"2020-02-13\"), Date(\"2020-02-14\"), Date(\"2020-02-15\"), Date(\"2020-02-16\"), Date(\"2020-02-17\"), Date(\"2020-02-18\"), Date(\"2020-02-19\"), Date(\"2020-02-20\"), Date(\"2020-02-21\"), Date(\"2020-02-22\")  …  Date(\"2020-03-10\"), Date(\"2020-03-11\"), Date(\"2020-03-12\"), Date(\"2020-03-13\"), Date(\"2020-03-14\"), Date(\"2020-03-15\"), Date(\"2020-03-16\"), Date(\"2020-03-17\"), Date(\"2020-03-18\"), Date(\"2020-03-19\")])
              \n\n\n

              In the epidemiological model it is hard to identify between the AR parameters such as the standard deviation of the AR process and the cluster factor of the negative binomial observation model. The reason for this identifiability problem is that the model assumes no delay between infection and observation. Therefore, on any day the data could be explained by \\(R_t\\) changing or observation noise and its not easy to disentangle greater volatility in \\(R_t\\) from higher noise in the observations.

              In models with latent delays, changes in \\(R_t\\) impact the observed cases over several days which means that it easier to disentangle trend effects from observation-to-observation fluctuations.

              To counter act this problem we condition the model on a fixed cluster factor value.

              \n\n
              fixed_cluster_factor = 0.25
              \n
              0.25
              \n\n\n

              EpiAware has the generate_epiaware function which joins an EpiProblem object with the data to produce as Turing model. This Turing model composes the three unit Turing models defined above: the Renewal infection generating process, the AR latent process for \\(\\log R_t\\), and the negative binomial observation model. Therefore, we can condition on variables as with any other Turing model.

              \n\n
              mdl = generate_epiaware(epi_prob, south_korea_data) |\n      (var\"obs.cluster_factor\" = fixed_cluster_factor,)
              \n
              Model{typeof(generate_epiaware), (:y_t, :time_steps, :epi_model), (:latent_model, :observation_model), (), Tuple{Vector{Int64}, Int64, Renewal{EpiData{Float64, typeof(exp)}, Normal{Float64}, EpiAware.EpiInfModels.ConstantRenewalStep{Float64}}}, Tuple{AR{Product{Continuous, Truncated{Normal{Float64}, Continuous, Float64, Float64, Float64}, Vector{Truncated{Normal{Float64}, Continuous, Float64, Float64, Float64}}}, HalfNormal{Float64}, Product{Continuous, Normal{Float64}, Vector{Normal{Float64}}}, Int64}, NegativeBinomialError{HalfNormal{Float64}}}, ConditionContext{@NamedTuple{obs.cluster_factor::Float64}, DefaultContext}}(EpiAware.EpiAwareBase.generate_epiaware, (y_t = [0, 0, 0, 1, 1, 1, 15, 34, 75, 190  …  131, 242, 114, 110, 107, 76, 74, 84, 93, 152], time_steps = 36, epi_model = Renewal{EpiData{Float64, typeof(exp)}, Normal{Float64}, EpiAware.EpiInfModels.ConstantRenewalStep{Float64}}(EpiData{Float64, typeof(exp)}([0.026663134095601056, 0.14059778064943784, 0.2502660305615846, 0.24789569560506844, 0.1731751163417783, 0.09635404000022223, 0.04573437575216367, 0.019313826994143808], 8, exp), Distributions.Normal{Float64}(μ=0.0, σ=1.0), EpiAware.EpiInfModels.ConstantRenewalStep{Float64}([0.019313826994143808, 0.04573437575216367, 0.09635404000022223, 0.1731751163417783, 0.24789569560506844, 0.2502660305615846, 0.14059778064943784, 0.026663134095601056]))), (latent_model = AR{Product{Continuous, Truncated{Normal{Float64}, Continuous, Float64, Float64, Float64}, Vector{Truncated{Normal{Float64}, Continuous, Float64, Float64, Float64}}}, HalfNormal{Float64}, Product{Continuous, Normal{Float64}, Vector{Normal{Float64}}}, Int64}(Distributions.Product{Distributions.Continuous, Distributions.Truncated{Distributions.Normal{Float64}, Distributions.Continuous, Float64, Float64, Float64}, Vector{Distributions.Truncated{Distributions.Normal{Float64}, Distributions.Continuous, Float64, Float64, Float64}}}(v=Truncated{Normal{Float64}, Continuous, Float64, Float64, Float64}[Truncated(Distributions.Normal{Float64}(μ=0.1, σ=0.05); lower=0.0, upper=1.0), Truncated(Distributions.Normal{Float64}(μ=0.8, σ=0.05); lower=0.0, upper=1.0)]), HalfNormal{Float64}(μ=0.5), Distributions.Product{Distributions.Continuous, Distributions.Normal{Float64}, Vector{Distributions.Normal{Float64}}}(v=Normal{Float64}[Distributions.Normal{Float64}(μ=-1.0, σ=0.1), Distributions.Normal{Float64}(μ=-1.0, σ=0.5)]), 2), observation_model = NegativeBinomialError{HalfNormal{Float64}}(HalfNormal{Float64}(μ=0.1))), ConditionContext((var\"obs.cluster_factor\" = 0.25,), DynamicPPL.DefaultContext()))
              \n\n\n

              Sampling with apply_method

              The apply_method function combines the elements above:

              And returns a collection of results:

              \n\n
              inference_results = apply_method(mdl,\n    inference_method,\n    south_korea_data\n)
              \n
              EpiAwareObservables(Model{typeof(generate_epiaware), (:y_t, :time_steps, :epi_model), (:latent_model, :observation_model), (), Tuple{Vector{Int64}, Int64, Renewal{EpiData{Float64, typeof(exp)}, Normal{Float64}, EpiAware.EpiInfModels.ConstantRenewalStep{Float64}}}, Tuple{AR{Product{Continuous, Truncated{Normal{Float64}, Continuous, Float64, Float64, Float64}, Vector{Truncated{Normal{Float64}, Continuous, Float64, Float64, Float64}}}, HalfNormal{Float64}, Product{Continuous, Normal{Float64}, Vector{Normal{Float64}}}, Int64}, NegativeBinomialError{HalfNormal{Float64}}}, ConditionContext{@NamedTuple{obs.cluster_factor::Float64}, DefaultContext}}(EpiAware.EpiAwareBase.generate_epiaware, (y_t = [0, 0, 0, 1, 1, 1, 15, 34, 75, 190  …  131, 242, 114, 110, 107, 76, 74, 84, 93, 152], time_steps = 36, epi_model = Renewal{EpiData{Float64, typeof(exp)}, Normal{Float64}, EpiAware.EpiInfModels.ConstantRenewalStep{Float64}}(EpiData{Float64, typeof(exp)}([0.026663134095601056, 0.14059778064943784, 0.2502660305615846, 0.24789569560506844, 0.1731751163417783, 0.09635404000022223, 0.04573437575216367, 0.019313826994143808], 8, exp), Distributions.Normal{Float64}(μ=0.0, σ=1.0), EpiAware.EpiInfModels.ConstantRenewalStep{Float64}([0.019313826994143808, 0.04573437575216367, 0.09635404000022223, 0.1731751163417783, 0.24789569560506844, 0.2502660305615846, 0.14059778064943784, 0.026663134095601056]))), (latent_model = AR{Product{Continuous, Truncated{Normal{Float64}, Continuous, Float64, Float64, Float64}, Vector{Truncated{Normal{Float64}, Continuous, Float64, Float64, Float64}}}, HalfNormal{Float64}, Product{Continuous, Normal{Float64}, Vector{Normal{Float64}}}, Int64}(Distributions.Product{Distributions.Continuous, Distributions.Truncated{Distributions.Normal{Float64}, Distributions.Continuous, Float64, Float64, Float64}, Vector{Distributions.Truncated{Distributions.Normal{Float64}, Distributions.Continuous, Float64, Float64, Float64}}}(v=Truncated{Normal{Float64}, Continuous, Float64, Float64, Float64}[Truncated(Distributions.Normal{Float64}(μ=0.1, σ=0.05); lower=0.0, upper=1.0), Truncated(Distributions.Normal{Float64}(μ=0.8, σ=0.05); lower=0.0, upper=1.0)]), HalfNormal{Float64}(μ=0.5), Distributions.Product{Distributions.Continuous, Distributions.Normal{Float64}, Vector{Distributions.Normal{Float64}}}(v=Normal{Float64}[Distributions.Normal{Float64}(μ=-1.0, σ=0.1), Distributions.Normal{Float64}(μ=-1.0, σ=0.5)]), 2), observation_model = NegativeBinomialError{HalfNormal{Float64}}(HalfNormal{Float64}(μ=0.1))), ConditionContext((var\"obs.cluster_factor\" = 0.25,), DynamicPPL.DefaultContext())), (y_t = [0, 0, 0, 1, 1, 1, 15, 34, 75, 190  …  131, 242, 114, 110, 107, 76, 74, 84, 93, 152], dates = [Date(\"2020-02-13\"), Date(\"2020-02-14\"), Date(\"2020-02-15\"), Date(\"2020-02-16\"), Date(\"2020-02-17\"), Date(\"2020-02-18\"), Date(\"2020-02-19\"), Date(\"2020-02-20\"), Date(\"2020-02-21\"), Date(\"2020-02-22\")  …  Date(\"2020-03-10\"), Date(\"2020-03-11\"), Date(\"2020-03-12\"), Date(\"2020-03-13\"), Date(\"2020-03-14\"), Date(\"2020-03-15\"), Date(\"2020-03-16\"), Date(\"2020-03-17\"), Date(\"2020-03-18\"), Date(\"2020-03-19\")]), MCMC chain (2000×52×1 Array{Float64, 3}), @NamedTuple{generated_y_t::Vector{Int64}, I_t::Vector{Float64}, Z_t::Vector{Float64}}[(generated_y_t = [0, 0, 0, 1, 1, 1, 15, 34, 75, 190  …  131, 242, 114, 110, 107, 76, 74, 84, 93, 152], I_t = [0.31320720807054464, 0.29683750985248575, 0.4089938601616076, 1.3022343628622814, 1.0225626150832448, 2.8527817296411477, 10.369697939206759, 30.004843225718137, 69.27683270045847, 197.8262544135264  …  165.82971938826418, 261.08028416983564, 128.7797349208945, 98.10236805176936, 150.39291882409293, 144.96465663477696, 85.45247839555344, 76.49701858640442, 93.82203638614712, 150.84255039241933], Z_t = [-0.9622516557378923, -0.779767809194889, -0.22894416095706988, 1.1292735506394296, 0.9415632945314946, 1.7200973931688441, 2.6194952823620414, 3.0811689176981933, 2.9825028812997374, 3.015766126931429  …  -1.1445943026275176, -0.527355948876789, -1.0005686058194918, -1.0426429725004263, -0.4220655088301653, -0.2592548294034239, -0.6147274678250605, -0.6236497063004567, -0.342583682700895, 0.25399548116615]); (generated_y_t = [0, 0, 0, 1, 1, 1, 15, 34, 75, 190  …  131, 242, 114, 110, 107, 76, 74, 84, 93, 152], I_t = [0.45814991051360804, 0.5157565397914593, 0.7485784521593867, 1.9258424522354982, 1.881579386894302, 3.0862955995411405, 23.768073533134828, 32.469112917123354, 79.08124126301547, 133.2426021603086  …  218.50970158839397, 189.3209377495298, 162.14089882072605, 74.82939396174662, 84.68530229352427, 115.38772530731804, 74.09774056573012, 78.26487650435531, 118.86475219175041, 125.40968754401513], Z_t = [-1.0088100650997336, -0.639544659082267, -0.026428595731269766, 1.106648226738017, 1.0986425367420274, 1.3272100844568537, 2.9971592948443506, 2.679686874488113, 2.5981760009741737, 2.257218180409069  …  -0.771390714439445, -0.8621512462984372, -0.8677931302838009, -1.4402257812121877, -1.093298024516366, -0.5310548172927552, -0.6966495722710407, -0.4237917262795986, 0.13517581524187788, 0.28543487337970164]); … ; (generated_y_t = [0, 0, 0, 1, 1, 1, 15, 34, 75, 190  …  131, 242, 114, 110, 107, 76, 74, 84, 93, 152], I_t = [0.5158893098958413, 0.4643438820242316, 0.573058040409358, 1.0591979428548484, 1.0216766089139648, 4.977924446799509, 10.379829593366294, 38.06015718537239, 52.785133791434795, 110.17269294072234  …  133.32317529108064, 246.07992816063788, 178.29360470154046, 124.97217683109014, 124.06804457042375, 71.61177951656336, 71.22461510270242, 101.9991809371847, 93.03147773366017, 120.5822766958333], Z_t = [-1.0210041815742075, -0.8713671807503602, -0.41324905463364925, 0.421555199727424, 0.5159404613230617, 2.051190024137349, 2.4937798436595635, 3.123698200942809, 2.5209349500859215, 2.3128754298302074  …  -1.4075716776839973, -0.7130153359849505, -0.8603864344110592, -0.9819602779371421, -0.7664180669401581, -1.1026213666869615, -0.9002959563439888, -0.3274085775203858, -0.20380639049023946, 0.1989039251891774]); (generated_y_t = [0, 0, 0, 1, 1, 1, 15, 34, 75, 190  …  131, 242, 114, 110, 107, 76, 74, 84, 93, 152], I_t = [0.504769766867223, 0.310306090042842, 0.5131669904549973, 1.5030702265370084, 2.802351779067001, 1.4698799404359122, 12.981457766592483, 20.113164564630615, 74.3333268644793, 216.22515597743123  …  253.24676521364242, 221.20271873483486, 102.18122690472096, 103.19829495194227, 116.67237754308549, 83.73332916634705, 77.75012482367323, 66.34548811778136, 91.87960122252447, 163.27415680770434], Z_t = [-0.9524412048170314, -1.2057488759480226, -0.4696039243518976, 0.839141397204691, 1.6035897556442378, 0.7535006022615907, 2.484868832014502, 2.4245438158415697, 2.9847584797143485, 3.1699589180089607  …  -0.6586714025253477, -0.6448978991097477, -1.227673886693326, -1.03657558668536, -0.7138395469441117, -0.8091533570285688, -0.6492157835649145, -0.6210108342818847, -0.13636614713928563, 0.5836160289527288]);;])
              \n\n\n

              Results and Predictive plotting

              To assess the quality of the inference visually we can plot predictive quantiles for generated case data from the version of the model which hasn't conditioned on case data using posterior parameters inferred from the version conditioned on observed data. For this purpose, we add a generated_quantiles utility function. This kind of visualisation is known as posterior predictive checking, and is a useful diagnostic tool for Bayesian inference (see here).

              We also plot the inferred \\(R_t\\) estimates from the model. We find that the EpiAware model recovers the main finding in Mishra et al; that the \\(R_t\\) in South Korea peaked at a very high value (\\(R_t \\sim 10\\) at peak) before rapidly dropping below 1 in early March 2020.

              Note that, in reality, the peak \\(R_t\\) found here and in Mishra et al is unrealistically high, this might be due to a combination of:

              In a future note, we'll demonstrate having a time-varying ascertainment rate.

              \n\n
              function generated_quantiles(gens, quantity, qs; transformation = x -> x)\n    mapreduce(hcat, gens) do gen #loop over sampled generated quantities\n        getfield(gen, quantity) |> transformation\n    end |> mat -> mapreduce(hcat, qs) do q #Loop over matrix row to condense into qs\n        map(eachrow(mat)) do row\n            if any(ismissing, row)\n                return missing\n            else\n                quantile(row, q)\n            end\n        end\n    end\nend
              \n
              generated_quantiles (generic function with 1 method)
              \n\n
              let\n    C = south_korea_data.y_t\n    D = south_korea_data.dates\n\n    #Case unconditional model for posterior predictive sampling\n    mdl_unconditional = generate_epiaware(epi_prob,\n        (y_t = fill(missing, length(C)),)\n    ) | (var\"obs.cluster_factor\" = fixed_cluster_factor,)\n    posterior_gens = generated_quantities(mdl_unconditional, inference_results.samples)\n\n    #plotting quantiles\n    qs = [0.025, 0.25, 0.5, 0.75, 0.975]\n\n    #Prediction quantiles\n    predicted_y_t = generated_quantiles(posterior_gens, :generated_y_t, qs)\n    predicted_R_t = generated_quantiles(\n        posterior_gens, :Z_t, qs; transformation = x -> exp.(x))\n\n    ts = D .|> d -> d - minimum(D) .|> d -> d.value + 1\n    t_ticks = string.(D)\n    fig = Figure()\n    ax1 = Axis(fig[1, 1];\n        ylabel = \"Daily cases\",\n        xticks = (ts[1:14:end], t_ticks[1:14:end]),\n        title = \"Posterior predictive: Cases\"\n    )\n    ax2 = Axis(fig[2, 1];\n        yscale = log10,\n        title = \"Prediction: Reproduction number\",\n        xticks = (ts[1:14:end], t_ticks[1:14:end])\n    )\n    linkxaxes!(ax1, ax2)\n\n    lines!(ax1, ts, predicted_y_t[:, 3];\n        color = :purple,\n        linewidth = 2,\n        label = \"Post. median\"\n    )\n    band!(ax1, 1:size(predicted_y_t, 1), predicted_y_t[:, 2], predicted_y_t[:, 4];\n        color = (:purple, 0.4),\n        label = \"50%\"\n    )\n    band!(ax1, 1:size(predicted_y_t, 1), predicted_y_t[:, 1], predicted_y_t[:, 5];\n        color = (:purple, 0.2),\n        label = \"95%\"\n    )\n    scatter!(ax1, C;\n        color = :black,\n        label = \"Actual cases\")\n    axislegend(ax1)\n\n    lines!(ax2, ts, predicted_R_t[:, 3];\n        color = :green,\n        linewidth = 2,\n        label = \"Post. median\"\n    )\n    band!(ax2, 1:size(predicted_R_t, 1), predicted_R_t[:, 2], predicted_R_t[:, 4];\n        color = (:green, 0.4),\n        label = \"50%\"\n    )\n    band!(ax2, 1:size(predicted_R_t, 1), predicted_R_t[:, 1], predicted_R_t[:, 5];\n        color = (:green, 0.2),\n        label = \"95%\"\n    )\n    axislegend(ax2)\n\n    fig\nend
              \n\n\n\n

              Parameter inference

              We can interrogate the sampled chains directly from the samples field of the inference_results object.

              \n\n
              let\n    sub_chn = inference_results.samples[inference_results.samples.name_map.parameters[[1:5;\n                                                                                       end]]]\n    fig = pairplot(sub_chn)\n    lines!(fig[1, 1], ar.std_prior, label = \"Prior\")\n    lines!(fig[2, 2], ar.init_prior.v[1], label = \"Prior\")\n    lines!(fig[3, 3], ar.init_prior.v[2], label = \"Prior\")\n    lines!(fig[4, 4], ar.damp_prior.v[1], label = \"Prior\")\n    lines!(fig[5, 5], ar.damp_prior.v[2], label = \"Prior\")\n    lines!(fig[6, 6], epi.initialisation_prior, label = \"Prior\")\n\n    fig\nend
              \n\n\n","category":"page"},{"location":"showcase/replications/mishra-2020/","page":"On the derivation of the renewal equation from an age-dependent branching process: an epidemic modelling perspective","title":"On the derivation of the renewal equation from an age-dependent branching process: an epidemic modelling perspective","text":"EditURL = \"https://github.com/CDCgov/Rt-without-renewal/blob/main/docs/src/showcase/replications/mishra-2020/index.jl\"","category":"page"},{"location":"lib/EpiAwareBase/public/#Public-Documentation","page":"Public API","title":"Public Documentation","text":"","category":"section"},{"location":"lib/EpiAwareBase/public/","page":"Public API","title":"Public API","text":"Documentation for EpiAwareBae.jl's public interface.","category":"page"},{"location":"lib/EpiAwareBase/public/","page":"Public API","title":"Public API","text":"See the Internals section of the manual for internal package docs covering all submodules.","category":"page"},{"location":"lib/EpiAwareBase/public/#Contents","page":"Public API","title":"Contents","text":"","category":"section"},{"location":"lib/EpiAwareBase/public/","page":"Public API","title":"Public API","text":"Pages = [\"public.md\"]\nDepth = 2:2","category":"page"},{"location":"lib/EpiAwareBase/public/#Index","page":"Public API","title":"Index","text":"","category":"section"},{"location":"lib/EpiAwareBase/public/","page":"Public API","title":"Public API","text":"Pages = [\"public.md\"]","category":"page"},{"location":"lib/EpiAwareBase/public/#Public-API","page":"Public API","title":"Public API","text":"","category":"section"},{"location":"lib/EpiAwareBase/public/","page":"Public API","title":"Public API","text":"Modules = [EpiAware.EpiAwareBase]\nPrivate = false","category":"page"},{"location":"lib/EpiAwareBase/public/#EpiAware.EpiAwareBase","page":"Public API","title":"EpiAware.EpiAwareBase","text":"Module for defining abstract epidemiological types.\n\n\n\n\n\n","category":"module"},{"location":"lib/EpiAwareBase/public/#EpiAware.EpiAwareBase.AbstractAccumulationStep","page":"Public API","title":"EpiAware.EpiAwareBase.AbstractAccumulationStep","text":"abstract type AbstractAccumulationStep\n\nAbstract type for all accumulation steps\n\n\n\nFields\n\n\n\n\n\n","category":"type"},{"location":"lib/EpiAwareBase/public/#EpiAware.EpiAwareBase.AbstractBroadcastRule","page":"Public API","title":"EpiAware.EpiAwareBase.AbstractBroadcastRule","text":"abstract type AbstractBroadcastRule\n\nAn abstract type representing a broadcast rule.\n\n\n\nFields\n\n\n\n\n\n","category":"type"},{"location":"lib/EpiAwareBase/public/#EpiAware.EpiAwareBase.AbstractEpiMethod","page":"Public API","title":"EpiAware.EpiAwareBase.AbstractEpiMethod","text":"abstract type AbstractEpiMethod\n\nAbstract supertype for all EpiAware inference/generative modelling methods.\n\n\n\nFields\n\n\n\n\n\n","category":"type"},{"location":"lib/EpiAwareBase/public/#EpiAware.EpiAwareBase.AbstractEpiModel","page":"Public API","title":"EpiAware.EpiAwareBase.AbstractEpiModel","text":"abstract type AbstractEpiModel <: AbstractModel\n\nThe abstract supertype for all structs that define a model for generating unobserved/latent infections.\n\n\n\nFields\n\n\n\n\n\n","category":"type"},{"location":"lib/EpiAwareBase/public/#EpiAware.EpiAwareBase.AbstractEpiOptMethod","page":"Public API","title":"EpiAware.EpiAwareBase.AbstractEpiOptMethod","text":"abstract type AbstractEpiOptMethod <: AbstractEpiMethod\n\nAbstract supertype for infence/generative methods that are based on optimization, e.g. MAP estimation or variational inference.\n\n\n\nFields\n\n\n\n\n\n","category":"type"},{"location":"lib/EpiAwareBase/public/#EpiAware.EpiAwareBase.AbstractEpiProblem","page":"Public API","title":"EpiAware.EpiAwareBase.AbstractEpiProblem","text":"abstract type AbstractEpiProblem\n\nAbstract supertype for all EpiAware problems.\n\n\n\nFields\n\n\n\n\n\n","category":"type"},{"location":"lib/EpiAwareBase/public/#EpiAware.EpiAwareBase.AbstractEpiSamplingMethod","page":"Public API","title":"EpiAware.EpiAwareBase.AbstractEpiSamplingMethod","text":"abstract type AbstractEpiSamplingMethod <: AbstractEpiMethod\n\nAbstract supertype for infence/generative methods that are based on sampling from the posterior distribution, e.g. NUTS.\n\n\n\nFields\n\n\n\n\n\n","category":"type"},{"location":"lib/EpiAwareBase/public/#EpiAware.EpiAwareBase.AbstractLatentModel","page":"Public API","title":"EpiAware.EpiAwareBase.AbstractLatentModel","text":"abstract type AbstractLatentModel <: AbstractModel\n\nThe abstract supertype for all structs that define a model for generating a latent process used in EpiAware models.\n\n\n\nFields\n\n\n\n\n\n","category":"type"},{"location":"lib/EpiAwareBase/public/#EpiAware.EpiAwareBase.AbstractObservationModel","page":"Public API","title":"EpiAware.EpiAwareBase.AbstractObservationModel","text":"abstract type AbstractObservationModel <: AbstractModel\n\nA type representing an abstract observation model that is a subtype of AbstractModel.\n\n\n\nFields\n\n\n\n\n\n","category":"type"},{"location":"lib/EpiAwareBase/public/#EpiAware.EpiAwareBase.AbstractTuringEpiModel","page":"Public API","title":"EpiAware.EpiAwareBase.AbstractTuringEpiModel","text":"abstract type AbstractTuringEpiModel <: AbstractEpiModel\n\nA abstract type representing a Turing-based epidemiological model.\n\n\n\nFields\n\n\n\n\n\n","category":"type"},{"location":"lib/EpiAwareBase/public/#EpiAware.EpiAwareBase.AbstractTuringIntercept","page":"Public API","title":"EpiAware.EpiAwareBase.AbstractTuringIntercept","text":"abstract type AbstractTuringIntercept <: AbstractTuringLatentModel\n\nA abstract type used to define the common interface for intercept models.\n\n\n\nFields\n\n\n\n\n\n","category":"type"},{"location":"lib/EpiAwareBase/public/#EpiAware.EpiAwareBase.AbstractTuringLatentModel","page":"Public API","title":"EpiAware.EpiAwareBase.AbstractTuringLatentModel","text":"abstract type AbstractTuringLatentModel <: AbstractLatentModel\n\nA abstract type representing a Turing-based Latent model.\n\n\n\nFields\n\n\n\n\n\n","category":"type"},{"location":"lib/EpiAwareBase/public/#EpiAware.EpiAwareBase.AbstractTuringObservationErrorModel","page":"Public API","title":"EpiAware.EpiAwareBase.AbstractTuringObservationErrorModel","text":"abstract type AbstractTuringObservationErrorModel <: AbstractTuringObservationModel\n\nThe abstract supertype for all structs that defines a Turing-based model for generating observation errors.\n\n\n\nFields\n\n\n\n\n\n","category":"type"},{"location":"lib/EpiAwareBase/public/#EpiAware.EpiAwareBase.AbstractTuringObservationModel","page":"Public API","title":"EpiAware.EpiAwareBase.AbstractTuringObservationModel","text":"abstract type AbstractTuringObservationModel <: AbstractObservationModel\n\nA abstract type representing a Turing-based observation model.\n\n\n\nFields\n\n\n\n\n\n","category":"type"},{"location":"lib/EpiAwareBase/public/#EpiAware.EpiAwareBase.AbstractTuringRenewal","page":"Public API","title":"EpiAware.EpiAwareBase.AbstractTuringRenewal","text":"abstract type AbstractTuringRenewal <: AbstractTuringEpiModel\n\nAbstract type for all Turing-based Renewal infection generating models.\n\n\n\nFields\n\n\n\n\n\n","category":"type"},{"location":"lib/EpiAwareBase/public/#EpiAware.EpiAwareBase.EpiAwareObservables","page":"Public API","title":"EpiAware.EpiAwareBase.EpiAwareObservables","text":"struct EpiAwareObservables\n\nThe EpiAwareObservables struct represents the observables used in the EpiAware model.\n\nFields\n\nmodel: The model used for the observables.\ndata: The data used for the observables.\nsamples: Samples from the posterior distribution.\ngenerated: The generated observables.\n\n\n\nFields\n\nmodel::Any\ndata::Any\nsamples::Any\ngenerated::Any\n\n\n\n\n\n","category":"type"},{"location":"lib/EpiAwareBase/public/#EpiAware.EpiAwareBase.EpiMethod","page":"Public API","title":"EpiAware.EpiAwareBase.EpiMethod","text":"struct EpiMethod{O<:AbstractEpiOptMethod, S<:AbstractEpiSamplingMethod} <: AbstractEpiMethod\n\nEpiMethod represents a method for performing EpiAware inference and/or generative modelling, which combines a sequence of optimization steps to pass initialisation information to a sampler method.\n\n\n\nFields\n\npre_sampler_steps::Vector{O} where O<:AbstractEpiOptMethod: Pre-sampler optimization steps.\nsampler::AbstractEpiSamplingMethod: Sampler method.\n\n\n\n\n\n","category":"type"},{"location":"lib/EpiAwareBase/public/#EpiAware.EpiAwareBase.EpiProblem","page":"Public API","title":"EpiAware.EpiAwareBase.EpiProblem","text":"struct EpiProblem{E<:AbstractEpiModel, L<:AbstractLatentModel, O<:AbstractObservationModel} <: AbstractEpiProblem\n\nDefines an inference/generative modelling problem for case data.\n\nEpiProblem wraps the underlying components of an epidemiological model:\n\nepi_model: An epidemiological model for unobserved infections.\nlatent_model: A latent model for underlying latent process.\nobservation_model: An observation model for observed cases.\n\nAlong with a tspan tuple for the time span of the case data.\n\n\n\nFields\n\nepi_model::AbstractEpiModel: Epidemiological model for unobserved infections.\nlatent_model::AbstractLatentModel: Latent model for underlying latent process.\nobservation_model::AbstractObservationModel: Observation model for observed cases.\ntspan::Tuple{Int64, Int64}: Time span for either inference or generative modelling of case time series.\n\n\n\n\n\n","category":"type"},{"location":"lib/EpiAwareBase/public/#EpiAware.EpiAwareBase._apply_method","page":"Public API","title":"EpiAware.EpiAwareBase._apply_method","text":"_apply_method(\n model::AbstractEpiModel,\n method::AbstractEpiMethod;\n ...\n)\n_apply_method(\n model::AbstractEpiModel,\n method::AbstractEpiMethod,\n prev_result;\n kwargs...\n)\n\n\nApply the inference/generative method method to the AbstractEpiModel object mdl.\n\nArguments\n\nmodel::AbstractEpiModel: The model to apply the method to.\nmethod::AbstractEpiMethod: The epidemiological method to apply.\nprev_result: The previous result of the method.\nkwargs: Additional keyword arguments passed to the method.\n\nReturns\n\nnothing: If no concrete implementation is defined for the given method.\n\n\n\n\n\n","category":"function"},{"location":"lib/EpiAwareBase/public/#EpiAware.EpiAwareBase.apply_method-Tuple{Any, Any, Any}","page":"Public API","title":"EpiAware.EpiAwareBase.apply_method","text":"apply_method(\n model,\n method,\n data;\n kwargs...\n) -> EpiAwareObservables\n\n\nWrap the _apply_method function by calling it with the given model, method, data, and optional keyword arguments (kwargs). The resulting solution is then passed to the generated_observables function, along with the model and input data, to compute the generated observables.\n\nArguments\n\nmodel: The model to apply the method to.\nmethod: The method to apply to the model.\ndata: The data to pass to the apply_method function.\nkwargs: Optional keyword arguments to pass to the apply_method function.\n\nReturns\n\nThe generated observables computed from the solution.\n\n\n\n\n\n","category":"method"},{"location":"lib/EpiAwareBase/public/#EpiAware.EpiAwareBase.apply_method-Tuple{Any, Any}","page":"Public API","title":"EpiAware.EpiAwareBase.apply_method","text":"apply_method(\n model,\n method;\n kwargs...\n) -> EpiAwareObservables\n\n\nCalls wrap_apply_method setting the data argument to nothing.\n\n\n\n\n\n","category":"method"},{"location":"lib/EpiAwareBase/public/#EpiAware.EpiAwareBase.apply_method-Tuple{EpiProblem, AbstractEpiMethod, Any}","page":"Public API","title":"EpiAware.EpiAwareBase.apply_method","text":"apply_method(\n epiproblem::EpiProblem,\n method::AbstractEpiMethod,\n data;\n fix_parameters,\n condition_parameters,\n kwargs...\n) -> EpiAwareObservables\n\n\nRun the EpiAware algorithm to estimate the parameters of an epidemiological model.\n\nArguments\n\nepiproblem::EpiProblem: An EpiProblem object specifying the epidemiological problem.\nmethod::EpiMethod: An EpiMethod object specifying the inference method.\ndata: The observed data used for inference.\n\nKeyword Arguments\n\nfix_parameters::NamedTuple: A NamedTuple of fixed parameters for the model.\ncondition_parameters::NamedTuple: A NamedTuple of conditioned parameters for the model.\nkwargs...: Additional keyword arguments passed to the inference methods.\n\nReturns\n\nA NamedTuple with a samples field which is the output of applying methods and a model field with the model used. Optionally, a gens field with the generated quantities from the model if that makes sense with the inference method.\n\n\n\n\n\n","category":"method"},{"location":"lib/EpiAwareBase/public/#EpiAware.EpiAwareBase.broadcast_n-Tuple{AbstractBroadcastRule, Any, Any, Any}","page":"Public API","title":"EpiAware.EpiAwareBase.broadcast_n","text":"broadcast_n(\n broadcast_rule::AbstractBroadcastRule,\n latent,\n n,\n period\n)\n\n\nThis function is used to define the behavior of broadcasting for a specific type of AbstractBroadcastRule.\n\nThe broadcast_n function returns the length of the latent periods to generate using the given broadcast_rule. Which model of broadcasting to be implemented is set by the type of broadcast_rule. If no implemention is defined for the given broadcast_rule, then EpiAware will return a warning and return nothing.\n\n\n\n\n\n","category":"method"},{"location":"lib/EpiAwareBase/public/#EpiAware.EpiAwareBase.broadcast_rule-Tuple{AbstractBroadcastRule, Any, Any}","page":"Public API","title":"EpiAware.EpiAwareBase.broadcast_rule","text":"broadcast_rule(\n broadcast_rule::AbstractBroadcastRule,\n n,\n period\n)\n\n\nThis function is used to define the behavior of broadcasting for a specific type of AbstractBroadcastRule.\n\nThe broadcast_rule function implements a model of broadcasting a latent process. Which model of broadcasting to be implemented is set by the type of broadcast_rule. If no implemention is defined for the given broadcast_rule, then EpiAware will return a warning and return nothing.\n\n\n\n\n\n","category":"method"},{"location":"lib/EpiAwareBase/public/#EpiAware.EpiAwareBase.condition_model-Tuple{Any, Any, Any}","page":"Public API","title":"EpiAware.EpiAwareBase.condition_model","text":"condition_model(\n model,\n fix_parameters,\n condition_parameters\n) -> Any\n\n\nCondition a model on fixed (i.e to a value) and conditioned (i.e to data) parameters.\n\nReturns\n\nmodel: The conditioned model.\n\n\n\n\n\n","category":"method"},{"location":"lib/EpiAwareBase/public/#EpiAware.EpiAwareBase.generate_epiaware-Tuple{Any, Any, AbstractEpiModel, AbstractLatentModel, AbstractObservationModel}","page":"Public API","title":"EpiAware.EpiAwareBase.generate_epiaware","text":"generate_epiaware(\n y_t,\n time_step,\n epi_model::AbstractEpiModel,\n latent_model::AbstractLatentModel,\n observation_model::AbstractObservationModel\n)\n\n\nCreate an epi-aware model using the specified epimodel, latentmodel, and observation_model.\n\nArguments\n\ny_t: The observed data.\ntime_steps: The time steps.\nepi_model: An abstract epi model.\nlatent_model: An abstract latent model.\nobservation_model: An abstract observation model.\n\nReturns\n\nnothing\n\n\n\n\n\n","category":"method"},{"location":"lib/EpiAwareBase/public/#EpiAware.EpiAwareBase.generate_epiaware-Tuple{EpiProblem, Any}","page":"Public API","title":"EpiAware.EpiAwareBase.generate_epiaware","text":"generate_epiaware(epiproblem::EpiProblem, data) -> Any\n\n\nGenerate an epi-aware model given an EpiProblem and data.\n\nArguments\n\nepiproblem: Epi problem specification.\ndata: Observed data.\n\nReturns\n\nA tuple containing the generated quantities of the epi-aware model.\n\n\n\n\n\n","category":"method"},{"location":"lib/EpiAwareBase/public/#EpiAware.EpiAwareBase.generate_latent-Tuple{AbstractLatentModel, Any}","page":"Public API","title":"EpiAware.EpiAwareBase.generate_latent","text":"generate_latent(latent_model::AbstractLatentModel, n) -> Any\n\n\nConstructor function for a latent process path Z_t of length n.\n\nThe generate_latent function implements a model of generating a latent process. Which model for generating the latent process infections is implemented is set by the type of latent_model. If no implemention is defined for the type of latent_model, then EpiAware will pass a warning and return nothing.\n\nInterface to Turing.jl probablilistic programming language (PPL)\n\nApart from the no implementation fallback method, the generate_latent implementation function should return a constructor function for a DynamicPPL.Model object. Sample paths of Z_t are generated quantities of the constructed model. Priors for model parameters are fields of epi_model.\n\n\n\n\n\n","category":"method"},{"location":"lib/EpiAwareBase/public/#EpiAware.EpiAwareBase.generate_latent_infs-Tuple{AbstractEpiModel, Any}","page":"Public API","title":"EpiAware.EpiAwareBase.generate_latent_infs","text":"generate_latent_infs(\n epi_model::AbstractEpiModel,\n Z_t\n) -> Any\n\n\nConstructor function for unobserved/latent infections based on the type of epi_model <: AbstractEpimodel and a latent process path Z_t.\n\nThe generate_latent_infs function implements a model of generating unobserved/latent infections conditional on a latent process. Which model of generating unobserved/latent infections to be implemented is set by the type of epi_model. If no implemention is defined for the given epi_model, then EpiAware will return a warning and return nothing.\n\nInterface to Turing.jl probablilistic programming language (PPL)\n\nApart from the no implementation fallback method, the generate_latent_infs implementation function returns a constructor function for a DynamicPPL.Model object where the unobserved/latent infections are a generated quantity. Priors for model parameters are fields of epi_model.\n\n\n\n\n\n","category":"method"},{"location":"lib/EpiAwareBase/public/#EpiAware.EpiAwareBase.generate_observations-Tuple{AbstractObservationModel, Any, Any}","page":"Public API","title":"EpiAware.EpiAwareBase.generate_observations","text":"generate_observations(\n obs_model::AbstractObservationModel,\n y_t,\n Y_t\n) -> Any\n\n\nConstructor function for generating observations based on the given observation model.\n\nThe generate_observations function implements a model of generating observations based on the given observation model. Which model of generating observations to be implemented is set by the type of obs_model. If no implemention is defined for the given obs_model, then EpiAware will return a warning and return nothing.\n\n\n\n\n\n","category":"method"},{"location":"lib/EpiAwareBase/public/#EpiAware.EpiAwareBase.generated_observables-Tuple{Any, Any, Any}","page":"Public API","title":"EpiAware.EpiAwareBase.generated_observables","text":"generated_observables(\n model,\n data,\n solution\n) -> EpiAwareObservables\n\n\nGenerate observables from a given model and solution and return them as a EpiAwareObservables struct.\n\nArguments\n\nmodel: The model used for generating observables.\ndata: The data used for generating observables.\nsolution: The solution used for generating observables.\n\nReturns\n\nAn instance of EpiAwareObservables struct with the provided model, data, solution, and the generated observables if specified\n\n\n\n\n\n","category":"method"},{"location":"getting-started/explainers/#Explainers","page":"Overview","title":"Explainers","text":"","category":"section"},{"location":"getting-started/explainers/","page":"Overview","title":"Overview","text":"This section contains a series of explainers that provide a detailed overview of the EpiAware platform and its features. These explainers are designed to help you understand the platform and its capabilities, and to provide you with the information you need to get started using EpiAware. See the sidebar for the list of explainers.","category":"page"},{"location":"lib/EpiObsModels/internals/#Internal-Documentation","page":"Internal API","title":"Internal Documentation","text":"","category":"section"},{"location":"lib/EpiObsModels/internals/","page":"Internal API","title":"Internal API","text":"Documentation for EpiObsModels.jl's internal interface.","category":"page"},{"location":"lib/EpiObsModels/internals/#Contents","page":"Internal API","title":"Contents","text":"","category":"section"},{"location":"lib/EpiObsModels/internals/","page":"Internal API","title":"Internal API","text":"Pages = [\"internals.md\"]\nDepth = 2:2","category":"page"},{"location":"lib/EpiObsModels/internals/#Index","page":"Internal API","title":"Index","text":"","category":"section"},{"location":"lib/EpiObsModels/internals/","page":"Internal API","title":"Internal API","text":"Pages = [\"internals.md\"]","category":"page"},{"location":"lib/EpiObsModels/internals/#Internal-API","page":"Internal API","title":"Internal API","text":"","category":"section"},{"location":"lib/EpiObsModels/internals/","page":"Internal API","title":"Internal API","text":"Modules = [EpiAware.EpiObsModels]\nPublic = false","category":"page"},{"location":"lib/EpiObsModels/internals/#EpiAware.EpiObsModels.LDStep","page":"Internal API","title":"EpiAware.EpiObsModels.LDStep","text":"struct LDStep{D<:(AbstractVector{<:Real})} <: AbstractAccumulationStep\n\nThe LatentDelay step function struct\n\n\n\nFields\n\nrev_pmf::AbstractVector{<:Real}\n\n\n\n\n\n","category":"type"},{"location":"lib/EpiObsModels/internals/#EpiAware.EpiObsModels.LDStep-Tuple{Any, Any}","page":"Internal API","title":"EpiAware.EpiObsModels.LDStep","text":"The LatentDelay step function method for accumulate_scan.\n\n\n\n\n\n","category":"method"},{"location":"lib/EpiObsModels/internals/#EpiAware.EpiAwareBase.generate_observations-Tuple{AbstractTuringObservationErrorModel, Any, Any}","page":"Internal API","title":"EpiAware.EpiAwareBase.generate_observations","text":"generate_observations(\n obs_model::AbstractTuringObservationErrorModel,\n y_t,\n Y_t\n) -> Any\n\n\nGenerates observations from an observation error model. It provides support for missing values in observations (y_t), and expected observations (Y_t) that are shorter than observations. When this is the case it assumes that the expected observations are the last length(Y_t) elements of y_t. It also pads the expected observations with a small value (1e-6) to mitigate potential numerical issues.\n\nIt dispatches to the observation_error function to generate the observation error distribution which uses priors generated by generate_observation_error_priors submodel. For most observation error models specific implementations of observation_error and generate_observation_error_priors are required but a specific implementation of generate_observations is not required.\n\n\n\n\n\n","category":"method"},{"location":"lib/EpiObsModels/internals/#EpiAware.EpiAwareBase.generate_observations-Tuple{Ascertainment, Any, Any}","page":"Internal API","title":"EpiAware.EpiAwareBase.generate_observations","text":"generate_observations(\n obs_model::Ascertainment,\n y_t,\n Y_t\n) -> Any\n\n\nGenerates observations based on the LatentDelay observation model.\n\nArguments\n\nobs_model::Ascertainment: The Ascertainment model.\ny_t: The current state of the observations.\nY_t` : The expected observations.\n\nReturns\n\ny_t: The updated observations.\nexpected_aux: Additional expected observation-related variables.\nobs_aux: Additional observation-related variables.\n\n\n\n\n\n","category":"method"},{"location":"lib/EpiObsModels/internals/#EpiAware.EpiAwareBase.generate_observations-Tuple{LatentDelay, Any, Any}","page":"Internal API","title":"EpiAware.EpiAwareBase.generate_observations","text":"generate_observations(\n obs_model::LatentDelay,\n y_t,\n Y_t\n) -> Any\n\n\nGenerates observations based on the LatentDelay observation model.\n\nArguments\n\nobs_model::LatentDelay: The LatentDelay observation model.\ny_t: The current observations.\nI_t: The current infection indicator.\n\nReturns\n\ny_t: The updated observations.\n\n\n\n\n\n","category":"method"},{"location":"lib/EpiObsModels/internals/#EpiAware.EpiAwareBase.generate_observations-Tuple{StackObservationModels, NamedTuple, AbstractVector}","page":"Internal API","title":"EpiAware.EpiAwareBase.generate_observations","text":"generate_observations(\n obs_model::StackObservationModels,\n y_t::NamedTuple,\n Y_t::AbstractVector\n) -> Any\n\n\nGenerate observations from a stack of observation models. Maps Y_t to a NamedTuple of the same length as y_t assuming a 1 to many mapping.\n\nArguments\n\nobs_model::StackObservationModels: The stack of observation models.\ny_t::NamedTuple: The observed values.\nY_t::AbstractVector: The expected values.\n\n\n\n\n\n","category":"method"},{"location":"lib/EpiObsModels/internals/#EpiAware.EpiAwareBase.generate_observations-Tuple{StackObservationModels, NamedTuple, NamedTuple}","page":"Internal API","title":"EpiAware.EpiAwareBase.generate_observations","text":"generate_observations(\n obs_model::StackObservationModels,\n y_t::NamedTuple,\n Y_t::NamedTuple\n) -> Any\n\n\nGenerate observations from a stack of observation models. Assumes a 1 to 1 mapping between y_t and Y_t.\n\nArguments\n\nobs_model::StackObservationModels: The stack of observation models.\ny_t::NamedTuple: The observed values.\nY_t::NamedTuple: The expected values.\n\n\n\n\n\n","category":"method"},{"location":"lib/EpiObsModels/internals/#EpiAware.EpiAwareBase.generate_observations-Tuple{TransformObservationModel, Any, Any}","page":"Internal API","title":"EpiAware.EpiAwareBase.generate_observations","text":"generate_observations(\n obs::TransformObservationModel,\n y_t,\n Y_t\n) -> Any\n\n\nGenerates observations or accumulates log-likelihood based on the TransformObservationModel.\n\nArguments\n\nobs::TransformObservationModel: The TransformObservationModel.\ny_t: The current state of the observations.\nY_t: The expected observations.\n\nReturns\n\ny_t: The updated observations.\n\n\n\n\n\n","category":"method"},{"location":"lib/EpiObsModels/internals/#EpiAware.EpiAwareUtils.get_state-Tuple{EpiAware.EpiObsModels.LDStep, Any, Any}","page":"Internal API","title":"EpiAware.EpiAwareUtils.get_state","text":"get_state(\n acc_step::EpiAware.EpiObsModels.LDStep,\n initial_state,\n state\n) -> Any\n\n\nThe LatentDelay step function method for get_state.\n\n\n\n\n\n","category":"method"},{"location":"lib/EpiObsModels/internals/#EpiAware.EpiObsModels.NegativeBinomialMeanClust-Tuple{Any, Any}","page":"Internal API","title":"EpiAware.EpiObsModels.NegativeBinomialMeanClust","text":"NegativeBinomialMeanClust(μ, α) -> SafeNegativeBinomial\n\n\nCompute the mean-cluster factor negative binomial distribution.\n\nArguments\n\nμ: The mean of the distribution.\nα: The clustering factor parameter.\n\nReturns\n\nA NegativeBinomial distribution object.\n\n\n\n\n\n","category":"method"},{"location":"lib/EpiObsModels/internals/#EpiAware.EpiObsModels.generate_observation_kernel-Tuple{Any, Any}","page":"Internal API","title":"EpiAware.EpiObsModels.generate_observation_kernel","text":"generate_observation_kernel(\n delay_int,\n time_horizon;\n partial\n) -> Any\n\n\nGenerate an observation kernel matrix based on the given delay interval and time horizon.\n\nArguments\n\ndelay_int::Vector{Float64}: The delay PMF vector.\ntime_horizon::Int: The number of time steps of the observation period.\npartial::Bool: Whether to generate a partial observation kernel matrix.\n\nReturns\n\nK::SparseMatrixCSC{Float64, Int}: The observation kernel matrix.\n\n\n\n\n\n","category":"method"},{"location":"lib/EpiInference/internals/#Internal-Documentation","page":"Internal API","title":"Internal Documentation","text":"","category":"section"},{"location":"lib/EpiInference/internals/","page":"Internal API","title":"Internal API","text":"Documentation for EpInference.jl's internal interface.","category":"page"},{"location":"lib/EpiInference/internals/#Contents","page":"Internal API","title":"Contents","text":"","category":"section"},{"location":"lib/EpiInference/internals/","page":"Internal API","title":"Internal API","text":"Pages = [\"internals.md\"]\nDepth = 2:2","category":"page"},{"location":"lib/EpiInference/internals/#Index","page":"Internal API","title":"Index","text":"","category":"section"},{"location":"lib/EpiInference/internals/","page":"Internal API","title":"Internal API","text":"Pages = [\"internals.md\"]","category":"page"},{"location":"lib/EpiInference/internals/#Internal-API","page":"Internal API","title":"Internal API","text":"","category":"section"},{"location":"lib/EpiInference/internals/","page":"Internal API","title":"Internal API","text":"Modules = [EpiAware.EpiInference]\nPublic = false","category":"page"},{"location":"lib/EpiInference/internals/#EpiAware.EpiAwareBase._apply_method","page":"Internal API","title":"EpiAware.EpiAwareBase._apply_method","text":"_apply_method(\n model::DynamicPPL.Model,\n method::ManyPathfinder;\n ...\n) -> Any\n_apply_method(\n model::DynamicPPL.Model,\n method::ManyPathfinder,\n prev_result;\n kwargs...\n) -> Any\n\n\nApply a ManyPathfinder method to a DynamicPPL.Model object.\n\nIf prev_result is a vector of real numbers, then the ManyPathfinder method is applied with the initial values set to prev_result. Otherwise, the ManyPathfinder method is run with default initial values generated.\n\n\n\n\n\n","category":"function"},{"location":"lib/EpiInference/internals/#EpiAware.EpiAwareBase._apply_method-2","page":"Internal API","title":"EpiAware.EpiAwareBase._apply_method","text":"_apply_method(\n model::DynamicPPL.Model,\n method::NUTSampler;\n ...\n) -> Any\n_apply_method(\n model::DynamicPPL.Model,\n method::NUTSampler,\n prev_result;\n kwargs...\n) -> Any\n\n\nApply NUTS sampling to a DynamicPPL.Model object with prev_result representing any initial results to use for sampler initialisation.\n\n\n\n\n\n","category":"function"},{"location":"lib/EpiInference/internals/#EpiAware.EpiInference._apply_nuts-Tuple{Any, Any, Any}","page":"Internal API","title":"EpiAware.EpiInference._apply_nuts","text":"_apply_nuts(model, method, prev_result; kwargs...) -> Any\n\n\nNo initialisation NUTS.\n\n\n\n\n\n","category":"method"},{"location":"lib/EpiInference/internals/#EpiAware.EpiInference._apply_nuts-Tuple{Any, Any, Pathfinder.PathfinderResult}","page":"Internal API","title":"EpiAware.EpiInference._apply_nuts","text":"_apply_nuts(\n model,\n method,\n prev_result::Pathfinder.PathfinderResult;\n kwargs...\n) -> Any\n\n\nInitialise NUTS with initial parameters from a Pathfinder result.\n\n\n\n\n\n","category":"method"},{"location":"lib/EpiInference/internals/#EpiAware.EpiInference._continue_manypathfinder!-Tuple{Any, DynamicPPL.Model}","page":"Internal API","title":"EpiAware.EpiInference._continue_manypathfinder!","text":"_continue_manypathfinder!(\n pfs,\n mdl::DynamicPPL.Model;\n max_tries,\n nruns,\n kwargs...\n)\n\n\nContinue running the pathfinder algorithm until a pathfinder succeeds or the maximum number of tries is reached.\n\nArguments\n\npfs: An array of pathfinder objects.\nmdl::DynamicPPL.Model: The model to perform inference on.\nmax_tries: The maximum number of tries to run the pathfinder algorithm. Default is Inf.\nnruns: The number of times to run the pathfinder function.\nkwargs...: Additional keyword arguments passed to pathfinder.\n\nReturns\n\npfs: The updated array of pathfinder objects.\n\n\n\n\n\n","category":"method"},{"location":"lib/EpiInference/internals/#EpiAware.EpiInference._get_best_elbo_pathfinder-Tuple{Any}","page":"Internal API","title":"EpiAware.EpiInference._get_best_elbo_pathfinder","text":"_get_best_elbo_pathfinder(pfs) -> Any\n\n\nSelects the pathfinder with the highest ELBO estimate from a list of pathfinders.\n\nArguments\n\npfs: A list of pathfinders results or Symbol values indicating failure.\n\nReturns\n\nThe pathfinder with the highest ELBO estimate.\n\n\n\n\n\n","category":"method"},{"location":"lib/EpiInference/internals/#EpiAware.EpiInference._run_manypathfinder-Tuple{DynamicPPL.Model}","page":"Internal API","title":"EpiAware.EpiInference._run_manypathfinder","text":"_run_manypathfinder(mdl::DynamicPPL.Model; nruns, kwargs...)\n\n\nRun pathfinder multiple times and store the results in an array. Fails safely.\n\nArguments\n\nmdl::DynamicPPL.Model: The Turing model to be used for inference.\nnruns: The number of times to run the pathfinder function.\nkwargs...: Additional keyword arguments passed to pathfinder.\n\nReturns\n\nAn array of PathfinderResult objects or Symbol values indicating success or failure.\n\n\n\n\n\n","category":"method"},{"location":"getting-started/explainers/modelling-infections/#Modelling-infections","page":"Modelling infections","title":"Modelling infections","text":"","category":"section"},{"location":"developer/contributing/#Contributing","page":"Contributing","title":"Contributing","text":"","category":"section"},{"location":"developer/contributing/","page":"Contributing","title":"Contributing","text":"This page details the some of the guidelines that should be followed when contributing to this package. It is adapted from Documenter.jl.","category":"page"},{"location":"developer/contributing/#Branches","page":"Contributing","title":"Branches","text":"","category":"section"},{"location":"developer/contributing/","page":"Contributing","title":"Contributing","text":"release-* branches are used for tagged minor versions of this package. This follows the same approach used in the main Julia repository, albeit on a much more modest scale.","category":"page"},{"location":"developer/contributing/","page":"Contributing","title":"Contributing","text":"Please open pull requests against the master branch rather than any of the release-* branches whenever possible.","category":"page"},{"location":"developer/contributing/#Backports","page":"Contributing","title":"Backports","text":"","category":"section"},{"location":"developer/contributing/","page":"Contributing","title":"Contributing","text":"Bug fixes are backported to the release-* branches using git cherry-pick -x by a EpiAware member and will become available in point releases of that particular minor version of the package.","category":"page"},{"location":"developer/contributing/","page":"Contributing","title":"Contributing","text":"Feel free to nominate commits that should be backported by opening an issue. Requests for new point releases to be tagged in METADATA.jl can also be made in the same way.","category":"page"},{"location":"developer/contributing/#release-*-branches","page":"Contributing","title":"release-* branches","text":"","category":"section"},{"location":"developer/contributing/","page":"Contributing","title":"Contributing","text":"Each new minor version x.y.0 gets a branch called release-x.y (a protected branch).\nNew versions are usually tagged only from the release-x.y branches.\nFor patch releases, changes get backported to the release-x.y branch via a single PR with the standard name \"Backports for x.y.z\" and label \"Type: Backport\". The PR message links to all the PRs that are providing commits to the backport. The PR gets merged as a merge commit (i.e. not squashed).\nThe old release-* branches may be removed once they have outlived their usefulness.\nPatch version milestones are used to keep track of which PRs get backported etc.","category":"page"},{"location":"developer/contributing/#Style-Guide","page":"Contributing","title":"Style Guide","text":"","category":"section"},{"location":"developer/contributing/","page":"Contributing","title":"Contributing","text":"Follow the style of the surrounding text when making changes. When adding new features please try to stick to the following points whenever applicable. This project follows the SciML style guide.","category":"page"},{"location":"developer/contributing/#Tests","page":"Contributing","title":"Tests","text":"","category":"section"},{"location":"developer/contributing/#Unit-tests","page":"Contributing","title":"Unit tests","text":"","category":"section"},{"location":"developer/contributing/","page":"Contributing","title":"Contributing","text":"As is conventional for Julia packages, unit tests are located at test/*.jl with the entrypoint test/runtests.jl.","category":"page"},{"location":"developer/contributing/#End-to-end-testing","page":"Contributing","title":"End to end testing","text":"","category":"section"},{"location":"developer/contributing/","page":"Contributing","title":"Contributing","text":"Tests that build example package docs from source and inspect the results (end to end tests) are located in /test/examples. The main entry points are test/examples/make.jl for building and test/examples/test.jl for doing some basic checks on the generated outputs.","category":"page"},{"location":"developer/contributing/#Benchmarking","page":"Contributing","title":"Benchmarking","text":"","category":"section"},{"location":"developer/contributing/","page":"Contributing","title":"Contributing","text":"Benchmarking is orchestrated using PkgBenchmark.jl along with a GitHub action that uses BenchmarkCI.jl The benchmarks are located in benchmarks/ and the main entry point is benchmarks/runbenchmarks.jl.","category":"page"},{"location":"developer/contributing/","page":"Contributing","title":"Contributing","text":"The main function in the benchmark environment is make_epiaware_suite which calls TuringBenchmarking.make_turing_suite on a set of Turing models generated by EpiAware benchmarking their sampling with the following autodiff backends:","category":"page"},{"location":"developer/contributing/","page":"Contributing","title":"Contributing","text":"ForwardDiff.jl.\nReverseDiff.jl: With compile = false.\nReverseDiff.jl: With compile = true.","category":"page"},{"location":"developer/contributing/#Benchmarking-\"gotchas\"","page":"Contributing","title":"Benchmarking \"gotchas\"","text":"","category":"section"},{"location":"developer/contributing/#Models-with-no-parameters","page":"Contributing","title":"Models with no parameters","text":"","category":"section"},{"location":"developer/contributing/","page":"Contributing","title":"Contributing","text":"In EpiAware we do expose some models thats do not have parameters, for example, Poisson sampling with a transformation on a fixed mean process implemented by TransformObservationModel(NegativeBinomialError()) has no sampleable parameters (although it does contributed log-likelihood as part of a wider model). This causes TuringBenchmarking.make_turing_suite to throw an error as it expects all models to have parameters.","category":"page"},{"location":"developer/contributing/#Pluto-usage-in-showcase-documentation","page":"Contributing","title":"Pluto usage in showcase documentation","text":"","category":"section"},{"location":"developer/contributing/","page":"Contributing","title":"Contributing","text":"Some of the showcase examples in EpiAware/docs/src/showcase use Pluto.jl notebooks for the underlying computation. The output of the notebooks is rendered into HTML for inclusion in the documentation in two steps:","category":"page"},{"location":"developer/contributing/","page":"Contributing","title":"Contributing","text":"PlutoStaticHTML.jl converts the notebook with output into a machine-readable .md format.\nDocumenter.jl renders the .md file into HTML for inclusion in the documentation during the build process.","category":"page"},{"location":"developer/contributing/","page":"Contributing","title":"Contributing","text":"For other examples of using Pluto to generate documentation see the examples shown here.","category":"page"},{"location":"developer/contributing/#Running-Pluto-notebooks-from-EpiAware-locally","page":"Contributing","title":"Running Pluto notebooks from EpiAware locally","text":"","category":"section"},{"location":"developer/contributing/","page":"Contributing","title":"Contributing","text":"To run the Pluto.jl scripts in the EpiAware documentation directly from the source code you can do these steps:","category":"page"},{"location":"developer/contributing/","page":"Contributing","title":"Contributing","text":"Install Pluto.jl locally. We recommend using the version of Pluto that is pinned in the Project.toml file defining the documentation environment.\nClone the EpiAware repository.\nStart Pluto.jl either from REPL (see the Pluto.jl documentation) or from the command line with the shell script EpiAware/docs/pluto-scripts.sh.\nFrom the Pluto.jl interface, navigate to the Pluto.jl script you want to run.","category":"page"},{"location":"developer/contributing/#Contributing-to-Pluto-notebooks-in-EpiAware-documentation","page":"Contributing","title":"Contributing to Pluto notebooks in EpiAware documentation","text":"","category":"section"},{"location":"developer/contributing/#Modifying-an-existing-Pluto-notebook","page":"Contributing","title":"Modifying an existing Pluto notebook","text":"","category":"section"},{"location":"developer/contributing/","page":"Contributing","title":"Contributing","text":"Committing changes to the Pluto.jl notebooks in the EpiAware documentation is the same as committing changes to any other part of the repository. However, please note that we expect the following features for the environment management of the notebooks:","category":"page"},{"location":"developer/contributing/","page":"Contributing","title":"Contributing","text":"Use the environment determined by the Project.toml file in the EpiAware/docs directory. If you want extra packages, add them to this environment.\nUse the version of EpiAware that is used in these notebooks to be the version of EpiAware on the branch being pull requested into main. To do this use the Pkg.develop function.","category":"page"},{"location":"developer/contributing/","page":"Contributing","title":"Contributing","text":"To do this you can use the following code snippet in the Pluto notebook:","category":"page"},{"location":"developer/contributing/","page":"Contributing","title":"Contributing","text":"# Determine the relative path to the `EpiAware/docs` directory\ndocs_dir = dirname(dirname(dirname(dirname(@__DIR__))))\n# Determine the relative path to the `EpiAware` package directory\npkg_dir = dirname(docs_dir)\n\nusing Pkg: Pkg\nPkg.activate(docs_dir)\nPkg.develop(; path = pkg_dir)\nPkg.instantiate()","category":"page"},{"location":"developer/contributing/#Adding-a-new-Pluto-notebook","page":"Contributing","title":"Adding a new Pluto notebook","text":"","category":"section"},{"location":"developer/contributing/","page":"Contributing","title":"Contributing","text":"Adding a new Pluto.jl notebook to the EpiAware documentation is the same as adding any other file to the repository. However, in addition to following the guidelines for modifying an existing notebook, please note that the new notebook is added to the set of notebook builds using build in the EpiAware/docs/make.jl file. This will generate an .md of the same name as the notebook which can be rendered when makedocs is run. For this document to be added to the overall documentation the path to the .md file must be added to the Pages array defined in EpiAware/docs/pages.jl.","category":"page"},{"location":"developer/checklist/#Checklists","page":"Release checklist","title":"Checklists","text":"","category":"section"},{"location":"developer/checklist/","page":"Release checklist","title":"Release checklist","text":"The purpose of this page is to collate a series of checklists for commonly performed changes to the source code of EpiAware. It has been adapted from Documenter.jl.","category":"page"},{"location":"developer/checklist/","page":"Release checklist","title":"Release checklist","text":"In each case, copy the checklist into the description of the pull request.","category":"page"},{"location":"developer/checklist/#Making-a-release","page":"Release checklist","title":"Making a release","text":"","category":"section"},{"location":"developer/checklist/","page":"Release checklist","title":"Release checklist","text":"In preparation for a release, use the following checklist. These steps should be performed on a branch with an open pull request, either for a topic branch, or for a new branch release-1.y.z (\"Release version 1.y.z\") if multiple changes have accumulated on the master branch since the last release.","category":"page"},{"location":"developer/checklist/","page":"Release checklist","title":"Release checklist","text":"## Pre-release\n\n - [ ] Change the version number in `Project.toml`\n * If the release is breaking, increment MAJOR\n * If the release adds a new user-visible feature, increment MINOR\n * Otherwise (bug-fixes, documentation improvements), increment PATCH\n - [ ] Update `CHANGELOG.md`, following the existing style (in particular, make sure that the change log for this version has the correct version number and date).\n - [ ] Run `make changelog`, to make sure that all the issue references in `CHANGELOG.md` are up to date.\n - [ ] Check that the commit messages in this PR do not contain `[ci skip]`\n - [ ] Run https://github.com/JuliaDocs/Documenter.jl/actions/workflows/regression-tests.yml\n using a `workflow_dispatch` trigger to check for any changes that broke extensions.\n\n## The release\n\n - [ ] After merging the pull request, tag the release. There are two options for this:\n\n 1. [Comment `[at]JuliaRegistrator register` on the GitHub commit.](https://github.com/JuliaRegistries/Registrator.jl#via-the-github-app)\n 2. Use [JuliaHub's package registration feature](https://help.juliahub.com/juliahub/stable/contribute/#registrator) to trigger the registration.\n\n Either of those should automatically publish a new version to the Julia registry.\n - Once registered, the `TagBot.yml` workflow should create a tag, and rebuild the documentation for this tag.\n - These steps can take quite a bit of time (1 hour or more), so don't be surprised if the new documentation takes a while to appear.","category":"page"},{"location":"lib/EpiObsModels/#EpiObsModels.jl","page":"Overview","title":"EpiObsModels.jl","text":"","category":"section"},{"location":"lib/EpiObsModels/","page":"Overview","title":"Overview","text":"This package provides observation models for the EpiAware ecosystem.","category":"page"},{"location":"lib/EpiObsModels/#API","page":"Overview","title":"API","text":"","category":"section"},{"location":"lib/EpiObsModels/","page":"Overview","title":"Overview","text":"Pages = [\"lib/EpiObsModels/public.md\", \"lib/EpiObsModels/internals.md\"]","category":"page"},{"location":"lib/EpiAwareBase/#EpiAwareBase.jl","page":"Overview","title":"EpiAwareBase.jl","text":"","category":"section"},{"location":"lib/EpiAwareBase/","page":"Overview","title":"Overview","text":"This package provides the core functionality for the EpiAware ecosystem. It is a dependency of all other EpiAware packages.","category":"page"},{"location":"lib/EpiAwareBase/#API","page":"Overview","title":"API","text":"","category":"section"},{"location":"lib/EpiAwareBase/","page":"Overview","title":"Overview","text":"Pages = [\"lib/EpiAwareBase/public.md\", \"lib/EpiAwareBase/internals.md\"]","category":"page"},{"location":"developer/#developer","page":"Overview","title":"Developer documentation","text":"","category":"section"},{"location":"developer/","page":"Overview","title":"Overview","text":"Welcome to the EpiAware developer documentation! This section is designed to help you get started with developing the package.","category":"page"},{"location":"lib/EpiAwareUtils/public/#Public-Documentation","page":"Public API","title":"Public Documentation","text":"","category":"section"},{"location":"lib/EpiAwareUtils/public/","page":"Public API","title":"Public API","text":"Documentation for EpiAwareBae.jl's public interface.","category":"page"},{"location":"lib/EpiAwareUtils/public/","page":"Public API","title":"Public API","text":"See the Internals section of the manual for internal package docs covering all submodules.","category":"page"},{"location":"lib/EpiAwareUtils/public/#Contents","page":"Public API","title":"Contents","text":"","category":"section"},{"location":"lib/EpiAwareUtils/public/","page":"Public API","title":"Public API","text":"Pages = [\"public.md\"]\nDepth = 2:2","category":"page"},{"location":"lib/EpiAwareUtils/public/#Index","page":"Public API","title":"Index","text":"","category":"section"},{"location":"lib/EpiAwareUtils/public/","page":"Public API","title":"Public API","text":"Pages = [\"public.md\"]","category":"page"},{"location":"lib/EpiAwareUtils/public/#Public-API","page":"Public API","title":"Public API","text":"","category":"section"},{"location":"lib/EpiAwareUtils/public/","page":"Public API","title":"Public API","text":"Modules = [EpiAware.EpiAwareUtils]\nPrivate = false","category":"page"},{"location":"lib/EpiAwareUtils/public/#EpiAware.EpiAwareUtils","page":"Public API","title":"EpiAware.EpiAwareUtils","text":"Module for defining utility functions.\n\n\n\n\n\n","category":"module"},{"location":"lib/EpiAwareUtils/public/#EpiAware.EpiAwareUtils.DirectSample","page":"Public API","title":"EpiAware.EpiAwareUtils.DirectSample","text":"struct DirectSample <: AbstractEpiSamplingMethod\n\nSample directly from a Turing model.\n\n\n\nFields\n\nn_samples::Union{Nothing, Int64}: Number of samples from a model. If an integer is provided, the model is sampled n_samples times using Turing.Prior() returning an MCMChains. Chain object. If nothing, the model is sampled once returning a NamedTuple object of the sampled random variables along with generated quantities\n\n\n\n\n\n","category":"type"},{"location":"lib/EpiAwareUtils/public/#EpiAware.EpiAwareUtils.HalfNormal","page":"Public API","title":"EpiAware.EpiAwareUtils.HalfNormal","text":"struct HalfNormal{T<:Real} <: Distributions.Distribution{Distributions.Univariate, Distributions.Continuous}\n\nCreate a half-normal prior distribution with the specified mean.\n\nArguments:\n\nμ: The mean of the half-normal distribution.\n\nReturns:\n\nA HalfNormal distribution with the specified mean.\n\nExamples:\n\nusing EpiAware, Distributions\n\nhn = HalfNormal(1.0)\n# output\nEpiAware.EpiAwareUtils.HalfNormal{Float64}(μ=1.0)\n\nfilter out all the values that are less than 0\n\nrand(hn)\n# output\n0.4508533245229199\n\ncdf(hn, 2)\n# output\n0.8894596502772643\n\nquantile(hn, 0.5)\n# output\n0.8453475393951495\n\nlogpdf(hn, 2)\n# output\n-3.1111166111445083\n\nmean(hn)\n# output\n1.0\n\nvar(hn)\n# output\n0.5707963267948966\n\n\n\nFields\n\nμ::Real\n\n\n\n\n\n","category":"type"},{"location":"lib/EpiAwareUtils/public/#EpiAware.EpiAwareUtils.SafeDiscreteUnivariateDistribution","page":"Public API","title":"EpiAware.EpiAwareUtils.SafeDiscreteUnivariateDistribution","text":"A constant alias for Distribution{Univariate, SafeIntValued}. This type represents a univariate distribution with real-valued outcomes.\n\n\n\n\n\n","category":"type"},{"location":"lib/EpiAwareUtils/public/#EpiAware.EpiAwareUtils.SafeIntValued","page":"Public API","title":"EpiAware.EpiAwareUtils.SafeIntValued","text":"struct SafeIntValued <: Distributions.ValueSupport\n\nA type to represent real-valued distributions, the purpose of this type is to avoid problems with the eltype function when having rand calls in the model.\n\n\n\nFields\n\n\n\n\n\n","category":"type"},{"location":"lib/EpiAwareUtils/public/#EpiAware.EpiAwareUtils.SafeNegativeBinomial","page":"Public API","title":"EpiAware.EpiAwareUtils.SafeNegativeBinomial","text":"struct SafeNegativeBinomial{T<:Real} <: Distributions.UnivariateDistribution{SafeIntValued}\n\nCreate a Negative binomial distribution with the specified mean that avoids InExactError when the mean is too large.\n\nParameterisation:\n\nWe are using a mean and cluster factorization of the negative binomial distribution such that the variance to mean relationship is:\n\nsigma^2 = mu + alpha^2 mu^2\n\nThe reason for this parameterisation is that at sufficiently large mean values (i.e. r > 1 / p) p is approximately equal to the standard fluctuation of the distribution, e.g. if p = 0.05 we expect typical fluctuations of samples from the negative binomial to be about 5% of the mean when the mean is notably larger than 20. Otherwise, we expect approximately Poisson noise. In our opinion, this parameterisation is useful for specifying the distribution in a way that is easier to reason on priors for p.\n\nArguments:\n\nr: The number of successes, although this can be extended to a continous number.\np: Success rate.\n\nReturns:\n\nA SafeNegativeBinomial distribution with the specified mean.\n\nExamples:\n\nusing EpiAware, Distributions\n\nbigμ = exp(48.0) #Large value of μ\nσ² = bigμ + 0.05 * bigμ^2 #Large variance\n\n# We can calculate the success rate from the mean to variance relationship\np = bigμ / σ²\nr = bigμ * p / (1 - p)\nd = SafeNegativeBinomial(r, p)\n# output\nEpiAware.EpiAwareUtils.SafeNegativeBinomial{Float64}(r=20.0, p=2.85032816548187e-20)\n\ncdf(d, 100)\n# output\n0.0\n\nlogpdf(d, 100)\n# output\n-850.1397180331871\n\nmean(d)\n# output\n7.016735912097631e20\n\nvar(d)\n# output\n2.4617291430060293e40\n\n\n\nFields\n\nr::Real\np::Real\n\n\n\n\n\n","category":"type"},{"location":"lib/EpiAwareUtils/public/#EpiAware.EpiAwareUtils.SafePoisson","page":"Public API","title":"EpiAware.EpiAwareUtils.SafePoisson","text":"struct SafePoisson{T<:Real} <: Distributions.UnivariateDistribution{SafeIntValued}\n\nCreate a Poisson distribution with the specified mean that avoids InExactError when the mean is too large.\n\nArguments:\n\nλ: The mean of the Poisson distribution.\n\nReturns:\n\nA SafePoisson distribution with the specified mean.\n\nExamples:\n\nusing EpiAware, Distributions\n\nbigλ = exp(48.0) #Large value of λ\nd = SafePoisson(bigλ)\n# output\nEpiAware.EpiAwareUtils.SafePoisson{Float64}(λ=7.016735912097631e20)\n\ncdf(d, 2)\n# output\n0.0\n\nlogpdf(d, 100)\n# output\n-7.016735912097631e20\n\nmean(d)\n# output\n7.016735912097631e20\n\nvar(d)\n# output\n7.016735912097631e20\n\n\n\nFields\n\nλ::Real\n\n\n\n\n\n","category":"type"},{"location":"lib/EpiAwareUtils/public/#EpiAware.EpiAwareUtils.accumulate_scan-Tuple{AbstractAccumulationStep, Any, Any}","page":"Public API","title":"EpiAware.EpiAwareUtils.accumulate_scan","text":"accumulate_scan(\n acc_step::AbstractAccumulationStep,\n initial_state,\n ϵ_t\n) -> Any\n\n\nApply the `accumulate` function to the `AbstractAccumulationStep` object.\nThis is effectively a optimised version of a for loop that applies the\n`AbstractAccumulationStep` object to the input data in a single pass.\n\n# Arguments\n- `acc_step::AbstractAccumulationStep: The accumulation step function.\n- `initial_state`: The initial state of the accumulation.\n- `ϵ_t::AbstractVector{<:Real}`: The input data.\n\n# Returns\n- `state::AbstractVector{<:Real}`: The accumulated state as returned by the\n`get_state` function from the output of the `accumulate` function.\n\n# Examples\n```julia\nusing EpiAware\nstruct TestStep <: AbstractAccumulationStep\n a::Float64\nend\n\nfunction (step::TestStep)(state, ϵ)\n new_state = step.a * ϵ\n return new_state\nend\n\nacc_step = TestStep(0.5)\ninitial_state = zeros(3)\n\naccumulate_scan(acc_step, initial_state, [1.0, 2.0, 3.0])\n\nfunction get_state(acc_step::TestStep, initial_state, state)\n return state\nend\n\naccumulate_scan(acc_step, initial_state, [1.0, 2.0, 3.0])\n```\n\n\n\n\n\n","category":"method"},{"location":"lib/EpiAwareUtils/public/#EpiAware.EpiAwareUtils.censored_cdf-Tuple{Distributions.Distribution}","page":"Public API","title":"EpiAware.EpiAwareUtils.censored_cdf","text":"censored_cdf(\n dist::Distributions.Distribution;\n Δd,\n D,\n upper\n) -> Any\n\n\nCreate a discrete probability cumulative distribution function (CDF) from a given distribution, assuming a uniform distribution over primary event times with censoring intervals of width Δd for both primary and secondary events.\n\nNB: censored_cdf returns the non-truncated CDF, i.e. the CDF without conditioning on the secondary event occuring either before or after some time.\n\nArguments\n\ndist: The distribution from which to create the PMF.\nΔd: The step size for discretizing the domain. Default is 1.0.\nD: The upper bound of the domain. Must be greater than Δd. Default D = nothing\n\nindicates that the distribution should be truncated at its upperth percentile rounded to nearest multiple of Δd.\n\nReturns\n\nA vector representing the CDF with 0.0 appended at the beginning.\n\nRaises\n\nAssertionError if the minimum value of dist is negative.\nAssertionError if Δd is not positive.\nAssertionError if D is shorter than Δd.\nAssertionError if D is not a multiple of Δd.\n\nExamples\n\nusing Distributions\nusing EpiAware.EpiAwareUtils\n\ndist = Exponential(1.0)\n\ncensored_cdf(dist; D = 10) |>\n p -> round.(p, digits=3)\n\n# output\n11-element Vector{Float64}:\n 0.0\n 0.368\n 0.767\n 0.914\n 0.969\n 0.988\n 0.996\n 0.998\n 0.999\n 1.0\n 1.0\n\n\n\n\n\n","category":"method"},{"location":"lib/EpiAwareUtils/public/#EpiAware.EpiAwareUtils.censored_pmf-Tuple{Distributions.Distribution, Val{:single_censored}}","page":"Public API","title":"EpiAware.EpiAwareUtils.censored_pmf","text":"censored_pmf(\n dist::Distributions.Distribution,\n ::Val{:single_censored};\n primary_approximation_point,\n Δd,\n D\n)\n\n\nCreate a discrete probability mass function (PMF) from a given distribution, assuming that the primary event happens at primary_approximation_point * Δd within an intial censoring interval. Common single-censoring approximations are primary_approximation_point = 0 (left-hand approximation), primary_approximation_point = 1 (right-hand) and primary_approximation_point = 0.5 (midpoint).\n\nArguments\n\ndist: The distribution from which to create the PMF.\n::Val{:single_censored}: A dummy argument to dispatch to this method. The purpose of the Val\n\ntype argument is that to use single-censored approximation is an active decision.\n\nprimary_approximation_point: A approximation point for the primary time in its censoring interval.\n\nDefault is 0.5 for midpoint approximation.\n\nΔd: The step size for discretizing the domain. Default is 1.0.\nD: The upper bound of the domain. Must be greater than Δd.\n\nReturns\n\nA vector representing the PMF.\n\nRaises:\n\nAssertionError if the minimum value of dist is negative.\nAssertionError if Δd is not positive.\nAssertionError if D is not greater than Δd.\n\nExamples\n\nusing Distributions\nusing EpiAware.EpiAwareUtils\n\ndist = Exponential(1.0)\n\ncensored_pmf(dist, Val(:single_censored); D = 10) |>\n p -> round.(p, digits=3)\n\n# output\n10-element Vector{Float64}:\n 0.393\n 0.383\n 0.141\n 0.052\n 0.019\n 0.007\n 0.003\n 0.001\n 0.0\n 0.0\n\n\n\n\n\n","category":"method"},{"location":"lib/EpiAwareUtils/public/#EpiAware.EpiAwareUtils.censored_pmf-Tuple{Distributions.Distribution}","page":"Public API","title":"EpiAware.EpiAwareUtils.censored_pmf","text":"censored_pmf(\n dist::Distributions.Distribution;\n Δd,\n D,\n upper\n) -> Any\n\n\nCreate a discrete probability mass function (PMF) from a given distribution, assuming a uniform distribution over primary event times with censoring intervals of width Δd for both primary and secondary events. The CDF for the time from the left edge of the interval containing the primary event to the secondary event is created by direct numerical integration (quadrature) of the convolution of the CDF of dist with the uniform density on [0,Δd), using the censored_cdf function. The discrete PMF for double censored delays is then found using simple differencing on the CDF.\n\nNB: censored_pmf returns a right-truncated PMF, i.e. the PMF conditioned on the secondary event occurring before or on the final secondary censoring window.\n\nArguments\n\ndist: The distribution from which to create the PMF.\nΔd: The step size for discretizing the domain. Default is 1.0.\nD: The upper bound of the domain. Must be greater than Δd. Default D = nothing\n\nindicates that the distribution should be truncated at its upperth percentile rounded to nearest multiple of Δd.\n\nReturns\n\nA vector representing the PMF.\n\nRaises\n\nAssertionError if the minimum value of dist is negative.\nAssertionError if Δd is not positive.\nAssertionError if D is shorter than Δd.\nAssertionError if D is not a multiple of Δd.\n\nExamples\n\nusing Distributions\nusing EpiAware.EpiAwareUtils\n\ndist = Exponential(1.0)\n\ncensored_pmf(dist; D = 10) |>\n p -> round.(p, digits=3)\n\n# output\n10-element Vector{Float64}:\n 0.368\n 0.4\n 0.147\n 0.054\n 0.02\n 0.007\n 0.003\n 0.001\n 0.0\n 0.0\n\n\n\n\n\n","category":"method"},{"location":"lib/EpiAwareUtils/public/#EpiAware.EpiAwareUtils.get_param_array-Tuple{MCMCChains.Chains}","page":"Public API","title":"EpiAware.EpiAwareUtils.get_param_array","text":"get_param_array(chn::MCMCChains.Chains) -> Any\n\n\nExtract a parameter array from a Chains object chn that matches the shape of number of sample and chain pairs in chn.\n\nArguments\n\nchn::Chains: The Chains object containing the MCMC samples.\n\nReturns\n\nparam_array: An array of parameter samples, where each element corresponds to a single\n\nMCMC sample as a NamedTuple.\n\nExample\n\nSampling from a simple model which has both scalar and vector quantity random variables across 4 chains.\n\nusing Turing, MCMCChains, EpiAware\n\n@model function testmodel()\n y ~ Normal()\nend\nmdl = testmodel()\nchn = sample(mdl, Prior(), MCMCSerial(), 2, 1, progress=false)\n\nA = get_param_array(chn)\n\n\n\n\n\n","category":"method"},{"location":"lib/EpiAwareUtils/public/#EpiAware.EpiAwareUtils.get_state-Tuple{AbstractAccumulationStep, Any, Any}","page":"Public API","title":"EpiAware.EpiAwareUtils.get_state","text":"get_state(\n acc_step::AbstractAccumulationStep,\n initial_state,\n state\n) -> Any\n\n\nProcesses the output of the `accumulate` function to return the final state.\n\n# Arguments\n- `acc_step::AbstractAccumulationStep`: The accumulation step function.\n- `initial_state`: The initial state of the accumulation.\n- `state`: The output of the `accumulate` function.\n\n# Returns\n- `state`: The combination of the initial state and the last element of\n each accumulated state.\n\n\n\n\n\n","category":"method"},{"location":"lib/EpiAwareUtils/public/#EpiAware.EpiAwareUtils.prefix_submodel-Tuple{AbstractModel, Function, String, Vararg{Any}}","page":"Public API","title":"EpiAware.EpiAwareUtils.prefix_submodel","text":"prefix_submodel(\n model::AbstractModel,\n fn::Function,\n prefix::String,\n kwargs...\n) -> Any\n\n\nGenerate a submodel with an optional prefix. A lightweight wrapper around the @submodel macro from DynamicPPL.jl.\n\nArguments\n\nmodel::AbstractModel: The model to be used.\nfn::Function: The Turing @model function to be applied to the model.\nprefix::String: The prefix to be used. If the prefix is an empty string, the submodel is created without a prefix.\n\nReturns\n\nsubmodel: The returns from the submodel are passed through.\n\nExamples\n\nusing EpiAware, DynamicPPL\nsubmodel = prefix_submodel(FixedIntercept(0.1), generate_latent, string(1), 2)\n\nWe can now draw a sample from the submodel.\n\nrand(submodel)\n\n\n\n\n\n","category":"method"},{"location":"lib/EpiAwareUtils/public/#EpiAware.EpiAwareUtils.scan-Union{Tuple{F}, Tuple{F, Any, Any}} where F<:AbstractModel","page":"Public API","title":"EpiAware.EpiAwareUtils.scan","text":"scan(f::AbstractModel, init, xs) -> Tuple{Any, Any}\n\n\nApply f to each element of xs and accumulate the results.\n\nf must be a callable on a sub-type of AbstractModel.\n\nDesign note\n\nscan is being restricted to AbstractModel sub-types to ensure: 1. That compiler specialization is activated 2. Also avoids potential compiler overhead from specialisation on f<: Function.\n\nArguments\n\nf: A callable/functor that takes two arguments, carry and x, and returns a new carry and a result y.\ninit: The initial value for the carry variable.\nxs: An iterable collection of elements.\n\nReturns\n\nys: An array containing the results of applying f to each element of xs.\ncarry: The final value of the carry variable after processing all elements of xs.\n\nExamples\n\n```jldoctest using EpiAware\n\nstruct Adder <: EpiAwareBase.AbstractModel end function (a::Adder)(carry, x) carry + x, carry + x end\n\nscan(Adder(), 0, 1:5) #output ([1, 3, 6, 10, 15], 15)\n\n\n\n\n\n","category":"method"},{"location":"lib/EpiAwareUtils/public/#EpiAware.EpiAwareUtils.spread_draws-Tuple{MCMCChains.Chains}","page":"Public API","title":"EpiAware.EpiAwareUtils.spread_draws","text":"spread_draws(chn::MCMCChains.Chains) -> DataFrames.DataFrame\n\n\nspread_draws(chn::Chains)\n\nConverts a Chains object into a DataFrame in tidybayes format.\n\nArguments\n\nchn::Chains: The Chains object to be converted.\n\nReturns\n\ndf::DataFrame: The converted DataFrame.\n\n\n\n\n\n","category":"method"},{"location":"lib/EpiAwareUtils/public/#EpiAware.EpiAwareUtils.∫F-Tuple{Any, Any, Any}","page":"Public API","title":"EpiAware.EpiAwareUtils.∫F","text":"∫F(dist, t, Δd) -> Any\n\n\nCalculate the CDF of the random variable X + U where X has cumulative distriubtion function F and U is a uniform random variable on [0, Δd).\n\nThis is used in solving for censored CDFs and PMFs using numerical quadrature.\n\n\n\n\n\n","category":"method"},{"location":"release-notes/","page":"Release notes","title":"Release notes","text":"EditURL = \"https://github.com/JuliaDocs/Documenter.jl/blob/master/CHANGELOG.md\"","category":"page"},{"location":"release-notes/#Release-notes","page":"Release notes","title":"Release notes","text":"","category":"section"},{"location":"release-notes/","page":"Release notes","title":"Release notes","text":"The format is based on Keep a Changelog, and this project adheres to Semantic Versioning.","category":"page"},{"location":"release-notes/#Unreleased","page":"Release notes","title":"Unreleased","text":"","category":"section"},{"location":"release-notes/#Added","page":"Release notes","title":"Added","text":"","category":"section"},{"location":"release-notes/#Changed","page":"Release notes","title":"Changed","text":"","category":"section"},{"location":"release-notes/#Fixed","page":"Release notes","title":"Fixed","text":"","category":"section"},{"location":"getting-started/#getting-started","page":"Overview","title":"Getting started","text":"","category":"section"},{"location":"getting-started/","page":"Overview","title":"Overview","text":"Note that this section of the documentation is still under construction. Please see replications for the most up-to-date information. Please feel free to contribute to the documentation by submitting a pull request.","category":"page"},{"location":"getting-started/","page":"Overview","title":"Overview","text":"Welcome to the EpiAware documentation! This section is designed to help you get started with the package. It includes a frequently asked questions (FAQ) section, a series of explainers that provide a detailed overview of the platform and its features, and tutorials that will help you get started with EpiAware for specific tasks. See the sidebar for the list of topics.","category":"page"},{"location":"getting-started/explainers/observation-models/#Observation-models","page":"Observation models","title":"Observation models","text":"","category":"section"},{"location":"showcase/#showcase","page":"Overview","title":"EpiAware Showcase","text":"","category":"section"},{"location":"showcase/","page":"Overview","title":"Overview","text":"Here we showcase the capabilities of EpiAware in action. If you have a showcase you would like to add, please submit a pull request.","category":"page"}] } diff --git a/previews/PR510/showcase/index.html b/previews/PR510/showcase/index.html index cf8ffb371..59a40bb46 100644 --- a/previews/PR510/showcase/index.html +++ b/previews/PR510/showcase/index.html @@ -1,2 +1,2 @@ -Overview · EpiAware.jl

              EpiAware Showcase

              Here we showcase the capabilities of EpiAware in action. If you have a showcase you would like to add, please submit a pull request.

              +Overview · EpiAware.jl

              EpiAware Showcase

              Here we showcase the capabilities of EpiAware in action. If you have a showcase you would like to add, please submit a pull request.

              diff --git a/previews/PR510/showcase/replications/chatzilena-2019/index.html b/previews/PR510/showcase/replications/chatzilena-2019/index.html index 5b9b2f771..2e8eb95da 100644 --- a/previews/PR510/showcase/replications/chatzilena-2019/index.html +++ b/previews/PR510/showcase/replications/chatzilena-2019/index.html @@ -541,4 +541,4 @@

              - + diff --git a/previews/PR510/showcase/replications/mishra-2020/index.html b/previews/PR510/showcase/replications/mishra-2020/index.html index 245512481..ee4c7ed4a 100644 --- a/previews/PR510/showcase/replications/mishra-2020/index.html +++ b/previews/PR510/showcase/replications/mishra-2020/index.html @@ -435,4 +435,4 @@

              - +