From 761a8cac8ff5bef8b3d08fc21354be4f2c8d6a75 Mon Sep 17 00:00:00 2001 From: autodocs Date: Sun, 24 Feb 2019 11:15:59 +0000 Subject: [PATCH] build based on e920baf --- dev/api/index.html | 32 ++++++++++++++++---------------- 1 file changed, 16 insertions(+), 16 deletions(-) diff --git a/dev/api/index.html b/dev/api/index.html index 9e57af22a..feb9e9055 100644 --- a/dev/api/index.html +++ b/dev/api/index.html @@ -1,52 +1,52 @@ -API · BAT

API

API

Types

Functions

Documentation

AbstractDensity

The following functions must be implemented for subtypes:

  • BAT.nparams
  • BAT.unsafe_density_logval

In some cases, it may be desirable to override the default implementations of the functions

  • BAT.exec_capabilities
  • BAT.unsafe_density_logval!
source
AbstractMCMCCallback <: Function

Subtypes (e.g. X) must support

(::X)(level::Integer, chain::MCMCIterator) => nothing
-(::X)(level::Integer, tuner::AbstractMCMCTuner) => nothing

to be compabtible with mcmc_iterate!, mcmc_tune_burnin!, etc.

source
AbstractProposalDist

The following functions must be implemented for subtypes:

  • BAT.distribution_logpdf
  • BAT.proposal_rand!
  • BAT.nparams, returning the number of parameters (i.e. dimensionality).
  • LinearAlgebra.issymmetric, indicating whether p(a -> b) == p(b -> a) holds true.

In some cases, it may be desirable to override the default implementation of BAT.distribution_logpdf!.

source
BAT.DataSetType.
DataSet{T<:AbstractFloat, I<:Integer}

Holds the MCMC output. For construction use constructor: function DataSet{T<:Real}(data::Matrix{T}, logprob::Vector{T}, weights::Vector{T})

Variables

  • 'data' : An P x N array with N data points with P parameters.
  • 'logprob' : The logarithmic probability for each samples stored in an array
  • 'weights' : How often each sample occurred. Set to an array of ones if working directly on MCMC output
  • 'ids' : Array which is used to assign each sample to a batch, required for the cov. weighed uncertainty estimation
  • .sortids : an array of indices which stores the original ordering of the samples (the space partitioning tree reorders the samples), required to calculate an effective sample size.
  • 'N' : number of samples
  • 'P' : number of parameters
  • 'nsubsets' : the number of batches
  • 'iswhitened' : a boolean value which indicates whether the data set is iswhitened
  • 'isnew' : a boolean value which indicates whether the data set was swapped out with a different one (it is possible to redo the integration with a different sample set using previously generated hyper-rectangles)
  • 'partitioningtree' : The space partitioning tree, used to efficiently identify samples in a point cloud
  • 'startingIDs' : The Hyper-Rectangle Seed Samples are stored in this array
  • 'tolerance' : A threshold required for the hyper-rectangle creation process.
source
struct ExecCapabilities
+API · BAT

API

API

Types

Functions

Documentation

AbstractDensity

The following functions must be implemented for subtypes:

  • BAT.nparams
  • BAT.unsafe_density_logval

In some cases, it may be desirable to override the default implementations of the functions

  • BAT.exec_capabilities
  • BAT.unsafe_density_logval!
source
AbstractMCMCCallback <: Function

Subtypes (e.g. X) must support

(::X)(level::Integer, chain::MCMCIterator) => nothing
+(::X)(level::Integer, tuner::AbstractMCMCTuner) => nothing

to be compabtible with mcmc_iterate!, mcmc_tune_burnin!, etc.

source
AbstractProposalDist

The following functions must be implemented for subtypes:

  • BAT.distribution_logpdf
  • BAT.proposal_rand!
  • BAT.nparams, returning the number of parameters (i.e. dimensionality).
  • LinearAlgebra.issymmetric, indicating whether p(a -> b) == p(b -> a) holds true.

In some cases, it may be desirable to override the default implementation of BAT.distribution_logpdf!.

source
BAT.DataSetType.
DataSet{T<:AbstractFloat, I<:Integer}

Holds the MCMC output. For construction use constructor: function DataSet{T<:Real}(data::Matrix{T}, logprob::Vector{T}, weights::Vector{T})

Variables

  • 'data' : An P x N array with N data points with P parameters.
  • 'logprob' : The logarithmic probability for each samples stored in an array
  • 'weights' : How often each sample occurred. Set to an array of ones if working directly on MCMC output
  • 'ids' : Array which is used to assign each sample to a batch, required for the cov. weighed uncertainty estimation
  • .sortids : an array of indices which stores the original ordering of the samples (the space partitioning tree reorders the samples), required to calculate an effective sample size.
  • 'N' : number of samples
  • 'P' : number of parameters
  • 'nsubsets' : the number of batches
  • 'iswhitened' : a boolean value which indicates whether the data set is iswhitened
  • 'isnew' : a boolean value which indicates whether the data set was swapped out with a different one (it is possible to redo the integration with a different sample set using previously generated hyper-rectangles)
  • 'partitioningtree' : The space partitioning tree, used to efficiently identify samples in a point cloud
  • 'startingIDs' : The Hyper-Rectangle Seed Samples are stored in this array
  • 'tolerance' : A threshold required for the hyper-rectangle creation process.
source
struct ExecCapabilities
     nthreads::Int
     threadsafe::Bool
     nprocs::Int
     remotesafe::Bool
-end

Specifies the execution capabilities of functions that support an ExecContext argument.

nthreads specifies the maximum number of threads the function can utilize efficiently, internally. If nthreads <= 1, the function implementation is single-threaded. nthreads == 0 indicates that the function is cheap and that when used in combination with other functions, their capabilities should dominate.

threadsafe specifies whether the function is thread-safe, and can be can be run on multiple threads in parallel by the caller.

nprocs specifies the maximum number of worker processes the function can utilize efficiently, internally. If procs <= 1, the function cannot use worker processes. nthreads == 0 carries equivalent meaning to nthreads == 0.

remotesafe specifies that the function can be run on a remote thread, it implies that the function arguments can be (de-)serialized safely.

Functions with an ExecContext argument should announce their capabilities via methods of exec_capabilities. Functions should, ideally, either support internal multithreading (nthreads > 1) or be thread-safe (threadsafe == true). Likewise, functions should either utilize worker processes (nprocs > 1) internally or support remote execution (remotesafe == true) by the caller.

source
BAT.ExecContextType.
struct ExecContext
+end

Specifies the execution capabilities of functions that support an ExecContext argument.

nthreads specifies the maximum number of threads the function can utilize efficiently, internally. If nthreads <= 1, the function implementation is single-threaded. nthreads == 0 indicates that the function is cheap and that when used in combination with other functions, their capabilities should dominate.

threadsafe specifies whether the function is thread-safe, and can be can be run on multiple threads in parallel by the caller.

nprocs specifies the maximum number of worker processes the function can utilize efficiently, internally. If procs <= 1, the function cannot use worker processes. nthreads == 0 carries equivalent meaning to nthreads == 0.

remotesafe specifies that the function can be run on a remote thread, it implies that the function arguments can be (de-)serialized safely.

Functions with an ExecContext argument should announce their capabilities via methods of exec_capabilities. Functions should, ideally, either support internal multithreading (nthreads > 1) or be thread-safe (threadsafe == true). Likewise, functions should either utilize worker processes (nprocs > 1) internally or support remote execution (remotesafe == true) by the caller.

source
BAT.ExecContextType.
struct ExecContext
     use_threads::Bool
     onprocs::Vector{Int64}
-end

Functions that take an ExecContext argument must limit their use of threads and processes accordingly. Depending on use_threads, the function may use all (or only a single) thread(s) on each process in onprocs (in addition to the current thread on the current process).

The caller may choose to change the ExecContext from call to call, based on execution time and latency measurements, etc.

Functions can announce their BAT.ExecCapabilities via exec_capabilities.

source
GRConvergence

Gelman-Rubin $$maximum(R^2)$$ convergence test.

source
GenericDensity{F} <: AbstractDensity

Constructors:

GenericDensity(log_f, nparams::Int)

Turns the logarithmic density function log_f into a BAT-compatible AbstractDensity. log_f must support

`log_f(params::AbstractVector{<:Real})::Real`

with length(params) == nparams.

It must be safe to execute log_f in parallel on multiple threads and processes.

source
BAT.HMIDataType.
HMIData{T<:AbstractFloat, I<:Integer}

Includes all the informations of the integration process, including a list of hyper-rectangles, the results of the whitening transformation, the starting ids, and the average number of points and volume of the created hyper-rectangles.

Variables

  • 'dataset1' : Data Set 1
  • 'dataset2' : Data Set 2
  • 'whiteningresult' : contains the whitening matrix and its determinant, required to scale the final integral estimate
  • 'volumelist1' : An array of integration volumes created using dataset1, but filled with samples from dataset2
  • 'volumelist2' : An array of integration volumes created using dataset2, but filled with samples from dataset1
  • 'cubelist1' : An array of small hyper-cubes created around seeding samples of dataset 1
  • 'cubelist2' : An array of small hyper-cubes created around seeding samples of dataset 2
  • 'iterations1' : The number of volume adapting iterations for the creating volumelist1
  • 'iterations2' : The number of volume adapting iterations for the creating volumelist2
  • 'rejectedrects1' : An array of ids, indicating which hyper-rectangles of volumelist1 were rejected due to trimming
  • 'rejectedrects2' : An array of ids, indicating which hyper-rectangles of volumelist2 were rejected due to trimming
  • 'integralestimates' : A dictionary containing the final integral estimates with uncertainty estimation using different uncertainty estimators. Also includes all intermediate results required for the integral estimate combination
source
BAT.HMISettingsType.
HMISettings

holds the settings for the hm_integrate function. There are several default constructors available: HMIFastSettings() HMIStandardSettings() HMIPrecisionSettings()

#Variables

  • 'whitening_method::Symbol' : which whitening method to use
  • 'max_startingIDs::Integer' : influences how many starting ids are allowed to be generated
  • 'maxstartingIDsfraction::AbstractFloat' : how many points are considered as possible starting points as a fraction of total points available
  • 'rect_increase::AbstractFloat' : describes the procentual rectangle volume increase/decrease during hyperrectangle creation. Low values can increase the precision if enough points are available but can cause systematically wrong results if not enough points are available.
  • 'useallrects::Bool' : All rectangles are used for the integration process no matter how big their overlap is. If enabled the rectangles are weighted by their overlap.
  • 'useMultiThreading' : activate multithreading support.
  • 'warning_minstartingids' : the required minimum amount of starting samples
  • 'dotrimming' : determines whether the integral estimates are trimmed (1σ trim) before combining them into a final result (more robust)
  • 'uncertaintyestimators' : A dictionary of different uncertainty estimator functions. Currently three functions are available: hmcombineresultslegacy! (outdated, overestimates uncertainty significantly in higher dimensions), hmcombineresultscovweighted! (very fast) and hmcombineresults_analyticestimation! (recommended)

end

source
IntegrationVolume{T<:AbstractFloat, I<:Integer}

Variables

  • 'pointcloud' : holds the point cloud of the integration volume
  • 'spatialvolume' : the boundaries of the integration volume
  • 'volume' : the volume

Hold the point cloud and the spatial volume for integration.

source
IntegrationVolume(dataset::DataSet{T, I}, spvol::HyperRectVolume{T}, searchpts::Bool = true)::IntegrationVolume{T, I}

creates an integration region by calculating the point cloud an the volume of the spatial volume.

source
OnlineMvMean{T<:AbstractFloat} <: AbstractVector{T}

Multi-variate mean implemented via Kahan-Babuška-Neumaier summation.

source
OnlineUvMean{T<:AbstractFloat}

Univariate mean implemented via Kahan-Babuška-Neumaier summation.

source
BAT.PointCloudType.
PointCloud{T<:AbstractFloat, I<:Integer}

Stores the information of the points of an e.g. HyperRectVolume

Variables

  • 'maxLogProb' : The maximum log. probability of one of the points inside the hyper-rectangle
  • 'minLogProb' : The minimum log. probability of one of the points inside the hyper-rectangle
  • 'maxWeightProb' : the weighted max. log. probability
  • 'minWeightProb' : the weighted min. log. probability
  • 'probfactor' : The probability factor of the hyper-rectangle
  • 'probweightfactor' : The weighted probability factor
  • 'points' : The number of points inside the hyper-rectangle
  • 'pointIDs' : the IDs of the points inside the hyper-rectangle, might be empty because it is optional and costs performance
  • 'searchres' : used to boost performance
source
BAT.PointCloudMethod.
PointCloud{T<:AbstractFloat, I<:Integer}(dataset::DataSet{T, I}, hyperrect::HyperRectVolume{T}, searchpts::Bool = false)::PointCloud

creates a point cloud by searching the data tree for points which are inside the hyper-rectangle The parameter searchpts determines if an array of the point IDs is created as well

source
WhiteningResult{T<:AbstractFloat}

Stores the information obtained during the Whitening Process

Variables

  • 'determinant' : The determinant of the whitening matrix
  • 'targetprobfactor' : The suggested target probability factor
  • 'whiteningmatrix' : The whitening matrix
  • 'meanvalue' : the mean vector of the input data
source
BAT.bat_samplerFunction.
bat_sampler(d::Distribution)

Tries to return a BAT-compatible sampler for Distribution d. A sampler is BAT-compatible if it supports random number generation using an arbitrary AbstractRNG:

rand(rng::AbstractRNG, s::SamplerType)
-rand!(rng::AbstractRNG, s::SamplerType, x::AbstractArray)

If no specific method of bat_sampler is defined for the type of d, it will default to sampler(d), which may or may not return a BAT-compatible sampler.

source
BAT.density_logvalFunction.
density_logval(
+end

Functions that take an ExecContext argument must limit their use of threads and processes accordingly. Depending on use_threads, the function may use all (or only a single) thread(s) on each process in onprocs (in addition to the current thread on the current process).

The caller may choose to change the ExecContext from call to call, based on execution time and latency measurements, etc.

Functions can announce their BAT.ExecCapabilities via exec_capabilities.

source
GRConvergence

Gelman-Rubin $$maximum(R^2)$$ convergence test.

source
GenericDensity{F} <: AbstractDensity

Constructors:

GenericDensity(log_f, nparams::Int)

Turns the logarithmic density function log_f into a BAT-compatible AbstractDensity. log_f must support

`log_f(params::AbstractVector{<:Real})::Real`

with length(params) == nparams.

It must be safe to execute log_f in parallel on multiple threads and processes.

source
BAT.HMIDataType.
HMIData{T<:AbstractFloat, I<:Integer}

Includes all the informations of the integration process, including a list of hyper-rectangles, the results of the whitening transformation, the starting ids, and the average number of points and volume of the created hyper-rectangles.

Variables

  • 'dataset1' : Data Set 1
  • 'dataset2' : Data Set 2
  • 'whiteningresult' : contains the whitening matrix and its determinant, required to scale the final integral estimate
  • 'volumelist1' : An array of integration volumes created using dataset1, but filled with samples from dataset2
  • 'volumelist2' : An array of integration volumes created using dataset2, but filled with samples from dataset1
  • 'cubelist1' : An array of small hyper-cubes created around seeding samples of dataset 1
  • 'cubelist2' : An array of small hyper-cubes created around seeding samples of dataset 2
  • 'iterations1' : The number of volume adapting iterations for the creating volumelist1
  • 'iterations2' : The number of volume adapting iterations for the creating volumelist2
  • 'rejectedrects1' : An array of ids, indicating which hyper-rectangles of volumelist1 were rejected due to trimming
  • 'rejectedrects2' : An array of ids, indicating which hyper-rectangles of volumelist2 were rejected due to trimming
  • 'integralestimates' : A dictionary containing the final integral estimates with uncertainty estimation using different uncertainty estimators. Also includes all intermediate results required for the integral estimate combination
source
BAT.HMISettingsType.
HMISettings

holds the settings for the hm_integrate function. There are several default constructors available: HMIFastSettings() HMIStandardSettings() HMIPrecisionSettings()

#Variables

  • 'whitening_method::Symbol' : which whitening method to use
  • 'max_startingIDs::Integer' : influences how many starting ids are allowed to be generated
  • 'maxstartingIDsfraction::AbstractFloat' : how many points are considered as possible starting points as a fraction of total points available
  • 'rect_increase::AbstractFloat' : describes the procentual rectangle volume increase/decrease during hyperrectangle creation. Low values can increase the precision if enough points are available but can cause systematically wrong results if not enough points are available.
  • 'useallrects::Bool' : All rectangles are used for the integration process no matter how big their overlap is. If enabled the rectangles are weighted by their overlap.
  • 'useMultiThreading' : activate multithreading support.
  • 'warning_minstartingids' : the required minimum amount of starting samples
  • 'dotrimming' : determines whether the integral estimates are trimmed (1σ trim) before combining them into a final result (more robust)
  • 'uncertaintyestimators' : A dictionary of different uncertainty estimator functions. Currently three functions are available: hmcombineresultslegacy! (outdated, overestimates uncertainty significantly in higher dimensions), hmcombineresultscovweighted! (very fast) and hmcombineresults_analyticestimation! (recommended)

end

source
IntegrationVolume{T<:AbstractFloat, I<:Integer}

Variables

  • 'pointcloud' : holds the point cloud of the integration volume
  • 'spatialvolume' : the boundaries of the integration volume
  • 'volume' : the volume

Hold the point cloud and the spatial volume for integration.

source
IntegrationVolume(dataset::DataSet{T, I}, spvol::HyperRectVolume{T}, searchpts::Bool = true)::IntegrationVolume{T, I}

creates an integration region by calculating the point cloud an the volume of the spatial volume.

source
OnlineMvMean{T<:AbstractFloat} <: AbstractVector{T}

Multi-variate mean implemented via Kahan-Babuška-Neumaier summation.

source
OnlineUvMean{T<:AbstractFloat}

Univariate mean implemented via Kahan-Babuška-Neumaier summation.

source
BAT.PointCloudType.
PointCloud{T<:AbstractFloat, I<:Integer}

Stores the information of the points of an e.g. HyperRectVolume

Variables

  • 'maxLogProb' : The maximum log. probability of one of the points inside the hyper-rectangle
  • 'minLogProb' : The minimum log. probability of one of the points inside the hyper-rectangle
  • 'maxWeightProb' : the weighted max. log. probability
  • 'minWeightProb' : the weighted min. log. probability
  • 'probfactor' : The probability factor of the hyper-rectangle
  • 'probweightfactor' : The weighted probability factor
  • 'points' : The number of points inside the hyper-rectangle
  • 'pointIDs' : the IDs of the points inside the hyper-rectangle, might be empty because it is optional and costs performance
  • 'searchres' : used to boost performance
source
BAT.PointCloudMethod.
PointCloud{T<:AbstractFloat, I<:Integer}(dataset::DataSet{T, I}, hyperrect::HyperRectVolume{T}, searchpts::Bool = false)::PointCloud

creates a point cloud by searching the data tree for points which are inside the hyper-rectangle The parameter searchpts determines if an array of the point IDs is created as well

source
WhiteningResult{T<:AbstractFloat}

Stores the information obtained during the Whitening Process

Variables

  • 'determinant' : The determinant of the whitening matrix
  • 'targetprobfactor' : The suggested target probability factor
  • 'whiteningmatrix' : The whitening matrix
  • 'meanvalue' : the mean vector of the input data
source
BAT.bat_samplerFunction.
bat_sampler(d::Distribution)

Tries to return a BAT-compatible sampler for Distribution d. A sampler is BAT-compatible if it supports random number generation using an arbitrary AbstractRNG:

rand(rng::AbstractRNG, s::SamplerType)
+rand!(rng::AbstractRNG, s::SamplerType, x::AbstractArray)

If no specific method of bat_sampler is defined for the type of d, it will default to sampler(d), which may or may not return a BAT-compatible sampler.

source
BAT.density_logvalFunction.
density_logval(
     density::AbstractDensity,
     params::AbstractVector{<:Real},
     exec_context::ExecContext = ExecContext()
-)

Version of density_logval for a single parameter vector.

Do not implement density_logval directly for subtypes of AbstractDensity, implement BAT.unsafe_density_logval instead.

See ExecContext for thread-safety requirements.

source
BAT.density_logval!Function.
density_logval!(
+)

Version of density_logval for a single parameter vector.

Do not implement density_logval directly for subtypes of AbstractDensity, implement BAT.unsafe_density_logval instead.

See ExecContext for thread-safety requirements.

source
BAT.density_logval!Function.
density_logval!(
     r::AbstractVector{<:Real},
     density::AbstractDensity,
     params::VectorOfSimilarVectors{<:Real},
     exec_context::ExecContext = ExecContext()
-)

Compute log of values of a density function for multiple parameter value vectors.

Input:

  • density: density function
  • params: parameter values
  • exec_context: Execution context

Output is stored in

  • r: Vector of log-result values

Array size requirements:

axes(params, 1) == axes(r, 1)

Note: density_logval! must not be called with out-of-bounds parameter vectors (see param_bounds). The result of density_logval! for parameter vectors that are out of bounds is implicitly -Inf, but for performance reasons the output is left undefined: density_logval! may fail or store arbitrary values in r.

Do not implement density_logval! directly for subtypes of AbstractDensity, implement BAT.unsafe_density_logval! instead.

See ExecContext for thread-safety requirements.

source
distribution_logpdf(
+)

Compute log of values of a density function for multiple parameter value vectors.

Input:

  • density: density function
  • params: parameter values
  • exec_context: Execution context

Output is stored in

  • r: Vector of log-result values

Array size requirements:

axes(params, 1) == axes(r, 1)

Note: density_logval! must not be called with out-of-bounds parameter vectors (see param_bounds). The result of density_logval! for parameter vectors that are out of bounds is implicitly -Inf, but for performance reasons the output is left undefined: density_logval! may fail or store arbitrary values in r.

Do not implement density_logval! directly for subtypes of AbstractDensity, implement BAT.unsafe_density_logval! instead.

See ExecContext for thread-safety requirements.

source
distribution_logpdf(
     pdist::AbstractProposalDist,
     params_new::AbstractVector,
     params_old:::AbstractVector
-)

Analog to distribution_logpdf!, but for a single parameter vector.

source
distribution_logpdf!(
+)

Analog to distribution_logpdf!, but for a single parameter vector.

source
distribution_logpdf!(
     p::AbstractArray,
     pdist::AbstractProposalDist,
     params_new::Union{AbstractVector,VectorOfSimilarVectors},
     params_old:::Union{AbstractVector,VectorOfSimilarVectors}
-)

Returns log(PDF) value of pdist for transitioning from old to new parameter values for multiple parameter sets.

end

Input:

  • params_new: New parameter values (column vectors)
  • params_old: Old parameter values (column vectors)

Output is stored in

  • p: Array of PDF values, length must match, shape is ignored

Array size requirements:

  • size(params_old, 1) == size(params_new, 1) == length(pdist)
  • size(params_old, 2) == size(params_new, 2) or size(params_old, 2) == 1
  • size(params_new, 2) == length(p)

Implementations of distribution_logpdf! must be thread-safe.

source
BAT.fromuhc!Function.
fromuhc!(Y::AbstractVector, X::AbstractVector, vol::SpatialVolume)
-fromuhc!(Y::VectorOfSimilarVectors, X::VectorOfSimilarVectors, vol::SpatialVolume)

Bijective transformation of coordinates X within the unit hypercube to coordinates Y in vol. If X and Y are matrices, the transformation is applied to the column vectors. Use Y === X to transform in-place.

Use inv(fromuhc!) to get the the inverse transformation.

source
BAT.fromuhcMethod.
fromuhc(X::AbstractVector, vol::SpatialVolume)
-fromuhc(X::VectorOfSimilarVectors, vol::SpatialVolume)

Bijective transformation from unit hypercube to vol. See fromuhc!.

Use inv(fromuhc) to get the the inverse transformation.

source
BAT.fromuiFunction.
y = fromui(x::Real, lo::Real, hi::Real)
-y = fromui(x::Real, lo_hi::ClosedInterval{<:Real})

Linear bijective transformation from the unit inverval (i.e. x ∈ 0..1) to y ∈ lo..hi.

Use inv(fromui) to get the the inverse transformation.

Use @inbounds to disable range checking on the input value.

source
BAT.hm_initMethod.

function hm_init!(result, settings)

Sets the global multithreading setting and ensures that a minimum number of samples, dependent on the number of dimensions, are provided.

source
BAT.hm_integrate!Method.

function hm_integrate!(result, settings = HMIPrecisionSettings())

This function starts the adaptive harmonic mean integration. See arXiv:1808.08051 for more details. It needs a HMIData struct as input, which holds the samples, in form of a dataset, the integration volumes and other properties, required for the integration, and the final result.

source

function hm_whiteningtransformation!(result, settings)

Applies a whitening transformation to the samples. A custom whitening method can be used by overriding settings.whitening_function!

source
issymmetric_around_origin(d::Distribution)

Returns true (resp. false) if the Distribution is symmetric (resp. non-symmetric) around the origin.

source
BAT.log_volumeFunction.
log_volume(vol::SpatialVolume)

Get the logarithm of the volume of the space in vol.

source
BAT.nparamsFunction.
nparams(X::Union{AbstractParamBounds,MCMCIterator,...})

Get the number of parameters of X.

source
BAT.param_boundsMethod.
param_bounds(density::AbstractDensity)::AbstractParamBounds

Get the parameter bounds of density. See density_logval! for the implications and handling of the bounds.

Use

new_density = density[bounds::ParamVolumeBounds]

to create a new density function with additional bounds.

source
BAT.proposal_rand!Function.
function proposal_rand!(
+)

Returns log(PDF) value of pdist for transitioning from old to new parameter values for multiple parameter sets.

end

Input:

  • params_new: New parameter values (column vectors)
  • params_old: Old parameter values (column vectors)

Output is stored in

  • p: Array of PDF values, length must match, shape is ignored

Array size requirements:

  • size(params_old, 1) == size(params_new, 1) == length(pdist)
  • size(params_old, 2) == size(params_new, 2) or size(params_old, 2) == 1
  • size(params_new, 2) == length(p)

Implementations of distribution_logpdf! must be thread-safe.

source
BAT.fromuhc!Function.
fromuhc!(Y::AbstractVector, X::AbstractVector, vol::SpatialVolume)
+fromuhc!(Y::VectorOfSimilarVectors, X::VectorOfSimilarVectors, vol::SpatialVolume)

Bijective transformation of coordinates X within the unit hypercube to coordinates Y in vol. If X and Y are matrices, the transformation is applied to the column vectors. Use Y === X to transform in-place.

Use inv(fromuhc!) to get the the inverse transformation.

source
BAT.fromuhcMethod.
fromuhc(X::AbstractVector, vol::SpatialVolume)
+fromuhc(X::VectorOfSimilarVectors, vol::SpatialVolume)

Bijective transformation from unit hypercube to vol. See fromuhc!.

Use inv(fromuhc) to get the the inverse transformation.

source
BAT.fromuiFunction.
y = fromui(x::Real, lo::Real, hi::Real)
+y = fromui(x::Real, lo_hi::ClosedInterval{<:Real})

Linear bijective transformation from the unit inverval (i.e. x ∈ 0..1) to y ∈ lo..hi.

Use inv(fromui) to get the the inverse transformation.

Use @inbounds to disable range checking on the input value.

source
BAT.hm_initMethod.

function hm_init!(result, settings)

Sets the global multithreading setting and ensures that a minimum number of samples, dependent on the number of dimensions, are provided.

source
BAT.hm_integrate!Method.

function hm_integrate!(result, settings = HMIPrecisionSettings())

This function starts the adaptive harmonic mean integration. See arXiv:1808.08051 for more details. It needs a HMIData struct as input, which holds the samples, in form of a dataset, the integration volumes and other properties, required for the integration, and the final result.

source

function hm_whiteningtransformation!(result, settings)

Applies a whitening transformation to the samples. A custom whitening method can be used by overriding settings.whitening_function!

source
issymmetric_around_origin(d::Distribution)

Returns true (resp. false) if the Distribution is symmetric (resp. non-symmetric) around the origin.

source
BAT.log_volumeFunction.
log_volume(vol::SpatialVolume)

Get the logarithm of the volume of the space in vol.

source
BAT.nparamsFunction.
nparams(X::Union{AbstractParamBounds,MCMCIterator,...})

Get the number of parameters of X.

source
BAT.param_boundsMethod.
param_bounds(density::AbstractDensity)::AbstractParamBounds

Get the parameter bounds of density. See density_logval! for the implications and handling of the bounds.

Use

new_density = density[bounds::ParamVolumeBounds]

to create a new density function with additional bounds.

source
BAT.proposal_rand!Function.
function proposal_rand!(
     rng::AbstractRNG,
     pdist::GenericProposalDist,
     params_new::Union{AbstractVector,VectorOfSimilarVectors},
     params_old::Union{AbstractVector,VectorOfSimilarVectors}
-)

Generate one or multiple proposed parameter vectors, based on one or multiple previous parameter vectors.

Input:

  • rng: Random number generator to use
  • pdist: Proposal distribution to use
  • params_old: Old parameter values (vector or column vectors, if a matrix)

Output is stored in

  • params_new: New parameter values (vector or column vectors, if a matrix)

The caller must guarantee:

  • size(params_old, 1) == size(params_new, 1)
  • size(params_old, 2) == size(params_new, 2) or size(params_old, 2) == 1
  • params_new !== params_old (no aliasing)

Implementations of proposal_rand! must be thread-safe.

source
BAT.spatialvolumeFunction.
spatialvolume(b::ParamVolumeBounds)::SpatialVolume

Returns the spatial volume that defines the parameter bounds.

source
MCMCCallbackWrapper{F} <: AbstractMCMCCallback

Wraps a callable object to turn it into an AbstractMCMCCallback.

Constructor:

MCMCCallbackWrapper(f::Any)

f needs to support the call syntax of an AbstractMCMCCallback.

source
SearchResult{T<:AbstractFloat, I<:Integer}

Stores the results of the space partitioning tree's search function

Variables

  • 'pointIDs' : the IDs of samples found, might be empty because it is optional
  • 'points' : The number of points found.
  • 'maxLogProb' : the maximum log. probability of the points found.
  • 'minLogProb' : the minimum log. probability of the points found.
  • 'maxWeightProb' : the weighted minimum log. probability found.
  • 'minWeightProb' : the weighted maximum log. probfactor found.
source
BAT.apply_boundsFunction.
apply_bounds(x::Real, interval::ClosedInterval, boundary_type::BoundsType)

Specify lower and upper bound via interval.

source
BAT.apply_bounds!Function.
apply_bounds!(params::AbstractVector, bounds::AbstractParamBounds)

Apply bounds to parameters params.

source
BAT.apply_boundsMethod.
apply_bounds(x::<:Real, lo::<:Real, hi::<:Real, boundary_type::BoundsType)

Apply lower/upper bound lo/hi to value x. boundary_type may be hard_bounds, cyclic_bounds or reflective_bounds.

source
BAT.autocrlMethod.
autocrl(xv::AbstractVector{T}, kv::AbstractVector{Int} = Vector{Int}())

autocorrelation := Σ Cov[xi,x(i+k)]/Var[x]

Computes the autocorrelations at various leg k of the input vector (time series) xv. The vector kv is the collections of lags to take into account

source
create_hypercube{T<:Real}(origin::Vector{T}, edgelength::T)::HyperRectVolume

creates a hypercube shaped spatial volume

source

This function creates a hyper-rectangle around each starting sample. It starts by building a hyper-cube and subsequently adapts each face individually, thus turning the hyper-cube into a hyper-rectangle. The faces are adjusted in a way to match the shape of the distribution as best as possible.

source
effective_sample_size(params::AbstractArray, weights::AbstractVector; with_weights=true)

Effective size estimation for a (multidimensional) ElasticArray. By default applies the Kish approximation with the weigths available, but can be turned off (with_weights=false).

source
effective_sample_size(samples::DensitySampleVector; with_weights=true)

Effective size estimation for a (multidimensional) DensitySampleVector. By default applies the Kish approximation with the weigths available, but can be turned off (with_weights=false).

source

Effective size estimation for a vector of samples xv. If a weight vector w is provided, the Kish approximation is applied.

By default computes the autocorrelation up to the square root of the number of entries in the vector, unless an explicit list of lags is provided (kv).

source
eval_density_logval!(...)

Internal function to first apply bounds and then evaluate density.

Guarantees that for out-of-bounds parameters:

  • density_logval is not called
  • log value of density is set to (resp. returned as) -Inf
source
BAT.eval_prior_posterior_logval!(...)

Internal function to first apply bounds to the parameters and then compute prior and posterior log valued.

source
BAT.exec_capabilitiesFunction.
exec_capabilities(f, args...)::ExecCapabilities

Determines the execution capabilities of a function f that supports an ExecContext argument, when called with arguments args.... The ExecContext argument itself is excluded from args... for exec_capabilities.

Before calling f, the caller must use

exec_capabilities(f, args...)

to determine the execution capabilities of f with the intended arguments, and take the resulting ExecCapabilities into account. If f is not thread-safe (but remote-safe), and the caller needs to run it on multiple threads, the caller may deep-copy the function arguments.

source
find_hypercube_centers(dataset::DataSet{T, I}, whiteningresult::WhiteningResult, settings::HMISettings)::Vector{I}

finds possible starting points for the hyperrectangle creation

source
BAT.gr_RsqrMethod.
gr_Rsqr(stats::AbstractVector{<:MCMCBasicStats})

Gelman-Rubin $$R^2$$ for all parameters.

source

This function assigns each thread its own hyper-rectangle to build, if in multithreading-mode.

source
BAT.initial_params!Function.
BAT.initial_params!(
+)

Generate one or multiple proposed parameter vectors, based on one or multiple previous parameter vectors.

Input:

  • rng: Random number generator to use
  • pdist: Proposal distribution to use
  • params_old: Old parameter values (vector or column vectors, if a matrix)

Output is stored in

  • params_new: New parameter values (vector or column vectors, if a matrix)

The caller must guarantee:

  • size(params_old, 1) == size(params_new, 1)
  • size(params_old, 2) == size(params_new, 2) or size(params_old, 2) == 1
  • params_new !== params_old (no aliasing)

Implementations of proposal_rand! must be thread-safe.

source
BAT.spatialvolumeFunction.
spatialvolume(b::ParamVolumeBounds)::SpatialVolume

Returns the spatial volume that defines the parameter bounds.

source
MCMCCallbackWrapper{F} <: AbstractMCMCCallback

Wraps a callable object to turn it into an AbstractMCMCCallback.

Constructor:

MCMCCallbackWrapper(f::Any)

f needs to support the call syntax of an AbstractMCMCCallback.

source
SearchResult{T<:AbstractFloat, I<:Integer}

Stores the results of the space partitioning tree's search function

Variables

  • 'pointIDs' : the IDs of samples found, might be empty because it is optional
  • 'points' : The number of points found.
  • 'maxLogProb' : the maximum log. probability of the points found.
  • 'minLogProb' : the minimum log. probability of the points found.
  • 'maxWeightProb' : the weighted minimum log. probability found.
  • 'minWeightProb' : the weighted maximum log. probfactor found.
source
BAT.apply_boundsFunction.
apply_bounds(x::Real, interval::ClosedInterval, boundary_type::BoundsType)

Specify lower and upper bound via interval.

source
BAT.apply_bounds!Function.
apply_bounds!(params::AbstractVector, bounds::AbstractParamBounds)

Apply bounds to parameters params.

source
BAT.apply_boundsMethod.
apply_bounds(x::<:Real, lo::<:Real, hi::<:Real, boundary_type::BoundsType)

Apply lower/upper bound lo/hi to value x. boundary_type may be hard_bounds, cyclic_bounds or reflective_bounds.

source
BAT.autocrlMethod.
autocrl(xv::AbstractVector{T}, kv::AbstractVector{Int} = Vector{Int}())

autocorrelation := Σ Cov[xi,x(i+k)]/Var[x]

Computes the autocorrelations at various leg k of the input vector (time series) xv. The vector kv is the collections of lags to take into account

source
create_hypercube{T<:Real}(origin::Vector{T}, edgelength::T)::HyperRectVolume

creates a hypercube shaped spatial volume

source

This function creates a hyper-rectangle around each starting sample. It starts by building a hyper-cube and subsequently adapts each face individually, thus turning the hyper-cube into a hyper-rectangle. The faces are adjusted in a way to match the shape of the distribution as best as possible.

source
effective_sample_size(params::AbstractArray, weights::AbstractVector; with_weights=true)

Effective size estimation for a (multidimensional) ElasticArray. By default applies the Kish approximation with the weigths available, but can be turned off (with_weights=false).

source
effective_sample_size(samples::DensitySampleVector; with_weights=true)

Effective size estimation for a (multidimensional) DensitySampleVector. By default applies the Kish approximation with the weigths available, but can be turned off (with_weights=false).

source

Effective size estimation for a vector of samples xv. If a weight vector w is provided, the Kish approximation is applied.

By default computes the autocorrelation up to the square root of the number of entries in the vector, unless an explicit list of lags is provided (kv).

source
eval_density_logval!(...)

Internal function to first apply bounds and then evaluate density.

Guarantees that for out-of-bounds parameters:

  • density_logval is not called
  • log value of density is set to (resp. returned as) -Inf
source
BAT.eval_prior_posterior_logval!(...)

Internal function to first apply bounds to the parameters and then compute prior and posterior log valued.

source
BAT.exec_capabilitiesFunction.
exec_capabilities(f, args...)::ExecCapabilities

Determines the execution capabilities of a function f that supports an ExecContext argument, when called with arguments args.... The ExecContext argument itself is excluded from args... for exec_capabilities.

Before calling f, the caller must use

exec_capabilities(f, args...)

to determine the execution capabilities of f with the intended arguments, and take the resulting ExecCapabilities into account. If f is not thread-safe (but remote-safe), and the caller needs to run it on multiple threads, the caller may deep-copy the function arguments.

source
find_hypercube_centers(dataset::DataSet{T, I}, whiteningresult::WhiteningResult, settings::HMISettings)::Vector{I}

finds possible starting points for the hyperrectangle creation

source
BAT.gr_RsqrMethod.
gr_Rsqr(stats::AbstractVector{<:MCMCBasicStats})

Gelman-Rubin $$R^2$$ for all parameters.

source

This function assigns each thread its own hyper-rectangle to build, if in multithreading-mode.

source
BAT.initial_params!Function.
BAT.initial_params!(
     params::Union{AbstractVector{<:Real},VectorOfSimilarVectors{<:Real}},
     rng::AbstractRNG,
     model::AbstractBayesianModel,
     algorithm::MCMCAlgorithm
-)::typeof(params)

Fill params with random initial parameters suitable for model and algorithm. The default implementation will try to draw the initial parameters from the prior of the model.

source
create_hypercube!{T<:Real}(origin::Vector{T}, edgelength::T)::HyperRectVolume

resizes a hypercube shaped spatial volume

source
modify_integrationvolume!(intvol::IntegrationVolume{T, I}, dataset::DataSet{T, I}, spvol::HyperRectVolume{T}, searchpts::Bool = true)

updates an integration volume with new boundaries. Recalculates the pointcloud and volume.

source
BAT.unsafe_density_logval(
+)::typeof(params)

Fill params with random initial parameters suitable for model and algorithm. The default implementation will try to draw the initial parameters from the prior of the model.

source
create_hypercube!{T<:Real}(origin::Vector{T}, edgelength::T)::HyperRectVolume

resizes a hypercube shaped spatial volume

source
modify_integrationvolume!(intvol::IntegrationVolume{T, I}, dataset::DataSet{T, I}, spvol::HyperRectVolume{T}, searchpts::Bool = true)

updates an integration volume with new boundaries. Recalculates the pointcloud and volume.

source
BAT.unsafe_density_logval(
     density::AbstractDensity,
     params::AbstractVector{<:Real},
     exec_context::ExecContext = ExecContext()
-)

Unsafe variant of density_logval, implementations may rely on

  • size(params, 1) == nparams(density)

The caller must ensure that these conditions are met!

source
BAT.unsafe_density_logval!(
+)

Unsafe variant of density_logval, implementations may rely on

  • size(params, 1) == nparams(density)

The caller must ensure that these conditions are met!

source
BAT.unsafe_density_logval!(
     r::AbstractVector{<:Real},
     density::AbstractDensity,
     params::VectorOfSimilarVectors{<:Real},
     exec_context::ExecContext
-)

Unsafe variant of density_logval!, implementations may rely on

  • size(params, 1) == nparams(density)
  • size(params, 2) == length(r)

The caller must ensure that these conditions are met!

source
wgt_effective_sample_size(w::AbstractVector{T})

Kish's approximation for weighted samples effectivesamplesize estimation. Computes the weighting factor for weigthed samples, where w is the vector of weigths.

source
Base.intersectMethod.
intersect(a:ExecCapabilities, b:ExecCapabilities)

Get the intersection of execution capabilities of a and b, i.e. the ExecCapabilities that should be used when to functions are used in combination (e.g. in sequence).

source
+)

Unsafe variant of density_logval!, implementations may rely on

  • size(params, 1) == nparams(density)
  • size(params, 2) == length(r)

The caller must ensure that these conditions are met!

source
wgt_effective_sample_size(w::AbstractVector{T})

Kish's approximation for weighted samples effectivesamplesize estimation. Computes the weighting factor for weigthed samples, where w is the vector of weigths.

source
Base.intersectMethod.
intersect(a:ExecCapabilities, b:ExecCapabilities)

Get the intersection of execution capabilities of a and b, i.e. the ExecCapabilities that should be used when to functions are used in combination (e.g. in sequence).

source