From 70fef735603f24d8c38c681effe4d82df4ef7471 Mon Sep 17 00:00:00 2001 From: "Documenter.jl" Date: Tue, 12 Sep 2023 15:06:48 +0000 Subject: [PATCH] build based on bc0d0dd --- dev/api/index.html | 14 ++++---- dev/generated/augmentations.ipynb | 30 ++++++++-------- dev/generated/augmentations/index.html | 12 +++---- dev/generated/example.ipynb | 2 +- dev/generated/example/index.html | 2 +- dev/generated/heatmapping.ipynb | 2 +- dev/generated/heatmapping/index.html | 2 +- dev/generated/lrp/basics.ipynb | 4 +-- dev/generated/lrp/basics/index.html | 42 +++++++++++------------ dev/generated/lrp/composites/index.html | 2 +- dev/generated/lrp/custom_layer/index.html | 2 +- dev/generated/lrp/custom_rules/index.html | 2 +- dev/index.html | 2 +- dev/lrp/api/index.html | 34 +++++++++--------- dev/lrp/developer/index.html | 2 +- dev/search/index.html | 2 +- 16 files changed, 78 insertions(+), 78 deletions(-) diff --git a/dev/api/index.html b/dev/api/index.html index 58755d06..4d302704 100644 --- a/dev/api/index.html +++ b/dev/api/index.html @@ -1,10 +1,10 @@ General · ExplainableAI.jl

Basic API

All methods in ExplainableAI.jl work by calling analyze on an input and an analyzer:

ExplainableAI.analyzeFunction
analyze(input, method)
-analyze(input, method, neuron_selection)

Apply the analyzer method for the given input, returning an Explanation. If neuron_selection is specified, the explanation will be calculated for that neuron. Otherwise, the output neuron with the highest activation is automatically chosen.

See also Explanation and heatmap.

Keyword arguments

  • add_batch_dim: add batch dimension to the input without allocating. Default is false.
source
ExplainableAI.ExplanationType

Return type of analyzers when calling analyze.

Fields

  • val: numerical output of the analyzer, e.g. an attribution or gradient
  • output: model output for the given analyzer input
  • neuron_selection: neuron index used for the explanation
  • analyzer: symbol corresponding the used analyzer, e.g. :LRP or :Gradient
  • extras: optional named tuple that can be used by analyzers to return additional information.
source
ExplainableAI.heatmapFunction
heatmap(explanation)
+analyze(input, method, neuron_selection)

Apply the analyzer method for the given input, returning an Explanation. If neuron_selection is specified, the explanation will be calculated for that neuron. Otherwise, the output neuron with the highest activation is automatically chosen.

See also Explanation and heatmap.

Keyword arguments

  • add_batch_dim: add batch dimension to the input without allocating. Default is false.
source
ExplainableAI.ExplanationType

Return type of analyzers when calling analyze.

Fields

  • val: numerical output of the analyzer, e.g. an attribution or gradient
  • output: model output for the given analyzer input
  • neuron_selection: neuron index used for the explanation
  • analyzer: symbol corresponding the used analyzer, e.g. :LRP or :Gradient
  • extras: optional named tuple that can be used by analyzers to return additional information.
source
ExplainableAI.heatmapFunction
heatmap(explanation)
 heatmap(input, analyzer)
-heatmap(input, analyzer, neuron_selection)

Visualize explanation. Assumes Flux's WHCN convention (width, height, color channels, batch size).

See also analyze.

Keyword arguments

  • cs::ColorScheme: color scheme from ColorSchemes.jl that is applied. When calling heatmap with an Explanation or analyzer, the method default is selected. When calling heatmap with an array, the default is ColorSchemes.seismic.
  • reduce::Symbol: selects how color channels are reduced to a single number to apply a color scheme. The following methods can be selected, which are then applied over the color channels for each "pixel" in the explanation:
    • :sum: sum up color channels
    • :norm: compute 2-norm over the color channels
    • :maxabs: compute maximum(abs, x) over the color channels
    When calling heatmap with an Explanation or analyzer, the method default is selected. When calling heatmap with an array, the default is :sum.
  • rangescale::Symbol: selects how the color channel reduced heatmap is normalized before the color scheme is applied. Can be either :extrema or :centered. When calling heatmap with an Explanation or analyzer, the method default is selected. When calling heatmap with an array, the default for use with the seismic color scheme is :centered.
  • permute::Bool: Whether to flip W&H input channels. Default is true.
  • unpack_singleton::Bool: When heatmapping a batch with a single sample, setting unpack_singleton=true will return an image instead of an Vector containing a single image.

Note: keyword arguments can't be used when calling heatmap with an analyzer.

source

Analyzers

ExplainableAI.LRPType
LRP(model, rules)
-LRP(model, composite)

Analyze model by applying Layer-Wise Relevance Propagation. The analyzer can either be created by passing an array of LRP-rules or by passing a composite, see Composite for an example.

Keyword arguments

  • skip_checks::Bool: Skip checks whether model is compatible with LRP and contains output softmax. Default is false.
  • verbose::Bool: Select whether the model checks should print a summary on failure. Default is true.

References

[1] G. Montavon et al., Layer-Wise Relevance Propagation: An Overview [2] W. Samek et al., Explaining Deep Neural Networks and Beyond: A Review of Methods and Applications

source
ExplainableAI.GradientType
Gradient(model)

Analyze model by calculating the gradient of a neuron activation with respect to the input.

source
ExplainableAI.InputTimesGradientType
InputTimesGradient(model)

Analyze model by calculating the gradient of a neuron activation with respect to the input. This gradient is then multiplied element-wise with the input.

source
ExplainableAI.SmoothGradFunction
SmoothGrad(analyzer, [n=50, std=0.1, rng=GLOBAL_RNG])
-SmoothGrad(analyzer, [n=50, distribution=Normal(0, σ²=0.01), rng=GLOBAL_RNG])

Analyze model by calculating a smoothed sensitivity map. This is done by averaging sensitivity maps of a Gradient analyzer over random samples in a neighborhood of the input, typically by adding Gaussian noise with mean 0.

References

  • Smilkov et al., SmoothGrad: removing noise by adding noise
source
ExplainableAI.IntegratedGradientsFunction
IntegratedGradients(analyzer, [n=50])
-IntegratedGradients(analyzer, [n=50])

Analyze model by using the Integrated Gradients method.

References

  • Sundararajan et al., Axiomatic Attribution for Deep Networks
source

Input augmentations

SmoothGrad and IntegratedGradients are special cases of the input augmentations NoiseAugmentation and InterpolationAugmentation, which can be applied as a wrapper to any analyzer:

ExplainableAI.NoiseAugmentationType
NoiseAugmentation(analyzer, n, [std=1, rng=GLOBAL_RNG])
-NoiseAugmentation(analyzer, n, distribution, [rng=GLOBAL_RNG])

A wrapper around analyzers that augments the input with n samples of additive noise sampled from distribution. This input augmentation is then averaged to return an Explanation.

source
ExplainableAI.InterpolationAugmentationType
InterpolationAugmentation(model, [n=50])

A wrapper around analyzers that augments the input with n steps of linear interpolation between the input and a reference input (typically zero(input)). The gradients w.r.t. this augmented input are then averaged and multiplied with the difference between the input and the reference input.

source

Model preparation

ExplainableAI.canonizeFunction
canonize(model)

Canonize model by flattening it and fusing BatchNorm layers into preceding Dense and Conv layers with linear activation functions.

source

Input preprocessing

ExplainableAI.preprocess_imagenetFunction
preprocess_imagenet(img)

Preprocess an image for use with Metalhead.jl's ImageNet models using PyTorch weights. Uses PyTorch's normalization constants.

source

Index

+heatmap(input, analyzer, neuron_selection)

Visualize explanation. Assumes Flux's WHCN convention (width, height, color channels, batch size).

See also analyze.

Keyword arguments

Note: keyword arguments can't be used when calling heatmap with an analyzer.

source

Analyzers

ExplainableAI.LRPType
LRP(model, rules)
+LRP(model, composite)

Analyze model by applying Layer-Wise Relevance Propagation. The analyzer can either be created by passing an array of LRP-rules or by passing a composite, see Composite for an example.

Keyword arguments

  • skip_checks::Bool: Skip checks whether model is compatible with LRP and contains output softmax. Default is false.
  • verbose::Bool: Select whether the model checks should print a summary on failure. Default is true.

References

[1] G. Montavon et al., Layer-Wise Relevance Propagation: An Overview [2] W. Samek et al., Explaining Deep Neural Networks and Beyond: A Review of Methods and Applications

source
ExplainableAI.GradientType
Gradient(model)

Analyze model by calculating the gradient of a neuron activation with respect to the input.

source
ExplainableAI.InputTimesGradientType
InputTimesGradient(model)

Analyze model by calculating the gradient of a neuron activation with respect to the input. This gradient is then multiplied element-wise with the input.

source
ExplainableAI.SmoothGradFunction
SmoothGrad(analyzer, [n=50, std=0.1, rng=GLOBAL_RNG])
+SmoothGrad(analyzer, [n=50, distribution=Normal(0, σ²=0.01), rng=GLOBAL_RNG])

Analyze model by calculating a smoothed sensitivity map. This is done by averaging sensitivity maps of a Gradient analyzer over random samples in a neighborhood of the input, typically by adding Gaussian noise with mean 0.

References

  • Smilkov et al., SmoothGrad: removing noise by adding noise
source
ExplainableAI.IntegratedGradientsFunction
IntegratedGradients(analyzer, [n=50])
+IntegratedGradients(analyzer, [n=50])

Analyze model by using the Integrated Gradients method.

References

  • Sundararajan et al., Axiomatic Attribution for Deep Networks
source

Input augmentations

SmoothGrad and IntegratedGradients are special cases of the input augmentations NoiseAugmentation and InterpolationAugmentation, which can be applied as a wrapper to any analyzer:

ExplainableAI.NoiseAugmentationType
NoiseAugmentation(analyzer, n, [std=1, rng=GLOBAL_RNG])
+NoiseAugmentation(analyzer, n, distribution, [rng=GLOBAL_RNG])

A wrapper around analyzers that augments the input with n samples of additive noise sampled from distribution. This input augmentation is then averaged to return an Explanation.

source
ExplainableAI.InterpolationAugmentationType
InterpolationAugmentation(model, [n=50])

A wrapper around analyzers that augments the input with n steps of linear interpolation between the input and a reference input (typically zero(input)). The gradients w.r.t. this augmented input are then averaged and multiplied with the difference between the input and the reference input.

source

Model preparation

ExplainableAI.strip_softmaxFunction
strip_softmax(model)
+strip_softmax(layer)

Remove softmax activation on layer or model if it exists.

source
ExplainableAI.canonizeFunction
canonize(model)

Canonize model by flattening it and fusing BatchNorm layers into preceding Dense and Conv layers with linear activation functions.

source
ExplainableAI.flatten_modelFunction
flatten_model(model)

Flatten a Flux Chain containing Chains.

source

Input preprocessing

ExplainableAI.preprocess_imagenetFunction
preprocess_imagenet(img)

Preprocess an image for use with Metalhead.jl's ImageNet models using PyTorch weights. Uses PyTorch's normalization constants.

source

Index

diff --git a/dev/generated/augmentations.ipynb b/dev/generated/augmentations.ipynb index f9a7bca4..5b0ebb5d 100644 --- a/dev/generated/augmentations.ipynb +++ b/dev/generated/augmentations.ipynb @@ -113,10 +113,10 @@ { "output_type": "execute_result", "data": { - "text/plain": "28×28 Array{RGB{Float64},2} with eltype ColorTypes.RGB{Float64}:\n RGB{Float64}(0.448217,0.446334,0.444216) … RGB{Float64}(0.449986,0.448105,0.445977)\n RGB{Float64}(0.449956,0.448075,0.445947) RGB{Float64}(0.449897,0.448017,0.445889)\n RGB{Float64}(0.452467,0.450591,0.448448) RGB{Float64}(0.44846,0.446578,0.444459)\n RGB{Float64}(0.44945,0.447569,0.445444) RGB{Float64}(0.448302,0.446419,0.444301)\n RGB{Float64}(0.442454,0.440563,0.438478) RGB{Float64}(0.450911,0.449032,0.446898)\n RGB{Float64}(0.439192,0.437296,0.435231) … RGB{Float64}(0.451624,0.449746,0.447609)\n RGB{Float64}(0.438122,0.436224,0.434165) RGB{Float64}(0.453552,0.451678,0.449529)\n RGB{Float64}(0.443941,0.442052,0.439959) RGB{Float64}(0.447656,0.445772,0.443658)\n RGB{Float64}(0.444285,0.442397,0.440302) RGB{Float64}(0.440258,0.438364,0.436292)\n RGB{Float64}(0.448292,0.446409,0.444291) RGB{Float64}(0.441288,0.439395,0.437317)\n ⋮ ⋱ \n RGB{Float64}(0.446779,0.444894,0.442784) RGB{Float64}(0.454038,0.452164,0.450012)\n RGB{Float64}(0.454989,0.453116,0.450959) … RGB{Float64}(0.448123,0.44624,0.444123)\n RGB{Float64}(0.451499,0.449621,0.447484) RGB{Float64}(0.44487,0.442982,0.440884)\n RGB{Float64}(0.443384,0.441494,0.439405) RGB{Float64}(0.445594,0.443707,0.441605)\n RGB{Float64}(0.441907,0.440015,0.437933) RGB{Float64}(0.450412,0.448533,0.446402)\n RGB{Float64}(0.446492,0.444607,0.442499) RGB{Float64}(0.453535,0.45166,0.449511)\n RGB{Float64}(0.447141,0.445256,0.443145) … RGB{Float64}(0.451767,0.449889,0.447751)\n RGB{Float64}(0.449809,0.447928,0.445801) RGB{Float64}(0.449058,0.447177,0.445054)\n RGB{Float64}(0.450054,0.448174,0.446045) RGB{Float64}(0.449558,0.447677,0.445551)", - "image/png": "iVBORw0KGgoAAAANSUhEUgAAAHAAAABwCAIAAABJgmMcAAAABGdBTUEAALGPC/xhBQAAAAFzUkdCAK7OHOkAAAAgY0hSTQAAeiYAAICEAAD6AAAAgOgAAHUwAADqYAAAOpgAABdwnLpRPAAACh9JREFUeAHtwc2OHNd5x+HfOeet/piu+eCQGjLiMFTkwPaCUBwjvon4auJ70w0IELS1FEALUxBsiHLC8YjkTHfXdH2c82b1L2nRQDa1rOexP/3pv/j/BALHOI54caR44ZgYImJmiONIICDFCyPnKMeRQOCowFHuzsgZhRAYBUaBgDjOyBkZs0kZs0kZs0lZIDAKjAKBUWAU+KXIKPEzdyTEiKSUkBAC4u6IF0c8O1JKRtwZlVIYBUYxRCTGiIQYkEBAQgpIIHCMuyPuzjHGbFLGbFLGbFJGYBRC4JgQAiN3xN2RlBLHpJSQEAKSc0bcHck5I3nISPGClFIQd0diiIhHRxxHLBoSU0QCAXF3ZMgDUrwg7s4xxmxSxmxSxmxShjMqpSC5ZI4JISAxRqSKFVJKQbquQ/quRx4OD8h6vUa8OLJcLpFqUSF91yP90CN91yM5Z2QRFxzj7khKCWnbFhnygLg7EkNkFBgZs0kZs0kZs0lZKQUZ8oAMw4CEEJCqqhAzQ8wM6fse6fse2Td7pGs75HA4ICkmZN/skRgi0vc9kiwhQz8gV1cfIav1GqmqCrm/v0cOhwNScmEUGKWYkBADxxizSRmzSRmzSZnjjJxRjBEJISCVVUgIAen7HslDRkIIyGq1QlarFdI0DRJjRLw4st3vkOvr5xyzqBbIYrFA7u7ukBACknNGlqslUkpBUkxILhlZLpbIoT0gxmxSxmxSxmxShjOKKSJVqpAYI1JZhTjOyBkVL8gwDMjQD0jbtkjOGck5c0xdb5Dz83Okrmuk6zrkcDgg99t7pO975OTkhFHgqIEBqaoK6YceCQTEmE3KmE3KmE3KQgxIlSokpoi4O3J6dsox6/Ua2d5vkaZpkK7vkOIFWdsaWa/WSNd3SIoJubi4QB49eoQ0+z3yw5s3SEoJ2e/3yG67Q1brFXJ5eYnkISM5Z2S1XiHFC2LMJmXMJmXMJmX8QvGChBKQEAMSY0RKKcj9/T3Sdz2y3W6Rw+GAPHnyBDEzZMgDsqk3SGUVst/tkcoMORxaJISAXD66RA6HFrFkSN/3SNu2SM4ZSZaQw+GApJQQYzYpYzYpYzYpCyEggYDkkpG2aZG+7zlms9kgz549Q54+vULef/iA/HT7E3JoD0i9qZHNZoO0XYvUpzVyaFukH3rk008/RW7e3iB393fIcrFE9vs9slquOKbtWuRkfYIEAmLMJmXMJmXMJmUpRiSlhBR3ZH2yRpp9gyxXS6Tkgtz+dIts77fIhw8fkK7rkMPhgFRVhbx79w45PztH3B3pux5pHhrkzZs3HHN9fY0sqgrp+h55aBqk63tkuVwijiOlFMSYTcqYTcqYTcqcn7kzKqUgpRRkszlBTjYb5KF5QKwyZLffIa9fv0a6rkNevHiB3Ly9QS4fXyIpJeTu/g7Z3m+Ru7s7ZL/fI8+ePUNWqxVyenqKbO/vkeVqhZRDi/R9j1RVxTHGbFLGbFLGbFJWckFKLojjyHKxZBQYDf2ANA8Ncnt7i1hlyPX1NfL6u++Qi0cXyEPzgLz937fIy09eIq9ff4ecnZ4in/zLJ8jZ2RlSckb++re/Ifd390h9WiPDMCAhBqSqKqTrOqSqKsSYTcqYTcqYTcpyyYyckeNI27UcU6wgFxcXyPXza+T84hx59eoV8t/ffIP85re/Ra6urpDPP/8c+ddf/QrZnGyQly9fIn3fIe/fv0d2Dw/I2ekZ8u79O2S32yFmhgx5QOpNjSwWCySmiBizSRmzSRmzSVkgII4jMUYkxohUViHVokL2+z3y+PFj5Pnz58hnn32G/O53/4ZsNhtke79Fnj59ivzHH/6A3Nz8A1mtVxyzWC6RVddzzOlwinx4/wFxHFkul8j6ZI3EGBk5I2M2KWM2KWM2KQshIMkSslgsGDmjxXKBdF2HXFxcIO/fvUduTm+Qr//8Z+T7779H/v33v0fatkWePH6MfPP118j1i2vk9evXiJkhi2qBmCXED47kXJCzszOkqiqkWlRIDBFJKSHDMCDGbFLGbFLGbFIWY0TMDKmsQtwdWS1XyH63R4ZhQGKMyLfffotsNhuk63ukafbIixcvkC+//BLZnGyQX//m18jjx4+R9tAit7e3yHa3Qx4eHhCzhHhxpK5rZLfbIUMYkIUvkJwzYswmZcwmZcwmZQSO6vsOGXJGQgzIoT0gbdshXdchy+UC+eqrr5Dddod88cUXSCAgTz56gnz8Tx8jP/zwA5JzRt78+AZ5aB6Qd+/fIbf/uEWWqyVyeXmJ7Js9UlUVknNGHEdiiogxm5Qxm5Qxm5S5OzL0AxJCYBQY7bY7ZLfdIW9v3iIlF+TtzQ2y226R29tb5MPdHfLq1SuO+eMf/xP5y+u/IKvVCjk8HJDmoUFWyxXy9NlTxJIhm3rDyBlZZUjJBXEcKaUgxmxSxmxSxmxShjPy4EjJBSmlIG3bIikl5PnHz5F+6JG6rpHtbssxH11dIad1jTy6fIR88/U3SNu2SF3XSIwRSZaQuq6R87NzZLFYIE3TIMkSklJC3B3JQ2bkjIzZpIzZpIzZpCymiAQCkj0j7s4xh/aAbDYbpH/okZQScvXRFbLb7pC6rjkmEJDzi3Pk7u4O6foe6boOeXRxgZyfnSOr9QoJISD1aY3kISOBgLg7knNGQgiIMZuUMZuUMZuUuTuSc0YeDg9I13VIDBGJISJd1yPtoUX6vkccR66urpBSCvL3//k7cnHxCKnrGlmv1xxTSkGsMqTrOsQqQyqrEHfnmK7vkKEfEHfnGGM2KWM2KWM2KSulIMMwICUXxN2REAPSNA0yDAMy5AHx4sinzz9F9rs94u6MAqNFtUAuHl0g2+0WWS6WyOnpKZJLRs7Pz5FAQHLJyH63R9q2Rfq+R0IISIgBCQTEmE3KmE3KmE3KcI6KMSKLaoEsFgvk/OIcGfoBOTs7Q549e4as1itke3+PvHnzI/Lq1Svkxx9/RNarNdI0DWKVIW3XIpVVSNM0SNd2SD/0SLNvEHdHqqpCzAyJMTJyRsZsUsZsUsZsUhZCQFJKSIyRUWCUUkKsM+TF9Qvk7PyMY+rNBrn7cIfUdc0xL//5JXJ3f4ecnJwgJReklIJsd1skhoj0Q4/0XY/knJGqqpAYIxJTRGKMiBdHjNmkjNmkjNmkjMAopYSEGBB355i6rpFkCdnv9sj5xTmy3zdIP/RIjBHJOSPNvkFSTMgwDEjf9UhMEfHiSA6ZY6qqQhaLBZJSQswMCSEwckaOI8ZsUsZsUsZsUhYIjAKjEAISQmDkjAoF6doOGfKAdH2HeHHE3ZEhD4xaRqUUJMaImBkSQkBijMgwDIi7I5VVSLKEBALi7ojjiBdHCgUJ/MyYTcqYTcqYTcr4BXdHSilIDJFRYBQISNd3SNM0iJkhVVVxzHKx5JgQAzL0A+LuSIiBkTOKMSLujoQQEHdnFDjOGTnOcQExZpMyZpMyZpP6P7XvMbg+xHIKAAAAAElFTkSuQmCC", + "text/plain": "28×28 Array{RGB{Float64},2} with eltype ColorTypes.RGB{Float64}:\n RGB{Float64}(0.449309,0.447427,0.445303) … RGB{Float64}(0.451324,0.449446,0.44731)\n RGB{Float64}(0.450795,0.448916,0.446783) RGB{Float64}(0.450045,0.448164,0.446036)\n RGB{Float64}(0.454682,0.452809,0.450654) RGB{Float64}(0.449854,0.447974,0.445846)\n RGB{Float64}(0.448937,0.447055,0.444933) RGB{Float64}(0.449652,0.447771,0.445645)\n RGB{Float64}(0.440756,0.438862,0.436787) RGB{Float64}(0.451486,0.449608,0.447472)\n RGB{Float64}(0.437163,0.435264,0.43321) … RGB{Float64}(0.453044,0.451168,0.449022)\n RGB{Float64}(0.441147,0.439254,0.437177) RGB{Float64}(0.454557,0.452684,0.450529)\n RGB{Float64}(0.445164,0.443276,0.441176) RGB{Float64}(0.446164,0.444278,0.442172)\n RGB{Float64}(0.444724,0.442836,0.440738) RGB{Float64}(0.440714,0.438821,0.436746)\n RGB{Float64}(0.451756,0.449878,0.44774) RGB{Float64}(0.442449,0.440558,0.438473)\n ⋮ ⋱ \n RGB{Float64}(0.447106,0.445222,0.44311) RGB{Float64}(0.456026,0.454155,0.451992)\n RGB{Float64}(0.454335,0.452461,0.450308) … RGB{Float64}(0.446763,0.444878,0.442769)\n RGB{Float64}(0.454069,0.452195,0.450043) RGB{Float64}(0.443112,0.441222,0.439133)\n RGB{Float64}(0.447081,0.445197,0.443086) RGB{Float64}(0.446122,0.444236,0.44213)\n RGB{Float64}(0.442512,0.440621,0.438536) RGB{Float64}(0.452582,0.450706,0.448563)\n RGB{Float64}(0.445191,0.443304,0.441203) RGB{Float64}(0.456586,0.454716,0.452549)\n RGB{Float64}(0.447453,0.445569,0.443455) … RGB{Float64}(0.453907,0.452033,0.449882)\n RGB{Float64}(0.450314,0.448434,0.446304) RGB{Float64}(0.450043,0.448163,0.446035)\n RGB{Float64}(0.450967,0.449089,0.446955) RGB{Float64}(0.450369,0.44849,0.446359)", + "image/png": "iVBORw0KGgoAAAANSUhEUgAAAHAAAABwCAIAAABJgmMcAAAABGdBTUEAALGPC/xhBQAAAAFzUkdCAK7OHOkAAAAgY0hSTQAAeiYAAICEAAD6AAAAgOgAAHUwAADqYAAAOpgAABdwnLpRPAAACiRJREFUeAHtwc2OHNd5x+Hfe86p6u6Znu+AHHpGlmXAzIIGHEjxVThX41xZfAWRkxuwCGghahFHoj4GHIrT31Wn6rxZ/UtaNGAgqGU9T/r3P/8ZcRwxjIFxlLsjXhwpXjgmhICkmBAzY2AMSimIuzNwBo5zjGEMjOOcgeOImSGGIY7zjyQmo0pMRpWYjCrxC2aGGMbAGBiGWDAGkYHjiJkhwQLi7oiZIX3pGTiDUgoDZ1C8IIYhFgwJFpBgAXFzJFhAzAxxdwbOwN0ZOIPEZFSJyagSk1ElM0MsGANnYGZIKQVxHIkxIoGAxBQZOIO+7xkYRzmOeHGkeEFKKUgIAQklII4joQ5ICAExM8Tdkb7rEXdH3B0xDElMRpWYjCoxGVUqXhDvHCl9YWAMzAwJISBVqhB3R5pDg+QuI23TInVdI6UUZL6YI/PTOdLmFsk5I7nNSE+P1LMaMQxxdyTGiDSHBum6jmMsGMckJqNKTEaVmIwqlVKQvu+R0hfEgiEpJiTGiMQUka7rkL7vkaZpkObQIJvNBokpIrv9DgkWkJwzElNEutwhz549QxYnC6SqKuTp6Qk5HA5IKQUxDAkhIGbGMYnJqBKTUSUmo0ruzsAZmBkDZxBCQNwd6boO6bueY6pUIdWyQna7HRJCQPq+R9b7NXJ/d4+YGVJVFVLXNfK0euKY0hdkNp8hpRQkWEBKKchsNkOapkESk1ElJqNKTEaVzAxJKSEWDAkWkJQS4jji7oi7IzlnpOs6pGkbpPQF6foOcXdkebpELi4vkLPlEmnaFjkcDshqtUJym5GT0xPEgvGPVKlCuq7jmMRkVInJqBKTUaVghoQQkRADx1xcnCPujixOTpDVaoXEGJGcM+I4klJC5rM50uYWiSEiV5eXyNX1NbLbbpH//eYbJMaI7PIOeXh4QBbzBXJ9fY10XYf0VY/MZjOklIIkJqNKTEaVmIwqOT9znGNijEiIESl9QVarFZLbjGy2G+SwPyA3/3SDVFWF9F2PLJdLJFUJ2Wy3SFVVyOHQIMECcn19jTRNg8QUkS53SM4Z6boOiTEiTdMgMUYkMRlVYjKqxGRUKYSAhBAQL45s9hukOTSImSGny1Pk9vYWef78GfLhwwfk3btHZPW0Qk6Xp8jp6SnSNA2yXC6Rw6FBcs7Ib3/7CfLw8ICsnlbIbDZDNtsNklJCUkpIzhmZz+eIYUhiMqrEZFSJyahSDBGJMSJ93yPz2RzZ7XbIYrFASl+Qd4/vkPVqjXz48AFp2xY5NAekqirk8f0jcnFxgbg70rQNst/vkW+//RZxHLm7v0PqukZym5HdbofknJG+9Ijxs74vSGIyqsRkVInJqJK7I13fIaUUxIsjJycnyMnpCXLYH5CYIrLZbpA3b94gbc7IRx/dIz/++CNyfXONpJiQpw9PyGq9Rp6ePiDb7Q55cXuLzGdz5Oz8DFmv1shsPkNKU5Aud0iqEsckJqNKTEaVmIwq9X2POM4xs9kMMX7W9z2y2++Qd+/eIalKyP39PfLm66+Ry4tLZLfbIT/88CPym4/nyJuv3yDnZ+fIJ7/5BDk/P0dKKcj//P3vyGq9Qs6WZ0jf94gFQ1KVkLZpkaqukMRkVInJqBKTUaXihWPcHWmblmNSX5Cryyvk/v4euby8RF69eoW8fv0a+eeXL5Hnt7fIX/7jL8jvXv4OOTk9QT7++GOk6zrkp/c/IZvtBjk/P0N+ev8Tst1skRgj0vc9slwukaqukBACkpiMKjEZVWIyqmQYUrwgZoaEEJCUIjKbzZH1Zo3c3Nwgd3d3yL/84Q/Ip59+iiwWc2S1WiMvXrxA/vjHf0UeHh6QxWKB7Pd7pK4rZD6bc8xyuUQ+PD0h7o7M5jNkcbJAggUGziAxGVViMqrEZFTJzJAYIlJXNWJmSF3XSJtb5OLiAnn/03tk+eMS+eL1a+Srr75CPvvsM6RtG+T65hr54ovXyEf398ibr79GUoxIqiokxoh4caSUgpyfnyF1VSOpSoiZISEEpC89kpiMKjEZVWIyqmRmSIwRqeoK8eLIfD5Ddvsdst1uETNDvvzyS+Ti/BzJOSP7/R65vb1F/uuv/42cnp4gL1++RG5ubpDD4YBsHh+R9XqN7A97xIIxcAaLkwWy3WwRM0PqWY14cSQxGVViMqrEZFQJY2DBkJwz0nUd4jiy2WyQruuQ3GZkcbJA/vPzz5HNZoN8/vlfkRAMubm5Qe5+9Svkm2+/Rfq+R96+fYvsdjvk8fEReff4DqnrGrm+vka22y2SqoR4ccTdETNDEpNRJSajSkxGldwd6bueYywYst1ukd1uh7x/fI8UL8gP3/+APK1WyMPDA7JarZDfv3rFMX/6tz8hX331Bpkv5shhf0D2hz1S1zVy+/wWiTEii5MFEkNEZvMZ0nUd4sURx5HEZFSJyagSk1ElfqF4Qdwd6XOP5JyRGCJyd3+HNIcGmc/nyHq9RrwU5PmzZ8jZ2RlydX2F/O1vXyBt0yBn5+dIjAFJMSHL5RK5urpCqlQh2+0WSSkhMQSkhIB0fYc4jiQmo0pMRpWYjCoFCwyMQd/3iBdHYohI13VIVVVI27ZIVVXI8+fPkfV6jZwtzzjGMOTq6gJ5elohbdsiObfI5cUlcnFxgSwWCySGiMQYkK7vEXdHvDjS9z3HJCajSkxGlZiMKvELpRSkaRqkbVskhoiUUpDcZeTQHJC2bZGTkxPk9vkt0vc98v0P3yOXl5fI2fIMmc/mDIyBYUhMEcltRnKVkc46xMwYOIOcM5K7jBQviJkhicmoEpNRJSajSsUL0nUd0uUO6bse8eDIdrtFVqsV0pfCwB158eIFst1skeKOmBlSz2rk8vISWa1XSFVVyNXVFdI2LXJxcYFYMKTve6RpGuRwOCA5Z44xDDEMSUxGlZiMKjEZVfLiiLsjIQSkrmukqivkxm6QruuQs/Mz5NmzZ8hisUBWqxXy9u1b5NXvXyHfffcdMl/Mkd1+hyzmC6RtWqQvPbI/7JGmaZDcZuTQHJBSClLXNRJjRGKIiONIYjKqxGRUicmokpkhMUYkzAPHpJiQnDNyd3+HnJ+fc8zydImsnlbI8nTJwBn8+te/RlarFbJYLJBSCtL3PbLf75G2aZE2t0huM9L1HVJVNRJCQGKISAgBKV6QxGRUicmoEpNRJYxBDBEJFpDiBTEz5PT0FIkhItvtFrm8vER2uy2Sc0ZCDEhfemT3tENCCEjOGelyh1gwxIIhxQsSQkDqukZqaiTEgIQQOMbdGTiDxGRUicmoEpNRJcM4zpFggWPMDMm5RbquR9q2Rbw4UrwgfdcjLS1SvCCGIVWqkBgjEiwgOWekeEFmsxkSQ0Qc5xh3R9wdcXeOSUxGlZiMKjEZVcL4mTMoOBLMEDNDAgFp2hbZ7/ZITBGpqopj6lnNwBmYGdJ1HccEC4ibIzFGxNwQwxDHEcMQd+f/KzEZVWIyqsRkVP8HCLE3TJr2tCYAAAAASUVORK5CYII=", "text/html": [ - "" + "" ] }, "metadata": {}, @@ -146,10 +146,10 @@ { "output_type": "execute_result", "data": { - "text/plain": "28×28 Array{RGB{Float64},2} with eltype ColorTypes.RGB{Float64}:\n RGB{Float64}(0.447447,0.445563,0.44345) … RGB{Float64}(0.448958,0.447076,0.444954)\n RGB{Float64}(0.448848,0.446966,0.444845) RGB{Float64}(0.448824,0.446942,0.444821)\n RGB{Float64}(0.452837,0.450961,0.448816) RGB{Float64}(0.4474,0.445516,0.443403)\n RGB{Float64}(0.444361,0.442472,0.440377) RGB{Float64}(0.447272,0.445388,0.443275)\n RGB{Float64}(0.439538,0.437642,0.435574) RGB{Float64}(0.449485,0.447604,0.445479)\n RGB{Float64}(0.439831,0.437936,0.435866) … RGB{Float64}(0.45082,0.448941,0.446809)\n RGB{Float64}(0.442884,0.440994,0.438907) RGB{Float64}(0.452683,0.450807,0.448664)\n RGB{Float64}(0.44252,0.440629,0.438544) RGB{Float64}(0.445341,0.443454,0.441353)\n RGB{Float64}(0.443767,0.441878,0.439786) RGB{Float64}(0.440748,0.438854,0.43678)\n RGB{Float64}(0.446952,0.445067,0.442957) RGB{Float64}(0.44008,0.438185,0.436114)\n ⋮ ⋱ \n RGB{Float64}(0.446782,0.444897,0.442787) RGB{Float64}(0.449927,0.448047,0.445919)\n RGB{Float64}(0.453577,0.451702,0.449553) … RGB{Float64}(0.446197,0.444311,0.442205)\n RGB{Float64}(0.451126,0.449248,0.447113) RGB{Float64}(0.446008,0.444122,0.442017)\n RGB{Float64}(0.443152,0.441262,0.439173) RGB{Float64}(0.445286,0.443398,0.441297)\n RGB{Float64}(0.441278,0.439385,0.437308) RGB{Float64}(0.45161,0.449733,0.447595)\n RGB{Float64}(0.444443,0.442555,0.440459) RGB{Float64}(0.452805,0.45093,0.448785)\n RGB{Float64}(0.446321,0.444435,0.442329) … RGB{Float64}(0.451255,0.449377,0.447241)\n RGB{Float64}(0.448288,0.446405,0.444287) RGB{Float64}(0.448005,0.446122,0.444005)\n RGB{Float64}(0.449035,0.447153,0.445031) RGB{Float64}(0.448349,0.446467,0.444348)", - "image/png": "iVBORw0KGgoAAAANSUhEUgAAAHAAAABwCAIAAABJgmMcAAAABGdBTUEAALGPC/xhBQAAAAFzUkdCAK7OHOkAAAAgY0hSTQAAeiYAAICEAAD6AAAAgOgAAHUwAADqYAAAOpgAABdwnLpRPAAACiFJREFUeAHtwduOHMd9x/FvVf27p+ewO3ugybW4pCUBtoBYBhw50FPoZeI3cx5BgBDfhrYvTBGIEtGOSPOwc9iZPlRX5erX5MUAvunL/nzs97//dwaZDxz/XGaQUkJSSohzDnHOIWaGOOcYOAY5ZSSTkZwz/4zDMXCclFPmFOccA8dpmUEmI8ZkVMZkVMZkVMZHnHMMHAOH4yTPIFhgkBl47xEfPOKc45S+75GUE5L6hKSUkJQTp3jvEe884r1n4Bl45xHnGOTMIJORnDOnGJNRGZNRGZNRmXMO8c4zcAycc0hOmVOcc4j3HnHeITllJKaIBB+Q2EUk54z0qUdyykjKCfHeM8gMsstIsIB47xHnHJJTRmKMSM4ZyTkzcAyMyaiMyaiMyagspYT0qUdSSkgmIyEExOGQ2WzGKU3dILGPSIwRCSEgfd8jy8USqaoK6WKHdG2HdLFD+twj5axEcs5I6hMSLCAxRiTGyCnO8RGHGJNRGZNRGZNRWc4ZiX1E+r7nlOADYqUhVhgSY0QyGamPNVLXNZJSQnwISH2sEe89EmNEggUkdhH52cOfIfP5HCmsQDbbDdI0DdKnHnHOId57xPvAIDMwJqMyJqMyJqMyPuJwSPABcd4hVhgSQkBSn5AYI+KcQ4qyQKqqQvb3eyT4gMQ+Ivv9Hrl9csspRVEgZVkim80G8d4jfd8js3KGpJyQ4AOSUkLKskTqukaMyaiMyaiMyajM4ZBgAXHOIc45JPjAIGckpojknJG2bZE+9kjbtwwyg+PxiGQ+WK6WyPp8jaxWK6RtW6Sua2S73SJ93yOLxQLJOSPeeyTmiFhhSIyRgWNgTEZlTEZlTEZlfKSwAvHBIzll5Oz8jEHOyGKxRHa7HXIf7pGu65Dj8YhU8wq5Kq+Qpm0Q7z1ycXmBXF5cIvf398jLly8RC4bc3x+Q3XaHVPMKubq6QmKMiPWGzKs5klNGjMmojMmojMmoDMcgk5GUEuKdQywY0vcR2Ww2SNu1yHa7Rdq2RW4e3SDBAtJ1HbJcLhErDNnv94gFQ+qmRpx3yOXlJVI3NWIWkNhFpGkapO97xIIhdV0jwQJiTEZlTEZlTEZlDofknJGcM1I3LdLF1wwyg+Vqidzc3CAPHz5E7u7ukLdv3yJd1yGr1QpZzBdIU9fI6voB0jQNErsO+eyzz5DXr14j290WmZUzZH+/R6pZxSlt2yLz+ZxTjMmojMmojMmoLISAhOCRPiWkmlfI8XBEqqpCUp+Q9+/eI5vNBrm7u0ParkXapkWKokTevXuHrNdrJJORtm2Rw+GA/PjyJeJwyO3jW6QoDOm6Djkcj0jbtkhZlpzSpx4xJqMyJqMyJqOyTEYyH8kMUk7IYrFAlsslUtc14pxDDscD8uLFC6TtWuTp018gr1+9Qi6vLhEfPHJ3d4fsdjtks9kgh8MBufn5DVLNK2S1WiG73Q6ZzWZI6hPSxQ4prEAcDjEmozImozImo7LUJyT1CUk5IdWs4pS2a5H7+3vk9T9eI4UVyO3tLfL9998jF+s1cjgekFc/vUI+/fRT5MWLF8jZ+Tny+eefI2fnZ0jqE/LD//yAbDdbZLVaITFGxAePlK5EmrZByqJEjMmojMmojMmoLKWE5JyRTEbqpmaQGVhhyHq9Rh4/foysL9bIl19+ifz5T39CfvXFF8jNo0fIH/7wH8gvf/VLZLVaIU+fPkG6LiLv379D9scDcn5+jrx79x7Z7/eImSF93yPL1RKZlTMkhIAYk1EZk1EZk1EZjkHOGXHOISEEJPiAVPMK2e/2yPWDa+TJ7S3yr7/9LfLVV18h83mF7HZ75ObnN8jXX3+NvPnHG2Q+nyPO1chsViFV1yGOD85WK+Tu7g7JZKSaVchivkC890hKCTEmozImozImozLnHBJ8QMpZiTgcUpYlEruIXFxcIHd3d8ir16+R/3r2DHn+1+fI7/7td0jsOuTB9QPk2bNnyJMnt8jz58+RYIYURYFYMOSYMtKnHjk/P0esMKQsSgaODxyDlBJiTEZlTEZlTEZl3nnEB4+UZYmkPiFVVSHv794j6ZCQnBPylz//BVksFkjd1MjhcECePLlFvv32W2S5XCJffPEFcnV1jTRNjbx58xbZ7XfIsT4iZoaklJDlcokcDgcGHYOyLJGcM2JMRmVMRmVMRmU4Bs45JMaIdF2HOO+Qw+GAdG2HHI9HZLFcIH/8zz8im+0G+e6775DgA3J9fY08unmE/PcPPyAxdsjLv/0NOR6PyLu375A3b98gVVUh11fXSF3XSFmUSN/3iPMOcTjEmIzKmIzKmIzKcspIzBHp+56BY3C/v0f2+z3y9u1bpO975M1f3yCbzRZ59dNPyGa7RX79639Bcs7IN998g7x48QKZlTPkWB+Rw+GAVFWFPHz4CLEQkPl8zinlrET6vmeQGaScEGMyKmMyKmMyKuMjDodkMtJ3PdI0DWJmyCeffIK0TYss5gtkd7lH+r5HHj16hJydnSFXV9fIs2fPkLZpkfPzc8QHj1gwpDgrkKvLK2Q2myG7/Q6xYIj3HkkpIbGPDDIDYzIqYzIqYzIq894jmcwgMUg5IT54pDt2SFEUSNu2SLCAXF9fIbvtFlmtVpzi+GC9XiPb7RZpmgZp2ha5urxELtYXyKyaITlnZD6fIyklJOeM5JyRlBLicIgxGZUxGZUxGZVlMpL6hDRtgzR1g/jgGWQGTdMgx+MRaZoGWSwWyIMHD5BMRn766RVyeXGBrM/XyHKx5JSUE2LBkKZpkLIsGTgG3ntO6boOySkzyHzgGBiTURmTURmTUVlKCYkxIrGLSJ96TtlsNkifegaZQSYjj28fI7vdDnE4xDmHzGYz5OLyAtlutoiZIeuLNZL6hKzXawaOQYwRaeoGqesa6boOcTjEOccpxmRUxmRUxmRURuYk7z1SFAVSFAVy/eAaadsOuby8QG4e3SBVVSHb7Qb53x9/RH7z5W+Qv//f35GqqpD7+3ukqiokxog455CmbZCmaZC2aZH7wz2SUkLKskTMDPHec4oxGZUxGZUxGZXhGPjgkdKVDBwDC4bEGJEnt0+Qi4sLJOeMrM5WyN3mDjk/O+eUp0+fIu/fv0eqquKUvu+Rpm6Qpm6QuqmRru2QPvWImSHee8R7j3jvkZQSYkxGZUxGZUxGZQ6HhBAQZ45TvPfI+dk5YoUhu/0Oubi4QA73ByTGiHjvkNhH5P7+HimsQFKfkLZtEe89klJC2q5FnHNIURZIQYlYCEjwAXE4JOfMKcZkVMZkVMZkVIZj4HCIwzFwDHLOnNI2LdLFDmnbFskpIzllJPY9g7ZFUk6IxyPBAjLzM8Q5h8QYkZwzEkJAvPeIdx5JOTHIDFJODDInGZNRGZNRGZNRmeODTOYUj+eU7DLSNA1yOByQYAEpi5JTZuWMU5z3SNd1SM4Z8c4zcAxCCEhOGXE4TslkxmBMRmVMRmVMRvX/77EkyIv50vAAAAAASUVORK5CYII=", + "text/plain": "28×28 Array{RGB{Float64},2} with eltype ColorTypes.RGB{Float64}:\n RGB{Float64}(0.449522,0.447641,0.445516) … RGB{Float64}(0.451104,0.449225,0.44709)\n RGB{Float64}(0.45075,0.448871,0.446738) RGB{Float64}(0.450966,0.449087,0.446954)\n RGB{Float64}(0.454051,0.452177,0.450025) RGB{Float64}(0.449997,0.448117,0.445989)\n RGB{Float64}(0.448155,0.446272,0.444154) RGB{Float64}(0.449664,0.447784,0.445657)\n RGB{Float64}(0.441343,0.43945,0.437372) RGB{Float64}(0.45146,0.449582,0.447446)\n RGB{Float64}(0.437973,0.436075,0.434017) … RGB{Float64}(0.453458,0.451583,0.449434)\n RGB{Float64}(0.44102,0.439126,0.43705) RGB{Float64}(0.456661,0.454791,0.452624)\n RGB{Float64}(0.445,0.443113,0.441014) RGB{Float64}(0.445809,0.443923,0.441819)\n RGB{Float64}(0.445751,0.443864,0.44176) RGB{Float64}(0.440079,0.438185,0.436114)\n RGB{Float64}(0.450814,0.448935,0.446802) RGB{Float64}(0.442967,0.441076,0.438988)\n ⋮ ⋱ \n RGB{Float64}(0.448097,0.446213,0.444096) RGB{Float64}(0.456941,0.455072,0.452903)\n RGB{Float64}(0.456877,0.455008,0.45284) … RGB{Float64}(0.447049,0.445164,0.443053)\n RGB{Float64}(0.455179,0.453307,0.451149) RGB{Float64}(0.444357,0.442468,0.440373)\n RGB{Float64}(0.445437,0.44355,0.441448) RGB{Float64}(0.448776,0.446894,0.444773)\n RGB{Float64}(0.441164,0.439271,0.437194) RGB{Float64}(0.453209,0.451334,0.449187)\n RGB{Float64}(0.442396,0.440505,0.43842) RGB{Float64}(0.455316,0.453444,0.451285)\n RGB{Float64}(0.444365,0.442477,0.440381) … RGB{Float64}(0.45284,0.450964,0.448819)\n RGB{Float64}(0.449083,0.447201,0.445078) RGB{Float64}(0.450324,0.448444,0.446314)\n RGB{Float64}(0.451261,0.449382,0.447247) RGB{Float64}(0.450583,0.448703,0.446572)", + "image/png": "iVBORw0KGgoAAAANSUhEUgAAAHAAAABwCAIAAABJgmMcAAAABGdBTUEAALGPC/xhBQAAAAFzUkdCAK7OHOkAAAAgY0hSTQAAeiYAAICEAAD6AAAAgOgAAHUwAADqYAAAOpgAABdwnLpRPAAACjRJREFUeAHtwc1uI1d6x+Hfe85bRVKkJEpq94fd7TYamNhAOsAYE/gmJlczubLMHRiJtwbGXmTG2jhww3a3+kOiyCLr65ys/mUvCAQBalnP4//+l78gmYwYxv8lk5GcM5JS4phgAYkxIiEEJJORnDOSc+aYnDNiGANjYGYMMoOcM0cZA8M4JpM5xpmMypmMypmMyvkdM0MMY2AMDGNgHJcZhBCQEAJiZgyMQd/1SMoJ6VPPIDNIKTEwBiEEJFhALBhiZoiZIYYhmcwgZyRnjnImo3Imo3Imo3IzQywYYmaIYUhKCckpIxYMCTEgZVEyMAZt0yIpJY5JKSEpJSSnjOScEQuG5JSRHDJSxAIJITAwBjllpO97JOWM5JwRwxBnMipnMipnMipPOSG5y0hKiUFmYGZIjBEpvUQsGNK0DVIfaqSua8TdkRACUs5KZDlfIm3bIl3bIU3TID09Us5KJJORlBPi0ZG6qZGu6zjGgnGMMxmVMxmVMxmV55SRru+QrusQM0Nm5Qxxd6QoC6SpG+RwOCCH/QGp6xrpU494dMQL55i2bZEYI9K2HfLw4UfIYrFAyqJE7u7ukPpQIyklxMyQEAMSQkByzogzGZUzGZUzGZXnnBlkBiEEJIaIRI8c09QNcqgPSEoJKYoCKWclUlUVEkJAUp+QqqqQT55+ghiGFGWBzMoZcnt3ixiG9H2PzBdzJPUJsWBI6hMyK2dIXdeIMxmVMxmVMxmVYwxCDEgZS8SCISEEJJORvu8YZAY5ZaTrO+SwOyCpT0ifegaZwXK1RNbrNbJarZCmrpH94YDcb+6RuqmR5ckSsWAMcuY3hhRFgXR9x8AYOJNROZNROZNRuZkhHh0pCke6vkfOz84YmCHz+RzZbrfIvtojTdMgOWXEC0dm5Qzpug4JMSDr8zVycXmB7LY75P6nn5DoEWl3LfLmzRtkvpgjl5eXSNd1SN/3yHw+R1KfEGcyKmcyKmcyKjcMyWSkTwmJISLuBZLJyHa7RfquR7bbLbI/7JHLi0ukKAqkTz2yOl0hhTuy3W4Rd0cOhz1iwZCLiwtkv98jHh1p2gbp2g5puxbx6MihPiAxRsSZjMqZjMqZjMrNDDEzJKeM7Ood0nYtYmbIarlEnjx5jDx69Aj5cPsBubm5QTabDbI6XSHL5RKpmwZZnZ4idV0jbdchL168QG7evEHu7u6QWTlDdtUOKYoC8cKRtmmR+WKOGIY4k1E5k1E5k1F5iAGJISI5Z2Q+nyPVrkLmiznS9T3y5s0Ncn9/j3y4vUWaukYOhwNSliXy4f0H5Pz8nGPqpkaqqkJevXrFMU+fPkXKokDatkOqqkKapkHSLCFmhvR9jziTUTmTUTmTUXnOGUmpR1LOSEoJWZwskOVyiez3eySGiGy3W+T6+gekPtTIs0+fIa/fvEauLq+QEANye3uLbO43yN3dHVLtKuTxk8fIYr5AVqcr5H5zj8zmMySlhLRdixRecIwzGZUzGZUzGZX3fY8kDMlkZDabMcgM+r5HDvsD8u7tO6QoCuTp02fI9fU1cnl5iVS7Cvnl11+Rzz57jvxwfY2cnZ0iL168QM7OzpDUJ+TH//kRudvcIavVCun7HgkxIIUVSFM3SFEWiDMZlTMZlTMZlaeUkJQSYhhyyAfEMCSTkfV6jTx99hRZr9fIy5cvke+//x75/PPPkYcPHyJ//Y+/In/4pz8gq9UKef78OdK2LfL+/Xuk2lfI2ekZ8v79e2R7v0W8cKTvemS5WiJlWSIxRMSZjMqZjMqZjMrNDDEzJMaIhBCQsiiRoiiQ7W6LXF1dIp988gny5ZdfIl/+8Y/IfDFHdtsd8vjJY+Srr75C3t68RRaLBcfMZjOkbVoGmcFqtUJub2+RzG/msxlycnKChBCQnDPiTEblTEblTEblhiHujsxmMyTnjBRFgbRdi1xcXCAfbm+R169fI99++y1yfX2N/Ouf/oR0fY88ePAA+e6775Bnz54i//jHD0j0iBRFgUSPSEoJ6fseOT07RcqyRMqyRIIFJISAdF2HOJNROZNROZNReQiGFGWJFEWBpJSQxWKOVO8qZJd3iGHI3//778hqtUKaukZ2VYV8+ukz5Ouvv0aWyyXyxRdfIFdXl0hdN8jN2xvk/v4e2R/2SPTIIDNYniyR3W7HwBgURYHklBFnMipnMipnMio3CxzT1A3Spx4JISBN2yDVvkKaukHKskS++eYb5H5zj/znf32DxBiQq6sr5PHjx8iPP/6IdH2PvHr1Cqn2FfLu3Tvk7du3yHw2R64eXCG73Q6JHpHUJ46JHhFnMipnMipnMirPZKRtWiSTETNDdtsdstlskJubt0jqe+Tm7Vvk/n6DvH79Brm9vUVevvxnJGcG//bnPyPX19fIbDZD9vs9UlUVMp/NkcePHiPujixOFhxTliWSUmKQGWQy4kxG5UxG5UxG5TlnjsoMuq5DDvUBySkjT548RrquQxYnC2S73SF9n5CHDx8ip6dnyOXFBfK37/6GNHWDnJ+fIyEGJMaIlGclcrG+QMpZiWy3W8TdkcILpOs6pO1aBjkjzmRUzmRUzmRUHkPgN4Z0XYf0qUcMQ7q+Q+Zxjhz2B8TdkY8+eoBsNhvk9PQUMX7HGKzXa2Sz2SB1UyN1XSMX6wtkfb5G5os5knNGVssVknJiYAxSSkjf9QyMgTMZlTMZlTMZlWd+03cdsj/skbZpkegRiSEibdMi290WaeoGOVkukUePHiE5JeTnn39GLi4ukNOPz5DFYoEYxjEhBORQH5CiLBB3R0IMSO4y0jQN0nUdknPmGGcyKmcyKmcyKk99Qrq+Q1KfkExGcs7I5n6DpJSQruuQzG+ePX2K3G/vGWSOms1myMXFGtncbRB3R9brc6Tre+Tq8pKBGdK1HbKrdsi+2iNd1zEwBiEExMwQZzIqZzIqZzIqTzlxTIgBKWOJlGWJPLh6gLRti5ydnyEfP/kYmc1myN3dHfLTTz8hL//lJfLLL78ii8UC2e12yMnJCdK2HQNjsN3tkMP+gHRdh1RVhWQyUniBeOFIjJFBZuBMRuVMRuVMRuWGITFExEpDzAxxdySEgHz22WfI+fk5ksnIcrVEPtx+QE7PTjnm+afPkA8fbpHFYoGknJCcMlLXNVJVFdI0DdK2LZJSQmazGRJjRGKMSLCAJBLiTEblTEblTEblGIMQAuLmSCYjZoasTldI9IjsdlvkfL1GqqpCuq5DggWk73rkdlchMUak73ukazvEgiE5ZSRbRjw6EkNkYAzcHYkeEcOQTEZyzogzGZUzGZUzGZUHM8QsMDAGhiGGIWaGtG2LdG2HNG2LpJSQnDLS9R2DlkHKCQkExN2REAJiGNJayzExRCTEwFGZ38lIyolB4ihnMipnMipnMirHDMlkxLIhZoaYGRIsIE3dIFVVIdEjUhQFYhgym804JlhA2rZFcs5ICIFBYODRkZQTYsE4xjAkk5Gc+X9xJqNyJqNyJqP6X0JXHOxuJjYuAAAAAElFTkSuQmCC", "text/html": [ - "" + "" ] }, "metadata": {}, @@ -176,10 +176,10 @@ { "output_type": "execute_result", "data": { - "text/plain": "28×28 Array{RGB{Float64},2} with eltype ColorTypes.RGB{Float64}:\n RGB{Float64}(0.450524,0.448644,0.446513) … RGB{Float64}(0.452413,0.450536,0.448394)\n RGB{Float64}(0.452021,0.450144,0.448004) RGB{Float64}(0.451327,0.449449,0.447313)\n RGB{Float64}(0.455498,0.453626,0.451466) RGB{Float64}(0.450721,0.448842,0.446709)\n RGB{Float64}(0.449628,0.447748,0.445621) RGB{Float64}(0.450212,0.448332,0.446203)\n RGB{Float64}(0.441391,0.439498,0.43742) RGB{Float64}(0.452829,0.450953,0.448808)\n RGB{Float64}(0.439395,0.437499,0.435433) … RGB{Float64}(0.454492,0.452619,0.450465)\n RGB{Float64}(0.441538,0.439646,0.437566) RGB{Float64}(0.456504,0.454634,0.452468)\n RGB{Float64}(0.447459,0.445575,0.443462) RGB{Float64}(0.449167,0.447285,0.445162)\n RGB{Float64}(0.44712,0.445235,0.443124) RGB{Float64}(0.441339,0.439446,0.437368)\n RGB{Float64}(0.453026,0.451151,0.449005) RGB{Float64}(0.442836,0.440946,0.438859)\n ⋮ ⋱ \n RGB{Float64}(0.450221,0.448341,0.446211) RGB{Float64}(0.453395,0.45152,0.449373)\n RGB{Float64}(0.457913,0.456045,0.453871) … RGB{Float64}(0.448,0.446117,0.444)\n RGB{Float64}(0.458045,0.456177,0.454002) RGB{Float64}(0.447261,0.445376,0.443264)\n RGB{Float64}(0.446474,0.444589,0.442481) RGB{Float64}(0.447486,0.445602,0.443488)\n RGB{Float64}(0.440241,0.438347,0.436275) RGB{Float64}(0.453706,0.451831,0.449681)\n RGB{Float64}(0.441649,0.439756,0.437676) RGB{Float64}(0.456158,0.454287,0.452124)\n RGB{Float64}(0.44794,0.446057,0.44394) … RGB{Float64}(0.454319,0.452445,0.450292)\n RGB{Float64}(0.451918,0.450041,0.447902) RGB{Float64}(0.451286,0.449408,0.447272)\n RGB{Float64}(0.452007,0.45013,0.44799) RGB{Float64}(0.451667,0.44979,0.447652)", - "image/png": "iVBORw0KGgoAAAANSUhEUgAAAHAAAABwCAIAAABJgmMcAAAABGdBTUEAALGPC/xhBQAAAAFzUkdCAK7OHOkAAAAgY0hSTQAAeiYAAICEAAD6AAAAgOgAAHUwAADqYAAAOpgAABdwnLpRPAAACi1JREFUeAHtwctuI+eZx+Hf+9VbRVIiW5Rkq9V2y44naMBI4o2N5CYyV5O5s8kVGPF6JvYEXlhZTGIndp/congQWfUdZvWv7gWBYIBa1vP4f/zhD/xLxnGFXi4ZyTlzTLCAVF4hwQJSKEgpBSmlcFThLaNnGP9KoSClFMQwxMzoGW8VeoWCOKNBOaNBOaNBOUbPMHpGzzB6Rs8wJBA4pgoVUlUVPaNXSkFyykguGUkpIaUUJOfMMcECYsGQEAK9Qi+EgJgZx5RSkFwyxzijQTmjQTmjQbmZIYZxjJkhuWSkUJAQAlKFgISq4pgYI71CL5eM5JyRFBOSc0ZyyUiwQC/QCwQkeECqEBCzgOSSkRQTUnKhVzjKGQ3KGQ3KGQ3KSy5IKgnJKXOMBUOqUCFVXSElF+RwOCAxRuSwPyCVV4hhyHQ6RaanU6TtWqRrO6TrOiTlhDRNwzGFt7wKSLtvkRgjvULPgiGGIc5oUM5oUM5oUJ5zRmKKSEoJCRaQyiqk8gpp6ho5tC0SY0Qedg/Ifr9Hcs5ICAHZ7rZICAGJXUTcHelih1xdXSGz2Qxp6gZZrVbI/uEByaUghiEhBKQKAcmlIM5oUM5oUM5oUF4oiGFIVVWIYUjtNVKFCokxITFGxDCkmTTIdDpFNtstYkYvp4xsNhvk5uaGY5q6QZpJg9y9uUMsGJJSQiaTCZJzRqpQISklpJlMkMP+gDijQTmjQTmjQblhSFVVSKgCEiwglVcck3KiV+h1sUNSSkjXdkhKEenaDim8NZ/PkbNHZ8hiMUcObYvsH/bI6n6FpJiQ2cmMY4IFJJaI1HWNxC7SM3rOaFDOaFDOaFBuwZBgAamqCimlIIvFAjEzZDabIev7NeIPjnRdh+zKDpmdzJBJM0HatkVCCMj5+RI5v7hAttst8v3me6SqKmS32yHrzRqZTqfIxcUFEmNEUkrIdDpFSi6IMxqUMxqUMxqU845CQQoFCSEg7o7knJHV3QqJMSL3q3tkf9gjj68eI147EruIzOdzxN2RzWaLeF0j+/0eMTPk8uIS6doOqb1Guq5DurZDUkqIuyP7wx7xyhFnNChnNChnNCgPZohZ4Jjdww5JLxOSc0ZOT0+Rx9ePkavHV8jd3R3y+tVr5NAekPl8jpycniD7wx6ZL+bIYX9AYtchv/y3XyI/Pf8JWa/XSDNpkM1mg9RNjdTUSNd2yHQ65RhnNChnNChnNCi3EBB3R3LOyHQ6RXa7HTJpJkhMEfn555+R+9U98ubuDunaFtnv90jtNfI6vkaWyyVSSkEO7QHZPeyQ73/4L3qF3odPP0SaukG6rkW2ux3SdR2Sp4m3DMk5I85oUM5oUM5oUM47cs5IThkppSAnJyfI6ekpst/v6RV62+0Wub29Rbq2RW4+ukF+ev4cee/yEtmsN8jqboXcr++Ru7sVst1ukCfXT5DpdIosHi2Q+/t7ZDqZIm0+IDFGxL1GCgVxRoNyRoNyRoPynDKSYkJKKch0OkUKBYldRLbbLfLi+Qukrmvk6dMPkb/e/hVZLpfIbrtDfnr+E/Lxxx8jt9/dIotHC+STX/wCeXT2CMkpIf/7t78hq/sVspgvkJQSYiEglTtyOByQuqkRZzQoZzQoZzQoTzkhOWfEMGS/33NMrjOyPFsiT58+RZbLJfLrX/0K+Z+//AX59NNPkffffx/543/+EXn27BlycnKCfPLJJ0jXdcjPr18jm90DslgskDdv3iCb9Qap3JEUIzKfz5Fm0iBVqBBnNChnNChnNCg3DDEMCSEgVVUhlVfIZDJBNpsNcnl5gTz98EPk8y8+R7744gtkMp0gm80WuX5yjfzud79FXrx8gZzMZsiu0GsmE2Q2i4iZITFG5M3Pb5CcMzKdTZGT0xOkChWSS0ac0aCc0aCc0aDczBB3R5qmRgpvTScTpG075Pz8HHlzd4c8f/EC+fN//xn57vYW+eLzz5EuRuT9995Dvv7mG+Tm5ga5/e4WcXdkMpkg+/0eySUjKSbk7OwMqZsaaZoGMTN6Rq/kgjijQTmjQTmjQbmZIXVdI3XdILlkpJlMkM12i6RNolfoffvtt8hiPkcOhwOy3e2Qm5unyJdffonM53Pk2bNnyHvvXSKHwwF5+eoVsllvkP3DHvHa6RV689M5st6skWABaSYNUkpBnNGgnNGgnNGg3MwQC4Z0XYeklJBgATnsD0jXdUjbtshkMkH+9KevkPVmjXz11VdICAG5vLxEnjx5gnz/978jMSXkh3/8A9ltd8ibN2+QFy9fItPpBLm8uES22y3SNA1SckFKKYiZIc5oUM5oUM5oUF5KQbquQ3LOSAgB2W63yHqzRl69fIV0sUNevnyF3K9WyKtXr5C71Qr57De/QQpv/fvvf4/cfneL1E2N7B/2yG63Q2azGfLk+hpxd2Q2m9EzepOmQbouIoVCr9BzRoNyRoNyRoNy3lFyQXLOSOwi0nYtEiwg19fXSNu2yOnJKbLZbOiZIVdXV8hisUDOL86Rr7/5Gtk/7JFHjx4hoaoQrypk8WiBLJdLxGtHttstUtc1UngrhIDEFJGSC+KMBuWMBuWMBuUhBKRQEMuGlFKQKlRIjBE5OT1B9oc94u7I9fU1sl6vkcViQc+MXqF3vjxHVrZCuq5DDtstcnb2CDl7dIbUTc0xi8UCKaUghiEpJiTFRM/oOaNBOaNBOaNBOe/IOSNd1yFt1yKGIRYMadsW2T/skbZtETNDrq6ukFwy8uOPPyLn5+fIfD5HJtMJYhhSKEgIAWnbFpnNZoh7hZRCr4sd0nUdEmNECgUxDHFGg3JGg3JGg/KUEhJjRGKKSNd2iAVDVqsVkmLimJQz8sEHHyDrzZpjDEMmkwmyPF8i6/Ua8cqRs+UZklJCLi8v6RV6bdsim+0GORwOSIoJMTPEzDjGGQ3KGQ3KGQ3Kc8lIKYVeoefuSNM0yMXFBdJ1HXK+PEceXz9GptMpcn9/j/zw/Q/IZ599hvzzx38is9kM2W13yOxkhnRdh1ShQtbrNdIeWuRwOCC7hx1SSkGapkFqr5EQAsc4o0E5o0E5o0G5YUjlFWJmiJkhdVMjXeyQm6c3yHK5RAoFmZ+eInd3d8h8MeeYjz76CFmtVsh0NkVyzvQKve1hi5gZ0rUdElNEUkpI0zRIVVVIVVWIBUNyzogzGpQzGpQzGpRbMCRYQNwdCWaIWUAWiwXitSPrzRpZLpfIdrtDYheRYAGJKSLb3RYxMySlhKSYkFIKklPmmMorxN2RQkHcHamqCikUeoW3Cj1nNChnNChnNCg3M8QwxDCk8K6CBAtI27ZI7CLSdR1SSkEKBUkpIW3bIjlnxCtH3B2pQoVYMKRrO6SUgoQQEHenZ/RKKfQKvVIKUkrhGGc0KGc0KGc0KOcdhUKv0DMzeoWjDocDstvtEHdHmrrhmMlkghQKEiwgMUakUJBggV6hV3mFlFwQC8ZRhf+fwltGzxkNyhkNyhkN6v8AAWotVA/Ky9kAAAAASUVORK5CYII=", + "text/plain": "28×28 Array{RGB{Float64},2} with eltype ColorTypes.RGB{Float64}:\n RGB{Float64}(0.449503,0.447623,0.445497) … RGB{Float64}(0.450951,0.449072,0.446939)\n RGB{Float64}(0.450395,0.448515,0.446385) RGB{Float64}(0.450476,0.448597,0.446466)\n RGB{Float64}(0.454765,0.452892,0.450736) RGB{Float64}(0.449051,0.447169,0.445047)\n RGB{Float64}(0.449776,0.447895,0.445768) RGB{Float64}(0.44858,0.446697,0.444577)\n RGB{Float64}(0.441326,0.439433,0.437355) RGB{Float64}(0.450935,0.449056,0.446923)\n RGB{Float64}(0.43825,0.436353,0.434293) … RGB{Float64}(0.452543,0.450666,0.448523)\n RGB{Float64}(0.440395,0.4385,0.436428) RGB{Float64}(0.454369,0.452496,0.450342)\n RGB{Float64}(0.446738,0.444853,0.442743) RGB{Float64}(0.447415,0.445531,0.443418)\n RGB{Float64}(0.447177,0.445293,0.443181) RGB{Float64}(0.442936,0.441045,0.438958)\n RGB{Float64}(0.449207,0.447326,0.445202) RGB{Float64}(0.441824,0.439932,0.437851)\n ⋮ ⋱ \n RGB{Float64}(0.448184,0.446301,0.444183) RGB{Float64}(0.455155,0.453283,0.451125)\n RGB{Float64}(0.4565,0.45463,0.452464) … RGB{Float64}(0.449075,0.447193,0.445071)\n RGB{Float64}(0.452509,0.450633,0.44849) RGB{Float64}(0.445923,0.444037,0.441932)\n RGB{Float64}(0.445348,0.443461,0.44136) RGB{Float64}(0.447088,0.445204,0.443093)\n RGB{Float64}(0.441705,0.439813,0.437733) RGB{Float64}(0.451828,0.449951,0.447812)\n RGB{Float64}(0.44632,0.444435,0.442328) RGB{Float64}(0.45509,0.453218,0.45106)\n RGB{Float64}(0.446412,0.444527,0.442419) … RGB{Float64}(0.452566,0.450689,0.448546)\n RGB{Float64}(0.450806,0.448927,0.446794) RGB{Float64}(0.450252,0.448373,0.446243)\n RGB{Float64}(0.450595,0.448716,0.446584) RGB{Float64}(0.450485,0.448605,0.446474)", + "image/png": "iVBORw0KGgoAAAANSUhEUgAAAHAAAABwCAIAAABJgmMcAAAABGdBTUEAALGPC/xhBQAAAAFzUkdCAK7OHOkAAAAgY0hSTQAAeiYAAICEAAD6AAAAgOgAAHUwAADqYAAAOpgAABdwnLpRPAAACh5JREFUeAHtwcuOHOd5x+Hf+9VbVd0zPSeOeAKHsi3Y0SIMYglwbkJ349xZfAWhb8CSFg4kbmTTEM+c6XN1fYes/kUtOjAC1LKex//zj39ECgUxjH+mUJCSC5JzZmAMzAzxyhELhpgZUnJBcskMCv+ccZRhSC6ZYwxDzIxjCoVjnMmonMmonMmonF8wM8QwBsbAMAbGJ4FBoSAhBMQrR0IVOCbGiKSSkJILg8Ig5cQxIQQkhICYGRIsIGaGGMYxpRSklMKgMHAmo3Imo3Imo3IzQywYYhgDY1BKYVAYhBAQM0MqrxDDkJQSknNGUkpIzhnJOSM5ZyTnjIQQkFIKUnJBQhOQYAGxYEjJBUkpIblkpJSCGJ84k1E5k1E5k1F5KQVJMSE5ZwaFgQVDQgiI186gMOi6Djl0B6Q7dEhd10iwgMxmM8RnjvR9j8Q+Iof+gKSUkKZpOMoYVFWF7Ps9EvvIwBiYGVL4xJmMypmMypmMynPOSIwRiSkiwQJS1zVSuyNNXSOHvkf6vkc2mw1y6A8MCoMQArLZbhAzQ2KMiFeO9H2PPHj4AJm1M6RtG+T27g7Z7/ZIzpljLBgSLDAwBs5kVM5kVM5kVF5K4ZgQAhIsIJVXDMyQQ98jfd8jhiHz+Rw5OT1BNusNEkJAcs7IZrdFbm6eIMYndd0gbdsgt7d3yGZjSEoJaWctUkpBQghIThlp2gbZ7/eIMxmVMxmVMxmVYwwqr5A61EiwgNRNzTE5Z46JKSIxRuTQ90jOGem6DikUZHF6ilxeXCCLxRlyOHTIbr9Hlssl0vc9cnJygpgZYsGQUgrS1A2SYkIMQ5zJqJzJqJzJqDxYQKqqQiqvkFIKcnZ2xqAwODmZI8vVCqmqCukPPYPdFqnrE6RtWqQ7dEgVKuTy6gq5urpCtpstsv7735GqqpDNdoOs366Rtm2Rz64/Q2KMSIoJmc1nSC4ZcSajciajciajcoxBKQXJKSNmhgQzpPDJ7e0dEmNEVqsV0u075P6D+4i7IzFG5HRxirg7sl6vEXdHun2HmBly7949pOs6xN2Rvu+RruuQGCNSeYV0+w6pqgpxJqNyJqNyJqNywzgm5YTs93skxsgxJycnyMNHD5EHDx8gt7e3yPt375Gu65DFYoGcnpwiXdchi8UC6boO6fse+eKLL5DXb14jy+USadsWWa/XyGw245iu65D5fM4xzmRUzmRUzmRUbsEQrxzJJSOnJ6fIZrNBZrMZknJCPrz/gCyXS+Tj7UekP/TIfr9H6rpG3n94j1xeXHJM13XIbrtDXr58iRQK8uTmCdI0DdL3PbLdbpHD4YDUTc0xOWfEmYzKmYzKmYzK+YVcMpJzRnLOyOniFDk9PUV2ux1SKMhms0Fe/PgC6Q4H5OnTp8ir16+Q6+trZF2tkdu7W2S1XCG3d3fIZrNBHj96hMxmM+Ts7AxZLVdIO2uRlBLS9z1S1zXHOJNROZNROZNReUoJySkjhYLM2hnH9H2PbDdb5O2bt4jXNXJzc4P8+OIFcu/qCtlsN8jrV6+R2a9nyI8//oicn50jX/zmN8jZ2RmSckJ++uknZLlcIovFAokxIlWoEGsM6boOaZoGcSajciajciaj8lIKknNGCgXZ7XccU9c1cnF5gdzcPEEuLi+RZ8+eId9//z3y5ZdfIg8ePED+9F9/Qn73u98ipyenyK9+/Suk73vk44ePyHqzRs7PzpEPHz8gq9UKcXckpYycLRZI0zZIVVWIMxmVMxmVMxmVmxlSKEgIAamqCvHKkXbWIuv1Grm+d408efIE+f3v/x356quvkPl8hqxWa+TRo0fIH/7jD8ibN2+Rk/kc2Zkhbdsi836OGIacxTPk48ePSCkFmc1myHw+R0IISM4ZcSajciajciajcsOQUAWkbVoGxqCpG6Tve+Ty8hL5ePsRefP6DfKXv3yLvHjxAvn666+Rvj8g159dI999+x3y+edPkR9++BHx2pGmaZDKKySXjKSUkPPzc6Sua6RpGgbGwIIhOWXEmYzKmYzKmYzKMQZN3SBN0yApJaRtG2S72yIlZz4x5K//81fk/PwM6boO2W42yM3TG+T58z8ji9NT5F++/BK5/uwa6fZ75N2798h6tUL2uz1S1zVSckEWiwWyWW8QC4Y0TYPknBFnMipnMipnMio3M445HA5ISoljdrsdEmNEDocD0jYt8vz5n5G7uzvk+X8/R0IIyP3795HHjx8jf/vpb0iKCXn5j38g280W+fDxA/L27TtkPp8h19fXyG63Q9q2RWKKHBMsIM5kVM5kVM5kVE5hEGNkUAoDM2Sz3SDr1Rp5//49EmNE3rx5g9ze3SHv3r1Dlssl8uzZvzIoDL755hvkhx9+QNq2RXa7HbLdbZH5bI48fPgAqb1GTk9OGRiDuq6RKlYMCoNSCuJMRuVMRuVMRuWFwqDwSWGQYkS6Q8cxT548QfbdHjk5PUE2my0SQkAePXyInJ9fIFeXl8i3332LdN0BOT87Q0JVIe4Vcn5+jlxdXSGz2QxZrVZIZRViZoiZISklBsbAmYzKmYzKmYzKQwiIYUhMEcklI8ECElNEzAyJMSJ1XSMP7t9HVsslsjhbIKUUpFCQq8sr5G55h/QxIt16jVxcXiKXF5fIfDZHcs7IyfwEKaUgpRSOySUjVhg4k1E5k1E5k1F5KQWJKSLdvkP6vkeqqkLcHeljj3RdhxwOB465f/8+knNGXr16hVxcXiKLxQJpZy1iGMdUXiGH/oDM8gwxY+DuyH6/R1JOSE4ZKaXwiSHOZFTOZFTOZFReckFyykjKCcklI5YNubu7Q1JKSEoZMWPw+PFjZL3eIKUUxMyQtm2Rq6srZLlaIu6OXF1eITln5OrqCokpIv2hRzbbDXLoDkgpBSkUxMwYGANnMipnMipnMiovpSCFgoQQkLqukaZukOt710gfe+Ti4gJ59OgRMpvNkLu7O+Tly5fIs397hvz888/I/GSObHdb5GR+gvSxR0IIyGq9QrabLRJjRPbdHjEMqesaCSEgVaiQQkGcyaicyaicyajczBCvHAkWGBgDd0cO/QG5eXqDXJxf8ElBThcL5Pb2FjlbnHHM559/jizvlsh8PkdSTkjOCem6A5JzRvrYIzlnJKWEzNoZYmZIqAJiwRgUBs5kVM5kVM5kVM4vhBCQUAXEMMSCIYvFAnF3ZL1ZI1eXl8h2s0ViH5FQBSSlhGw3W8SCIX3fIykmjkkpIWaGNHWDlFIYGIO6rpEqVIhhHFNKQZzJqJzJqJzJqNzMGBgDM0MM4xgzQw6HAxL7iPSHHsklI6UUJMbIMblkxHHE3ZGqqhAzQ2IfkVIKEqqAhBAQM+OoUpBCQUoqHONMRuVMRuVMRuX8H0opiJkhxieFT7quQ7bbLeKVI3VTc0zbtgwKAzNDYoxIoSDBAgNjUFUVUkpBLBhiZohhSC6ZQeH/xZmMypmMypmM6n8BLJoZsXwt49cAAAAASUVORK5CYII=", "text/html": [ - "" + "" ] }, "metadata": {}, @@ -208,10 +208,10 @@ { "output_type": "execute_result", "data": { - "text/plain": "28×28 Array{RGB{Float64},2} with eltype ColorTypes.RGB{Float64}:\n RGB{Float64}(0.351809,0.34983,0.34826) … RGB{Float64}(0.354843,0.352866,0.351279)\n RGB{Float64}(0.35032,0.348341,0.346779) RGB{Float64}(0.353631,0.351653,0.350073)\n RGB{Float64}(0.351409,0.34943,0.347862) RGB{Float64}(0.354959,0.352981,0.351393)\n RGB{Float64}(0.349521,0.347542,0.345984) RGB{Float64}(0.349715,0.347735,0.346176)\n RGB{Float64}(0.349647,0.347668,0.346109) RGB{Float64}(0.359971,0.357996,0.35638)\n RGB{Float64}(0.346748,0.344767,0.343225) … RGB{Float64}(0.365355,0.363383,0.361737)\n RGB{Float64}(0.343255,0.341274,0.339751) RGB{Float64}(0.366811,0.364841,0.363186)\n RGB{Float64}(0.347284,0.345304,0.343759) RGB{Float64}(0.345238,0.343257,0.341723)\n RGB{Float64}(0.342378,0.340397,0.338878) RGB{Float64}(0.336259,0.334276,0.332791)\n RGB{Float64}(0.341335,0.339353,0.33784) RGB{Float64}(0.342739,0.340757,0.339237)\n ⋮ ⋱ \n RGB{Float64}(0.364512,0.36254,0.360899) RGB{Float64}(0.347847,0.345867,0.344319)\n RGB{Float64}(0.35998,0.358005,0.356389) … RGB{Float64}(0.344506,0.342525,0.340995)\n RGB{Float64}(0.354208,0.352231,0.350647) RGB{Float64}(0.342904,0.340923,0.339401)\n RGB{Float64}(0.353607,0.351629,0.350049) RGB{Float64}(0.35015,0.348171,0.34661)\n RGB{Float64}(0.349899,0.34792,0.34636) RGB{Float64}(0.348958,0.346979,0.345424)\n RGB{Float64}(0.345451,0.343471,0.341935) RGB{Float64}(0.347499,0.345519,0.343972)\n RGB{Float64}(0.34938,0.347401,0.345844) … RGB{Float64}(0.348774,0.346794,0.34524)\n RGB{Float64}(0.349957,0.347978,0.346418) RGB{Float64}(0.349328,0.347349,0.345792)\n RGB{Float64}(0.351888,0.349909,0.348338) RGB{Float64}(0.350979,0.349,0.347434)", - "image/png": "iVBORw0KGgoAAAANSUhEUgAAAHAAAABwCAIAAABJgmMcAAAABGdBTUEAALGPC/xhBQAAAAFzUkdCAK7OHOkAAAAgY0hSTQAAeiYAAICEAAD6AAAAgOgAAHUwAADqYAAAOpgAABdwnLpRPAAAClFJREFUeAHtwUuPHNd5x+HfOeetqp6eG0eaoYZjM44lUSAQxwIEZe8s42/gfFd9BAsQAkeRYEYRKc7QCbvn2pe6nTerfw0XDXhTy3oe+/c//YldHGeXEAKSc0Y8O9LnHnF3JMXEToFBiolBYODZkezOwB1JlhAzQ2KISAgByZ4Rd0c8O7s4zsAZOI+MyaiMyaiMyajMcXYJISCBgIQQGAQGdVcjXdchTdsgKRkSQ0BCDIiZIYGAOI7kPiMpJST0gYExSCkhKSUke2bgDNq2Rfrc8/c5YkxGZUxGZUxGZYGAuDsSQmAQAoMQkJwz4jiDwCNnEHhkhSFVWSHZM+LuSNO0SNe1yHqzRswKpKhr5ODgADn56ASJMbJL27TIar1i4Az63DPIDIzJqIzJqIzJqMxxxN2Rru+QnDNSWIGYGeJ8qEHatkWapkFmPkNCCIhnR0IIiHtGurZDtpstg1Aje7MZsjffQ+q6RpqmQWKISNM0SNM2SIwJCXwgMDAmozImozImozI+0OeM9F3HLnt7c+Rgfx8pygLZbrfI4cEh4u4MQkBiCEiMEVmtV0jdNEjf90h2R6qyRLq+QxaLJbJcLJH1Zo3Mqhni7khVVchsb4aUZYmkmBBjMipjMipjMipzZxACA+dRShHpuhZpmoZBYODuiLsjh4eHSEoJyTkjbdsil28vkb7vkZwzcnx8jJgZsq23yHq9RvquQ2KMSN/3yGxvDynKEqnKCimKAsmeEWMyKmMyKmMyKsMdSSkhZoYEApI9I+vNGrl/uGcXS4ZcXV0hV1dXyHq9Ru7u75EYAnJ0dIRc/OoCuXh2wSAw6LoOuby8Qu7v7pC6qZG2a9nF3ZGqKpEiFEjwgBiTURmTURmTUVkIAamqCqmqCokhIjlnpM89st1skdlshtze3SE//vgj8u7dO2R/fx958cUXyOeffYYcHR0hz//hOXJ8fIycnZ4iP/30P8hqtUK+++475PLyElmv18hqvUK22y0SQ2DgDEIMiDEZlTEZlTEZlWXPSNu2SNd2SNu1yMH+ARJCQMqqRN6+fYu8XyyQ5fU18vLlS+S3n/4W+d0//Q45OztDbm5ukIODA2Q+nyMhRuTs6RmSFgl58eJzpO1apN7WyGa7QdqmRdwdadsWMTPEmIzKmIzKmIzK+EDOGen7HunaDrnLd8jHH3+MzGYz5MmTJ0iMEQkE5OknT5Hc98jZ0zPk3dUVslgskcVigSyXS2T/YB+JISIfffwRcnt3h7x8+RJZLBbI1eUVg30G2TNSFAWDEBBjMipjMipjMiqLISKeHVmv1sjN7S0yqyrEsyMXv7pAnjx5ghRFgWw2G+Tnn39Gbm9ukez/gdxc3yDr9Rq5f3hA+r5DCiuQh9UK+Zevv0Z+/fzXyHw+Rw7295FAQNq2RVarFbt0fY8Yk1EZk1EZk1FZsoRYMsQKQ2KKSIoJ+ejjj5CqrJCmaZBXr14hl28vkeVyiRw/OUa6rkfu7++RN2/eIH3fI13bInXTIF3XIXuzGWKFIU3TIClGZDabIZvNBkkpIe6OJHfEmIzKmIzKmIzKAoFdUkzI8dExUlYlcnd3h1y+vUQ22y1yc3ODbLdb5Pvvv0c++/RT5PrmBnn+/Dny9ddfI588fYp89dVXyHK5RJq2Qfq+R5aLJXJzc4O4O3J/d49kz0hZlIgVhoQYEGMyKmMyKmMyKss5IzFFJFlC5uUcqesasWTIxcUFUpQFEgjIl19+iSwWCyTEgDzcPyA//PADcnp6ipw/O0f+8K9/QN6/f49cvn2L/NcPPyDz+Rx5v3iP3N7eIquHFVIUBXJ4eIgUZYGUZYUYk1EZk1EZk1FZjJFdPDvS5x5JlpC5zdnlxecvkBdfvED++Md/Q969e4d8/5/fI3/+87dIVVXIdrtFurZDXr16hTw8PCCbzQbpuo5BYPBw/4BcL6+Rbb1FYoiIFYZ0XYekmBBjMipjMipjMipLKSHJErK3t4fknBk4g81mgziOPKwekOPjY+Svf32FvHn9Gtlst0jd1EiMEfnvn35CVusV0rYtcnp6iiyWCyT3GdmsN0h2R8wMmad9/p4QI7sYk1EZk1EZk1FZn3sGPYOu7ZDsGam3NfL28i1SVRVydnqGfPPNN8jv//n3SFmVyJs3b5Db21vk9vYW2dubIVVZIavVCrm+vkayZ+Tdu3fI9fIa6foOKYsSKS0hhRXI0eERkt0Rd0eMyaiMyaiMyajM3REzQ0IISNd0yO3tLXJzc4sURYH85S9/QU7PzpD7u3tku90i3377LXJ5dYWsVyvk4uIC+dvf/hc5Pz9HHh7ueRSQw8MD5PzZM8Q9I33XI/P9fSSGgMQYkbquGQQGxmRUxmRUxmRUxgcCAXF3pOt7pCgKpKoq5O72Ftms18jr16+Rw8ND5PXr18i2rtlls9kgddMgi8UCubm5QZqmRs4/OUdmVYVc10vk5OQEefr0Kbv0fY/0uWcXdwbGZFTGZFTGZFQWCEjbtUhMEQmBQd3UyHa7RWJKSNO2iFlCHh4ekK7vkadPz5D3//ceuXj2DDnY30d++eUXpKoqxLMjziPHkfPzc+TZs2dIUzeI40gIAVmtVkhRFkjXdogxGZUxGZUxGZW5O9L3PbJZb5DtdovUdY2kGJGTs1OkqmZIXW+RvuuR/f19ZD6fI8dHx8jR0RHS9z3yj7/5DRJiRLquQ/quQw4Pj5CUEhJDRFJKSN3USM4Z6doO8eyIuyPGZFTGZFTGZFSWPTPoGeTsSNM2SNf1SIwR6doOqetbpKkbxMyQEAJSVhXy5OQEefLkGIkxIkdHR0jTNsj9/T1ydnaGHB4eIrPZDKlmFeLuSFmVyN3dHRJCQPq+R5xHxmRUxmRUxmRUxgfcHcm5Rzw7EgKPAoP1eo20bYtsNhvk5OQEiTEiZgnp+w7ZbrfI+SfnyP58n11SSkj2jCyXS2Q+nyNt0yLuGfHMI2dgZkhRFEjbtYgxGZUxGZUxGZW5O+LuPApICAGZzWZI7jOy2W6Qvb095OTkBDEzJMaIrFZr5PjoCIkxIrPZDHF3pOs7pKkbpG5qpCxKZLVaIcvlEimLEpnNZoi7MwgM3J2BMzAmozImozImozJ3R9wdyX2PmBlSViViyZDDw0Okms2QFCMSU0Q2mw3S9R2S3RF3R66urpBkCXF3JMWEtG2LbLZbJOeMlEWB5D4jbdci7o64M8ie2cWYjMqYjMqYjMpCCEgIAQkW2CWEgMz2ZoglQ8wMCTEwcAZt0yJ7sz2kLAtks9kgbdMizoccKYsSWa1WyHq9RsqqRFbuSFVWSFlViKXETs5OxmRUxmRUxmRUxgcCAfHg7JI9IyEEJHtGmrZBQghIDJFdirJEUkxICIFBYGAxIe6ONE3DIDCoqgqxwpAYIuI4kvseaXKPxBCREAKDwMCYjMqYjMqYjMoCAXEcCQTEeRR45O4MnIHjiJkhIQRkvj9H2rZFPDsScmAQGIQYkcCjZAkxM6TrOgaBR87A+UBgEEJAQgxIICDOI2MyKmMyKmMyqv8Hx/FgpOnDGV0AAAAASUVORK5CYII=", + "text/plain": "28×28 Array{RGB{Float64},2} with eltype ColorTypes.RGB{Float64}:\n RGB{Float64}(0.373803,0.371836,0.370143) … RGB{Float64}(0.377324,0.375361,0.373647)\n RGB{Float64}(0.375367,0.373401,0.371699) RGB{Float64}(0.377171,0.375207,0.373495)\n RGB{Float64}(0.377197,0.375233,0.37352) RGB{Float64}(0.375973,0.374008,0.372303)\n RGB{Float64}(0.369969,0.368,0.366328) RGB{Float64}(0.373067,0.3711,0.369411)\n RGB{Float64}(0.370823,0.368855,0.367178) RGB{Float64}(0.382107,0.380147,0.378407)\n RGB{Float64}(0.374222,0.372256,0.37056) … RGB{Float64}(0.390822,0.388869,0.38708)\n RGB{Float64}(0.375885,0.373921,0.372215) RGB{Float64}(0.382941,0.380982,0.379237)\n RGB{Float64}(0.369851,0.367882,0.36621) RGB{Float64}(0.371247,0.369279,0.3676)\n RGB{Float64}(0.366418,0.364447,0.362795) RGB{Float64}(0.354395,0.352417,0.350833)\n RGB{Float64}(0.371987,0.37002,0.368336) RGB{Float64}(0.363721,0.361749,0.360112)\n ⋮ ⋱ \n RGB{Float64}(0.378522,0.376559,0.374839) RGB{Float64}(0.366744,0.364773,0.36312)\n RGB{Float64}(0.379367,0.377405,0.37568) … RGB{Float64}(0.363909,0.361937,0.360299)\n RGB{Float64}(0.376655,0.374691,0.372981) RGB{Float64}(0.37372,0.371754,0.370061)\n RGB{Float64}(0.373796,0.37183,0.370137) RGB{Float64}(0.37974,0.377778,0.376051)\n RGB{Float64}(0.37461,0.372645,0.370947) RGB{Float64}(0.375038,0.373072,0.371372)\n RGB{Float64}(0.371529,0.369561,0.36788) RGB{Float64}(0.369221,0.367252,0.365584)\n RGB{Float64}(0.374621,0.372655,0.370957) … RGB{Float64}(0.372137,0.370169,0.368485)\n RGB{Float64}(0.374246,0.37228,0.370584) RGB{Float64}(0.373038,0.371071,0.369382)\n RGB{Float64}(0.375336,0.37337,0.371668) RGB{Float64}(0.373808,0.371842,0.370148)", + "image/png": "iVBORw0KGgoAAAANSUhEUgAAAHAAAABwCAIAAABJgmMcAAAABGdBTUEAALGPC/xhBQAAAAFzUkdCAK7OHOkAAAAgY0hSTQAAeiYAAICEAAD6AAAAgOgAAHUwAADqYAAAOpgAABdwnLpRPAAACoZJREFUeAHtwc1zHdWdx+HPOf3r27ovurYkbN5iYBhDIKnKZKoyi8xuWGT2mf3kf82ObCCkiqrgOAM2likkYUn3pft29zm/WX3bLLTsZT+P/elP/8udnIHzWowBSSkhOWUk5YTEEJHCCqQoCiSlhBRFgRSxQFJKiLsjKSWknJVICAGJMSLuzsC5k+MMnDu5O+I4YkxGZUxGZUxGZYGAhBAYRO7k7khOGWnbFkkpMQgMKq+QnDKSPTNwBm1qkZQzEkNAsjsSQkAKK5BAQMpZiaSUec2RnDKSyQycgePcxZiMypiMypiMyggM3B0JISCBwCAwiDEisYgMAgPPjhRWIPOjOZJSQrJnJKWENHWN5JyRnDMyqyqkqmbIcrFE5vM54u4MnEGfeqSua8SzcxfPjhiTURmTURmTUVnOGck5I947EgiIuzMIDEorkbZtkT73SFM3SNu2yPxojuz3e6RpGiTGiBzaFgm81qceCbw2ny8Qd0dSSkjXdkhKCenaDnEcscIQxxFjMipjMipjMioLBCSGiBy6FkmpR6pZhayP18hyuURyzkjbtsjhcED6vkfcHVmuVshyuUQO7QHJOSOHwwHp2g5p2w7ZbLfI7c0xst/vkbIsEccRKwxZrpZICAGJISLGZFTGZFTGZFRG4DVnEAKDGCLiONKnHinLEqmbGmnbFtlsNshuv0P2+xpZLZfIvt4jh0OLHFUV0vU9g8CgiJGBO1LXNZJzRuq6RlarFVKYIZ4dCTEgkYgYk1EZk1EZk1FZICChCMi8mCMxRqRPiYEzuL6+RhxHtrst8uz5c+Tly5fIdrtFjo+PkRgC8vY77yAPHz5ETs9OkaqqkBcvzpGrq0vk+voaCTEiZoZkz0jMGbGyREIISE4ZMSajMiajMiajsuwZKa1EisIYuCPujqSckNQmZDabIYfDAdlut8jV5SWyXK2Qf/vNb5B3330XefzRY+TRL36B3Lt/HzEz5Ouvv0Zub2+Rr/76FbLb75DNZos0dYNkzwwCg/l8gRQxIsZkVMZkVMZkVObOwLMjXe6Q7WaDxBiREAIyq2aIuyNVVSG//Phj5NNPPkHu3buH/PrXv0KO12ukKApks90iJ6enSBEjslwsEHdHPvzwQ+Ty6hI5OnqF5OxIXe+RlBKSU0JyTogxGZUxGZUxGZXFEJCcM9J2LbLb7RAzQ9548Aby/vvvI5988gmyXq+Rvu+Rl+fnyG6/R1bHx8i3//ct0nYtkvqEfPnll8jpySkSi4jMyhmSUkLefPNNZLFYIOfn58hisUTcHSnLEsk5I8ZkVMZkVMZkVBZjRNwd6boO6foe2e62SFEUyHq9Rl6ev0RevXqF1HWN3NzcIK9+eoXUTY1cXl4hTV0j+7pGck7IbFYhFxcXyH/87nfIx7/8GFmv18hbb76JlGWJ7Pd7pK5rBs6g6zvEmIzKmIzKmIzK3B0JISDzozmSjhNyfLxC1us1knNGvnnyDbLZbJC+6xHHkZwzcnFxgXRthzx7/hy5uLhAihiRfV0jMUbk7OwUmc1myNvvvI2kvkeOjo6Q7XaLeHbEce5iTEZlTEZlTEZlhIAUViBVVSEhBsSzI3XdIFdXPyH1fo80TYN0XYd8/pe/IP/ywQfIYrlAFvMF8qtPP0Xe/uwz5Pf/+Xsk54zc3Nwg8/kc+e7b75AfXv6A7PZ7pKlrpOs6pDBDZrMSKQpDjMmojMmojMmoLIaAWGFIdVQh88UcySkjdV0jpycnyHwxR2blDPnoo4+QP/7PH5EYI3Jzc4P88MMPyHq9Rh7/62Pkv//wB+TQHpDz85fI559/jtRNjfzjyT8Qx5Gbm1ukNEPefudtXiuRooiIMRmVMRmVMRmVFVYgVhpSVRUSQ0T29R7JOSNd1yG//fffIo8fP0Y+++y/kKdP/4l88/e/I0+fPkVub28Rz45sd1vkq7/9DSliRC6vrpC+65GqqpBYROTl+Uuk73tkuVohbdshnh0pYoEYk1EZk1EZk1FZDBEJBGRWzpDdboe0h5ZBYLBYLpDN7S3S9z1yeXmFfPHFF0jf90jTHJDb21vkxYsXyOXVJbLb7ZAPP/wQ+efTp8h2u0PaQ4t0XY+s12uk7TpktVohITCojirE3RFjMipjMipjMipLKSF96pHtdos0TYP0fY9cX18jbdcii/kCefXnPyO3N7fI/fv3kBfn54jjSOoTslqtkNVqhTRNgzx58gTZ7/fI5eUl8tNPr5Db2xtksVggD+4/QBbzOTKfz5E+9dzFmIzKmIzKmIzK3B3JOSNd3yGOI03TIOfn50hd18h+t0fOzs6Qvu+Ri4sL5Pmz58iPP/6INE2DnJ6eIpeXV8gH77+PHNoD0jQNcjgckOPjNfLee+8hDx88ROaLOdIeWqTve+TQHpCcM2JMRmVMRmVMRmU5Z8TdESuMgTNwd2RWVUjTNMir62vk2bNnyPcvXiAvvv8e2e33yOnJCdL1HVIUBfLkm2+Q7779FrGyRKqqQt566y2kaWrkwRsPkBAC4u5I27VIICCeHXF3xJiMypiMypiMyvgZd0e6vkN22x1ye3uLXF9fI01dI13XIcvVCtltt8hms0FOz86QqqqQk5MTpCxLZLffI4e2RZqmQT744APkvUePkKP5EbJarhB3R0IIyKycIXVdIyEEBpmBMRmVMRmVMRmV8TM5ZaSjQ7quQ1LOyGq1QlarFVKaIW3XIk1zQH7x6BGyWi2RQEBOTk94LSDvvvMOkj0jKWUkxoj89OoVchZPkcV8jhRmDJyBuyNFUSBt1zFwR4zJqIzJqIzJqMxxpE894p0jXd8j1WyGVNUMKa1E2rZFttstYlYiRVEgNze3yHuPHiH3799HTk5OkPX6GFksFsj5i3PE3RErCiSEgNy/f4KknJDD4YAczY+QnDNSFAXinhFjMipjMipjMipzd8TdkewZCSEgMUakLGeIe0ZSTkhRGLJcLpCu7ZD79+4hVVUhfd8j6/UaOTs7Q6pZhTx88BC5uLhA+tQj1axCNpsNUs5KxMyQw+GAuDtiRYGkxMCYjMqYjMqYjMoCAXEc8eyIFQUSQ0BWyyXS9z2yWq2Qs9MzZD6fI7v9HtnttoiZIVVVIbNyhjSHBtlsNgycwXw+R+qmRtq2RQorkLZtkcIKxN2RGCPi7oi7I8ZkVMZkVMZkVObuDJxB9owEAncJISDz+Rw5OjpCYoxIWZbcJfU9EmNEQgjId8++Q2blDHF35Pj4GEk5IV3XIzknZLPZIKWVyNH8CEkpISklxHEke0aMyaiMyaiMyagshIA4jhSxYBAYhBiQclYiZoYURYGUZYkEAlJYgZgZEmJA2rZFcs5IHRrErECyZySlhDR1jXRdx12O5kdIzhkprEBSn5CUExIIiDEZlTEZlTEZlTmOBAISQkCcn3MkEBDPjiRPSE4ZCTEg7gwWiwUSQuAus3KGuDt3ORwODJxBygkJISAhBCT1CelCh6ScGDivBV4LDIzJqIzJqIzJqCwQEMeREAMSQ2DgkUFg4DiSUkLMDIlEpJrNkK7vkCIWSJ96pOs7pIgFd4kxIiknJMaIhCIgnp275Oy8lpEQAhI8MAgMjMmojMmojMmo/h+8Jq5qmqO8ZgAAAABJRU5ErkJggg==", "text/html": [ - "" + "" ] }, "metadata": {}, @@ -242,10 +242,10 @@ { "output_type": "execute_result", "data": { - "text/plain": "28×28 Array{RGB{Float64},2} with eltype ColorTypes.RGB{Float64}:\n RGB{Float64}(0.99999,0.99999,1.0) … RGB{Float64}(0.999993,0.999993,1.0)\n RGB{Float64}(0.999968,0.999968,1.0) RGB{Float64}(0.999982,0.999982,1.0)\n RGB{Float64}(0.999891,0.999891,1.0) RGB{Float64}(1.0,0.999997,0.999997)\n RGB{Float64}(1.0,0.999942,0.999942) RGB{Float64}(1.0,0.999958,0.999958)\n RGB{Float64}(1.0,0.999935,0.999935) RGB{Float64}(0.999972,0.999972,1.0)\n RGB{Float64}(0.999993,0.999993,1.0) … RGB{Float64}(0.999936,0.999936,1.0)\n RGB{Float64}(1.0,0.999941,0.999941) RGB{Float64}(0.999999,0.999999,1.0)\n RGB{Float64}(1.0,0.999944,0.999944) RGB{Float64}(0.99989,0.99989,1.0)\n RGB{Float64}(0.999789,0.999789,1.0) RGB{Float64}(0.999921,0.999921,1.0)\n RGB{Float64}(0.999807,0.999807,1.0) RGB{Float64}(0.999962,0.999962,1.0)\n ⋮ ⋱ \n RGB{Float64}(1.0,0.999998,0.999998) RGB{Float64}(1.0,0.999876,0.999876)\n RGB{Float64}(0.999946,0.999946,1.0) … RGB{Float64}(1.0,0.999938,0.999938)\n RGB{Float64}(0.999985,0.999985,1.0) RGB{Float64}(1.0,0.999904,0.999904)\n RGB{Float64}(0.999993,0.999993,1.0) RGB{Float64}(1.0,0.999977,0.999977)\n RGB{Float64}(1.0,0.999891,0.999891) RGB{Float64}(1.0,0.999919,0.999919)\n RGB{Float64}(1.0,0.999937,0.999937) RGB{Float64}(1.0,0.999911,0.999911)\n RGB{Float64}(1.0,0.999956,0.999956) … RGB{Float64}(1.0,0.999955,0.999955)\n RGB{Float64}(1.0,0.999957,0.999957) RGB{Float64}(1.0,0.999981,0.999981)\n RGB{Float64}(1.0,0.999998,0.999998) RGB{Float64}(1.0,0.999991,0.999991)", - "image/png": "iVBORw0KGgoAAAANSUhEUgAAAHAAAABwCAIAAABJgmMcAAAABGdBTUEAALGPC/xhBQAAAAFzUkdCAK7OHOkAAAAgY0hSTQAAeiYAAICEAAD6AAAAgOgAAHUwAADqYAAAOpgAABdwnLpRPAAABClJREFUeAHtwTto1QcUx/Hvzf2pMdfEqA0+4oM0UWmLDlaihgpFkNpBDFVB6iBS2kGQ0q46SJfgojhlcGlEO4gOgtQHVXSwPqY6a7mNFZFGqTWIGpv8O50TB0EMJ9dYzuejoihIcUQKJVIokUKJFEqkUCKFEimUSKFECiVSKJFCiRRKpFAihRIplEihRAolUiiRQokUSqRQIoUSKZRIoUQKJVIokUKJCWx4mFcqnz+DO34c19aG270b82zqDEx9PeNKpFAihRIplJgInj3DDQ1hytUqbmAA9+gRbsoUnISrVjH1C4cx5y6/h2lsxK1ahSuXGTORQokUSqRQopaePMEMjlQw1Wo9ZvmHwj18iGtrwz19itu2DXfrFq6vD7diBWbx2h2YefNwAwO4OXMYM5FCiRRKpFCihi5cq2Du3MFVq7jWb4Vp/GQdZrJGcO3tvFJLC2bkq68xT5/i3r/8M+7GY8yc+nrcxo24cpk3IVIokUKJFEpEKQrc3buY3x4uwGzfjhsawp04gWtqwk2axEvqeJ0D5z7CfFHBVSq4yurVuOvXcWvW4IqCsRIplEihRAolggy9KGF+/X0BprkZ19ODW7oU19XFGxkZwQ0O4r7/rsBdv46bNAlz6u7HmCVLPsd8MPU5rlxmrEQKJVIokUKJIJOfD2I+/esM7qdfMIsPH8ZULl7E7fgR8++RIxhduoQ7dgxTt28fZvrevbj583FdXbg1azBdC3EXLuBaW6dgmqYwZiKFEimUSKFElMZGXLmMO3oU08eooXXrMP8wqoVRu/bvx5w6cwazScLdvo0rCty9e7j+fkxLeztmy5YKRiKESKFECiVSKDEeNm/GbdiA2bVzJ27ePNyhQ7jVq3GtrZhNnZ2YorcXUzpwADc4iHvwANfQgKtUMCKeSKFECiVSKDHeKhXc8eO80sGDmJGihKmrY1RvL+b+jRuYuadP49rbcXv24BYtolZECiVSKJFCiYmgVMLUlXAjI7i6chkzd9Uq3LRpuJ4e3KxZvA0ihRIplEihxARQFLieHtyePX9iigebMdeWf4Npa8O1zMDV8XaIFEqkUCKFEhPA2bO4/ftx3d0LMJ99ibt/H3fyJG72bN46kUKJFEqkUGIcFAWuVOK1+vtxy5bhhodx3d24tWtxHR1MKCKFEimUSKHEOCiVeK3nz3FXr+KuXHmMWbmyCbN+Pa6jgzErClypRDiRQokUSqRQooaKAnfzJm7nTl7ShOnsxHV0EKJUYlyJFEqkUCKFEjVU+qOK6dTfmPNDKzB9P/RjBhoW8a4RKZRIoUQKJWqprw+3ZQums5NRRROmZQbvHJFCiRRKpFCillpbcTNnYhoaGKXpvMtECiVSKJFCiVrauhX34gVmeJhRk+t4l4kUSqRQIoUStdTczKtM5f9DpFAihRIp1H/JytHcsTSkWgAAAABJRU5ErkJggg==", + "text/plain": "28×28 Array{RGB{Float64},2} with eltype ColorTypes.RGB{Float64}:\n RGB{Float64}(0.999994,0.999994,1.0) … RGB{Float64}(0.999994,0.999994,1.0)\n RGB{Float64}(0.999969,0.999969,1.0) RGB{Float64}(0.999993,0.999993,1.0)\n RGB{Float64}(1.0,0.999974,0.999974) RGB{Float64}(1.0,0.999985,0.999985)\n RGB{Float64}(0.999955,0.999955,1.0) RGB{Float64}(1.0,0.999963,0.999963)\n RGB{Float64}(0.999988,0.999988,1.0) RGB{Float64}(0.999999,0.999999,1.0)\n RGB{Float64}(1.0,0.999799,0.999799) … RGB{Float64}(0.99992,0.99992,1.0)\n RGB{Float64}(1.0,0.999927,0.999927) RGB{Float64}(1.0,0.999991,0.999991)\n RGB{Float64}(1.0,0.999969,0.999969) RGB{Float64}(0.999878,0.999878,1.0)\n RGB{Float64}(0.999928,0.999928,1.0) RGB{Float64}(0.999941,0.999941,1.0)\n RGB{Float64}(0.999925,0.999925,1.0) RGB{Float64}(0.999951,0.999951,1.0)\n ⋮ ⋱ \n RGB{Float64}(1.0,0.999905,0.999905) RGB{Float64}(1.0,0.999831,0.999831)\n RGB{Float64}(0.999996,0.999996,1.0) … RGB{Float64}(0.999993,0.999993,1.0)\n RGB{Float64}(0.999992,0.999992,1.0) RGB{Float64}(1.0,0.999932,0.999932)\n RGB{Float64}(0.999875,0.999875,1.0) RGB{Float64}(0.999971,0.999971,1.0)\n RGB{Float64}(1.0,0.999966,0.999966) RGB{Float64}(1.0,0.999959,0.999959)\n RGB{Float64}(1.0,0.999832,0.999832) RGB{Float64}(1.0,0.999906,0.999906)\n RGB{Float64}(1.0,0.999942,0.999942) … RGB{Float64}(1.0,0.99999,0.99999)\n RGB{Float64}(1.0,0.999997,0.999997) RGB{Float64}(1.0,0.999981,0.999981)\n RGB{Float64}(1.0,0.999999,0.999999) RGB{Float64}(1.0,0.99999,0.99999)", + "image/png": "iVBORw0KGgoAAAANSUhEUgAAAHAAAABwCAIAAABJgmMcAAAABGdBTUEAALGPC/xhBQAAAAFzUkdCAK7OHOkAAAAgY0hSTQAAeiYAAICEAAD6AAAAgOgAAHUwAADqYAAAOpgAABdwnLpRPAAABCBJREFUeAHtwU9o1gUYwPHv+77PpvvzuhnTDTfR99WamR2isfFOjKQOjXAgDCYeQjqMIAI9CgpCIOoOHRT0NERDEA0vIxXLGQTZDoHHxK2YiOGMzEm69fr+Oj3PPAzE8ex9t3g+H0mShOBHCK6E4EoIroTgSgiuhOBKCK6E4EoIroTgSgiuhOBKCK6E4EoIroTgSgiuhOBKCK6E4EoIroTgSgiuhOBKCK6E4EoIroRF7Plz5pS5+i3m/HnM+vWYvXtRz2pWopYvZ0EJwZUQXAnBlbAYTE9jZmZQmbExzOQk5tEjTFUVRgQzNoZavm4d6uoPq1DZLKarC5PJMG9CcCUEV0JwJZTTkyeof9L1qPHxZagtbwrm4UNMLod59gyzezfm9m3M0BCmowP1+vufolpbMZOTmJYW5k0IroTgSgiuhIWWJKjvf65H3b2LuXMHs29fBpV970NUtZQwGzcyp9WrUaWBz1BPn2LyI8OYm49RLTU1mN5eTCbDqxCCKyG4EoIrwUuSYO7dQ/3yoA3V348pFjEXL2JWrMBUVfGCNC/z1bUtqJ31mNpaTF2hgLl5E9PdjUkS5ksIroTgSgiuBCfTMynUj7+2oRobMUeOYDZvxnR380pKJczUFGbf3gQzOoqprkZ9M/4OatOmj1Fv1c1gMhnmSwiuhOBKCK4EJ8tmplAfPBjGnPsO9cbQECo7MoL5ZAhVPHsWJSMjmHPnUOlDh1AN+/djcjnM1q2YQgFVaMZcv45pa6tGNVQzb0JwJQRXQnAleMlmMVVVmDNnUKeZ9e/27agpZjUx6/OjR1GXrlxB7RTBjI0xp7VrMc3NqDUbNqB27apDieBCCK6E4EoIroSF0NeH6elBfbFnD6alBZWcOIFKdXVhWltROzs7UaWTJ1HpwUHM48eY+/cxNTWY2lqUpHAnBFdCcCUEV8JCq6vDXLjAXFLHj/NSp06h/hgdRa0ZHsbk85iDBzG5HOUiBFdCcCUEV8IiViph0uk0ak1nJyabxRw7hmlqohKE4EoIroTgSqiQJGFOhw9jDhz4HZX82Yf66e0BVD6PWfUaJk1lCMGVEFwJwZVQIakU5vJlzOAgprd3Peqj3ZiJCcylS5jmZipOCK6E4EoIroQKSRLMxASmvR1TLGJ27MBs24Zpb2dREYIrIbgSgiuhQqanMTduYEZH/0Z1dDSgenow+TyLlhBcCcGVEFwJFXLrFmZgAJPJNKAKBUw+lzArxWIlBFdCcCUEV0I5jY+juuQv1LXiu6ivv/wNNVmfw6RSLAVCcCUEV0JwJZTT6dOYvj5URwcvaEStWsmSIwRXQnAlBFdCObW2YpqaUHV1zJIGljIhuBKCKyG4Esqpvx9TLKKKRUx1dZqlTAiuhOBKCK6EcmpsZC61/H8IwZUQXAnB1X8zzMhuxSOOgQAAAABJRU5ErkJggg==", "text/html": [ - "" + "" ] }, "metadata": {}, diff --git a/dev/generated/augmentations/index.html b/dev/generated/augmentations/index.html index 303c60dc..44b89fba 100644 --- a/dev/generated/augmentations/index.html +++ b/dev/generated/augmentations/index.html @@ -20,16 +20,16 @@ convert2image(MNIST, x)

Noise augmentation

The NoiseAugmentation wrapper computes explanations averaged over noisy inputs. Let's demonstrate this on the Gradient analyzer. First, we compute the heatmap of an explanation without augmentation:

analyzer = Gradient(model)
 heatmap(input, analyzer)

Now we wrap the analyzer in a NoiseAugmentation with 10 samples of noise. By default, the noise is sampled from a Gaussian distribution with mean 0 and standard deviation 1.

analyzer = NoiseAugmentation(Gradient(model), 50)
-heatmap(input, analyzer)

Note that a higher sample size is desired, as it will lead to a smoother heatmap. However, this comes at the cost of a longer computation time.

We can also set the standard deviation of the Gaussian distribution:

analyzer = NoiseAugmentation(Gradient(model), 50, 0.1)
-heatmap(input, analyzer)

When used with a Gradient analyzer, this is equivalent to SmoothGrad:

analyzer = SmoothGrad(model, 50)
-heatmap(input, analyzer)

We can also use any distribution from Distributions.jl, for example Poisson noise with rate $\lambda=0.5$:

using Distributions
+heatmap(input, analyzer)

Note that a higher sample size is desired, as it will lead to a smoother heatmap. However, this comes at the cost of a longer computation time.

We can also set the standard deviation of the Gaussian distribution:

analyzer = NoiseAugmentation(Gradient(model), 50, 0.1)
+heatmap(input, analyzer)

When used with a Gradient analyzer, this is equivalent to SmoothGrad:

analyzer = SmoothGrad(model, 50)
+heatmap(input, analyzer)

We can also use any distribution from Distributions.jl, for example Poisson noise with rate $\lambda=0.5$:

using Distributions
 
 analyzer = NoiseAugmentation(Gradient(model), 50, Poisson(0.5))
-heatmap(input, analyzer)

Is is also possible to define your own distributions or mixture distributions.

NoiseAugmentation can be combined with any analyzer type, for example LRP:

analyzer = NoiseAugmentation(LRP(model), 50)
-heatmap(input, analyzer)

Integration augmentation

The InterpolationAugmentation wrapper computes explanations averaged over n steps of linear interpolation between the input and a reference input, which is set to zero(input) by default:

analyzer = InterpolationAugmentation(Gradient(model), 50)
+heatmap(input, analyzer)

Is is also possible to define your own distributions or mixture distributions.

NoiseAugmentation can be combined with any analyzer type, for example LRP:

analyzer = NoiseAugmentation(LRP(model), 50)
+heatmap(input, analyzer)

Integration augmentation

The InterpolationAugmentation wrapper computes explanations averaged over n steps of linear interpolation between the input and a reference input, which is set to zero(input) by default:

analyzer = InterpolationAugmentation(Gradient(model), 50)
 heatmap(input, analyzer)

When used with a Gradient analyzer, this is equivalent to IntegratedGradients:

analyzer = IntegratedGradients(model, 50)
 heatmap(input, analyzer)

To select a different reference input, pass it to the analyze or heatmap function using the keyword argument input_ref. Note that this is an arbitrary example for the sake of demonstration.

matrix_of_ones = ones(Float32, size(input))
 
 analyzer = InterpolationAugmentation(Gradient(model), 50)
 heatmap(input, analyzer; input_ref=matrix_of_ones)

Once again, InterpolationAugmentation can be combined with any analyzer type, for example LRP:

analyzer = InterpolationAugmentation(LRP(model), 50)
-heatmap(input, analyzer)

This page was generated using Literate.jl.

+heatmap(input, analyzer)

This page was generated using Literate.jl.

diff --git a/dev/generated/example.ipynb b/dev/generated/example.ipynb index 0892de0e..d9af649c 100644 --- a/dev/generated/example.ipynb +++ b/dev/generated/example.ipynb @@ -380,7 +380,7 @@ { "output_type": "execute_result", "data": { - "text/plain": "20-element Vector{Matrix{ColorTypes.RGB{Float64}}}:\n [RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); … ; RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0)]\n [RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); … ; RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0)]\n [RGB{Float64}(1.0,0.9999998474121093,0.9999998474121093) RGB{Float64}(1.0,0.9999998474121093,0.9999998474121093) … RGB{Float64}(1.0,0.9999998474121093,0.9999998474121093) RGB{Float64}(1.0,0.9999998474121093,0.9999998474121093); RGB{Float64}(1.0,0.9999998474121093,0.9999998474121093) RGB{Float64}(1.0,0.9999998474121093,0.9999998474121093) … RGB{Float64}(1.0,0.9999998474121093,0.9999998474121093) RGB{Float64}(1.0,0.9999998474121093,0.9999998474121093); … ; RGB{Float64}(1.0,0.9999998474121093,0.9999998474121093) RGB{Float64}(1.0,0.9999998474121093,0.9999998474121093) … RGB{Float64}(1.0,0.9999998474121093,0.9999998474121093) RGB{Float64}(1.0,0.9999998474121093,0.9999998474121093); RGB{Float64}(1.0,0.9999998474121093,0.9999998474121093) RGB{Float64}(1.0,0.9999998474121093,0.9999998474121093) … RGB{Float64}(1.0,0.9999998474121093,0.9999998474121093) RGB{Float64}(1.0,0.9999998474121093,0.9999998474121093)]\n [RGB{Float64}(1.0,0.9999998474121093,0.9999998474121093) RGB{Float64}(1.0,0.9999998474121093,0.9999998474121093) … RGB{Float64}(1.0,0.9999998474121093,0.9999998474121093) RGB{Float64}(1.0,0.9999998474121093,0.9999998474121093); RGB{Float64}(1.0,0.9999998474121093,0.9999998474121093) RGB{Float64}(1.0,0.9999998474121093,0.9999998474121093) … RGB{Float64}(1.0,0.9999998474121093,0.9999998474121093) RGB{Float64}(1.0,0.9999998474121093,0.9999998474121093); … ; RGB{Float64}(1.0,0.9999998474121093,0.9999998474121093) RGB{Float64}(1.0,0.9999998474121093,0.9999998474121093) … RGB{Float64}(1.0,0.9999998474121093,0.9999998474121093) RGB{Float64}(1.0,0.9999998474121093,0.9999998474121093); RGB{Float64}(1.0,0.9999998474121093,0.9999998474121093) RGB{Float64}(1.0,0.9999998474121093,0.9999998474121093) … RGB{Float64}(1.0,0.9999998474121093,0.9999998474121093) RGB{Float64}(1.0,0.9999998474121093,0.9999998474121093)]\n [RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); … ; RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0)]\n [RGB{Float64}(0.9999998474121093,0.9999998474121093,1.0) RGB{Float64}(0.9999998474121093,0.9999998474121093,1.0) … RGB{Float64}(0.9999998474121093,0.9999998474121093,1.0) RGB{Float64}(0.9999998474121093,0.9999998474121093,1.0); RGB{Float64}(0.9999998474121093,0.9999998474121093,1.0) RGB{Float64}(0.9999998474121093,0.9999998474121093,1.0) … RGB{Float64}(0.9999998474121093,0.9999998474121093,1.0) RGB{Float64}(0.9999998474121093,0.9999998474121093,1.0); … ; RGB{Float64}(0.9999998474121093,0.9999998474121093,1.0) RGB{Float64}(0.9999998474121093,0.9999998474121093,1.0) … RGB{Float64}(0.9999998474121093,0.9999998474121093,1.0) RGB{Float64}(0.9999998474121093,0.9999998474121093,1.0); RGB{Float64}(0.9999998474121093,0.9999998474121093,1.0) RGB{Float64}(0.9999998474121093,0.9999998474121093,1.0) … RGB{Float64}(0.9999998474121093,0.9999998474121093,1.0) RGB{Float64}(0.9999998474121093,0.9999998474121093,1.0)]\n [RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); … ; RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0)]\n [RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); … ; RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0)]\n [RGB{Float64}(0.9999998474121093,0.9999998474121093,1.0) RGB{Float64}(0.9999998474121093,0.9999998474121093,1.0) … RGB{Float64}(0.9999998474121093,0.9999998474121093,1.0) RGB{Float64}(0.9999998474121093,0.9999998474121093,1.0); RGB{Float64}(0.9999998474121093,0.9999998474121093,1.0) RGB{Float64}(0.9999998474121093,0.9999998474121093,1.0) … RGB{Float64}(0.9999998474121093,0.9999998474121093,1.0) RGB{Float64}(0.9999998474121093,0.9999998474121093,1.0); … ; RGB{Float64}(0.9999998474121093,0.9999998474121093,1.0) RGB{Float64}(0.9999998474121093,0.9999998474121093,1.0) … RGB{Float64}(0.9999998474121093,0.9999998474121093,1.0) RGB{Float64}(0.9999998474121093,0.9999998474121093,1.0); RGB{Float64}(0.9999998474121093,0.9999998474121093,1.0) RGB{Float64}(0.9999998474121093,0.9999998474121093,1.0) … RGB{Float64}(0.9999998474121093,0.9999998474121093,1.0) RGB{Float64}(0.9999998474121093,0.9999998474121093,1.0)]\n [RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); … ; RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0)]\n [RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); … ; RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0)]\n [RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); … ; RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0)]\n [RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); … ; RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0)]\n [RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); … ; RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0)]\n [RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); … ; RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0)]\n [RGB{Float64}(1.0,0.9999998474121093,0.9999998474121093) RGB{Float64}(1.0,0.9999998474121093,0.9999998474121093) … RGB{Float64}(1.0,0.9999998474121093,0.9999998474121093) RGB{Float64}(1.0,0.9999998474121093,0.9999998474121093); RGB{Float64}(1.0,0.9999998474121093,0.9999998474121093) RGB{Float64}(1.0,0.9999998474121093,0.9999998474121093) … RGB{Float64}(1.0,0.9999998474121093,0.9999998474121093) RGB{Float64}(1.0,0.9999998474121093,0.9999998474121093); … ; RGB{Float64}(1.0,0.9999998474121093,0.9999998474121093) RGB{Float64}(1.0,0.9999998474121093,0.9999998474121093) … RGB{Float64}(1.0,0.9999998474121093,0.9999998474121093) RGB{Float64}(1.0,0.9999998474121093,0.9999998474121093); RGB{Float64}(1.0,0.9999998474121093,0.9999998474121093) RGB{Float64}(1.0,0.9999998474121093,0.9999998474121093) … RGB{Float64}(1.0,0.9999998474121093,0.9999998474121093) RGB{Float64}(1.0,0.9999998474121093,0.9999998474121093)]\n [RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); … ; RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0)]\n [RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); … ; RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0)]\n [RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); … ; RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0)]\n [RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); … ; RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0)]", + "text/plain": "20-element Vector{Matrix{ColorTypes.RGB{Float64}}}:\n [RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); … ; RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0)]\n [RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); … ; RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0)]\n [RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); … ; RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0)]\n [RGB{Float64}(0.9999998474121093,0.9999998474121093,1.0) RGB{Float64}(0.9999998474121093,0.9999998474121093,1.0) … RGB{Float64}(0.9999998474121093,0.9999998474121093,1.0) RGB{Float64}(0.9999998474121093,0.9999998474121093,1.0); RGB{Float64}(0.9999998474121093,0.9999998474121093,1.0) RGB{Float64}(0.9999998474121093,0.9999998474121093,1.0) … RGB{Float64}(0.9999998474121093,0.9999998474121093,1.0) RGB{Float64}(0.9999998474121093,0.9999998474121093,1.0); … ; RGB{Float64}(0.9999998474121093,0.9999998474121093,1.0) RGB{Float64}(0.9999998474121093,0.9999998474121093,1.0) … RGB{Float64}(0.9999998474121093,0.9999998474121093,1.0) RGB{Float64}(0.9999998474121093,0.9999998474121093,1.0); RGB{Float64}(0.9999998474121093,0.9999998474121093,1.0) RGB{Float64}(0.9999998474121093,0.9999998474121093,1.0) … RGB{Float64}(0.9999998474121093,0.9999998474121093,1.0) RGB{Float64}(0.9999998474121093,0.9999998474121093,1.0)]\n [RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); … ; RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0)]\n [RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); … ; RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0)]\n [RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); … ; RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0)]\n [RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); … ; RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0)]\n [RGB{Float64}(0.9999998474121093,0.9999998474121093,1.0) RGB{Float64}(0.9999998474121093,0.9999998474121093,1.0) … RGB{Float64}(0.9999998474121093,0.9999998474121093,1.0) RGB{Float64}(0.9999998474121093,0.9999998474121093,1.0); RGB{Float64}(0.9999998474121093,0.9999998474121093,1.0) RGB{Float64}(0.9999998474121093,0.9999998474121093,1.0) … RGB{Float64}(0.9999998474121093,0.9999998474121093,1.0) RGB{Float64}(0.9999998474121093,0.9999998474121093,1.0); … ; RGB{Float64}(0.9999998474121093,0.9999998474121093,1.0) RGB{Float64}(0.9999998474121093,0.9999998474121093,1.0) … RGB{Float64}(0.9999998474121093,0.9999998474121093,1.0) RGB{Float64}(0.9999998474121093,0.9999998474121093,1.0); RGB{Float64}(0.9999998474121093,0.9999998474121093,1.0) RGB{Float64}(0.9999998474121093,0.9999998474121093,1.0) … RGB{Float64}(0.9999998474121093,0.9999998474121093,1.0) RGB{Float64}(0.9999998474121093,0.9999998474121093,1.0)]\n [RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); … ; RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0)]\n [RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); … ; RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0)]\n [RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); … ; RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0)]\n [RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); … ; RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0)]\n [RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); … ; RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0)]\n [RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); … ; RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0)]\n [RGB{Float64}(1.0,0.9999998474121093,0.9999998474121093) RGB{Float64}(1.0,0.9999998474121093,0.9999998474121093) … RGB{Float64}(1.0,0.9999998474121093,0.9999998474121093) RGB{Float64}(1.0,0.9999998474121093,0.9999998474121093); RGB{Float64}(1.0,0.9999998474121093,0.9999998474121093) RGB{Float64}(1.0,0.9999998474121093,0.9999998474121093) … RGB{Float64}(1.0,0.9999998474121093,0.9999998474121093) RGB{Float64}(1.0,0.9999998474121093,0.9999998474121093); … ; RGB{Float64}(1.0,0.9999998474121093,0.9999998474121093) RGB{Float64}(1.0,0.9999998474121093,0.9999998474121093) … RGB{Float64}(1.0,0.9999998474121093,0.9999998474121093) RGB{Float64}(1.0,0.9999998474121093,0.9999998474121093); RGB{Float64}(1.0,0.9999998474121093,0.9999998474121093) RGB{Float64}(1.0,0.9999998474121093,0.9999998474121093) … RGB{Float64}(1.0,0.9999998474121093,0.9999998474121093) RGB{Float64}(1.0,0.9999998474121093,0.9999998474121093)]\n [RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); … ; RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0)]\n [RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); … ; RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0)]\n [RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); … ; RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0)]\n [RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); … ; RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0)]", "text/html": [ "
(a vector displayed as a row to save space)
" ] diff --git a/dev/generated/example/index.html b/dev/generated/example/index.html index 89e1ef13..ecbfc07a 100644 --- a/dev/generated/example/index.html +++ b/dev/generated/example/index.html @@ -44,4 +44,4 @@ heatmap(expl)

This heatmap shows us that the "upper loop" of the hand-drawn 9 has negative relevance with respect to the output neuron corresponding to digit 4!

Note

The output neuron can also be specified when calling heatmap:

heatmap(input, analyzer, 5)

Analyzing batches

ExplainableAI also supports explanations of input batches:

batchsize = 20
 xs, _ = MNIST(Float32, :test)[1:batchsize]
 batch = reshape(xs, 28, 28, 1, :) # reshape to WHCN format
-expl = analyze(batch, analyzer);

This will return a single Explanation expl for the entire batch. Calling heatmap on expl will detect the batch dimension and return a vector of heatmaps.

heatmap(expl)
(a vector displayed as a row to save space)

For more information on heatmapping batches, refer to the heatmapping documentation.


This page was generated using Literate.jl.

+expl = analyze(batch, analyzer);

This will return a single Explanation expl for the entire batch. Calling heatmap on expl will detect the batch dimension and return a vector of heatmaps.

heatmap(expl)
(a vector displayed as a row to save space)

For more information on heatmapping batches, refer to the heatmapping documentation.


This page was generated using Literate.jl.

diff --git a/dev/generated/heatmapping.ipynb b/dev/generated/heatmapping.ipynb index 3715a9ea..e4c872c5 100644 --- a/dev/generated/heatmapping.ipynb +++ b/dev/generated/heatmapping.ipynb @@ -431,7 +431,7 @@ { "output_type": "execute_result", "data": { - "text/plain": "100-element Vector{Matrix{ColorTypes.RGB{Float64}}}:\n [RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); … ; RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0)]\n [RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); … ; RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0)]\n [RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); … ; RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0)]\n [RGB{Float64}(0.9999998474121093,0.9999998474121093,1.0) RGB{Float64}(0.9999998474121093,0.9999998474121093,1.0) … RGB{Float64}(0.9999998474121093,0.9999998474121093,1.0) RGB{Float64}(0.9999998474121093,0.9999998474121093,1.0); RGB{Float64}(0.9999998474121093,0.9999998474121093,1.0) RGB{Float64}(0.9999998474121093,0.9999998474121093,1.0) … RGB{Float64}(0.9999998474121093,0.9999998474121093,1.0) RGB{Float64}(0.9999998474121093,0.9999998474121093,1.0); … ; RGB{Float64}(0.9999998474121093,0.9999998474121093,1.0) RGB{Float64}(0.9999998474121093,0.9999998474121093,1.0) … RGB{Float64}(0.9999998474121093,0.9999998474121093,1.0) RGB{Float64}(0.9999998474121093,0.9999998474121093,1.0); RGB{Float64}(0.9999998474121093,0.9999998474121093,1.0) RGB{Float64}(0.9999998474121093,0.9999998474121093,1.0) … RGB{Float64}(0.9999998474121093,0.9999998474121093,1.0) RGB{Float64}(0.9999998474121093,0.9999998474121093,1.0)]\n [RGB{Float64}(0.9999998474121093,0.9999998474121093,1.0) RGB{Float64}(0.9999998474121093,0.9999998474121093,1.0) … RGB{Float64}(0.9999998474121093,0.9999998474121093,1.0) RGB{Float64}(0.9999998474121093,0.9999998474121093,1.0); RGB{Float64}(0.9999998474121093,0.9999998474121093,1.0) RGB{Float64}(0.9999998474121093,0.9999998474121093,1.0) … RGB{Float64}(0.9999998474121093,0.9999998474121093,1.0) RGB{Float64}(0.9999998474121093,0.9999998474121093,1.0); … ; RGB{Float64}(0.9999998474121093,0.9999998474121093,1.0) RGB{Float64}(0.9999998474121093,0.9999998474121093,1.0) … RGB{Float64}(0.9999998474121093,0.9999998474121093,1.0) RGB{Float64}(0.9999998474121093,0.9999998474121093,1.0); RGB{Float64}(0.9999998474121093,0.9999998474121093,1.0) RGB{Float64}(0.9999998474121093,0.9999998474121093,1.0) … RGB{Float64}(0.9999998474121093,0.9999998474121093,1.0) RGB{Float64}(0.9999998474121093,0.9999998474121093,1.0)]\n [RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); … ; RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0)]\n [RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); … ; RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0)]\n [RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); … ; RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0)]\n [RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); … ; RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0)]\n [RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); … ; RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0)]\n ⋮\n [RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); … ; RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0)]\n [RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); … ; RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0)]\n [RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); … ; RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0)]\n [RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); … ; RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0)]\n [RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); … ; RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0)]\n [RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); … ; RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0)]\n [RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); … ; RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0)]\n [RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); … ; RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0)]\n [RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); … ; RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0)]", + "text/plain": "100-element Vector{Matrix{ColorTypes.RGB{Float64}}}:\n [RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); … ; RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0)]\n [RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); … ; RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0)]\n [RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); … ; RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0)]\n [RGB{Float64}(0.9999998474121093,0.9999998474121093,1.0) RGB{Float64}(0.9999998474121093,0.9999998474121093,1.0) … RGB{Float64}(0.9999998474121093,0.9999998474121093,1.0) RGB{Float64}(0.9999998474121093,0.9999998474121093,1.0); RGB{Float64}(0.9999998474121093,0.9999998474121093,1.0) RGB{Float64}(0.9999998474121093,0.9999998474121093,1.0) … RGB{Float64}(0.9999998474121093,0.9999998474121093,1.0) RGB{Float64}(0.9999998474121093,0.9999998474121093,1.0); … ; RGB{Float64}(0.9999998474121093,0.9999998474121093,1.0) RGB{Float64}(0.9999998474121093,0.9999998474121093,1.0) … RGB{Float64}(0.9999998474121093,0.9999998474121093,1.0) RGB{Float64}(0.9999998474121093,0.9999998474121093,1.0); RGB{Float64}(0.9999998474121093,0.9999998474121093,1.0) RGB{Float64}(0.9999998474121093,0.9999998474121093,1.0) … RGB{Float64}(0.9999998474121093,0.9999998474121093,1.0) RGB{Float64}(0.9999998474121093,0.9999998474121093,1.0)]\n [RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); … ; RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0)]\n [RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); … ; RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0)]\n [RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); … ; RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0)]\n [RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); … ; RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0)]\n [RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); … ; RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0)]\n [RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); … ; RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0)]\n ⋮\n [RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); … ; RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0)]\n [RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); … ; RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0)]\n [RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); … ; RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0)]\n [RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); … ; RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0)]\n [RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); … ; RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0)]\n [RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); … ; RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0)]\n [RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); … ; RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0)]\n [RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); … ; RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0)]\n [RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); … ; RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0); RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0) … RGB{Float64}(1.0,1.0,1.0) RGB{Float64}(1.0,1.0,1.0)]", "text/html": [ "
(a vector displayed as a row to save space)
" ] diff --git a/dev/generated/heatmapping/index.html b/dev/generated/heatmapping/index.html index 7500743a..47cc410d 100644 --- a/dev/generated/heatmapping/index.html +++ b/dev/generated/heatmapping/index.html @@ -27,4 +27,4 @@ batch = reshape(xs, 28, 28, 1, :); # reshape to WHCN format

The heatmap function automatically recognizes that the explanation is batched and returns a Vector of images:

heatmaps = heatmap(batch, analyzer)
(a vector displayed as a row to save space)

Image.jl's mosaic function can used to display them in a grid:

mosaic(heatmaps; nrow=10)
Output type consistency

To obtain a singleton Vector containing a single heatmap for non-batched inputs, use the heatmap keyword argument unpack_singleton=false.

Processing heatmaps

Heatmapping makes use of the Julia-based image processing ecosystem Images.jl.

If you want to further process heatmaps, you may benefit from reading about some fundamental conventions that the ecosystem utilizes that are different from how images are typically represented in OpenCV, MATLAB, ImageJ or Python.

Saving heatmaps

Since heatmaps are regular Images.jl images, they can be saved as such:

using FileIO
 
 img = heatmap(input, analyzer)
-save("heatmap.png", img)

This page was generated using Literate.jl.

+save("heatmap.png", img)

This page was generated using Literate.jl.

diff --git a/dev/generated/lrp/basics.ipynb b/dev/generated/lrp/basics.ipynb index efb117ec..7dca358d 100644 --- a/dev/generated/lrp/basics.ipynb +++ b/dev/generated/lrp/basics.ipynb @@ -292,7 +292,7 @@ { "output_type": "execute_result", "data": { - "text/plain": "11-element Vector{Array{Float32}}:\n [0.11555797 0.16567013 … -0.00074588886 -0.016081242; -0.020463338 -0.18181762 … 0.0075527276 -0.0012225641; … ; 0.05978778 0.025775693 … 0.016010145 0.0058790017; 0.027305666 0.034312364 … 0.011098867 0.0031443986;;; 0.17919934 -0.44290525 … -0.0028781097 0.0025866567; -0.17921384 -1.1027005 … 0.00529778 0.014584399; … ; 0.0065630525 0.019256484 … 0.01691892 -0.017772272; -0.006193216 -0.0006287951 … -0.0018861903 0.0041949633;;; -0.23348749 -0.14968054 … -0.017748682 0.005836729; 0.22589077 -0.11296646 … 0.0004931396 0.008539669; … ; 0.0057462584 -0.0026442981 … -0.015646847 -0.0002237301; 0.0056847427 -0.013524553 … 0.01720984 -0.004970015;;;;]\n [-0.037589494 -0.35138637 … -0.0023543204 -0.0031317736; -0.32994968 -0.343098 … -0.0023740756 -0.001340456; … ; 0.001353058 -0.045696773 … 0.024047514 -0.0043757693; 0.009031263 0.00071556837 … -0.012437812 -0.0006313002;;; 0.12511042 0.16571468 … 0.0062297885 -0.0023807054; 0.032985922 0.043249875 … 0.002118163 0.004735875; … ; -0.0 -0.0030185538 … -0.018875849 0.0036812269; -0.0 0.007889737 … 0.017898919 0.009926329;;; -0.05125717 -0.0 … 0.0042070267 0.0; 0.09910777 0.105798446 … -0.0 0.0; … ; -0.010481375 0.009564088 … 0.012331664 0.0; -0.0 -0.0 … 0.0010695708 -4.6312893f-5;;; -0.10725487 -0.032024257 … 0.0032939261 0.0; 0.0 0.0 … -5.4200027f-5 -0.0011091139; … ; -0.0 0.0047965213 … -0.0 -0.0005881093; -0.0010443022 -0.00014615235 … 0.0 -0.0;;; -0.69584686 -1.0485102 … 0.027182447 0.0036318763; 0.19321147 0.038500734 … -0.023141088 0.004430523; … ; -0.00366669 -0.0002217274 … 0.03482189 0.00024875192; 0.0018984725 -0.010518859 … 0.0038209218 0.0;;; -0.025747867 0.30286813 … 0.00014623615 -0.0029056992; 0.06188767 0.004420979 … -0.010237755 -0.00016455087; … ; -0.00019923941 0.010801336 … 0.0066850116 0.0039750487; -0.010430547 0.0040452997 … 0.004033728 -0.0018882912;;; 0.0 -0.0 … -0.0 -0.0; -0.0 0.0 … -0.0 -0.0; … ; 0.0 0.0 … -0.0 -0.0; -0.0 0.0 … 0.0 0.0;;; 0.022411661 -0.0 … 0.0 -0.0037738564; 0.11818989 0.0 … 0.0012431474 -0.0019136093; … ; -0.00065092824 0.0 … 0.0 -0.0040879594; -0.00030009574 -0.001830498 … -0.0 -0.00044494044;;;;]\n [0.0 0.0 … -0.001536238 0.0; -0.32529083 0.0 … 0.0 0.0; … ; 0.0 0.0 … 0.0 0.0; 0.00089153595 0.0 … 0.0067885933 0.0;;; 0.0 0.0076351017 … -0.01013882 0.0; 0.0 0.0 … 0.0 0.0; … ; 0.0 0.0 … 0.0 0.00069445185; 0.0 0.0 … 0.0 0.0;;; 0.0 0.0 … 0.0 0.0; 0.0 0.0 … 0.0 0.0041311784; … ; 0.0 0.0 … 0.0 0.0; 0.0 -0.029274946 … -0.018185424 0.0;;; 0.0 0.0 … 0.0 -0.0012027454; 0.0 -0.8644419 … 0.0 0.0; … ; 0.0 0.0 … 0.0 0.0; 0.0 0.006391116 … -0.00089663255 0.0;;; 0.0 -0.8371275 … 0.0 0.006528635; 0.0 0.0 … 0.0 0.0; … ; 0.0 0.038002744 … 0.0 0.0; 0.0 0.0 … 0.0 -0.011889387;;; 0.0 0.0 … 0.0 -0.0031841923; 0.0 0.0 … 0.0 0.0; … ; 0.0 0.0 … 0.0 0.0; 0.0 0.0 … 0.0 0.0043347296;;; 0.0 0.0 … 0.0 0.0; 0.0 0.0 … 0.0 0.0; … ; 0.0 0.0 … 0.0 0.001537811; 0.0 -0.0061070193 … 0.0 0.0;;; -0.2521484 0.0 … 0.0 0.0; 0.0 0.0 … 0.0 0.0; … ; 0.0 0.0 … 0.0 0.0; 0.0 0.04499356 … 0.042463027 0.0;;;;]\n [-0.32529083 0.072779745 … -0.016228346 -0.001536238; 0.2532575 -0.0 … 0.010171676 0.0009467099; … ; 0.027731834 0.22912401 … 0.03565193 -0.0; 0.00089153595 0.008775006 … 0.009476386 0.0067885933;;; 0.0076351026 0.031596325 … -0.0 -0.01013882; 0.0 0.03872561 … 0.0 0.009164122; … ; 0.0 0.0 … -0.0 0.0010779466; -0.0 -0.0006514026 … -0.0 0.00069445185;;; 0.0 0.21772698 … -0.0 0.0041311784; -0.0 -0.0 … -0.08997348 -0.019909158; … ; 0.0015084054 0.03941227 … -0.0 -0.012937454; -0.029274946 -0.026991209 … -0.03406386 -0.018185424;;; -0.8644419 0.10458824 … -0.013436062 -0.0012027454; 1.2725815 -0.2836974 … 0.004793112 -0.012660509; … ; -0.05805798 0.04653638 … 0.012597835 -0.0042927563; 0.006391116 0.048833884 … 0.004463303 -0.00089663255;;; -0.8371275 -0.12549062 … 0.020481162 0.006528635; 0.0 0.19743301 … -0.0055893334 0.07488632; … ; -0.023937635 -0.036739197 … -0.0034947726 0.04904119; 0.038002744 -0.05372891 … 0.00037613173 -0.011889387;;; 0.0 -0.0 … -0.00052843615 -0.0031841923; -0.0 0.0 … -0.013878753 -0.012208857; … ; 0.0 0.0 … -0.0 -0.0; -0.0 0.0006502942 … -0.0 0.0043347296;;; 0.0 -0.17113881 … 0.0 0.0; -0.6252986 0.0 … 0.0 -0.0035140316; … ; -0.0010301681 0.0 … 7.153189f-5 0.0067811585; -0.0061070193 0.0 … 0.0004158783 0.001537811;;; -0.2521484 -0.36160702 … -0.0025597953 0.0; -0.2626785 -0.031484623 … -0.018764477 -0.025711201; … ; 0.066031694 -0.09855567 … -0.018081622 -0.0066982876; 0.04499356 -0.035856098 … 0.037654907 0.042463027;;;;]\n [-0.07855032 0.00196009 … -0.0 -0.0; -0.014896327 -0.8878472 … -0.008299498 -0.0; … ; 0.04742264 -0.033691153 … -0.0 -0.0; -0.011322196 -0.10518424 … -0.007691069 -0.0018612122;;; -0.0 -0.024953386 … 0.0 0.0; -0.0 0.0 … 2.1948752f-5 0.0; … ; 0.0 -0.0 … 0.0 0.0; 0.0053180666 0.0 … 0.0 0.0;;; 0.0 -0.0 … 0.0 -0.0013910539; -0.0 0.0 … -0.0 0.0052974066; … ; -0.0 0.0 … 0.0 -0.0012707297; -0.0 0.0 … -0.0 -0.00069558783;;; … ;;; -0.0 -0.16272092 … 0.0 0.0; -0.10774095 -0.15646434 … -0.0 0.0; … ; 0.009053699 0.0 … 0.0 0.0; 0.008717909 -0.0020142347 … 0.0014557827 0.0;;; 0.0011598621 -0.772295 … 0.014580138 0.022462297; 0.42342895 0.9504279 … -0.010914816 0.010951316; … ; 0.00361706 0.11409792 … 0.012012022 0.023946112; 0.0 0.013503348 … 0.019456247 -0.012093022;;; -0.0 0.0029368615 … 0.00012988015 -0.0; -0.0 0.1753267 … 0.0 0.0; … ; 0.0 0.0 … 0.0 0.0; -0.0 0.0 … -0.0 0.0;;;;]\n [0.0 0.0 … 0.0 0.0; 0.0 0.0 … 0.0 0.0; … ; 0.00025325536 -0.000578286 … 0.0 0.0; 0.00025790694 -0.00214369 … 0.0 0.0;;; -0.0021142424 -0.0 … -0.0 -0.0; -0.0 -0.0 … -0.0 -0.0; … ; -0.0 -0.0 … -0.0 -0.0; -0.0 -0.0 … -0.0 -0.0;;; -0.0 -0.0 … -6.964325f-5 -3.2026877f-5; 0.0013377107 0.044989742 … -0.0 -8.142743f-5; … ; -0.0002638756 -0.036990084 … -0.012151355 0.008718809; -2.5924955f-5 -0.0 … -0.00012693179 -0.00030154805;;; 0.001276079 0.0012933273 … 0.00049700524 0.0; -5.7388935 -0.17388494 … 0.00066013855 0.0013053188; … ; -0.036486518 -0.032211166 … 0.00082350534 0.0013062144; -0.0010246906 -0.2589682 … -0.00032640586 0.0012392186;;; -0.02728919 -0.030647784 … 0.04076553 -0.0101187; -0.006137956 -0.0 … -0.052455477 -0.03686809; … ; -0.0013391115 -0.0 … -0.0 -0.0; -8.074782f-5 -0.0 … -0.0120127 -0.0;;; -0.0 -0.0 … -0.0016737665 -0.013131372; -0.0 -0.0 … -0.0 -0.021611648; … ; -0.0 -0.0076563214 … -0.01021521 -0.029008528; -0.0 -0.0 … 0.008279179 -0.001842284;;; 0.0 0.26381096 … -0.012868868 0.053802982; 0.0032877645 0.061142918 … 0.021163741 0.014161846; … ; 0.055578362 0.06774405 … 0.0713381 0.055979244; 0.008224021 0.066860475 … 0.01846212 0.0;;; -0.0 -0.0 … -0.0 -0.004081749; -0.0 -0.0 … -0.0 0.0024254376; … ; -0.0 -0.0 … -0.0 -3.81782f-5; -0.0012016319 -0.001447384 … -0.0 -0.008825191;;;;]\n [0.0 0.0 … 0.0 0.0; 0.0 0.0 … 0.0 0.0; … ; 0.00019299799 -0.00069045747 … 0.0 0.0; 0.00019671685 -0.0018927953 … 0.0 0.0;;; -0.0022356662 0.0 … 0.0 0.0; 0.0 0.0 … 0.0 0.0; … ; 0.0 0.0 … 0.0 0.0; 0.0 0.0 … 0.0 0.0;;; 0.0 0.0 … 0.00022334259 -0.00013145785; 0.00030036323 0.00062986766 … 0.0 0.0005030005; … ; -0.0016554995 -0.013122062 … -0.0007300087 0.005158066; -2.7746066f-5 0.0 … -0.00026150592 0.0010497526;;; -0.0015516146 -0.0023198463 … 0.0007291804 0.0; -0.016532248 -0.020140184 … 0.008194434 0.0053568445; … ; -0.008068854 -0.0022266698 … 0.0015820686 0.0055338885; -0.005404968 -0.035394073 … 0.0065475777 -0.006851507;;; -0.0015183646 0.005927388 … 0.008750405 -0.0047523314; 0.00058639195 0.0 … -0.0045847525 -0.000464695; … ; 0.00063990976 0.0 … 0.0 0.0; -2.6791107f-5 0.0 … 0.00079456065 0.0;;; 0.0 0.0 … 0.00036528043 0.00016716821; 0.0 0.0 … 0.0 -0.0024151024; … ; 0.0 0.0018217764 … 0.00064925384 -0.008731368; 0.0 0.0 … 0.0019635109 0.005983344;;; 0.0 0.003956399 … -0.0131952455 0.004792052; 0.00014768173 0.0063210945 … -0.00094137364 0.0039392924; … ; 0.0047478415 0.0077295173 … 0.009154586 0.0056523103; -0.0007516285 0.0034228517 … 0.0008037041 0.0;;; 0.0 0.0 … 0.0 -0.0027430686; 0.0 0.0 … 0.0 0.0027418195; … ; 0.0 0.0 … 0.0 0.0006357164; -0.00057337456 -0.0006643733 … 0.0 -0.006642742;;;;]\n [0.0; 0.0; … ; 0.00063571654; -0.006642742;;]\n [-0.0; 0.0; … ; -0.0; 0.0;;]\n [-0.011225047; 0.0; … ; -0.013638405; 0.0054359557;;]\n [0.0; 0.0; … ; 0.0; 0.0;;]" + "text/plain": "11-element Vector{Array{Float32}}:\n [-0.009472245 0.00584306 … -0.02032024 -0.0015154534; -0.0035235577 -0.0052569476 … -0.011869485 0.01934209; … ; 0.00017586722 -0.00037515975 … -0.0103139775 -0.014537552; 0.00068686006 -0.00034957525 … 0.01744247 -0.09619969;;; -0.0013093968 0.0008409157 … -4.5910558f-5 0.0018075189; -0.00073351956 -0.00015999204 … 0.0003710103 0.009410792; … ; 0.0053614182 -0.004057283 … -0.12897025 -0.09293466; 0.0018994694 -0.00021554832 … -0.043417353 -0.015087117;;; -0.0062978147 -0.008678315 … 0.006531787 -0.0019656986; -0.007543638 -0.0015284255 … -0.0011367024 -0.02094337; … ; 0.01157374 -0.003987183 … -0.005726347 -0.0002367196; 0.0023800305 -0.002054219 … 0.1209437 0.015001486;;;;]\n [0.0035034132 -0.0017993673 … 0.0005170571 -0.0018372213; -0.004546299 0.004019535 … -0.015946586 -0.008511239; … ; -0.0021218208 -0.0031747886 … -0.16199318 0.0049746158; -0.0 0.0018127401 … 0.054253634 0.026597403;;; -0.0 0.0 … -0.0 0.0; -0.0 0.0 … 0.0 0.0; … ; 0.0 0.0 … -0.0 0.0; -0.0 0.0 … -0.0 -0.0;;; -0.0018422215 0.0020551211 … 0.0025094042 0.005682765; 0.0009900206 1.3828961f-5 … 0.005348869 -0.0031528976; … ; -0.0013707638 0.007953098 … -0.18782364 -0.03572682; -0.0010078928 -0.00056493195 … 0.006256265 0.0077366326;;; -0.0 -0.0 … -0.0 -0.0; -0.0 -0.0019358782 … 0.0 -0.001952286; … ; 0.0 0.0009961642 … 0.0 -0.0034517506; 0.0 0.0006965524 … -0.0 -0.027705263;;; -0.0002061818 -0.0019822945 … -0.0026462881 -0.0010781613; 0.0 -0.0023050606 … 0.0064351046 -0.0060237222; … ; -0.0 -0.0003716479 … 0.029707128 0.0012755342; -0.0 0.0019188512 … 0.0 0.0;;; -0.000214311 -0.010100777 … -0.01139981 -0.007876165; -0.0 -0.014803933 … -0.0018727981 -0.0032883766; … ; -0.0 0.004822242 … 0.09425293 -0.031457752; 0.0 0.00056969933 … 0.023519063 0.0106164375;;; -0.0 -8.776794f-5 … -0.0 -0.0006325494; -0.0 0.0 … 0.0 0.0; … ; -0.0 -0.0 … 0.0 0.0; -1.0597285f-5 0.0005147922 … -0.0 -0.0;;; 0.00096644193 -7.808639f-5 … 0.0 -0.00011002664; 0.0 -0.0 … 0.0 0.00037349068; … ; -0.0 -0.0 … 0.01978657 0.0010062988; -0.0003826704 -1.8939966f-7 … -0.0 -0.0020761497;;;;]\n [0.0 0.0 … 0.0 0.0; 0.0018076752 0.0 … 0.005037835 0.0; … ; 0.0 0.0 … 0.0 0.0; 0.005294052 0.0 … -0.043221883 0.0;;; 0.0 0.0 … 0.0 0.0; 0.0 -0.030614512 … 0.0 -0.08271291; … ; 0.0 -0.0068007205 … 0.0 0.0; 0.0 0.0 … 0.19478096 0.0;;; 0.0 0.0 … 0.0 0.0; 0.0 -0.020832172 … -0.010733439 0.0; … ; 0.011108787 0.0 … 0.0 0.0; 0.0 0.0 … 0.0416032 0.0;;; 0.0 0.0 … 0.0 0.00054458814; 0.0 0.0 … 0.0 0.0; … ; 0.0 0.0 … 0.0049856845 0.0; 0.0 0.000364909 … 0.0 0.0;;; 0.0 0.002902595 … 0.0 0.0; 0.0 0.0 … -0.0039699767 0.0; … ; 0.0 0.0 … 0.0 0.0; 5.8355155f-5 0.0 … 0.0 0.0;;; 0.0 -5.536558f-6 … 0.0 0.0; 0.0 0.0 … 0.0 0.00073433865; … ; 0.0 0.0 … 0.0 0.0; 0.0 -0.0064098304 … 0.078030564 0.0;;; 0.0 0.0 … 0.0 0.013867112; 0.0 0.0 … 0.0 0.0; … ; 0.0 0.0 … 0.0 0.0056717647; 0.0 0.0012879411 … 0.0 0.0;;; 0.0 -0.0127740605 … 0.0 -0.0037058485; 0.0 0.0 … 0.0 0.0; … ; 0.0 0.002918349 … 0.0 0.0; 0.0 0.0 … 0.0 -0.098142065;;;;]\n [0.0018076752 -0.0017914379 … 0.010912744 0.005037835; 0.003146093 0.0070666443 … -0.0033314882 0.0016019704; … ; 0.00094042893 0.002500307 … -0.051809482 -0.0021070647; 0.005294052 0.0035056006 … -0.04508993 -0.043221883;;; -0.030614512 -0.028605629 … 0.03533674 -0.08271291; -0.013661846 -0.018811712 … 0.048984993 0.0018512132; … ; -0.008267681 0.0031180966 … -0.4964919 0.087146975; -0.0068007205 -0.0049512247 … -0.09898572 0.19478096;;; -0.020832172 0.012958281 … -0.0145021295 -0.010733439; 0.024078442 -0.043611288 … 0.009315716 -0.03934914; … ; 0.0020304366 -0.025814818 … -0.5877161 -0.06612282; 0.011108787 -0.0049188966 … -0.16563568 0.0416032;;; 0.0 0.0 … 0.0 0.00054458814; 0.0 0.0014325002 … -0.0 -0.007487516; … ; 0.0 0.0 … 0.0 0.0; 0.000364909 -0.0 … -0.0 0.0049856845;;; 0.002902595 0.010848046 … 0.0016612044 -0.0039699767; 0.0021958582 -0.0 … -0.0045076828 0.0; … ; 0.011490392 -0.0 … -0.051645935 -0.0788317; 5.8355155f-5 0.00012856709 … 0.0 0.0;;; -5.5365617f-6 0.0019997207 … -0.042309836 0.00073433865; 9.8170065f-5 0.0052722283 … 0.0009886121 0.013920874; … ; 0.0012907889 -0.008674218 … -0.1263955 0.00016553135; -0.0064098304 0.00042148706 … -0.034399506 0.078030564;;; -0.0 -0.0 … 0.0 0.013867112; -0.00021590342 0.0 … -0.0 0.016674623; … ; 0.00071952544 0.0020121576 … 0.00035435078 -0.039963033; 0.0012879411 0.0 … 0.0 0.0056717647;;; -0.0127740605 -0.0084436 … 0.008217119 -0.0037058485; -0.007901048 0.0037877653 … 0.024776941 -0.0039122943; … ; -0.0042476645 0.008609128 … 0.39979583 -0.047535356; 0.002918349 0.002066907 … -0.13292135 -0.098142065;;;;]\n [-0.0060129 0.004947145 … -0.00095806574 -0.0; 0.0065832767 -0.052057464 … -0.056602966 -0.023054762; … ; 0.004297906 0.014459793 … -0.38959792 -0.007993415; -0.0039121285 0.014056264 … -0.016948467 0.030973993;;; 0.0 0.0 … -0.0 0.0; -0.0 0.0 … -0.0 -0.0; … ; 0.0 -0.0 … 0.0 0.0; -0.0 -0.0 … 0.0 -0.0;;; -0.0 -0.0 … 0.0 0.0; -0.0 0.0 … -0.0 0.0; … ; -0.0 0.0 … -0.0 -0.0; 0.0 -0.0 … -0.11779007 0.004743313;;; … ;;; -0.003565118 0.008684757 … 0.0038712767 0.001317503; 0.0025540683 0.0077775274 … -0.0007238809 -0.007561193; … ; 0.00079193007 0.0 … -0.040063173 0.0019080255; 0.0 0.0 … -0.0 0.003577612;;; 0.010008552 -0.012330534 … -0.019456798 0.009671853; -0.019658534 -0.0077114133 … 0.016439216 -0.012582938; … ; -0.001204844 0.002018993 … -0.08815816 0.036435362; -0.000992362 0.0088675385 … 0.06986052 -0.00352241;;; -0.00042583156 -0.0033112594 … -0.010367511 -0.01109659; -0.010521588 -0.029737439 … -0.022300055 0.037693635; … ; 1.9410983f-5 -0.003108582 … -0.6265661 0.05769167; 5.4972665f-5 -0.0026848516 … 0.44313323 0.006028518;;;;]\n [-0.009087542 0.015242783 … 0.00038557113 0.00016988456; 0.00052375626 0.006888794 … 0.0003164491 0.0; … ; 0.00042734796 0.008357467 … 0.0059204493 0.0; 0.0 0.0003859305 … -0.036637787 0.0;;; 0.0016616735 0.0 … 0.0 0.0; -0.0050229025 0.009797564 … -0.004184786 -0.0040291166; … ; 0.015985139 0.0024508408 … 0.0 0.0; 0.009855311 0.0029462029 … 0.019993676 0.0;;; 0.0029421442 -0.0 … -0.0 -0.0; -0.0015970633 -0.0 … -0.0 -0.0; … ; -0.0 -0.0 … -0.0 -0.0; -0.0 -0.0 … -0.0 -0.0;;; -0.0 -0.0 … -0.0 -0.0; -0.0 -0.0 … -0.009138055 -0.014551435; … ; -0.0 -0.0028891144 … -0.015558918 -0.012855283; -0.004117268 -0.017922072 … -0.10571765 0.085731976;;; -0.0048868204 -0.009001074 … -0.0044259257 -0.0006279988; -0.0019702045 -0.0 … -0.0025051113 -0.00012586084; … ; -0.0 -0.0 … -0.0 -0.0009839034; -0.0 -0.0 … -0.0 -0.0;;; -0.0027210843 -0.0010123148 … -0.0016603607 0.0; -0.010022762 -0.011689309 … -0.010672701 -0.0007040875; … ; -0.005992802 -0.014473112 … 0.0048518926 -0.0027864722; -0.0014223392 -0.0026232374 … -0.0066365944 -0.001964196;;; -0.00026872652 -0.011296998 … -0.0033272356 -0.00075939985; 0.0064179585 -0.0056892405 … 0.008649842 -0.0; … ; -0.008282446 0.010173877 … -0.00063849083 -0.0; -0.001992213 -9.518567f-5 … -0.0 -0.0;;; 0.0 0.00015598763 … 0.027452735 0.0; -0.0026529778 4.4901776f-6 … 0.0 0.0; … ; -0.0039833463 0.0 … 0.0 0.0; 0.0066107465 0.0 … 0.0 0.0;;;;]\n [-0.0040190583 0.001959781 … 0.0011727866 0.00045608636; -5.9101726f-6 0.00273008 … -0.0009980954 0.0; … ; 0.0019108502 0.002554447 … 0.0023681768 0.0; 0.0 -0.00016679091 … -0.0032775607 0.0;;; 0.0024964965 0.0 … 0.0 0.0; -0.0049271113 0.0012705418 … -0.0034270915 -0.0034248324; … ; 0.0053838445 -0.0015088606 … 0.0 0.0; 0.002017074 -0.0022359882 … 0.007829457 0.0;;; 0.0032855582 0.0 … 0.0 0.0; -0.00042963406 0.0 … 0.0 0.0; … ; 0.0 0.0 … 0.0 0.0; 0.0 0.0 … 0.0 0.0;;; 0.0 0.0 … 0.0 0.0; 0.0 0.0 … 0.0012976757 -0.006136401; … ; 0.0 -0.00012777353 … 0.008834558 0.0048288694; 0.00065701467 0.0044193636 … -0.009854828 0.0037790416;;; 0.001795972 -0.0013664576 … -1.5821686f-5 0.00021147483; 0.0027478538 0.0 … 0.0002789615 0.0010531481; … ; 0.0 0.0 … 0.0 0.0012826986; 0.0 0.0 … 0.0 0.0;;; 0.0033500406 0.0015418594 … -0.0028614996 0.0; -0.0061064786 -0.00029856386 … -0.001297864 0.0019738614; … ; -0.0015491742 0.0017975166 … 0.0025762648 0.0024479611; 0.004684416 -0.0039901133 … -0.009421431 -0.0039553186;;; 0.0004971211 -0.0008847231 … -0.0007221794 -0.00020761049; 0.0009720241 -0.00089283567 … 0.0015392612 0.0; … ; -0.0034194232 0.0010986307 … -0.003867398 0.0; -0.0002773601 -9.582106f-5 … 0.0 0.0;;; 0.0 -7.6053984f-5 … 0.001716335 0.0; -0.00055262644 -1.2947371f-5 … 0.0 0.0; … ; -0.0020563372 0.0 … 0.0 0.0; 0.0012816414 0.0 … 0.0 0.0;;;;]\n [-0.0040190583; -5.910173f-6; … ; 0.0; 0.0;;]\n [0.0; 0.0; … ; -0.0; -0.0;;]\n [0.0; 0.0; … ; -0.03575138; -0.0063902177;;]\n [0.0; 0.0; … ; 0.0; 0.0;;]" }, "metadata": {}, "execution_count": 9 @@ -322,7 +322,7 @@ { "output_type": "execute_result", "data": { - "text/plain": "11-element Vector{Array{Float32}}:\n [0.10909061 0.016985286 … -0.033392604 0.035766583; 0.0047575007 -0.10714021 … -0.012110931 0.0013750129; … ; -0.07479966 0.007444864 … 0.036861412 0.009906101; 0.00805714 -0.022324992 … 0.01221438 0.045526262;;; 0.1833335 -0.31012642 … 0.0042630946 -0.015650481; -0.101551116 -0.8807603 … 0.00069726567 -0.030251766; … ; -0.015744127 -0.03496587 … -0.015696604 -0.014254219; -0.023524927 -0.024451341 … -0.002061537 0.015031568;;; -0.12570041 -0.118935265 … 0.015687658 0.011928017; 0.14294061 -0.07356083 … 0.01841417 0.04019364; … ; 0.025564123 -0.013152767 … -0.009696721 -0.0025443526; -0.003473982 0.010800932 … 0.017462572 0.0029346414;;;;]\n [-0.036506597 -0.25065005 … 0.01619241 0.011507603; -0.22996363 -0.35731292 … -0.004716799 0.0038788142; … ; -0.033761796 0.0313675 … 0.022528403 -0.0051091155; -0.0072388975 -0.008173408 … 0.009247528 -0.0056583444;;; 0.08575319 0.17711058 … -0.007875867 0.00058436993; 0.013789675 0.09537185 … -0.0010806506 -0.0074940566; … ; 0.0 0.0029525734 … -0.02821751 -0.01039312; 0.0 -0.009143967 … 0.02028125 -0.003175292;;; -0.042887196 -0.0 … -0.004717435 -0.0; 0.060022302 0.1207603 … 0.0 0.0; … ; 0.0047387774 -0.008239751 … 0.013589981 0.0; 0.0 0.0 … -0.00071525894 0.0007352606;;; -0.06445834 -0.0585551 … -0.009763665 -0.0; 0.0 0.0 … 0.01818304 -0.0006693352; … ; 0.0 0.00562708 … -0.0 0.0024318378; 0.0021766252 -0.0004362477 … -0.0 0.0;;; -0.4763521 -0.8459456 … -0.02505148 -0.0030459946; 0.09090679 -0.0654049 … 0.0602672 -0.0027281241; … ; -0.0073197423 -0.016405828 … -0.025265057 -0.0061176745; -0.0004264298 0.006827222 … 0.030976454 0.0;;; -0.014431472 0.27200603 … -0.00022729569 -0.0038549162; 0.041840933 0.059507057 … -0.0012789028 -0.0013737421; … ; -0.02029101 0.0017885001 … -0.010666857 0.011415687; 0.0095414845 -0.017155882 … -0.001260215 -0.008891415;;; 0.0 -0.0 … 0.0 0.0; -0.0 0.0 … 0.0 0.0; … ; -0.0 -0.0 … 0.0 -0.0; 0.0 -0.0 … 0.0 0.0;;; 0.021060636 -0.0 … -0.0 0.021627596; 0.07137109 0.0 … 0.0025617837 -0.02132579; … ; -0.0022815014 0.0 … -0.0 0.0019612527; -0.012264215 -0.00527389 … 0.0 0.0140815275;;;;]\n [0.0 0.0 … 0.0249988 0.0; -0.26958176 0.0 … 0.0 0.0; … ; 0.0 0.0 … 0.0 0.0; -0.0032865105 0.0 … -0.008930933 0.0;;; 0.0 0.0020870604 … 0.018985491 0.0; 0.0 0.0 … 0.0 0.0; … ; 0.0 0.0 … 0.0 -0.0013934255; 0.0 0.0 … 0.0 0.0;;; 0.0 0.0 … 0.0 0.0; 0.0 0.0 … 0.0 -0.024125686; … ; 0.0 0.0 … 0.0 0.0; 0.0 0.058032494 … 0.009138649 0.0;;; 0.0 0.0 … 0.0 0.001083702; 0.0 -0.56319964 … 0.0 0.0; … ; 0.0 0.0 … 0.0 0.0; 0.0 -0.019358065 … 0.005610562 0.0;;; 0.0 -0.6203666 … 0.0 0.020886518; 0.0 0.0 … 0.0 0.0; … ; 0.0 -0.01582041 … 0.0 0.0; 0.0 0.0 … 0.0 0.06050062;;; 0.0 0.0 … 0.0 0.006584508; 0.0 0.0 … 0.0 0.0; … ; 0.0 0.0 … 0.0 0.0; 0.0 0.0 … 0.0 -0.011490189;;; 0.0 0.0 … 0.0 0.0; 0.0 0.0 … 0.0 0.0; … ; 0.0 0.0 … 0.0 0.0015476055; 0.0 -0.0120370025 … 0.0 0.0;;; -0.069292426 0.0 … 0.0 0.0; 0.0 0.0 … 0.0 0.0; … ; 0.0 0.0 … 0.0 0.0; 0.0 -0.048977695 … 0.08440569 0.0;;;;]\n [-0.26958176 0.05057835 … 0.03289142 0.0249988; 0.20903519 -0.0 … 0.069433786 -0.021604175; … ; 0.01612467 -0.097033486 … 0.008083403 -0.0; -0.0032865105 0.055063967 … 0.020193087 -0.008930933;;; 0.0020870604 0.09518771 … 0.0 0.018985491; 0.0 0.08856021 … 0.0 0.12412661; … ; -0.0 -0.0 … -0.0 -0.021148896; 0.0 -0.015397718 … -0.0 -0.0013934255;;; 0.0 -0.047148593 … 0.0 -0.024125686; 0.0 -0.0 … -0.1029622 -0.02053965; … ; 0.0021314933 -0.059277765 … 0.0 0.0027859597; 0.058032494 -0.021531822 … -0.010623338 0.009138649;;; -0.56319964 0.045164842 … -0.05104338 0.0010837021; 0.6500704 -0.10057322 … 0.116284445 0.030583534; … ; 0.004337278 -0.08474632 … 0.009033818 -0.0033665996; -0.019358065 0.013887664 … 0.01867761 0.005610562;;; -0.6203666 0.057189662 … 0.13043272 0.020886518; -0.0 -0.2689995 … 0.0013740978 -0.18750094; … ; 0.032115858 -0.052761964 … -0.008885173 0.030270008; -0.01582041 0.107497804 … -0.010417068 0.06050062;;; 0.0 -0.0 … -1.1341947f-5 0.006584508; 0.0 0.0 … 0.005908674 0.063301444; … ; -0.0 -0.0 … -0.0 -0.0; -0.0 -0.00039320008 … 0.0 -0.011490189;;; 0.0 -0.1795146 … 0.0 0.0; -0.3742136 0.0 … -0.0 -0.0145936385; … ; 0.005261438 -0.0 … 0.00018893428 0.0073752846; -0.0120370025 -0.0 … 0.00022015258 0.0015476055;;; -0.069292426 -0.3425286 … 0.01214021 0.0; -0.20765051 -0.038671862 … -0.07501496 0.05664548; … ; 0.045021582 0.06282013 … 0.007421834 0.0027402488; -0.048977695 0.04136082 … 0.15190281 0.08440569;;;;]\n [-0.16807593 -0.00038939033 … 0.0 0.0; -0.01918672 -0.54489285 … 0.2542474 0.0; … ; 0.025190767 0.024571508 … 0.0 -0.0; 0.040931985 0.09611825 … -0.0021987807 0.0036025958;;; -0.0 -0.011272973 … 0.0 0.0; -0.0 0.0 … 0.0010476111 -0.0; … ; 0.0 0.0 … 0.0 0.0; 0.005500955 0.0 … 0.0 -0.0;;; 0.0 -0.0 … -0.0 0.0070097805; -0.0 0.0 … -0.0 -0.0605208; … ; 0.0 -0.0 … 0.0 0.021108273; 0.0 -0.0 … 0.0 0.00059921195;;; … ;;; -0.0 -0.09291314 … 0.0 0.0; -0.043849297 -0.09413411 … 0.0 -0.0; … ; -0.0037466194 -0.0 … 0.0 0.0; 0.03011647 0.024953123 … 0.0011174951 0.0;;; -0.00026625805 -0.47847658 … 0.029899087 -0.011348398; 0.37205762 0.5127711 … 0.0690732 -0.059880264; … ; -0.0010180731 0.023897378 … 0.032561462 -0.013875803; 0.0 -0.004059144 … 0.021860223 0.0030347547;;; -0.0 0.024330003 … 7.210141f-5 0.0; -0.0 0.18096678 … 0.0 -0.0; … ; 0.0 -0.0 … 0.0 -0.0; 0.0 -0.0 … -0.0 -0.0;;;;]\n [-0.0 -0.0 … -0.0 -0.0; -0.0 -0.0 … -0.0 -0.0; … ; -0.0014693483 0.00087103667 … -0.0 -0.0; 0.000712205 0.0004526412 … -0.0 -0.0;;; -0.0032647247 -0.0 … -0.0 -0.0; -0.0 -0.0 … -0.0 -0.0; … ; -0.0 -0.0 … -0.0 -0.0; -0.0 -0.0 … -0.0 -0.0;;; -0.0 -0.0 … -6.9886f-5 -2.8113112f-5; 0.015341483 -0.4279785 … -0.0 -8.52097f-5; … ; -0.00057602755 -0.029404787 … 0.0009434741 -0.01466591; -2.2266046f-5 -0.0 … -0.00015644934 -0.0008297956;;; 0.0045853104 0.0052902126 … 0.0014710545 0.0; -3.879381 0.17541379 … 0.0057629375 0.0049753017; … ; -0.0023168456 -0.0061817425 … 0.00254286 0.0050499276; 0.004407694 -0.058885217 … 0.005088185 0.0055331453;;; 0.01664174 0.019312974 … -0.005006531 0.007520891; 0.004637208 0.0 … 0.045063864 0.012015342; … ; 0.0010321845 0.0 … 0.0 0.0; 6.257282f-5 0.0 … 0.008860406 0.0;;; 0.0 0.0 … 0.0050628604 0.027811034; 0.0 0.0 … 0.0 0.059925538; … ; 0.0 0.05243204 … 0.02179366 0.035637934; 0.0 0.0 … 0.03361498 0.04308124;;; 0.0 -0.046526916 … -0.0094169015 0.05705587; 0.002336602 0.019722909 … 0.015049898 0.010068342; … ; 0.029206987 0.050646834 … 0.05410663 0.08566312; 0.0058457106 0.13492039 … 0.013127517 0.0;;; -0.0 -0.0 … -0.0 -0.00043023916; -0.0 -0.0 … -0.0 0.0099360095; … ; -0.0 -0.0 … -0.0 -6.194052f-6; 0.004096542 0.006739668 … -0.0 -0.0022184034;;;;]\n [0.0 0.0 … 0.0 0.0; 0.0 0.0 … 0.0 0.0; … ; -0.00096417195 0.0025425456 … 0.0 0.0; 0.0007130215 0.0006981252 … 0.0 0.0;;; -0.0032050891 0.0 … 0.0 0.0; 0.0 0.0 … 0.0 0.0; … ; 0.0 0.0 … 0.0 0.0; 0.0 0.0 … 0.0 0.0;;; 0.0 0.0 … 1.4059367f-5 0.0; 0.0029872751 -0.0059376317 … 0.0 9.514808f-5; … ; 0.0029633814 -0.009852298 … 0.00012038374 -0.0062369113; 0.0 0.0 … -0.0005627781 -0.0009423485;;; 0.002408507 -0.007942966 … 0.0 0.0; -0.011189836 0.020324802 … -0.004016214 0.0063461987; … ; -0.00035452866 -0.0006969604 … -0.00024563205 0.0068167793; 0.0042496896 -0.00844548 … -0.0020587514 -0.0029978447;;; 0.00039793408 -0.0044802655 … -0.0027069023 -0.004153416; 0.00039026883 0.0 … 0.009906187 -0.006654854; … ; 0.0 0.0 … 0.0 0.0; 0.0 0.0 … 0.0026402038 0.0;;; 0.0 0.0 … 0.0011237959 -0.007889131; 0.0 0.0 … 0.0 0.006470887; … ; 0.0 0.015827665 … -0.0074078036 -0.013098773; 0.0 0.0 … 0.0037569758 0.009128958;;; 0.0 -0.0010951807 … -0.009458962 0.010107953; 0.00017005035 -0.006325204 … 0.0037686406 0.0004751422; … ; 0.0011803741 0.0060589933 … 0.00826315 0.018243885; -0.001660798 0.010315852 … 0.002854667 0.0;;; 0.0 0.0 … 0.0 6.654783f-5; 0.0 0.0 … 0.0 0.009028647; … ; 0.0 0.0 … 0.0 0.0004901995; 0.0027190999 0.0037400024 … 0.0 -0.0011963211;;;;]\n [0.0; 0.0; … ; 0.00049019954; -0.0011963211;;]\n [-0.0; 0.0; … ; -0.013183771; 0.005037208;;]\n [-0.011272744; 0.0; … ; -0.013183771; 0.005037208;;]\n [0.0; 0.0; … ; 0.0; 0.0;;]" + "text/plain": "11-element Vector{Array{Float32}}:\n [-0.020549035 -0.024313763 … -0.011654909 0.007775537; -0.011973615 -0.017967401 … 0.000742935 0.012078945; … ; -0.0004220839 0.0030927146 … 0.0037613513 -0.021150392; -0.0070049884 0.0001017415 … -0.0021260597 0.0038619272;;; -0.0069432724 0.013849563 … 0.0014212014 0.00088907103; 0.019912457 -0.010001135 … 0.00051687524 0.0129755745; … ; -0.0026160078 0.005191513 … 0.03786127 0.008858009; 0.0024910064 0.0010605063 … 0.0069909114 0.0058669136;;; -0.012390484 -0.019654717 … -0.015254563 -0.0016446747; -0.0018130421 0.01169585 … -0.008221991 -0.014727929; … ; 0.010400448 -0.00939577 … 0.010445775 0.0012857689; 0.0027821732 -0.004455432 … -0.009897709 -0.0065635736;;;;]\n [0.0056483643 -0.005885467 … 0.00014934714 0.002215689; -0.0025986263 -0.03158706 … -0.018832304 0.004248301; … ; -0.0006835304 -0.014681763 … 0.03903776 -0.00091486506; -0.0 -0.0002450664 … -0.009824717 0.004216825;;; -0.0 -0.0 … 0.0 0.0; 0.0 0.0 … -0.0 0.0; … ; 0.0 0.0 … 0.0 -0.0; 0.0 -0.0 … 0.0 0.0;;; 0.003933321 0.010123121 … -0.006740068 -0.0047339224; -0.0029304447 -0.017438963 … 0.015133806 -0.0064843725; … ; 0.002543267 0.0127403205 … 0.023401817 0.02234707; 0.0011551542 -0.0012508045 … 0.013438275 0.0012100725;;; -0.0 -0.0 … -0.0 -0.0; 0.0 -0.0055190884 … 0.0 6.161599f-5; … ; 0.0 0.0004768268 … 0.0 0.0036763914; 0.0 0.0004337171 … -0.0 0.017669175;;; -0.00073957007 -0.0076534683 … -0.00040722522 0.0018002691; 0.0 0.0003984671 … 0.005264323 -0.005823896; … ; -0.0 -0.005524943 … -0.019220933 0.00015176853; -0.0 0.0027773117 … 0.0 -0.0;;; -0.0031329424 -0.022559829 … -0.012698963 -0.009028534; 0.0 -0.0107013825 … -0.0037738725 -0.021549985; … ; 0.0 0.004839057 … -0.04038414 0.007365438; 0.0 0.0010742135 … 0.007641832 0.003793424;;; -0.0 -0.0005813907 … 0.0 2.3335042f-5; -0.0 0.0 … -0.0 -0.0; … ; -0.0 0.0 … -0.0 0.0; -1.5492476f-5 -0.003457145 … 0.0 0.0;;; 0.004262894 -0.0030284566 … -0.0 0.00011257526; 0.0 0.0 … -0.0 -0.0014767359; … ; -0.0 -0.0 … -0.004403744 -0.0010402704; -0.0007285346 0.0001761883 … 0.0 0.0020781679;;;;]\n [0.0 0.0 … 0.0 0.0; 0.0197936 0.0 … -0.0027767268 0.0; … ; 0.0 0.0 … 0.0 0.0; 0.0060742735 0.0 … 0.00761312 0.0;;; 0.0 0.0 … 0.0 0.0; 0.0 -0.11016627 … 0.0 -0.040695917; … ; 0.0 -0.019000238 … 0.0 0.0; 0.0 0.0 … -0.04464491 0.0;;; 0.0 0.0 … 0.0 0.0; 0.0 -0.020532416 … -0.043715484 0.0; … ; 0.018945182 0.0 … 0.0 0.0; 0.0 0.0 … 0.04611916 0.0;;; 0.0 0.0 … 0.0 0.0041568265; 0.0 0.0 … 0.0 0.0; … ; 0.0 0.0 … -0.00823735 0.0; 0.0 -0.0015937029 … 0.0 0.0;;; 0.0 0.022455115 … 0.0 0.0; 0.0 0.0 … -0.002604124 0.0; … ; 0.0 0.0 … 0.0 0.0; 0.0015208044 0.0 … 0.0 0.0;;; 0.0 0.00013276907 … 0.0 0.0; 0.0 0.0 … 0.0 -0.0022144152; … ; 0.0 0.0 … 0.0 0.0; 0.0 -0.0042166375 … -0.023264844 0.0;;; 0.0 0.0 … 0.0 -0.014498155; 0.0 0.0 … 0.0 0.0; … ; 0.0 0.0 … 0.0 0.002351008; 0.0 -0.0012879117 … 0.0 0.0;;; 0.0 -0.019445531 … 0.0 0.0045063742; 0.0 0.0 … 0.0 0.0; … ; 0.0 0.003665083 … 0.0 0.0; 0.0 0.0 … 0.0 0.00812926;;;;]\n [0.0197936 -0.0020694346 … 0.0015316486 -0.0027767268; -0.03969087 -0.012016704 … 0.007047866 0.0049179927; … ; -0.0009785222 -0.002794486 … 0.08409035 -0.0004582178; 0.0060742735 0.0069163954 … 0.0035728354 0.00761312;;; -0.11016627 -0.056297615 … 0.021033006 -0.040695917; 0.015368152 -0.02104815 … -0.04665993 -0.0045651314; … ; -0.008641394 -0.002191481 … 0.10798392 0.03323338; -0.019000238 -0.021335289 … -0.14771856 -0.04464491;;; -0.020532416 -0.056303173 … -0.002657313 -0.043715484; 0.08817319 0.0405551 … 0.019215923 -0.0095744915; … ; 0.024168054 0.021075746 … 0.22659378 0.038182043; 0.018945182 0.0075271307 … 0.01402809 0.04611916;;; -0.0 0.0 … 0.0 0.0041568265; -0.0 0.005683231 … -0.0 0.013802878; … ; 0.0 0.0 … -0.0 -0.0; -0.0015937029 -0.0 … -0.0 -0.00823735;;; 0.022455115 0.05582742 … 0.0054020504 -0.002604124; 0.0009104868 -0.0 … -0.01639251 -0.0; … ; 0.010900323 0.0 … 0.013854605 0.03485483; 0.0015208044 0.0003638873 … 0.0 -0.0;;; 0.00013276916 -0.024421964 … -0.0105713485 -0.0022144152; 3.595161f-5 0.015429195 … 0.012379932 0.00097141566; … ; -0.0021665737 -0.013680112 … 0.04755238 0.018349802; -0.0042166375 -0.0009970395 … -0.017472446 -0.023264844;;; -0.0 -0.0 … 0.0 -0.014498155; 0.00041601932 0.0 … -0.0 0.015455432; … ; -0.0015068478 0.0008932163 … 0.00013972296 0.007779165; -0.0012879117 0.0 … -0.0 0.002351008;;; -0.019445531 -0.06087045 … 0.0012524069 0.0045063742; 0.001155323 -0.018947532 … 0.037321735 0.0035189395; … ; -0.002068928 0.007519365 … -0.09728572 -0.030160356; 0.003665083 -0.0036395893 … -0.0015151352 0.00812926;;;;]\n [-0.02351312 0.024980392 … -0.0033463375 -0.0; 0.032440867 -0.12812209 … -0.008342913 -0.016858634; … ; 0.011330212 0.044863973 … 0.04262708 0.0022826912; 0.001229889 0.024635386 … 0.045470513 0.024953863;;; -0.0 0.0 … -0.0 -0.0; -0.0 0.0 … 0.0 -0.0; … ; -0.0 -0.0 … 0.0 0.0; -0.0 0.0 … -0.0 0.0;;; 0.0 0.0 … -0.0 -0.0; -0.0 0.0 … -0.0 -0.0; … ; -0.0 -0.0 … 0.0 -0.0; 0.0 -0.0 … 0.01592344 0.0018683716;;; … ;;; -0.0060688723 0.018241748 … 0.0068790168 0.0017471414; -0.00030270763 0.03738421 … 0.00050688465 0.0077794064; … ; 0.00031178954 0.0 … 0.0055337925 -0.0046372362; 0.0 0.0 … 0.0 -0.0008482904;;; 0.017409055 -0.22810198 … 0.023047466 -0.0045939386; -0.008571354 -0.011798853 … 0.01837061 0.006338169; … ; 0.0022784194 0.01896486 … 0.02798345 0.039397836; 0.00038077493 0.00695092 … -0.0016735652 -0.010993966;;; 0.0009654596 -0.022993833 … -0.005855352 0.0030232845; -0.0092500085 -0.2037581 … -0.060987923 -0.029890358; … ; -0.0037731396 -0.020323317 … 0.04855842 0.088938326; -0.00034794226 -0.008238844 … -0.0022217906 -0.013633252;;;;]\n [-0.0052708024 0.015073608 … -0.0010439358 -0.00043269462; -0.009597907 0.0077708457 … -0.0008377079 -0.0; … ; -0.0011751978 0.0041795643 … 0.0053407983 -0.0; -0.0 -0.0010450414 … -0.02876702 -0.0;;; 0.0023009616 0.0 … 0.0 0.0; 0.0002624837 0.0058222436 … 0.0068032728 0.011719021; … ; 0.009756947 0.003401256 … 0.0 0.0; 0.014002259 0.0041687503 … 0.01721613 0.0;;; 0.0013435994 0.0 … 0.0 0.0; 0.00464327 0.0 … 0.0 0.0; … ; 0.0 0.0 … 0.0 0.0; 0.0 0.0 … 0.0 0.0;;; -0.0 -0.0 … -0.0 -0.0; -0.0 -0.0 … -0.0017176595 -0.003035322; … ; -0.0 -0.00047206684 … -0.0033041867 0.009866126; -0.00069290114 -0.0039632763 … -0.028800922 0.13241737;;; -0.005795941 -0.013856801 … -0.0025015934 -0.0001633699; -0.005265389 -0.0 … -0.0036241766 -0.0005151005; … ; -0.0 -0.0 … -0.0 -0.0004364273; -0.0 -0.0 … -0.0 -0.0;;; -0.0046401382 -0.0025978074 … -0.0034400541 -0.0; -0.011177669 -0.012503688 … -0.011698979 -0.0021400712; … ; -0.0077818655 -0.01465179 … 0.087136894 -0.0047095916; -0.0031443997 -0.0045354124 … -0.008348406 -0.0038006855;;; -0.00074557436 -0.058345776 … -0.013401744 -0.0024764459; -0.018231818 0.029640406 … -0.010919472 -0.0; … ; -0.014533197 0.01982175 … -0.0020092018 -0.0; -0.00023961799 -0.00024619338 … -0.0 -0.0;;; -0.0 -7.017856f-6 … -0.015345847 -0.0; 0.0013657191 -2.5243367f-6 … -0.0 -0.0; … ; -0.0093560945 -0.0 … -0.0 -0.0; 0.0033149759 -0.0 … -0.0 -0.0;;;;]\n [-0.0012747397 0.0022214828 … -0.0016435009 0.00042234178; -0.0013297409 0.0044092005 … -4.425866f-5 0.0; … ; 0.0016340031 0.0020319629 … 0.003454011 0.0; 0.0 0.0025749651 … -0.0024029817 0.0;;; -4.741503f-5 0.0 … 0.0 0.0; -0.003388777 -0.0033911965 … 0.00021679474 0.002166495; … ; -0.0004983841 -0.0042090206 … 0.0 0.0; 0.0029321339 -0.0031190368 … 0.0037281066 0.0;;; 0.0022693886 0.0 … 0.0 0.0; 0.007169022 0.0 … 0.0 0.0; … ; 0.0 0.0 … 0.0 0.0; 0.0 0.0 … 0.0 0.0;;; 0.0 0.0 … 0.0 0.0; 0.0 0.0 … -0.00018031301 -0.006131742; … ; 0.0 -0.00014391587 … 0.0046980577 0.0056127114; -3.7542195f-6 -0.001527812 … -0.0027841546 0.0049430886;;; 0.0022782965 -0.0042575635 … 0.0022085966 0.00043441134; 0.00079193746 0.0 … -0.00019219457 0.0009953931; … ; 0.0 0.0 … 0.0 0.0019906797; 0.0 0.0 … 0.0 0.0;;; 0.0013464186 -0.0008256198 … 0.0032198639 0.0; 0.0015455859 -0.0015441959 … 0.015265144 0.0008905824; … ; -0.0053401613 0.0094594555 … 0.009697458 0.000800819; -0.0016784286 0.00028017338 … -0.002633766 0.002039907;;; 0.0003004432 -0.0046986924 … -0.0030665086 -0.00062410254; -0.0020306502 0.006191542 … -0.0012569687 0.0; … ; -0.0040762527 0.0022583888 … 0.0005537418 0.0; 0.0014613654 0.00024701114 … 0.0 0.0;;; 0.0 0.000775391 … -0.00096943136 0.0; 0.0002462572 0.0 … 0.0 0.0; … ; -0.0044969087 0.0 … 0.0 0.0; 0.00065519975 0.0 … 0.0 0.0;;;;]\n [-0.0012747397; -0.0013297409; … ; 0.0; 0.0;;]\n [0.0; 0.0; … ; -0.038909033; -0.0047464888;;]\n [0.0; 0.0; … ; -0.038909033; -0.0047464888;;]\n [0.0; 0.0; … ; 0.0; 0.0;;]" }, "metadata": {}, "execution_count": 10 diff --git a/dev/generated/lrp/basics/index.html b/dev/generated/lrp/basics/index.html index 7b87a9b6..5b69ae8a 100644 --- a/dev/generated/lrp/basics/index.html +++ b/dev/generated/lrp/basics/index.html @@ -118,29 +118,29 @@ expl = analyze(input, analyzer; layerwise_relevances=true) expl.extras.layerwise_relevances
11-element Vector{Array{Float32}}:
- [-0.0038831537 -0.0016364201 … -0.045573 -0.0007823985; -0.009868371 -0.005603635 … -0.010315119 -0.0063129906; … ; 0.025076304 0.009065998 … 0.0012486618 -0.0009917524; 0.0046729147 -0.026177768 … 0.001904136 -0.004317519;;; 0.012448726 -0.0038384055 … -0.047494687 -0.00054027996; 0.0073520658 -0.0015974953 … -0.09586338 0.015866395; … ; -0.0050147185 -0.021957021 … -0.008861424 0.0007192572; -0.004703007 -0.00771669 … -0.0050953464 -0.00030545314;;; -0.0038291058 0.005184213 … 0.0049437806 0.00079900335; -0.014821771 -0.0008268345 … -0.015994215 0.04018665; … ; 0.021390665 -0.0047633173 … -0.0005296806 0.00029906866; 0.0047857366 0.0012345195 … 0.00937819 -0.0020163765;;;;]
- [-0.0 -0.0015384482 … 0.0 0.008598945; 0.00038294203 -0.0029863832 … 0.0 -0.0011195089; … ; -0.0 0.0 … 0.0014450476 -0.00048536164; -0.0 0.009702566 … -0.0009883826 -0.0011691569;;; -0.00079957035 -0.0005192932 … -0.0012548858 -0.00073483173; 0.0013056269 0.0 … 0.0 0.0; … ; 0.0034405012 0.0 … 0.000369388 0.0; -0.0008780009 0.00021489015 … -0.0 0.0;;; -0.0024585316 -0.012765162 … -0.026199805 -0.00451525; -0.0003825467 -0.0022131738 … 0.041093443 -0.009561424; … ; 0.0050402256 -0.0003939805 … 0.010613688 -0.011685435; -0.002224417 0.033943884 … 0.0 0.0015255885;;; 0.0033699325 -0.0018459557 … -0.0074523115 -0.0016924344; 0.0 -2.4054882f-5 … 0.0 -0.0; … ; -0.0 0.0012737393 … -0.00039809968 -0.0; 0.0 0.0 … -0.0 -0.0;;; 0.0 0.0 … -0.0 0.0; 0.0 -0.0 … 0.0 0.0; … ; 0.0 -0.0 … 0.0 0.0; 0.0 -0.0 … -0.0 0.0;;; 0.001290221 -0.0076769157 … 0.034402575 0.0; 0.00071924576 -0.0038246913 … 0.009003081 -0.0; … ; -0.009414813 0.004746493 … 0.0006822266 0.0; -0.0032743677 0.009196395 … -0.0030326967 0.0;;; 7.008447f-5 0.0014111816 … -0.0 0.0100263115; 0.0034034092 -0.0012987872 … -0.0004759902 0.0; … ; -0.002077336 -1.1088605f-5 … 0.0 -0.00042851767; 0.0 -0.0 … -0.0 -0.0;;; 0.0 -0.0006852316 … 0.00070702797 0.0024452826; 0.0 -0.0008835746 … -0.0016289164 0.0; … ; 0.0 0.0027091035 … -0.0 -0.0014551887; -0.0018945724 -0.0 … -0.0 0.0;;;;]
- [0.0 0.0 … 0.0 0.0; 0.0 0.0 … 0.0 0.0; … ; 0.0 0.0 … 0.0 0.0; 0.0 0.0 … 0.0 -0.009811934;;; 0.0 0.0 … 0.0 0.0; 0.0 0.005894333 … -0.01743813 0.0; … ; 0.0 -0.0037458532 … 0.012704272 0.0; 0.0 0.0 … 0.0 0.0;;; 0.0031526769 0.0 … -0.0026008259 0.0; 0.0 0.0 … 0.0 0.0; … ; 0.0059148953 0.0 … 0.0 0.0; 0.0 0.0 … 0.0 3.4611727f-5;;; 0.0 0.0 … -4.0115297f-5 0.0; 0.0 0.0 … 0.0 0.0; … ; 0.0 0.0 … 0.0 0.0; 0.0 0.0 … 0.0 0.0;;; 0.0 0.0 … 0.0 0.0; 0.012927536 0.0 … 0.024350777 0.0; … ; 0.0 0.037582703 … 0.0 0.0; 0.0 0.0 … -0.0018690513 0.0;;; 0.0 0.0 … 0.0 0.0; 0.0 0.0 … -0.030507525 0.0; … ; 0.0 -0.0034406525 … 0.0 0.0; 0.0 0.0 … -0.009525346 0.0;;; 0.0 0.0 … 0.0 0.0; 0.0 -0.016103031 … 0.016972253 0.0; … ; 0.0 0.04648207 … 0.01000766 0.0; 0.0 0.0 … 0.0 0.0;;; 0.0 -0.009955549 … -0.102434464 0.0; 0.0 0.0 … 0.0 0.0; … ; 0.0 -0.013700821 … 0.0 -0.016326614; 0.0 0.0 … 0.0 0.0;;;;]
- [-0.0 -0.0 … 4.5440836f-5 0.0; 0.0 -0.0 … -0.0 0.019273452; … ; -0.0 -0.0 … -0.0 -0.0021772722; 0.0 -0.0059929476 … -0.0003794208 -0.009811934;;; 0.005894333 -0.06769997 … 0.00410236 -0.01743813; -0.005548533 0.019269153 … 0.020970758 0.007751715; … ; 0.015911771 -0.0637522 … 0.027622208 0.030549388; -0.0037458532 -0.020323055 … 0.009707856 0.012704272;;; 0.0031526769 0.0017287281 … 0.016420767 -0.002600826; -0.0005557502 0.0 … 0.0 0.0; … ; -0.0031944741 0.0 … 0.0 -0.0; 0.0059148953 0.005947647 … 0.0 3.4611745f-5;;; 0.0 -0.0013372982 … 0.0 -4.01153f-5; 0.0 -0.0 … -0.0 -0.0; … ; 0.0 0.0 … -0.0 0.0; -0.0 0.0 … -0.0 0.0;;; 0.012927536 0.012694595 … 0.017712053 0.024350777; 0.006257179 -0.02464207 … 0.04431523 -0.0034392711; … ; 0.035829537 -0.1511845 … -0.029067537 0.022210369; 0.037582703 0.037922822 … 0.023974525 -0.0018690513;;; 0.0 -0.0 … -0.0023233723 -0.030507525; -0.0 0.0042026946 … 0.008070867 -0.036466334; … ; -0.0 -0.012943027 … -0.0003829958 -0.019259067; -0.0034406525 0.01941959 … 0.008958577 -0.009525346;;; -0.016103031 -0.008272432 … -0.12425316 0.016972253; 0.008363366 -0.029552955 … -0.04811538 -0.13425204; … ; 0.09131794 0.082262665 … -0.026476774 -0.03532927; 0.04648207 0.04768633 … 0.032613188 0.01000766;;; -0.009955549 -0.00033852574 … 0.0027721585 -0.102434464; 0.006476457 -0.014531804 … -0.054926477 -0.008889919; … ; -0.14278576 -0.086011484 … -0.032805458 -0.030393763; -0.013700821 0.03193436 … 0.013625924 -0.016326614;;;;]
- [-0.0 0.009837505 … 0.00083321135 0.0016100388; 0.0 0.0 … -0.0 0.014751482; … ; 0.0 0.0 … 0.0 0.0024911894; 0.0 0.0 … 0.0 0.00018069432;;; 0.0 0.0 … -0.0020939503 0.00061999797; 0.0 0.0 … -0.0 -0.007820387; … ; 0.0 -0.0 … 0.0 0.0052320133; -0.0 -0.0 … -0.0 -0.0031584154;;; -0.0 0.0 … 0.0 0.0; -0.0 -0.0 … -0.0 0.0; … ; -0.0 -0.0 … 0.0 -0.0; 0.0 -0.0 … -0.0 -0.0;;; … ;;; -0.0 -0.0038078476 … 0.012222226 -0.009125978; -0.023813767 0.027776431 … -0.026417498 0.03608647; … ; 0.03555107 -0.10374649 … 0.04450404 0.021566255; -0.0011495093 -0.008259712 … -0.01589209 -0.0059936964;;; -0.0 -0.0 … 0.0 -0.005831161; 0.0 0.0 … 0.0 -0.0; … ; -0.0 0.0 … 0.0 0.0; 0.0 0.0 … 0.0 0.0;;; 0.0028391446 -0.03785916 … -0.03625422 0.0063060126; 0.0028973455 -0.005039949 … 0.050228726 -0.011402732; … ; 0.0023721536 0.02924642 … -0.00808781 0.0046986477; -0.0 -0.0 … 0.0 0.0017547001;;;;]
- [-0.0 -0.00032919343 … -0.00021266754 -0.00032708238; -0.06445535 0.016767126 … 0.022303578 -0.0035928416; … ; -0.0039565526 -0.0027064886 … -0.018819261 -0.0003234109; -0.00032918417 -0.0 … -0.0 -0.0;;; -0.0030569562 7.234953f-5 … 0.0 0.0; 0.003388972 -0.0067799618 … 0.0 -0.024337104; … ; 3.1223368f-5 0.0 … 0.0 0.0; 0.0 0.0 … 0.0 0.0;;; -0.020130573 -0.02041969 … -0.0056696357 -0.0; -0.0177962 -0.030271105 … -0.011712311 -0.010745524; … ; -0.028359203 -0.03234335 … -0.022150092 -0.012624363; -0.009600129 -0.015224243 … -0.008884472 -0.0065132948;;; -0.00092679274 0.0058560036 … 0.010244188 0.007135728; -0.012662035 0.020497264 … 0.015412333 0.01572129; … ; 0.0 0.0 … 0.0027642269 0.0065141325; 0.0052442094 0.007745886 … 0.0075674076 0.00055994437;;; 0.0032030393 0.0 … 0.0 0.0; 0.0 0.0 … 0.0 0.0; … ; 0.0014639655 0.0024306322 … 0.0036227761 0.001239807; -0.0049681785 -0.003751059 … -0.0024018092 0.0;;; 0.0 0.0 … 0.0 0.0; 0.0 0.0 … 0.0 0.0; … ; 0.0 0.0 … 0.0 0.00039988832; 2.032554f-7 0.0021579608 … -0.0014047278 0.00070380286;;; 0.0 0.001394097 … 0.0025973236 0.0; 0.0020144444 0.025083313 … 0.0021311545 0.0021580863; … ; 0.0021440175 0.0011802689 … 0.031960964 -0.09744046; 0.0013966016 -0.006770011 … 0.0029344524 0.047524255;;; -0.0 -0.0 … -0.0 -0.0; -0.0 -0.0 … -0.0 -0.0; … ; -0.0 -0.0 … -0.0 -0.0; -0.0 -0.0 … -0.0 5.5300447f-5;;;;]
- [0.0 0.0044041215 … 0.00016468443 -0.0020823092; -0.0044622724 0.00801288 … 0.0083776 -0.00090109697; … ; -0.0021566907 -0.0010610776 … -0.007804811 0.0029821396; 0.002566032 0.0 … 0.0 0.0;;; -0.0023988858 -0.00047416694 … 0.0 0.0; 0.0016085053 -0.0020122884 … 0.0 -0.00089375104; … ; 0.00012068136 0.0 … 0.0 0.0; 0.0 0.0 … 0.0 0.0;;; 0.011013946 0.012182047 … 0.0019965605 0.0; 0.0135749625 0.0023635984 … 0.003702852 0.0014732066; … ; -0.014268804 0.011048073 … -0.0009001733 0.0027087235; 0.0012373454 -0.00057613675 … 0.0007663287 0.0010424253;;; -0.0024143858 -0.003202515 … -0.00039428292 -0.0061316458; -0.009534627 -0.0024973657 … -0.0014496408 -0.0005599244; … ; 0.0 0.0 … -0.00047404226 0.0013510782; -0.0017790678 -0.00085535296 … -2.1873622f-5 0.00024357504;;; 0.0019340301 0.0 … 0.0 0.0; 0.0 0.0 … 0.0 0.0; … ; 0.000767038 -0.003981961 … 0.0019554533 0.0006356343; -0.0056608682 -0.0051614223 … -0.0030260044 0.0;;; 0.0 0.0 … 0.0 0.0; 0.0 0.0 … 0.0 0.0; … ; 0.0 0.0 … 0.0 0.0002860292; 4.4010453f-6 0.001764002 … -0.0020483749 0.00016754633;;; 0.0 -0.0018095961 … 0.0004545721 0.0; -0.0024584064 0.0021633047 … 0.0012440955 0.0063662436; … ; -0.0007544176 -0.00021924358 … 0.010368153 -0.0076135676; -0.00079697766 -0.0032707427 … -0.010955581 0.003324185;;; 0.0 0.0 … 0.0 0.0; 0.0 0.0 … 0.0 0.0; … ; 0.0 0.0 … 0.0 0.0; 0.0 0.0 … 0.0 5.804623f-5;;;;]
- [0.0; -0.0044622724; … ; 0.0; 5.8046237f-5;;]
- [0.007762578; 0.0; … ; 0.0; 0.0;;]
- [0.007762578; 0.0; … ; 0.023404092; 0.0;;]
+ [-0.015887326 -0.058208115 … 0.0098114535 -0.012057768; 0.04041165 -0.1263505 … -0.088195056 -0.021429544; … ; -0.015893165 -0.007775398 … 0.014089855 0.0023246931; -0.041423798 -0.008618472 … 0.012849731 -0.0006974514;;; 0.010146738 0.0877583 … -0.008911166 -0.0010837569; -0.054595 -0.032387927 … -0.047349416 0.0016429567; … ; 0.02343945 -0.037683394 … -0.02318272 -0.036234837; -0.012450263 0.03321401 … -0.00058343366 -0.004807259;;; 0.07339452 0.01726334 … 0.00095575396 -0.006775429; -0.046373516 -0.07596873 … 0.02703556 -0.0039925864; … ; 0.011184323 0.004022099 … -0.014383364 0.0007300727; 0.0011864809 -0.032210015 … 0.001186319 -0.014550519;;;;]
+ [0.033594508 -0.022853648 … -0.003987488 -0.002643292; -0.0035315552 -0.083302915 … 0.0380814 0.004258694; … ; 0.033580724 -0.019374073 … -0.05942129 -0.019387627; -0.0012030138 -0.0078636 … 0.0 -0.007484102;;; -0.0 -0.0 … -0.0 0.0; 0.022769306 0.0 … -0.0 -0.0028232671; … ; 0.020034978 -0.0 … 0.0008376633 -0.0; 0.0058195856 0.00042984207 … -0.004361249 0.0;;; 0.0 0.027203923 … 0.0 -0.0; 0.030208122 -0.003560185 … -0.017017744 -0.0; … ; 0.04574615 -0.0034149908 … 0.0045322846 0.03340941; 0.0022795512 0.04062475 … 0.019724915 -0.0045433645;;; -0.03126347 0.08492896 … -0.031098926 -0.0; 0.0 -0.007855231 … -0.062298786 -0.0; … ; -0.00081358413 -0.0 … 0.013453669 0.0; 0.0036518634 0.022225184 … 0.0 0.0001959285;;; -0.0 -0.0 … 0.0 0.0; 0.0 0.0 … -0.0 -0.0; … ; -0.0 -0.0 … 0.0 0.0; 0.0 -0.0 … -0.0 -0.0;;; -0.0004087577 0.0054739346 … -0.043805853 -0.0; 0.0 -0.0 … -0.0 -0.0; … ; -0.0 -0.0 … 0.0 0.0; 0.0 0.0 … 0.0 -0.0;;; -0.020142747 -0.06256223 … -0.013590244 -0.003092245; 0.016133465 0.05321706 … 0.07917775 0.021760387; … ; 0.00035645172 0.07619165 … 0.009551053 0.021080099; 0.0 -0.0020573982 … -0.009070266 0.0040898398;;; 0.002654754 -0.011097668 … 0.051736582 -0.0024451253; 0.0331135 0.0 … -0.0427726 -0.010964387; … ; -0.006909267 -0.025459513 … 0.005522591 -0.01121097; -0.00026007922 -0.0 … 0.0 0.0;;;;]
+ [0.0 0.0 … 0.0 0.0; 0.0 0.0 … 0.0 -0.0024885454; … ; 0.0 0.0 … 0.0 0.0; 0.018896943 0.0 … -0.0018177402 0.0;;; 0.0 0.0 … 0.0 0.0; 0.0 0.0022484888 … 0.0 0.011426358; … ; 0.0 0.0027788489 … 0.0014827962 0.0; 0.0 0.0 … 0.0 0.0;;; 0.0 0.0 … 0.0 0.0; 0.0 0.0 … 0.0 0.0; … ; 0.0 0.0 … 0.0 0.0; 0.0 0.0 … 0.0 0.0;;; 0.0 0.0 … 0.0 0.0; -0.01156115 0.0 … 0.0 0.0; … ; 0.0 0.0038339351 … 0.0 0.004788911; 0.0 0.0 … 0.0 0.0;;; 0.0 0.09109991 … 0.0 0.0; 0.0 0.0 … 0.014098722 0.0; … ; 0.0 0.00021707235 … -0.037048806 0.0; 0.0 0.0 … 0.0 0.0;;; 0.0 0.0 … 0.0 0.0; -0.013339639 0.0 … 0.0 0.004211535; … ; 0.0 0.0 … 0.0 0.0; 0.0 0.0 … 0.0 0.0;;; 0.0 -0.046155088 … 0.0 0.0; 0.0 0.0 … -0.012450026 0.0; … ; 0.0 0.0 … 0.0 0.0; 0.0 0.018775655 … 0.005621827 0.0;;; 0.0 0.10543561 … -0.021568106 0.0; 0.0 0.0 … 0.0 0.0; … ; 0.013102754 0.0 … 0.0 0.0; 0.0 0.0 … 0.0 0.0;;;;]
+ [-0.0 -0.0 … -0.0 -0.0024885454; -0.011728548 0.1433166 … 0.01917907 0.011416879; … ; 0.008560104 0.0124386735 … -0.00038832307 -0.037304215; 0.018896943 0.057280656 … -0.10222816 -0.0018177402;;; 0.0022484888 -0.0040825084 … 0.043088324 0.011426358; 0.24205594 -0.21676223 … -0.04406478 -0.0014058232; … ; -0.015637476 -0.016657049 … -0.0058364905 0.04100866; 0.0027788489 -0.004617879 … 0.03365211 0.0014827962;;; -0.0 -0.0 … -0.0 -0.0; 0.0 -0.0 … 0.0 -0.0; … ; 0.0 0.0 … 0.0 0.0; 0.0 -0.0 … 0.0 -0.0;;; -0.011561151 0.0 … -0.0 -0.0; 0.010963951 0.012599284 … 0.009639426 0.0; … ; 0.08681142 0.053400084 … -0.0017996416 -0.0; 0.0038339351 0.036833685 … 0.09189414 0.004788911;;; 0.09109991 0.08994877 … -0.024679817 0.014098722; -0.03223931 -0.07114013 … -0.07491673 -0.014295357; … ; 0.044386044 0.012457834 … 0.023939012 0.015039931; 0.00021707237 -0.0039123967 … 0.041534808 -0.037048806;;; -0.013339639 0.00079768477 … -0.0 0.004211535; -0.0 0.0 … -0.030072391 0.0; … ; 0.023036582 0.022390915 … -0.0 -0.0; 0.0 -0.0 … -0.0042284727 0.0;;; -0.046155088 0.008014408 … -0.19311084 -0.012450026; -0.101781175 0.49882904 … -0.11255018 0.030970307; … ; 0.042792518 -0.021826135 … 0.21634623 0.010314441; 0.018775655 -0.1053141 … 0.02827021 0.005621827;;; 0.10543561 0.022896456 … 0.08090765 -0.021568106; 0.22146879 -0.0497979 … -0.0 -0.00030401722; … ; -0.05411764 0.010682818 … 0.034707684 -0.00074392464; 0.013102754 0.001181196 … -0.0 -0.0;;;;]
+ [0.0 0.0 … -0.0 0.0002007623; 0.0 0.03451721 … -0.0 0.0008650921; … ; -0.0 -0.0 … 0.017249426 0.006069709; 0.0 -0.00074658456 … 0.0006835919 0.0;;; -0.011476038 -0.007214478 … 0.014629279 -0.0; -0.0074302666 0.0026045754 … -0.0 0.0; … ; -0.018737286 -0.021805203 … -0.01369714 -0.0026236256; -0.016283523 0.063708484 … -0.035271876 0.0;;; -0.009625462 0.0076420913 … -0.010928319 0.0009458132; -0.00762036 0.12305745 … 0.0073253624 0.00010691043; … ; 0.010021585 -0.0265663 … 0.017286966 -0.0023634727; -0.0136828255 -0.009553743 … 0.035156634 0.0040319897;;; … ;;; -0.0003830397 -0.0 … -0.00022442544 -0.0; -0.0 0.0038119343 … 0.008094799 -0.020282632; … ; -0.0 -0.08258002 … 0.022602575 -0.0015307105; -0.003967533 -0.0008195345 … 0.0018436875 -0.00090942765;;; -0.0 4.2121395f-5 … 0.0027686825 -0.0; -0.03230181 0.0 … -0.0 0.0; … ; -0.025422787 0.0 … 0.022466319 -0.0; -0.0 -0.0 … -0.0 0.0;;; 0.0054352726 0.00791177 … 0.03388781 -0.01241298; -0.0 0.053023413 … -0.00019181857 0.0033808816; … ; 0.0 -0.0065312646 … 0.0945164 0.009299607; -0.0 -1.7685927f-5 … -0.0 0.0034514868;;;;]
+ [0.0028202797 0.0 … 0.004169422 0.00061992597; 0.0047180755 0.0015264327 … 0.0 0.003944678; … ; 0.00050097774 0.029236468 … 0.0023108225 0.0043646833; 0.030509656 0.0066761645 … 0.059903312 0.0067497175;;; 0.0 0.0072655655 … -0.0058527794 0.010513892; 0.0 8.086838f-6 … 0.017278602 -0.009785902; … ; 0.0 2.9730805f-5 … -0.003907089 0.024693465; 0.0 2.390558f-5 … 0.0003307237 1.8548f-5;;; 0.0 0.0 … 0.0 0.0; 0.0 0.0 … 0.0 0.0; … ; 0.0 0.0 … 0.0 0.0; 0.0 0.0 … 0.0 0.0;;; 0.015810953 -0.042435136 … 0.0011410884 -0.0; 0.012366156 -0.04300754 … -0.0006218589 -0.0; … ; -0.0 -0.08496949 … -0.0 -0.00027005776; -0.00078242837 -0.0 … 0.01880706 -0.0;;; -0.0 -0.0 … -0.0 -0.005349093; -0.0 -0.0 … -0.0 -0.0; … ; -0.0 -0.0 … -0.0 0.0030161103; -0.0 -0.0 … -0.0 -0.0;;; 0.0 -0.002517608 … 0.012442764 -0.0035677848; 0.0 0.0 … 0.0 0.0; … ; 0.0 0.0 … 0.0 0.0; 0.0 0.0 … 0.0 0.0;;; 0.0 0.0 … 0.0 0.0; 0.0 0.0 … 0.0 0.0; … ; 0.0 0.0 … 0.0 0.0; 0.0 0.0 … 0.0 0.0;;; 0.0021824033 -0.0 … -0.0 -0.0; -0.0 -0.0 … -0.0 -0.0; … ; -0.0 -0.0 … -0.0 -0.0; -0.0 -0.0 … -0.0 -0.0;;;;]
+ [-0.0011614505 0.0 … -0.00036145048 -0.00023153734; 0.00024599046 -0.00063758914 … 0.0 -0.009922795; … ; 0.00036544687 0.013303375 … -0.004808703 0.016644664; 0.0050626695 0.00837999 … 0.022046492 -0.0010091395;;; 0.0 0.0007426466 … -0.0033568693 0.008229597; 0.0 -0.00010124607 … 0.010036692 -0.005313186; … ; 0.0 -0.0055145794 … -0.0018333677 0.015905604; 0.0 0.0005536729 … 0.00039878604 -0.0052205883;;; 0.0 0.0 … 0.0 0.0; 0.0 0.0 … 0.0 0.0; … ; 0.0 0.0 … 0.0 0.0; 0.0 0.0 … 0.0 0.0;;; 0.010816978 -0.019193588 … 0.0011050606 0.0; 0.007288073 -0.020774215 … 0.0017266205 0.0; … ; 0.0 -0.0040241205 … 0.0 0.0009949756; -0.003989051 0.0 … 0.010726973 0.0;;; 0.0 0.0 … 0.0 0.0028603142; 0.0 0.0 … 0.0 0.0; … ; 0.0 0.0 … 0.0 0.00257416; 0.0 0.0 … 0.0 0.0;;; 0.0 -0.001990706 … 0.017968236 -0.0013600787; 0.0 0.0 … 0.0 0.0; … ; 0.0 0.0 … 0.0 0.0; 0.0 0.0 … 0.0 0.0;;; 0.0 0.0 … 0.0 0.0; 0.0 0.0 … 0.0 0.0; … ; 0.0 0.0 … 0.0 0.0; 0.0 0.0 … 0.0 0.0;;; 0.002281694 0.0 … 0.0 0.0; 0.0 0.0 … 0.0 0.0; … ; 0.0 0.0 … 0.0 0.0; 0.0 0.0 … 0.0 0.0;;;;]
+ [-0.0011614505; 0.00024599046; … ; 0.0; 0.0;;]
+ [-0.0047784545; 0.010044609; … ; 0.0; 0.059079066;;]
+ [-0.004778455; 0.010044609; … ; 0.0; 0.059079066;;]
  [0.0; 0.0; … ; 0.0; 0.0;;]

Note that the layerwise relevances are only kept for layers in the outermost Chain of the model. When using our unflattened model, we only obtain three layerwise relevances, one for each chain in the model and the output relevance:

analyzer = LRP(model; flatten=false) # use unflattened model
 
 expl = analyze(input, analyzer; layerwise_relevances=true)
 expl.extras.layerwise_relevances
11-element Vector{Array{Float32}}:
- [0.0010207323 -0.007980485 … -0.032674737 0.0045772777; -0.027682293 0.0058941203 … 0.0020468947 -0.011886046; … ; 0.014307547 -0.0047217556 … -0.008261258 -0.0047789533; -4.8582344f-5 0.0030975891 … -0.0030999926 -0.0014275227;;; 0.0038287179 -0.0050959843 … -0.015875975 -0.0012506621; 0.01762196 -0.00019205933 … 0.000342701 -0.0021429104; … ; -0.0010821145 0.0074184453 … -0.018994732 -0.0015100081; 0.0066144764 -0.001667912 … -0.004199393 -0.00017773638;;; -0.0029024556 0.0061603324 … 0.005302045 0.0017092457; -0.020301988 -0.0023542568 … -0.011563395 0.03955199; … ; -0.005022351 -0.0042651542 … 0.0051177396 0.00030817188; -0.004116833 -0.0051480336 … -0.0026709978 0.00047814468;;;;]
- [-0.0 -0.0022069407 … -0.0 0.002664379; 0.00010990416 -0.0032996142 … 0.0 0.0068608257; … ; 0.0 0.0 … 0.0003263809 0.0030069803; -0.0 -0.0009069561 … -0.0007290192 -0.0012759961;;; -0.001357918 -0.0024247447 … 0.0059249178 -0.00022268957; -0.0019146611 0.0 … 0.0 -0.0; … ; 0.0019866803 -0.0 … 0.00034997708 0.0; -0.00010038443 -0.0016282512 … -0.0 -0.0;;; -0.0049129566 -0.0033877972 … -0.011050789 -0.0022807335; 0.0066427714 -0.023294369 … 0.018099273 0.012962668; … ; -0.007587545 -0.005291812 … -0.01204115 -0.0075623565; -0.0015627532 -0.0022434099 … 0.0 -0.00039409686;;; -0.0006047327 0.0012370368 … 0.0048490893 -0.005150113; 0.0 -3.007865f-5 … 0.0 0.0; … ; -0.0 -0.00070633786 … -0.0010676244 -0.0; 0.0 0.0 … -0.0 -0.0;;; -0.0 0.0 … 0.0 0.0; 0.0 -0.0 … 0.0 -0.0; … ; 0.0 0.0 … 0.0 0.0; -0.0 0.0 … 0.0 0.0;;; 0.0039219316 -0.009974134 … 0.022232184 -0.0; 0.0013000884 -0.0025634805 … 0.004623608 0.0; … ; 0.0051482697 -0.0054312684 … 0.004514884 0.0; -0.0010540513 -0.0014582151 … -0.0033840511 0.0;;; -0.0004368335 0.0035060833 … 0.0 -0.000973588; 0.0015061196 0.00011707891 … 0.011551401 0.0; … ; 0.00193334 6.9175417f-6 … 0.0 0.0014759052; 0.0 0.0 … -0.0 -0.0;;; -0.0 -0.0009980901 … -0.00055147294 0.002898692; 0.0 -0.0006494175 … -0.0021454024 0.0; … ; 0.0 0.0027915372 … -0.0 -0.002086834; 8.737816f-5 -0.0 … -0.0 0.0;;;;]
- [0.0 0.0 … 0.0 0.0; 0.0 0.0 … 0.0 0.0; … ; 0.0 0.0 … 0.0 0.0; 0.0 0.0 … 0.0 -0.004800467;;; 0.0 0.0 … 0.0 0.0; 0.0 0.017522603 … 0.034752846 0.0; … ; 0.0 0.004675548 … -0.004731988 0.0; 0.0 0.0 … 0.0 0.0;;; -0.0158092 0.0 … 0.0033431738 0.0; 0.0 0.0 … 0.0 0.0; … ; -0.00095969904 0.0 … 0.0 0.0; 0.0 0.0 … 0.0 -4.8689613f-5;;; 0.0 0.0 … 0.0037445093 0.0; 0.0 0.0 … 0.0 0.0; … ; 0.0 0.0 … 0.0 0.0; 0.0 0.0 … 0.0 0.0;;; 0.0 0.0 … 0.0 0.0; 0.0036162864 0.0 … 0.013698082 0.0; … ; 0.0 0.01124425 … 0.0 0.0; 0.0 0.0 … 0.0027833695 0.0;;; 0.0 0.0 … 0.0 0.0; 0.0 0.0 … -0.028024347 0.0; … ; 0.0 -0.00027473547 … 0.0 0.0; 0.0 0.0 … -0.014293882 0.0;;; 0.0 0.0 … 0.0 0.0; 0.0 -0.01207277 … -0.062140226 0.0; … ; 0.0 -0.010729361 … 0.007876189 0.0; 0.0 0.0 … 0.0 0.0;;; 0.0 -0.012971649 … 0.02372979 0.0; 0.0 0.0 … 0.0 0.0; … ; 0.0 -0.00049811776 … 0.0 -0.02145152; 0.0 0.0 … 0.0 0.0;;;;]
- [-0.0 0.0 … 0.013893512 0.0; -0.0 -0.0 … -0.0 -0.02704144; … ; 0.0 -0.0 … -0.0 0.0012966063; 0.0 -0.022321453 … -0.00035889685 -0.004800467;;; 0.017522603 -0.062322963 … 0.0762988 0.034752846; -0.017428866 -0.019996174 … 0.059957143 -0.119387954; … ; 0.026025044 0.048225258 … 0.00820133 0.034158085; 0.004675548 -0.010452151 … 0.015543055 -0.004731988;;; -0.0158092 0.0016721005 … -0.022262266 0.0033431738; -0.0007202163 -0.0 … -0.0 -0.0; … ; 0.010006575 0.0 … 0.0 -0.0; -0.00095969904 0.0016532672 … -0.0 -4.868964f-5;;; 0.0 -0.0028197714 … -0.0 0.0037445098; 0.0 -0.0 … -0.0 -0.0; … ; 0.0 0.0 … 0.0 -0.0; -0.0 0.0 … -0.0 0.0;;; 0.0036162864 0.011475153 … -0.007317215 0.013698082; -0.014237667 -0.057123855 … -0.016851736 0.005776418; … ; -0.04622821 0.067472324 … -0.018283978 -0.027103558; 0.01124425 -0.0018985006 … -0.004256531 0.0027833695;;; 0.0 -0.0 … 0.0021832837 -0.028024347; -0.0 -0.005130376 … 0.00027432645 0.014130047; … ; 0.0 -0.031428806 … -0.004049 -0.015983729; -0.00027473547 -0.0021309366 … -0.0048160055 -0.014293882;;; -0.01207277 -0.054163843 … -0.08658156 -0.062140226; -0.025754971 0.0016448617 … -0.08806112 0.04779901; … ; -0.012978482 -8.204502f-5 … -0.0064664073 -0.008652067; -0.010729361 -0.00642478 … 0.0023604718 0.007876189;;; -0.012971649 0.045052778 … 0.12304125 0.02372979; 0.014348112 -0.03575912 … -0.11106775 -0.07331434; … ; -0.021503977 0.058245074 … -0.039259996 -0.01815788; -0.00049811776 -0.019827712 … 0.0025082864 -0.02145152;;;;]
- [-0.0 -0.0069106407 … 0.017630795 0.0053351102; 0.0 -0.0 … -0.0 -0.015785221; … ; 0.0 0.0 … 0.0 0.002176354; 0.0 0.0 … 0.0 0.00015630218;;; 0.0 0.0 … 0.0027944427 -0.0020671915; 0.0 0.0 … -0.0 0.009794469; … ; 0.0 0.0 … 0.0 0.004196387; 0.0 -0.0 … 0.0 -0.0011542102;;; -0.0 0.0 … 0.0 -0.0; 0.0 -0.0 … -0.0 -0.0; … ; -0.0 0.0 … 0.0 0.0; 0.0 -0.0 … 0.0 -0.0;;; … ;;; -0.0 -0.0025528236 … 0.008244845 0.014012203; -0.034055445 0.0057157986 … -0.08342037 -0.0433095; … ; 0.01988188 -0.0029564125 … 0.0509651 0.008807763; 0.0025244674 -0.010957105 … 0.033860177 0.0016514664;;; -0.0 0.0 … -0.0 0.009983332; -0.0 0.0 … 0.0 0.0; … ; -0.0 0.0 … -0.0 0.0; -0.0 0.0 … 0.0 0.0;;; 0.0015559589 -0.098932415 … -0.031985477 -0.008361466; -0.013876278 0.006019482 … -0.0130204195 0.019404432; … ; -0.002724205 -0.025462287 … 0.028628252 -0.011128603; -0.0 -0.0 … -0.0 -0.00043414664;;;;]
- [0.0 0.0007313333 … 0.00023889716 0.000802325; -0.1229928 0.01673135 … 0.015141603 0.00059648673; … ; -0.0012434542 0.0043078335 … -0.02793356 0.0008545891; 0.00071591506 0.0 … 0.0 0.0;;; -0.0007877251 -0.00011291591 … -0.0 -0.0; -0.001261818 0.0066601327 … -0.0 0.036132533; … ; -4.8772443f-5 -0.0 … -0.0 -0.0; -0.0 -0.0 … -0.0 -0.0;;; -0.019425998 -0.019744134 … -0.0045461487 -0.0; -0.016877916 -0.030849254 … -0.010450846 -0.009466191; … ; -0.028659314 -0.033238385 … -0.021658922 -0.011390429; -0.00831633 -0.014118344 … -0.0076080714 -0.0053279707;;; -0.009999247 0.00022019238 … 0.00041797463 0.00027496935; -0.0036646854 -0.0068193427 … 0.01696281 -0.017905114; … ; 0.0 0.0 … 9.77087f-5 0.00024806437; 0.00019484932 0.00030192756 … 0.00029398574 1.889332f-5;;; 0.0011164722 -0.0 … -0.0 -0.0; -0.0 -0.0 … -0.0 -0.0; … ; 0.0015054727 -0.00095814624 … 0.00077497464 0.00029341725; -0.00924244 -0.0033535266 … 0.0032362689 -0.0;;; -0.0 -0.0 … -0.0 -0.0; -0.0 -0.0 … -0.0 -0.0; … ; -0.0 -0.0 … -0.0 0.00167709; -3.8251355f-7 0.0008236506 … -0.003184022 0.002083247;;; -0.0 -0.0014248195 … -0.0042323847 -0.0; -0.0025525352 -0.057414826 … -0.002824782 -0.0028911857; … ; -0.00285632 -0.045944743 … 0.0483161 0.046754297; -0.001428533 -0.040429372 … -0.0058609266 -0.023800116;;; -0.0 -0.0 … -0.0 -0.0; -0.0 -0.0 … -0.0 -0.0; … ; -0.0 -0.0 … -0.0 -0.0; -0.0 -0.0 … -0.0 0.00052386103;;;;]
- [0.0 0.0028886402 … -2.7521888f-5 -0.00186171; -0.008642377 0.0065249153 … 0.0047177873 -0.00036978873; … ; -0.0020629764 0.00058060314 … -0.012415658 0.0029499922; 0.0018105485 0.0 … 0.0 0.0;;; 0.00029362284 -0.000107320295 … 0.0 0.0; 0.0003258799 0.0020408833 … 0.0 0.0013276662; … ; -1.4509937f-5 0.0 … 0.0 0.0; 0.0 0.0 … 0.0 0.0;;; 0.011150454 0.019498734 … -0.0021007138 0.0; -0.0003544218 0.0077359406 … -0.003043904 0.004699904; … ; -0.013695506 0.009964922 … -0.006913719 0.0026891092; 0.0008233385 -0.00017595408 … 0.0030297842 0.0024972649;;; -0.0015719595 -0.004308788 … -0.0014717263 -0.005392498; -0.0013754313 -0.0039540417 … 0.005317406 -0.0048811506; … ; 0.0 0.0 … -0.00010810665 0.00040118356; -0.00029156238 -0.0029937886 … -0.0024569104 0.0;;; 0.0014592897 0.0 … 0.0 0.0; 0.0 0.0 … 0.0 0.0; … ; 0.0014017643 0.002015151 … 0.00056676567 0.0003718638; -0.0071956627 -0.0018875207 … 0.003443957 0.0;;; 0.0 0.0 … 0.0 0.0; 0.0 0.0 … 0.0 0.0; … ; 0.0 0.0 … 0.0 0.0012142856; 4.6323335f-6 0.0004671621 … -0.006081901 -0.0001528387;;; 0.0 0.001480141 … -0.0058481935 0.0; -0.0052959304 -0.004699109 … 0.0014950532 0.003308977; … ; -0.00024809898 -0.010379411 … 0.023208247 0.0042587984; 0.0006641177 -0.008176546 … -4.94587f-5 -0.0010696442;;; 0.0 0.0 … 0.0 0.0; 0.0 0.0 … 0.0 0.0; … ; 0.0 0.0 … 0.0 0.0; 0.0 0.0 … 0.0 0.00054987177;;;;]
- [0.0; -0.008642377; … ; 0.0; 0.0005498718;;]
- [0.0; 0.0; … ; 0.024125015; 0.0;;]
- [0.0073964978; 0.0; … ; 0.024125015; 0.0011199445;;]
+ [0.012075527 -0.033965454 … 0.0062748506 0.010914707; 0.038185876 -0.10301141 … -0.04296844 -0.012282188; … ; 0.03680307 0.0063903756 … 0.009590587 0.0029232781; 0.060611613 0.023181804 … -0.0030978534 0.005093491;;; 0.006834915 0.079793856 … -0.00652297 0.0062898556; -0.036173407 -0.021219939 … -0.040224627 0.0038746095; … ; 0.015966563 0.07940842 … 0.019815344 -0.05508882; 0.017490491 -0.024528498 … 0.00305663 0.011775665;;; 0.06534993 0.01075969 … 0.0015058452 -0.011167471; -0.056264095 -0.08691356 … 0.028782493 -0.0030964124; … ; -0.025973715 0.0027485758 … 0.011712879 0.001995061; -0.0027966953 0.06693694 … 0.018838467 -0.01121829;;;;]
+ [0.021132203 -0.011900863 … -0.003244848 -0.0037818954; -0.0054758335 -0.082392514 … 0.039451905 -0.001930054; … ; -0.07092816 0.02321414 … -0.07196308 -0.013696575; 0.0034157662 0.00878201 … 0.0 -0.004049753;;; -0.0 0.0 … -0.0 0.0; 0.017962756 0.0 … -0.0 -0.0005708499; … ; -0.04256936 -0.0 … -0.026786232 -0.0; -0.0067991745 0.010044919 … -0.0029627213 0.0;;; 0.0 0.018287307 … -0.0 0.0; 0.019332092 0.0020243446 … -0.008135518 0.0; … ; -0.06692168 0.031454932 … 0.019724559 0.03330083; -0.0026828803 -0.040699948 … 0.0332756 -0.008093928;;; -0.01440019 0.031353157 … -0.0083758505 0.0; 0.0 -0.007342835 … -0.039693568 0.0; … ; 0.0032152592 0.0 … 0.028542178 0.0; -0.009116458 -0.01642934 … 0.0 0.00026276996;;; -0.0 -0.0 … 0.0 -0.0; 0.0 0.0 … 0.0 0.0; … ; 0.0 0.0 … 0.0 0.0; -0.0 0.0 … -0.0 -0.0;;; -0.00038011523 0.0016684806 … -0.03226623 -0.0; 0.0 -0.0 … -0.0 -0.0; … ; 0.0 0.0 … 0.0 0.0; -0.0 -0.0 … 0.0 -0.0;;; 0.010717269 -0.08184479 … -0.0021395213 -0.004668875; 0.0040217885 0.056555618 … 0.042825513 -0.0030855576; … ; -0.00025266074 -0.092097044 … 0.016299883 0.0374941; -0.0 0.01892568 … -0.0037127272 0.0062401453;;; 0.008265171 -0.032780208 … 0.017245477 0.0016902565; 0.02613619 0.0 … -0.025403498 -0.003582134; … ; 0.013642977 0.041020185 … 0.009061118 -0.011997079; 0.00048377103 0.0 … 0.0 -0.0;;;;]
+ [0.0 0.0 … 0.0 0.0; 0.0 0.0 … 0.0 0.0024356006; … ; 0.0 0.0 … 0.0 0.0; -0.04923193 0.0 … 0.00040502253 0.0;;; 0.0 0.0 … 0.0 0.0; 0.0 0.013409034 … 0.0 0.007944113; … ; 0.0 -0.0050453716 … -0.015503197 0.0; 0.0 0.0 … 0.0 0.0;;; 0.0 0.0 … 0.0 0.0; 0.0 0.0 … 0.0 0.0; … ; 0.0 0.0 … 0.0 0.0; 0.0 0.0 … 0.0 0.0;;; 0.0 0.0 … 0.0 0.0; -0.011398064 0.0 … 0.0 0.0; … ; 0.0 0.016680049 … 0.0 0.0036451952; 0.0 0.0 … 0.0 0.0;;; 0.0 0.041931257 … 0.0 0.0; 0.0 0.0 … -0.014231104 0.0; … ; 0.0 0.00034277313 … -0.024753492 0.0; 0.0 0.0 … 0.0 0.0;;; 0.0 0.0 … 0.0 0.0; -0.0038272126 0.0 … 0.0 -0.012051777; … ; 0.0 0.0 … 0.0 0.0; 0.0 0.0 … 0.0 0.0;;; 0.0 -0.075145006 … 0.0 0.0; 0.0 0.0 … -0.04074316 0.0; … ; 0.0 0.0 … 0.0 0.0; 0.0 -0.015887788 … 0.041650385 0.0;;; 0.0 0.052927982 … -0.011832283 0.0; 0.0 0.0 … 0.0 0.0; … ; -0.020751437 0.0 … 0.0 0.0; 0.0 0.0 … 0.0 0.0;;;;]
+ [-0.0 -0.0 … -0.0 0.0024356006; -0.010429319 0.12607297 … -0.0061041475 -0.003083829; … ; -0.027474063 -0.064783074 … -0.0011379219 -0.057110295; -0.04923193 -0.10265392 … -0.0973984 0.00040502253;;; 0.013409034 0.013160653 … 0.01726882 0.007944113; 0.119631246 -0.14587191 … -0.050687313 0.016962249; … ; 0.023661459 0.024825037 … 0.029446127 0.049719177; -0.0050453716 -0.049069487 … 0.04378974 -0.015503197;;; -0.0 -0.0 … -0.0 0.0; 0.0 -0.0 … 0.0 -0.0; … ; -0.0 -0.0 … 0.0 0.0; -0.0 0.0 … 0.0 0.0;;; -0.011398065 0.0 … -0.0 -0.0; -0.01736959 0.011223614 … 0.0056100977 -0.0; … ; -0.15844041 -0.08405198 … -0.0027374479 0.0; 0.016680049 -0.019766783 … 0.04877694 0.0036451952;;; 0.041931257 0.029733831 … -0.0025247468 -0.014231104; -0.03149809 -0.06821704 … -0.05066849 -0.014421882; … ; -0.0744649 -0.016112905 … 0.024088502 0.04548024; 0.0003427732 0.050394405 … 0.05503072 -0.024753492;;; -0.0038272126 0.00050944206 … -0.0 -0.012051777; -0.0 0.0 … -0.017726872 -0.0; … ; -0.038551215 -0.058575206 … -0.0 -0.0; -0.0 0.0 … -0.004859534 0.0;;; -0.075145006 0.038832165 … -0.040437188 -0.04074316; -0.108427234 0.33997026 … 0.0031223828 0.0065576024; … ; -0.07154113 0.08135031 … 0.025715886 -0.06642828; -0.015887788 0.23244813 … -0.0011331516 0.041650385;;; 0.052927982 0.09181431 … 0.05724064 -0.011832283; 0.15357831 -0.024482159 … -0.0 0.0015081036; … ; 0.15987286 -0.0014881686 … 0.070171796 -0.0056991717; -0.020751437 0.00404256 … -0.0 -0.0;;;;]
+ [0.0 -0.0 … 0.0 0.0005422209; 0.0 0.032469094 … 0.0 -0.0018230015; … ; 0.0 0.0 … 0.006113024 0.019767454; -0.0 0.0014404448 … 0.00052689837 0.0;;; -0.010642059 -0.008741316 … 0.0023216624 -0.0; -0.0033221976 0.013793992 … -0.0 0.0; … ; 0.039489865 0.007907649 … 0.0044166134 -0.008919255; 0.048689846 -0.14732403 … -0.040441446 0.0;;; 0.0011248641 0.048347138 … -0.045463037 -0.0006082539; -0.027674185 0.03532224 … 0.03959914 0.0013659149; … ; -0.020682504 0.04762185 … 0.03193263 0.009543797; 0.02251015 -0.005022062 … 0.028766094 0.0045430507;;; … ;;; -0.00088622066 -0.0 … -0.00125109 -0.0; -0.0 0.0041082343 … -0.0036717805 -0.0120829; … ; 0.0 0.10593838 … -0.0077199303 0.00040635068; 0.0067463364 0.000641426 … 0.009033866 -0.006938435;;; -0.0 1.6550723f-5 … 0.0015407992 -0.0; -0.034198128 0.0 … 0.0 -0.0; … ; 0.045488425 -0.0 … 0.007379008 -0.0; 0.0 0.0 … -0.0 -0.0;;; -0.005927929 -0.033522956 … 0.044964444 -0.025499523; -0.0 0.03570822 … 0.00011794532 -0.0051862826; … ; -0.0 0.0025914835 … 0.054987065 -0.035647567; 0.0 -0.0003149487 … -0.0 -0.0018737761;;;;]
+ [0.010124285 0.0 … 0.016223172 0.002010543; 0.019095352 0.0051449803 … 0.0 0.015122687; … ; 0.0016170243 0.059414644 … 0.00808145 0.017213115; 0.00961713 0.033469796 … 0.07159322 0.034287114;;; 0.0 -0.05963639 … 0.010644504 0.008845206; 0.0 1.7535249f-6 … 0.019827003 -0.023494847; … ; 0.0 1.8770475f-5 … -0.024158811 0.02483532; 0.0 4.5094246f-5 … 0.0148605425 5.3770742f-5;;; 0.0 0.0 … 0.0 0.0; 0.0 0.0 … 0.0 0.0; … ; 0.0 0.0 … 0.0 0.0; 0.0 0.0 … 0.0 0.0;;; -0.028357439 -0.00075395015 … 0.0010209352 -0.0; 0.010532249 -0.018197875 … -0.00012505849 -0.0; … ; -0.0 0.16741714 … -0.0 -5.03216f-5; -0.00016340692 -0.0 … 0.013541518 -0.0;;; -0.0 -0.0 … -0.0 -0.027302753; -0.0 -0.0 … -0.0 -0.0; … ; -0.0 -0.0 … -0.0 -0.00020184672; -0.0 -0.0 … -0.0 -0.0;;; -0.0 0.00010945892 … 0.023903675 -0.005050329; -0.0 -0.0 … -0.0 -0.0; … ; -0.0 -0.0 … -0.0 -0.0; -0.0 -0.0 … -0.0 -0.0;;; 0.0 0.0 … 0.0 0.0; 0.0 0.0 … 0.0 0.0; … ; 0.0 0.0 … 0.0 0.0; 0.0 0.0 … 0.0 0.0;;; 0.001442671 -0.0 … -0.0 -0.0; -0.0 -0.0 … -0.0 -0.0; … ; -0.0 -0.0 … -0.0 -0.0; -0.0 -0.0 … -0.0 -0.0;;;;]
+ [0.0036867645 0.0 … 0.009077327 0.0; -0.0008476483 -0.00082719704 … 0.0 -0.0005173; … ; 0.0 -0.011804008 … -0.0019885113 0.0027204514; -0.007668133 0.019490136 … 0.004388473 -0.02339485;;; 0.0 -0.006089351 … 0.0065577906 0.0057301857; 0.0 0.0 … 0.011053315 -0.013486754; … ; 0.0 -0.00089925155 … -0.01865479 0.015318556; 0.0 0.0061494582 … 0.008147786 -0.0021142247;;; 0.0 0.0 … 0.0 0.0; 0.0 0.0 … 0.0 0.0; … ; 0.0 0.0 … 0.0 0.0; 0.0 0.0 … 0.0 0.0;;; -0.016631544 -0.00010242814 … 0.0005805541 0.0; 0.0056723566 -0.008905447 … 0.0031136973 0.0; … ; 0.0 0.00805382 … 0.0 0.0004743735; 0.0023537732 0.0 … 0.0073112966 0.0;;; 0.0 0.0 … 0.0 -0.0068222573; 0.0 0.0 … 0.0 0.0; … ; 0.0 0.0 … 0.0 0.00014966603; 0.0 0.0 … 0.0 0.0;;; 0.0 0.0001645386 … 0.027381457 -0.0033824313; 0.0 0.0 … 0.0 0.0; … ; 0.0 0.0 … 0.0 0.0; 0.0 0.0 … 0.0 0.0;;; 0.0 0.0 … 0.0 0.0; 0.0 0.0 … 0.0 0.0; … ; 0.0 0.0 … 0.0 0.0; 0.0 0.0 … 0.0 0.0;;; 0.0016362044 0.0 … 0.0 0.0; 0.0 0.0 … 0.0 0.0; … ; 0.0 0.0 … 0.0 0.0; 0.0 0.0 … 0.0 0.0;;;;]
+ [0.0036867647; -0.00084764825; … ; 0.0; 0.0;;]
+ [-0.0; 0.0; … ; 0.0; 0.0;;]
+ [-0.0063927816; 0.010211879; … ; 0.0; 0.056226533;;]
  [0.0; 0.0; … ; 0.0; 0.0;;]

Performance tips

Using LRP without a GPU

Since ExplainableAI.jl's LRP implementation makes use of Tullio.jl, analysis can be accelerated by loading either

This only requires loading the LoopVectorization.jl package before ExplainableAI.jl:

using LoopVectorization
-using ExplainableAI

This page was generated using Literate.jl.

+using ExplainableAI

This page was generated using Literate.jl.

diff --git a/dev/generated/lrp/composites/index.html b/dev/generated/lrp/composites/index.html index 155704ae..74efb63e 100644 --- a/dev/generated/lrp/composites/index.html +++ b/dev/generated/lrp/composites/index.html @@ -192,4 +192,4 @@ Dropout(0.5) => PassRule(), Dense(512 => 100, relu) => EpsilonRule{Float32}(1.0f-6), ), -)

This page was generated using Literate.jl.

+)

This page was generated using Literate.jl.

diff --git a/dev/generated/lrp/custom_layer/index.html b/dev/generated/lrp/custom_layer/index.html index 0a8fee60..41122705 100644 --- a/dev/generated/lrp/custom_layer/index.html +++ b/dev/generated/lrp/custom_layer/index.html @@ -65,4 +65,4 @@ LRP(model; skip_checks=true)
LRP(
   Dense(100 => 20, unknown_activation)                    => ZeroRule(),
   Main.MyDoublingLayer() => ZeroRule(),
-)

Instead of throwing the usual ERROR: Unknown layer or activation function found in model, the LRP analyzer was created without having to register either the layer UnknownLayer or the activation function unknown_activation.


This page was generated using Literate.jl.

+)

Instead of throwing the usual ERROR: Unknown layer or activation function found in model, the LRP analyzer was created without having to register either the layer UnknownLayer or the activation function unknown_activation.


This page was generated using Literate.jl.

diff --git a/dev/generated/lrp/custom_rules/index.html b/dev/generated/lrp/custom_rules/index.html index c56a7dc5..ae8cb347 100644 --- a/dev/generated/lrp/custom_rules/index.html +++ b/dev/generated/lrp/custom_rules/index.html @@ -52,4 +52,4 @@ │ calls │ calls ┌─────────▼─────────┐ ┌─────────▼─────────┐ │ modify_parameters │ │ modify_parameters │ -└───────────────────┘ └───────────────────┘

Therefore modify_layer should only be extended for a specific rule and a specific layer type.

Advanced LRP rules

To implement custom LRP rules that require more than modify_layer, modify_input and modify_denominator, take a look at the LRP developer documentation.


This page was generated using Literate.jl.

+└───────────────────┘ └───────────────────┘

Therefore modify_layer should only be extended for a specific rule and a specific layer type.

Advanced LRP rules

To implement custom LRP rules that require more than modify_layer, modify_input and modify_denominator, take a look at the LRP developer documentation.


This page was generated using Literate.jl.

diff --git a/dev/index.html b/dev/index.html index da1a4799..905819a6 100644 --- a/dev/index.html +++ b/dev/index.html @@ -1,2 +1,2 @@ -Home · ExplainableAI.jl

ExplainableAI.jl

Explainable AI in Julia using Flux.jl.

Installation

To install this package and its dependencies, open the Julia REPL and run

julia> ]add ExplainableAI

Manual

General usage

LRP

API reference

General

LRP

+Home · ExplainableAI.jl

ExplainableAI.jl

Explainable AI in Julia using Flux.jl.

Installation

To install this package and its dependencies, open the Julia REPL and run

julia> ]add ExplainableAI

Manual

General usage

LRP

API reference

General

LRP

diff --git a/dev/lrp/api/index.html b/dev/lrp/api/index.html index 4fc6edea..19cab4e1 100644 --- a/dev/lrp/api/index.html +++ b/dev/lrp/api/index.html @@ -1,17 +1,17 @@ -LRP · ExplainableAI.jl

LRP analyzer

Refer to LRP for documentation on the LRP analyzer.

LRP rules

ExplainableAI.ZeroRuleType
ZeroRule()

LRP-$0$ rule. Commonly used on upper layers.

Definition

Propagates relevance $R^{k+1}$ at layer output to $R^k$ at layer input according to

\[R_j^k = \sum_i \frac{W_{ij}a_j^k}{\sum_l W_{il}a_l^k+b_i} R_i^{k+1}\]

References

  • S. Bach et al., On Pixel-Wise Explanations for Non-Linear Classifier Decisions by Layer-Wise Relevance Propagation
source
ExplainableAI.EpsilonRuleType
EpsilonRule([epsilon=1.0e-6])

LRP-$ϵ$ rule. Commonly used on middle layers.

Definition

Propagates relevance $R^{k+1}$ at layer output to $R^k$ at layer input according to

\[R_j^k = \sum_i\frac{W_{ij}a_j^k}{\epsilon +\sum_{l}W_{il}a_l^k+b_i} R_i^{k+1}\]

Optional arguments

  • epsilon: Optional stabilization parameter, defaults to 1.0e-6.

References

  • S. Bach et al., On Pixel-Wise Explanations for Non-Linear Classifier Decisions by Layer-Wise Relevance Propagation
source
ExplainableAI.GammaRuleType
GammaRule([gamma=0.25])

LRP-$γ$ rule. Commonly used on lower layers.

Definition

Propagates relevance $R^{k+1}$ at layer output to $R^k$ at layer input according to

\[R_j^k = \sum_i\frac{(W_{ij}+\gamma W_{ij}^+)a_j^k} - {\sum_l(W_{il}+\gamma W_{il}^+)a_l^k+(b_i+\gamma b_i^+)} R_i^{k+1}\]

Optional arguments

  • gamma: Optional multiplier for added positive weights, defaults to 0.25.

References

  • G. Montavon et al., Layer-Wise Relevance Propagation: An Overview
source
ExplainableAI.WSquareRuleType
WSquareRule()

LRP-$w²$ rule. Commonly used on the first layer when values are unbounded.

Definition

Propagates relevance $R^{k+1}$ at layer output to $R^k$ at layer input according to

\[R_j^k = \sum_i\frac{W_{ij}^2}{\sum_l W_{il}^2} R_i^{k+1}\]

References

  • G. Montavon et al., Explaining Nonlinear Classification Decisions with Deep Taylor Decomposition
source
ExplainableAI.FlatRuleType
FlatRule()

LRP-Flat rule. Similar to the WSquareRule, but with all weights set to one and all bias terms set to zero.

Definition

Propagates relevance $R^{k+1}$ at layer output to $R^k$ at layer input according to

\[R_j^k = \sum_i\frac{1}{\sum_l 1} R_i^{k+1} = \sum_i\frac{1}{n_i} R_i^{k+1}\]

where $n_i$ is the number of input neurons connected to the output neuron at index $i$.

References

  • S. Lapuschkin et al., Unmasking Clever Hans predictors and assessing what machines really learn
source
ExplainableAI.AlphaBetaRuleType
AlphaBetaRule([alpha=2.0, beta=1.0])

LRP-$αβ$ rule. Weights positive and negative contributions according to the parameters alpha and beta respectively. The difference $α-β$ must be equal to one. Commonly used on lower layers.

Definition

Propagates relevance $R^{k+1}$ at layer output to $R^k$ at layer input according to

\[R_j^k = \sum_i\left( +LRP · ExplainableAI.jl

LRP analyzer

Refer to LRP for documentation on the LRP analyzer.

LRP rules

ExplainableAI.ZeroRuleType
ZeroRule()

LRP-$0$ rule. Commonly used on upper layers.

Definition

Propagates relevance $R^{k+1}$ at layer output to $R^k$ at layer input according to

\[R_j^k = \sum_i \frac{W_{ij}a_j^k}{\sum_l W_{il}a_l^k+b_i} R_i^{k+1}\]

References

  • S. Bach et al., On Pixel-Wise Explanations for Non-Linear Classifier Decisions by Layer-Wise Relevance Propagation
source
ExplainableAI.EpsilonRuleType
EpsilonRule([epsilon=1.0e-6])

LRP-$ϵ$ rule. Commonly used on middle layers.

Definition

Propagates relevance $R^{k+1}$ at layer output to $R^k$ at layer input according to

\[R_j^k = \sum_i\frac{W_{ij}a_j^k}{\epsilon +\sum_{l}W_{il}a_l^k+b_i} R_i^{k+1}\]

Optional arguments

  • epsilon: Optional stabilization parameter, defaults to 1.0e-6.

References

  • S. Bach et al., On Pixel-Wise Explanations for Non-Linear Classifier Decisions by Layer-Wise Relevance Propagation
source
ExplainableAI.GammaRuleType
GammaRule([gamma=0.25])

LRP-$γ$ rule. Commonly used on lower layers.

Definition

Propagates relevance $R^{k+1}$ at layer output to $R^k$ at layer input according to

\[R_j^k = \sum_i\frac{(W_{ij}+\gamma W_{ij}^+)a_j^k} + {\sum_l(W_{il}+\gamma W_{il}^+)a_l^k+(b_i+\gamma b_i^+)} R_i^{k+1}\]

Optional arguments

  • gamma: Optional multiplier for added positive weights, defaults to 0.25.

References

  • G. Montavon et al., Layer-Wise Relevance Propagation: An Overview
source
ExplainableAI.WSquareRuleType
WSquareRule()

LRP-$w²$ rule. Commonly used on the first layer when values are unbounded.

Definition

Propagates relevance $R^{k+1}$ at layer output to $R^k$ at layer input according to

\[R_j^k = \sum_i\frac{W_{ij}^2}{\sum_l W_{il}^2} R_i^{k+1}\]

References

  • G. Montavon et al., Explaining Nonlinear Classification Decisions with Deep Taylor Decomposition
source
ExplainableAI.FlatRuleType
FlatRule()

LRP-Flat rule. Similar to the WSquareRule, but with all weights set to one and all bias terms set to zero.

Definition

Propagates relevance $R^{k+1}$ at layer output to $R^k$ at layer input according to

\[R_j^k = \sum_i\frac{1}{\sum_l 1} R_i^{k+1} = \sum_i\frac{1}{n_i} R_i^{k+1}\]

where $n_i$ is the number of input neurons connected to the output neuron at index $i$.

References

  • S. Lapuschkin et al., Unmasking Clever Hans predictors and assessing what machines really learn
source
ExplainableAI.AlphaBetaRuleType
AlphaBetaRule([alpha=2.0, beta=1.0])

LRP-$αβ$ rule. Weights positive and negative contributions according to the parameters alpha and beta respectively. The difference $α-β$ must be equal to one. Commonly used on lower layers.

Definition

Propagates relevance $R^{k+1}$ at layer output to $R^k$ at layer input according to

\[R_j^k = \sum_i\left( \alpha\frac{\left(W_{ij}a_j^k\right)^+}{\sum_l\left(W_{il}a_l^k+b_i\right)^+} -\beta\frac{\left(W_{ij}a_j^k\right)^-}{\sum_l\left(W_{il}a_l^k+b_i\right)^-} -\right) R_i^{k+1}\]

Optional arguments

  • alpha: Multiplier for the positive output term, defaults to 2.0.
  • beta: Multiplier for the negative output term, defaults to 1.0.

References

  • S. Bach et al., On Pixel-Wise Explanations for Non-Linear Classifier Decisions by Layer-Wise Relevance Propagation
  • G. Montavon et al., Layer-Wise Relevance Propagation: An Overview
source
ExplainableAI.ZPlusRuleType
ZPlusRule()

LRP-$z⁺$ rule. Commonly used on lower layers.

Equivalent to AlphaBetaRule(1.0f0, 0.0f0), but slightly faster. See also AlphaBetaRule.

Definition

Propagates relevance $R^{k+1}$ at layer output to $R^k$ at layer input according to

\[R_j^k = \sum_i\frac{\left(W_{ij}a_j^k\right)^+}{\sum_l\left(W_{il}a_l^k+b_i\right)^+} R_i^{k+1}\]

References

  • S. Bach et al., On Pixel-Wise Explanations for Non-Linear Classifier Decisions by Layer-Wise Relevance Propagation
  • G. Montavon et al., Explaining Nonlinear Classification Decisions with Deep Taylor Decomposition
source
ExplainableAI.ZBoxRuleType
ZBoxRule(low, high)

LRP-$zᴮ$-rule. Commonly used on the first layer for pixel input.

The parameters low and high should be set to the lower and upper bounds of the input features, e.g. 0.0 and 1.0 for raw image data. It is also possible to provide two arrays of that match the input size.

Definition

Propagates relevance $R^{k+1}$ at layer output to $R^k$ at layer input according to

\[R_j^k=\sum_i \frac{W_{ij}a_j^k - W_{ij}^{+}l_j - W_{ij}^{-}h_j} - {\sum_l W_{il}a_l^k+b_i - \left(W_{il}^{+}l_l+b_i^{+}\right) - \left(W_{il}^{-}h_l+b_i^{-}\right)} R_i^{k+1}\]

References

  • G. Montavon et al., Layer-Wise Relevance Propagation: An Overview
source
ExplainableAI.PassRuleType
PassRule()

Pass-through rule. Passes relevance through to the lower layer.

Supports layers with constant input and output shapes, e.g. reshaping layers.

Definition

Propagates relevance $R^{k+1}$ at layer output to $R^k$ at layer input according to

\[R_j^k = R_j^{k+1}\]

source
ExplainableAI.GeneralizedGammaRuleType
GeneralizedGammaRule([gamma=0.25])

Generalized LRP-$γ$ rule. Can be used on layers with leakyrelu activation functions.

Definition

Propagates relevance $R^{k+1}$ at layer output to $R^k$ at layer input according to

\[R_j^k = \sum_i\frac +\right) R_i^{k+1}\]

Optional arguments

  • alpha: Multiplier for the positive output term, defaults to 2.0.
  • beta: Multiplier for the negative output term, defaults to 1.0.

References

  • S. Bach et al., On Pixel-Wise Explanations for Non-Linear Classifier Decisions by Layer-Wise Relevance Propagation
  • G. Montavon et al., Layer-Wise Relevance Propagation: An Overview
source
ExplainableAI.ZPlusRuleType
ZPlusRule()

LRP-$z⁺$ rule. Commonly used on lower layers.

Equivalent to AlphaBetaRule(1.0f0, 0.0f0), but slightly faster. See also AlphaBetaRule.

Definition

Propagates relevance $R^{k+1}$ at layer output to $R^k$ at layer input according to

\[R_j^k = \sum_i\frac{\left(W_{ij}a_j^k\right)^+}{\sum_l\left(W_{il}a_l^k+b_i\right)^+} R_i^{k+1}\]

References

  • S. Bach et al., On Pixel-Wise Explanations for Non-Linear Classifier Decisions by Layer-Wise Relevance Propagation
  • G. Montavon et al., Explaining Nonlinear Classification Decisions with Deep Taylor Decomposition
source
ExplainableAI.ZBoxRuleType
ZBoxRule(low, high)

LRP-$zᴮ$-rule. Commonly used on the first layer for pixel input.

The parameters low and high should be set to the lower and upper bounds of the input features, e.g. 0.0 and 1.0 for raw image data. It is also possible to provide two arrays of that match the input size.

Definition

Propagates relevance $R^{k+1}$ at layer output to $R^k$ at layer input according to

\[R_j^k=\sum_i \frac{W_{ij}a_j^k - W_{ij}^{+}l_j - W_{ij}^{-}h_j} + {\sum_l W_{il}a_l^k+b_i - \left(W_{il}^{+}l_l+b_i^{+}\right) - \left(W_{il}^{-}h_l+b_i^{-}\right)} R_i^{k+1}\]

References

  • G. Montavon et al., Layer-Wise Relevance Propagation: An Overview
source
ExplainableAI.PassRuleType
PassRule()

Pass-through rule. Passes relevance through to the lower layer.

Supports layers with constant input and output shapes, e.g. reshaping layers.

Definition

Propagates relevance $R^{k+1}$ at layer output to $R^k$ at layer input according to

\[R_j^k = R_j^{k+1}\]

source
ExplainableAI.GeneralizedGammaRuleType
GeneralizedGammaRule([gamma=0.25])

Generalized LRP-$γ$ rule. Can be used on layers with leakyrelu activation functions.

Definition

Propagates relevance $R^{k+1}$ at layer output to $R^k$ at layer input according to

\[R_j^k = \sum_i\frac {(W_{ij}+\gamma W_{ij}^+)a_j^+ +(W_{ij}+\gamma W_{ij}^-)a_j^-} {\sum_l(W_{il}+\gamma W_{il}^+)a_j^+ +(W_{il}+\gamma W_{il}^-)a_j^- +(b_i+\gamma b_i^+)} I(z_k>0) \cdot R^{k+1}_i +\sum_i\frac {(W_{ij}+\gamma W_{ij}^-)a_j^+ +(W_{ij}+\gamma W_{ij}^+)a_j^-} {\sum_l(W_{il}+\gamma W_{il}^-)a_j^+ +(W_{il}+\gamma W_{il}^+)a_j^- +(b_i+\gamma b_i^-)} -I(z_k<0) \cdot R^{k+1}_i\]

Optional arguments

  • gamma: Optional multiplier for added positive weights, defaults to 0.25.

References

  • L. Andéol et al., Learning Domain Invariant Representations by Joint Wasserstein Distance Minimization
source

For manual rule assignment, use ChainTuple and ParallelTuple, matching the model structure:

Composites

Applying composites

ExplainableAI.CompositeType
Composite(primitives...)
+I(z_k<0) \cdot R^{k+1}_i\]

Optional arguments

  • gamma: Optional multiplier for added positive weights, defaults to 0.25.

References

  • L. Andéol et al., Learning Domain Invariant Representations by Joint Wasserstein Distance Minimization
source

For manual rule assignment, use ChainTuple and ParallelTuple, matching the model structure:

Composites

Applying composites

ExplainableAI.CompositeType
Composite(primitives...)
 Composite(default_rule, primitives...)

Automatically contructs a list of LRP-rules by sequentially applying composite primitives.

Primitives

To apply a single rule, use:

To apply a set of rules to layers based on their type, use:

Example

Using a VGG11 model:

julia> composite = Composite(
            GlobalTypeMap(
                ConvLayer => AlphaBetaRule(),
@@ -44,7 +44,7 @@
   Dense(4096 => 4096, relu)             => EpsilonRule{Float32}(1.0f-6),
   Dropout(0.5)                          => PassRule(),
   Dense(4096 => 1000)                   => EpsilonRule{Float32}(1.0f-6),
-)
source

Composite primitives

Mapping layers to rules

Composite primitives that apply a single rule:

ExplainableAI.LayerMapType
LayerMap(index, rule)

Composite primitive that maps an LRP-rule to all layers in the model at the given index. The index can either be an integer or a tuple of integers to map a rule to a specific layer in nested Flux Chains.

See show_layer_indices to print layer indices and Composite for an example.

source
ExplainableAI.RangeMapType
RangeMap(range, rule)

Composite primitive that maps an LRP-rule to the specified positional range of layers in the model.

See Composite for an example.

source

To apply LayerMap to nested Flux Chains or Parallel layers, make use of show_layer_indices:

Mapping layers to rules based on type

Composite primitives that apply rules based on the layer type:

ExplainableAI.RangeTypeMapType
RangeTypeMap(range, map)

Composite primitive that maps layer types to LRP rules based on a list of type-rule-pairs map within the specified range of layers in the model.

See Composite for an example.

source
ExplainableAI.FirstNTypeMapType
FirstNTypeMap(n, map)

Composite primitive that maps layer types to LRP rules based on a list of type-rule-pairs map within the first n layers in the model.

See Composite for an example.

source

Union types for composites

The following exported union types types can be used to define TypeMaps:

Composite presets

ExplainableAI.EpsilonGammaBoxFunction
EpsilonGammaBox(low, high; [epsilon=1.0f-6, gamma=0.25f0])

Composite using the following primitives:

julia> EpsilonGammaBox(-3.0f0, 3.0f0)
+)
source

Composite primitives

Mapping layers to rules

Composite primitives that apply a single rule:

ExplainableAI.LayerMapType
LayerMap(index, rule)

Composite primitive that maps an LRP-rule to all layers in the model at the given index. The index can either be an integer or a tuple of integers to map a rule to a specific layer in nested Flux Chains.

See show_layer_indices to print layer indices and Composite for an example.

source
ExplainableAI.RangeMapType
RangeMap(range, rule)

Composite primitive that maps an LRP-rule to the specified positional range of layers in the model.

See Composite for an example.

source

To apply LayerMap to nested Flux Chains or Parallel layers, make use of show_layer_indices:

Mapping layers to rules based on type

Composite primitives that apply rules based on the layer type:

ExplainableAI.RangeTypeMapType
RangeTypeMap(range, map)

Composite primitive that maps layer types to LRP rules based on a list of type-rule-pairs map within the specified range of layers in the model.

See Composite for an example.

source
ExplainableAI.FirstNTypeMapType
FirstNTypeMap(n, map)

Composite primitive that maps layer types to LRP rules based on a list of type-rule-pairs map within the first n layers in the model.

See Composite for an example.

source

Union types for composites

The following exported union types types can be used to define TypeMaps:

Composite presets

ExplainableAI.EpsilonGammaBoxFunction
EpsilonGammaBox(low, high; [epsilon=1.0f-6, gamma=0.25f0])

Composite using the following primitives:

julia> EpsilonGammaBox(-3.0f0, 3.0f0)
 Composite(
   GlobalTypeMap(  # all layers
     Flux.Conv               => ExplainableAI.GammaRule{Float32}(0.25f0),
@@ -64,7 +64,7 @@
     Flux.ConvTranspose => ExplainableAI.ZBoxRule{Float32}(-3.0f0, 3.0f0),
     Flux.CrossCor      => ExplainableAI.ZBoxRule{Float32}(-3.0f0, 3.0f0),
  ),
-)
source
ExplainableAI.EpsilonPlusFunction
EpsilonPlus(; [epsilon=1.0f-6])

Composite using the following primitives:

julia> EpsilonPlus()
 Composite(
   GlobalTypeMap(  # all layers
     Flux.Conv               => ExplainableAI.ZPlusRule(),
@@ -79,7 +79,7 @@
     typeof(MLUtils.flatten) => ExplainableAI.PassRule(),
     typeof(identity)        => ExplainableAI.PassRule(),
  ),
-)
source
ExplainableAI.EpsilonAlpha2Beta1Function
EpsilonAlpha2Beta1(; [epsilon=1.0f-6])

Composite using the following primitives:

julia> EpsilonAlpha2Beta1()
 Composite(
   GlobalTypeMap(  # all layers
     Flux.Conv               => ExplainableAI.AlphaBetaRule{Float32}(2.0f0, 1.0f0),
@@ -94,7 +94,7 @@
     typeof(MLUtils.flatten) => ExplainableAI.PassRule(),
     typeof(identity)        => ExplainableAI.PassRule(),
  ),
-)
source
ExplainableAI.EpsilonPlusFlatFunction
EpsilonPlusFlat(; [epsilon=1.0f-6])

Composite using the following primitives:

julia> EpsilonPlusFlat()
 Composite(
   GlobalTypeMap(  # all layers
     Flux.Conv               => ExplainableAI.ZPlusRule(),
@@ -115,7 +115,7 @@
     Flux.CrossCor      => ExplainableAI.FlatRule(),
     Flux.Dense         => ExplainableAI.FlatRule(),
  ),
-)
source
ExplainableAI.EpsilonAlpha2Beta1FlatFunction
EpsilonAlpha2Beta1Flat(; [epsilon=1.0f-6])

Composite using the following primitives:

julia> EpsilonAlpha2Beta1Flat()
 Composite(
   GlobalTypeMap(  # all layers
     Flux.Conv               => ExplainableAI.AlphaBetaRule{Float32}(2.0f0, 1.0f0),
@@ -136,7 +136,7 @@
     Flux.CrossCor      => ExplainableAI.FlatRule(),
     Flux.Dense         => ExplainableAI.FlatRule(),
  ),
-)
source

Custom rules

These utilities can be used to define custom rules without writing boilerplate code. To extend these functions, explicitly import them:

ExplainableAI.modify_parametersFunction
modify_parameters(rule, parameter)

Modify parameters before computing the relevance.

Note

Use of a custom function modify_layer will overwrite functionality of modify_parameters, modify_weight and modify_bias for the implemented combination of rule and layer types. This is due to the fact that internally, modify_weight and modify_bias are called by the default implementation of modify_layer. modify_weight and modify_bias in turn call modify_parameters by default.

The default call structure looks as follows:

┌─────────────────────────────────────────┐
+)
source

Custom rules

These utilities can be used to define custom rules without writing boilerplate code. To extend these functions, explicitly import them:

ExplainableAI.modify_parametersFunction
modify_parameters(rule, parameter)

Modify parameters before computing the relevance.

Note

Use of a custom function modify_layer will overwrite functionality of modify_parameters, modify_weight and modify_bias for the implemented combination of rule and layer types. This is due to the fact that internally, modify_weight and modify_bias are called by the default implementation of modify_layer. modify_weight and modify_bias in turn call modify_parameters by default.

The default call structure looks as follows:

┌─────────────────────────────────────────┐
 │              modify_layer               │
 └─────────┬─────────────────────┬─────────┘
           │ calls               │ calls
@@ -146,7 +146,7 @@
           │ calls               │ calls
 ┌─────────▼─────────┐ ┌─────────▼─────────┐
 │ modify_parameters │ │ modify_parameters │
-└───────────────────┘ └───────────────────┘
source
ExplainableAI.modify_weightFunction
modify_weight(rule, weight)

Modify layer weights before computing the relevance.

Note

Use of a custom function modify_layer will overwrite functionality of modify_parameters, modify_weight and modify_bias for the implemented combination of rule and layer types. This is due to the fact that internally, modify_weight and modify_bias are called by the default implementation of modify_layer. modify_weight and modify_bias in turn call modify_parameters by default.

The default call structure looks as follows:

┌─────────────────────────────────────────┐
+└───────────────────┘ └───────────────────┘
source
ExplainableAI.modify_weightFunction
modify_weight(rule, weight)

Modify layer weights before computing the relevance.

Note

Use of a custom function modify_layer will overwrite functionality of modify_parameters, modify_weight and modify_bias for the implemented combination of rule and layer types. This is due to the fact that internally, modify_weight and modify_bias are called by the default implementation of modify_layer. modify_weight and modify_bias in turn call modify_parameters by default.

The default call structure looks as follows:

┌─────────────────────────────────────────┐
 │              modify_layer               │
 └─────────┬─────────────────────┬─────────┘
           │ calls               │ calls
@@ -156,7 +156,7 @@
           │ calls               │ calls
 ┌─────────▼─────────┐ ┌─────────▼─────────┐
 │ modify_parameters │ │ modify_parameters │
-└───────────────────┘ └───────────────────┘
source
ExplainableAI.modify_biasFunction
modify_bias(rule, bias)

Modify layer bias before computing the relevance.

Note

Use of a custom function modify_layer will overwrite functionality of modify_parameters, modify_weight and modify_bias for the implemented combination of rule and layer types. This is due to the fact that internally, modify_weight and modify_bias are called by the default implementation of modify_layer. modify_weight and modify_bias in turn call modify_parameters by default.

The default call structure looks as follows:

┌─────────────────────────────────────────┐
+└───────────────────┘ └───────────────────┘
source
ExplainableAI.modify_biasFunction
modify_bias(rule, bias)

Modify layer bias before computing the relevance.

Note

Use of a custom function modify_layer will overwrite functionality of modify_parameters, modify_weight and modify_bias for the implemented combination of rule and layer types. This is due to the fact that internally, modify_weight and modify_bias are called by the default implementation of modify_layer. modify_weight and modify_bias in turn call modify_parameters by default.

The default call structure looks as follows:

┌─────────────────────────────────────────┐
 │              modify_layer               │
 └─────────┬─────────────────────┬─────────┘
           │ calls               │ calls
@@ -166,7 +166,7 @@
           │ calls               │ calls
 ┌─────────▼─────────┐ ┌─────────▼─────────┐
 │ modify_parameters │ │ modify_parameters │
-└───────────────────┘ └───────────────────┘
source
ExplainableAI.modify_layerFunction
modify_layer(rule, layer)

Modify layer before computing the relevance.

Note

Use of a custom function modify_layer will overwrite functionality of modify_parameters, modify_weight and modify_bias for the implemented combination of rule and layer types. This is due to the fact that internally, modify_weight and modify_bias are called by the default implementation of modify_layer. modify_weight and modify_bias in turn call modify_parameters by default.

The default call structure looks as follows:

┌─────────────────────────────────────────┐
+└───────────────────┘ └───────────────────┘
source
ExplainableAI.modify_layerFunction
modify_layer(rule, layer)

Modify layer before computing the relevance.

Note

Use of a custom function modify_layer will overwrite functionality of modify_parameters, modify_weight and modify_bias for the implemented combination of rule and layer types. This is due to the fact that internally, modify_weight and modify_bias are called by the default implementation of modify_layer. modify_weight and modify_bias in turn call modify_parameters by default.

The default call structure looks as follows:

┌─────────────────────────────────────────┐
 │              modify_layer               │
 └─────────┬─────────────────────┬─────────┘
           │ calls               │ calls
@@ -176,6 +176,6 @@
           │ calls               │ calls
 ┌─────────▼─────────┐ ┌─────────▼─────────┐
 │ modify_parameters │ │ modify_parameters │
-└───────────────────┘ └───────────────────┘
source

Compatibility settings:

ExplainableAI.LRP_CONFIG.supports_layerFunction
LRP_CONFIG.supports_layer(layer)

Check whether LRP can be used on a layer or a Chain. To extend LRP to your own layers, define:

LRP_CONFIG.supports_layer(::MyLayer) = true          # for structs
-LRP_CONFIG.supports_layer(::typeof(mylayer)) = true  # for functions
source
ExplainableAI.LRP_CONFIG.supports_activationFunction
LRP_CONFIG.supports_activation(σ)

Check whether LRP can be used on a given activation function. To extend LRP to your own activation functions, define:

LRP_CONFIG.supports_activation(::typeof(myactivation)) = true  # for functions
-LRP_CONFIG.supports_activation(::MyActivation) = true          # for structs
source

Index

+└───────────────────┘ └───────────────────┘
source

Compatibility settings:

ExplainableAI.LRP_CONFIG.supports_layerFunction
LRP_CONFIG.supports_layer(layer)

Check whether LRP can be used on a layer or a Chain. To extend LRP to your own layers, define:

LRP_CONFIG.supports_layer(::MyLayer) = true          # for structs
+LRP_CONFIG.supports_layer(::typeof(mylayer)) = true  # for functions
source
ExplainableAI.LRP_CONFIG.supports_activationFunction
LRP_CONFIG.supports_activation(σ)

Check whether LRP can be used on a given activation function. To extend LRP to your own activation functions, define:

LRP_CONFIG.supports_activation(::typeof(myactivation)) = true  # for functions
+LRP_CONFIG.supports_activation(::MyActivation) = true          # for structs
source

Index

diff --git a/dev/lrp/developer/index.html b/dev/lrp/developer/index.html index 28a59c92..e934639a 100644 --- a/dev/lrp/developer/index.html +++ b/dev/lrp/developer/index.html @@ -35,4 +35,4 @@ @tullio Rᵏ[j, b] = layer.weight[i, j] * ãᵏ[j, b] / z[i, b] * Rᵏ⁺¹[i, b] end

For maximum low-level control beyond modify_input and modify_denominator, you can also implement your own lrp! function and dispatch on individual rule types MyRule and layer types MyLayer:

function lrp!(Rᵏ, rule::MyRule, layer::MyLayer, modified_layer, aᵏ, Rᵏ⁺¹)
     Rᵏ .= ...
-end
+end
diff --git a/dev/search/index.html b/dev/search/index.html index d5063908..5fd826b7 100644 --- a/dev/search/index.html +++ b/dev/search/index.html @@ -1,2 +1,2 @@ -Search · ExplainableAI.jl

Loading search...

    +Search · ExplainableAI.jl

    Loading search...