-
Notifications
You must be signed in to change notification settings - Fork 34
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Bayes nn #5
Bayes nn #5
Conversation
How far along are you with this? Should I review it or is it more of a work in progress? Also, please add a description to your PR. |
Dear Alexx,
There is more work will be added.
I will inform you at which point to review.
Best regards,
Leonid Didukh
From: "Alexx Perloff" ***@***.***>
To: "cms-ml/documentation" ***@***.***>
Cc: "Leonid Didukh" ***@***.***>, "Author" ***@***.***>
Sent: Friday, 30 July, 2021 05:32:53
Subject: Re: [cms-ml/documentation] Bayes nn (#5)
How far along are you with this? Should I review it or is it more of a work in progress? Also, please add a description to your PR.
—
You are receiving this because you authored the thread.
Reply to this email directly, [ #5 (comment) | view it on GitHub ] , or [ https://github.com/notifications/unsubscribe-auth/AJFETGJOV2SBLQ27F6QCZV3T2IMOLANCNFSM5BGUFCTQ | unsubscribe ] .
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is a preliminary set of review comments. It seems that this document still needs some work.
@@ -0,0 +1,318 @@ | |||
# Bayesian Neural Network | |||
|
|||
The usual Neural Network are optimized in way to get fixed value of weights and biases that allows the model perform specific task successfully. Instead in |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
"Usually, neural networks are optimized in order to get a fixed value for the weights and biases which allow the model perform a specific task successfully."
# Bayesian Neural Network | ||
|
||
The usual Neural Network are optimized in way to get fixed value of weights and biases that allows the model perform specific task successfully. Instead in | ||
Bayesian Neural Network the weights and biases are the distribution, this type of model could be treated as a ensemble of many neural networks trained by the Bayesian inference. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
"In a Bayesian neural network the weights and biases are distributed rather than fixed. This type of model could be treated as an ensemble of many neural networks, train using a Bayesian inference."
The usual Neural Network are optimized in way to get fixed value of weights and biases that allows the model perform specific task successfully. Instead in | ||
Bayesian Neural Network the weights and biases are the distribution, this type of model could be treated as a ensemble of many neural networks trained by the Bayesian inference. | ||
|
||
Bayesian approach for the neural networks allows to estimate the uncertainty and make the decision of the model more robust according to the input data. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
"Using a Bayesian approach for the neural network training allows the analyzer to estimate the uncertainty and to make the decision of the model more robust against the input data."
|
||
|
||
### Training of NN and BNN | ||
=== "NN" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Since the title is in the images, is this notation really needed?
### Training of NN and BNN | ||
=== "NN" | ||
![Placeholder](../images/bayes_nn/trainingNN.png) | ||
The parameters ![formula](https://render.githubusercontent.com/render/math?math=\theta ) are optimized in order to minimaze the loss function |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The formula after "parameters" isn't being rendered.
"minimaze" -> "minimize"
Put a "." at the end of the sentence.
|
||
=== "Pyro" | ||
|
||
### Distribution and sampling |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Duplicate section?
|
||
|
||
|
||
Let's consider simple linear regression as an example and compare it to the bayesian analog. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Capitalize the word Bayesian.
|
||
## Variational Autoencoder | ||
|
||
The generative models could be build using the bayesian neural network. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
"Generative models can be built using a Bayesian neural network. The variational autoencoder is one popular way to forma generative model."
|
||
The generating process consist of two steps: | ||
|
||
1. Samling the latent variable from prior distribution |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
"Samling" -> "Sampling"
|
||
### Loss | ||
|
||
Once we define the procedure for the generation process the Objective function should be chosen for the optimization process. In order to train the network, we maximize the ELBO (Evidence Lower Bound) objective. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
"Objective" -> "objective"
@bdtmnk Can you please update this PR to fix the conflict and apply the requested changes? |
…iew comments from PR cms-ml#5.
Closing this PR for lack of progress. While this PR is being closed, we feel that documenting Bayesian Neural Networks is important. Therefore the work has been rebased in PR #30 and work will continue there. |
No description provided.