Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Bayes nn #5

Closed
wants to merge 3 commits into from
Closed

Bayes nn #5

wants to merge 3 commits into from

Conversation

bdtmnk
Copy link

@bdtmnk bdtmnk commented Jul 29, 2021

No description provided.

@aperloff
Copy link
Contributor

How far along are you with this? Should I review it or is it more of a work in progress? Also, please add a description to your PR.

@bdtmnk
Copy link
Author

bdtmnk commented Jul 30, 2021 via email

Copy link
Contributor

@aperloff aperloff left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is a preliminary set of review comments. It seems that this document still needs some work.

@@ -0,0 +1,318 @@
# Bayesian Neural Network

The usual Neural Network are optimized in way to get fixed value of weights and biases that allows the model perform specific task successfully. Instead in
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

"Usually, neural networks are optimized in order to get a fixed value for the weights and biases which allow the model perform a specific task successfully."

# Bayesian Neural Network

The usual Neural Network are optimized in way to get fixed value of weights and biases that allows the model perform specific task successfully. Instead in
Bayesian Neural Network the weights and biases are the distribution, this type of model could be treated as a ensemble of many neural networks trained by the Bayesian inference.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

"In a Bayesian neural network the weights and biases are distributed rather than fixed. This type of model could be treated as an ensemble of many neural networks, train using a Bayesian inference."

The usual Neural Network are optimized in way to get fixed value of weights and biases that allows the model perform specific task successfully. Instead in
Bayesian Neural Network the weights and biases are the distribution, this type of model could be treated as a ensemble of many neural networks trained by the Bayesian inference.

Bayesian approach for the neural networks allows to estimate the uncertainty and make the decision of the model more robust according to the input data.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

"Using a Bayesian approach for the neural network training allows the analyzer to estimate the uncertainty and to make the decision of the model more robust against the input data."



### Training of NN and BNN
=== "NN"
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Since the title is in the images, is this notation really needed?

### Training of NN and BNN
=== "NN"
![Placeholder](../images/bayes_nn/trainingNN.png)
The parameters ![formula](https://render.githubusercontent.com/render/math?math=\theta ) are optimized in order to minimaze the loss function
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The formula after "parameters" isn't being rendered.

"minimaze" -> "minimize"

Put a "." at the end of the sentence.


=== "Pyro"

### Distribution and sampling
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Duplicate section?




Let's consider simple linear regression as an example and compare it to the bayesian analog.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Capitalize the word Bayesian.


## Variational Autoencoder

The generative models could be build using the bayesian neural network.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

"Generative models can be built using a Bayesian neural network. The variational autoencoder is one popular way to forma generative model."


The generating process consist of two steps:

1. Samling the latent variable from prior distribution
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

"Samling" -> "Sampling"


### Loss

Once we define the procedure for the generation process the Objective function should be chosen for the optimization process. In order to train the network, we maximize the ELBO (Evidence Lower Bound) objective.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

"Objective" -> "objective"

@aperloff
Copy link
Contributor

@bdtmnk Can you please update this PR to fix the conflict and apply the requested changes?

aperloff pushed a commit to aperloff/CMSMLDocumentation that referenced this pull request Jul 26, 2022
@aperloff
Copy link
Contributor

Closing this PR for lack of progress. While this PR is being closed, we feel that documenting Bayesian Neural Networks is important. Therefore the work has been rebased in PR #30 and work will continue there.

@aperloff aperloff closed this Jul 26, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants