diff --git a/collections/_tutorials/2022-05-31-jean-zay-ultimate-guide.md b/collections/_tutorials/2022-05-31-jean-zay-ultimate-guide.md index a2c9f85d..11f9a26e 100644 --- a/collections/_tutorials/2022-05-31-jean-zay-ultimate-guide.md +++ b/collections/_tutorials/2022-05-31-jean-zay-ultimate-guide.md @@ -6,7 +6,7 @@ date: 2022-05-31 categories: JeanZay, Guide --- -(Updated on 18 May 2024) +(Updated on 11 July 2024) - [**Introduction**](#introduction) - [**Jean Zay account application**](#jean-zay-account-application) @@ -81,6 +81,9 @@ the supercomputer. > ⚠️ Don't forget to validate your information by clicking on click on *"Valider la saisie des informations"*! > computing-account-validate +> ⚠️ If after validating your information eDARI asks for a non-electronic signature, ask the secretariat (if possible) to add you to the Reseda database. +>Once this has been done, wait a day or so to cancel your account opening request and reopen another one. You should then be able to sign electronically. + After submitting the application form, you will soon receive an email to fill another online questionnaire and upload your CV. Congratulations! You have finally completed all the administrative procedures to request your access to Jean Zay. Now all you have to do is wait! 😄 Yes, you will have to wait about 1-2 months before you receive your username and password. diff --git a/collections/_tutorials/2023-05-09-tutorial-score-based-models.md b/collections/_tutorials/2023-05-09-tutorial-score-based-models.md index 1cf12850..36bc742b 100644 --- a/collections/_tutorials/2023-05-09-tutorial-score-based-models.md +++ b/collections/_tutorials/2023-05-09-tutorial-score-based-models.md @@ -33,7 +33,7 @@ categories: score-based models ## **Introduction** The main existing generative models can be divided in two categories : -* **likelihood-based models**, which goal is to learn directly the probability density function. Examples of these models are autoregressive models, [normalizing flow](http://127.0.0.1:4000/tutorials/2023-01-05-tutorial_normalizing_flow.html) or [variational auto-encoders](http://127.0.0.1:4000/tutorials/2022-09-12-tutorial-vae.html). +* **likelihood-based models**, which goal is to learn directly the probability density function. Examples of these models are autoregressive models, [normalizing flow](https://creatis-myriad.github.io/tutorials/2023-01-05-tutorial_normalizing_flow.html) or [variational auto-encoders](https://creatis-myriad.github.io/tutorials/2022-09-12-tutorial-vae.html). * **implicit generative models**, for which the density distribution is implicitly learnt by the model during sampling process. This is typically GANs, which have dominated the field of image generation during several years. Such models each have their specific limitations. Likelihood-based models either have strong restrictions on the model architecture to make sure the normalizing constant of the distribution is tractable and VAEs rely on a substitutes of the likelihood the training. GANs have been historically the state-of-the-art of deep learning generative models in terms of visual quality but they do not allow density estimation and rely on adversarial learning, which is known to be particularly unstable.