-
Notifications
You must be signed in to change notification settings - Fork 398
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
color convergence on custom data #10
Comments
Hi @wileewang , how do you generate the samples? I'm having some troubles with obtaining samples of the reverse process. |
Hello, have you solved it? I am also meeting the same color problems. |
Hi @zxyf1213 , you are probably using the wrong function. If you have issues with the colors of the images, and you notice that the histogram is becoming a gaussian, it mean that you are ADDING noise, so you are using the forward process (q-sample). I do not exactly remember what I used, but try to use self.diffusion.p_sample_loop or a variation of that, since it allows you to apply the reverse process. Good luck. |
@alezuech Thank you very much for receiving your reply. I training the improved_diffusion on cityscapes dataset and I haven't changed the original code of the repository, so there isn't like I used the wrong sample process. I don't know why there is a color problem when training diffusion model on cityscapes dataset, and the problem has been bothering me for a long time, do you have any opinion on it? |
have you resolved the problem? Thanks. |
Thanks for your great work! When I train ddpm on some dataset like cityscape. It is well-known that images of this dataset are almost the same color/style. However, the colors among generated samples are quite diverse. What is more interesting, when I adjust the u-net to predict xstart or I increase the model capacity, this problem solves. I really hope you can give some hints about this phenomenon.
The text was updated successfully, but these errors were encountered: