-
Notifications
You must be signed in to change notification settings - Fork 252
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Over-fitting with data generation #42
Comments
Hi, it's a good question, it will be great to see other peoples view, From your question- Training with a different mask for the image in each iteration is some sort of data augmentation hence the network generalize better when testing with a new image. I have used this option and my testing performance is around 0.95ssim. A comparison and a reported on performance will help understand it better |
Recently I was dealing with other related problem. Now I start to think if it is one mask per image per epoch + some random augmentation that should be fine. I can close the issue or if you want we wait and see other opinions on that, I am curios what ppl say. |
Let's wait and see what others say |
Hi, |
Hi,
thank you for sharing this code. It is rather a general question than an issue.
Do not you think you over-fit if you randomly generate masks on the same images and it happens that with future iterations the net eventually will see the whole image?
I am just curios how you dealt with this problem.
The text was updated successfully, but these errors were encountered: