Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

EMD still very large and increases throughout training #5

Open
davezdeng8 opened this issue Apr 6, 2020 · 2 comments
Open

EMD still very large and increases throughout training #5

davezdeng8 opened this issue Apr 6, 2020 · 2 comments

Comments

@davezdeng8
Copy link

davezdeng8 commented Apr 6, 2020

Hi,
Similar to some of the other issues posted, I'm getting a very large EMD. I divided by the number of points but ended up with an EMD of around 26 for a Chamfer Distance of around .1. I'm working with n = 22000 points.
In addition, if I use EMD as my loss and backpropogate, the loss ends up increasing, whereas it went down with Chamfer Distance. Any advice?
Thanks!

@davezdeng8 davezdeng8 changed the title EMD still very large EMD still very large and increases throughout training Apr 6, 2020
@daerduoCarey
Copy link
Owner

This is definitely possible that you get a small CD and a big EMD. Please check the definition for more details. For example, S1 = {(1, 0, 0) x 1, (0, 0, 1) x 100} and S2 = {(1, 0, 0) x 100, (0, 0, 1) x 1}. The CD is 0 between S1 and S2, but the EMD is 0.98.

@davezdeng8
Copy link
Author

Thanks for your response! Do you have any idea why EMD seems more difficult to optimize than Chamfer Distance? I've been trying to use EMD as a loss but even if I make the learning rate really small the model still ends up diverging. On the other hand, if I use Chamfer Distance the model converges extremely fast. I'm not too familiar with the algorithm you're using and how it would affect the optimization landscape, but could it have anything to do with that?
Thanks!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants