Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

initialized buffer/gradInputs memory and added tests for dice #913

Open
wants to merge 4 commits into
base: master
Choose a base branch
from
Open

initialized buffer/gradInputs memory and added tests for dice #913

wants to merge 4 commits into from

Conversation

robotsorcerer
Copy link

This includes the memory initialization for buffer and gradInput. It also clarifies the definition of |X|.

return gradInput
end

function DICECriterion:accGradParameters(input, gradOutput)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this is not needed, as this is a Criterion. Criterions dont have accGradParameters

@soumith
Copy link
Member

soumith commented Aug 7, 2016

you need to add the criterion to init.lua , otherwise it wont be loaded.
Make sure the continuous build passes.

@robotsorcerer
Copy link
Author

Is there any other thing I need to do apart from adding require('nn.DICECriterion') to init? I found that running `th -lnn -e "nn.test{'DICECriterion'}" returned the ff error:

/home/lex/torch/install/share/lua/5.1/torch/Tester.lua:508: Couldn't find test 'DICECriterion'

@soumith
Copy link
Member

soumith commented Aug 23, 2016

now the tests fail because your backward gradients dont match gradients from finite difference. there's likely a bug in your gradient calculation.

@robotsorcerer
Copy link
Author

That's because I added a normalizing constant to the gradient formula's denominator to avoid division errors.

Maybe I should remove it?

On Aug 23, 2016, at 10:58 AM, Soumith Chintala [email protected] wrote:

now the tests fail because your backward gradients dont match gradients from finite difference. there's likely a bug in your gradient calculation.


You are receiving this because you authored the thread.
Reply to this email directly, view it on GitHub, or mute the thread.

@soumith
Copy link
Member

soumith commented Aug 24, 2016

can you make the normalizing constant be optional in the constructor, and disable it when running the tests...

@robotsorcerer
Copy link
Author

Sure

Sincerely,
Lekan

On Aug 24, 2016, at 4:57 PM, Soumith Chintala [email protected] wrote:

can you make the normalizing constant be optional in the constructor, and disable it when running the tests...


You are receiving this because you authored the thread.
Reply to this email directly, view it on GitHub, or mute the thread.

@robotsorcerer
Copy link
Author

torch/install/share/lua/5.1/torch/Tester.lua:508: Couldn't find test 'DICECriterion'    

Was I supposed to do some tricks in Tester.lua as well?

It seems to be taking longer than usual to make this PR work with Travis and your unit tests. Feel free to close/delete the PR. I will leave my implementation on my forked copy for at least a month from now if you or anyone can manage to integrate it into the unit tests.

Cheers.

@tommy-qichang
Copy link

Hi Lekan,
I just wanna know whether this DICECriterion will merge or not? I need it desperately.
And will this criterion support volumetric inputs?

@robotsorcerer
Copy link
Author

It should merge. I believe I implemented it on a volumetric input back in the summer if I remember correctly.

@tommy-qichang
Copy link

Hi
I think your implementation has some problems especially for back propagation part.

http://campar.in.tum.de/pub/milletari2016Vnet/milletari2016Vnet.pdf

In this paper, it showed the gradient formula which I think is correct.

@robotsorcerer
Copy link
Author

robotsorcerer commented Dec 10, 2016

Thanks for spotting this. I think I plugged the wrong equation into wolfram alpha to generate the gradients early on. Could you send a PR?

@tommy-qichang
Copy link

I'm pretty busy right now. But I can send a PR maybe next January.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants