Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix: Fitter exception message #41 #42

Merged
merged 5 commits into from
Apr 19, 2021

Conversation

eslam69
Copy link
Contributor

@eslam69 eslam69 commented Mar 31, 2021

Fixing issue #41

Copy link
Member

@mstimberg mstimberg left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for the PR, this obviously fixes the message but... I just realized that the if statement has another issue (not your fault of course): the error should not be raised if optimizer is None (a default Optimizer will be chosen in that case), the closing parentheses comes too early... Mind fixing this issue as well? Thanks again!

@eslam69
Copy link
Contributor Author

eslam69 commented Apr 2, 2021

No problem, I'll fix the parentheses

@eslam69
Copy link
Contributor Author

eslam69 commented Apr 2, 2021

There is a test case that expects optimizer= None to raise a TypeError.
And I found that there is no handling for the None-optimizer case, in this case optimizer.initialize() will fail:

if self.optimizer is None or restart:
if start_iteration is None:
self.iteration = 0
self.n_samples = optimizer.initialize(self.parameter_names,
popsize=self._n_samples,
rounds=n_rounds,
**params)
self.optimizer = optimizer

should I add an extra check before the last one:

if ((optimizer is None ) and (self.optimizer is not None)):
    optimizer = self.optimizer 

which will use the optimizer of the previous fit.
Or should I fix the test case?

@mstimberg
Copy link
Member

There is a test case that expects optimizer= None to raise a TypeError.
And I found that there is no handling for the None-optimizer case, in this case optimizer.initialize() will fail:

Apologies, I thought we handle optimizer=None by making it create a NevergradOptimizer automatically, but this is actually #5 and was never implemented :-/
So either we have to revert the parentheses change (or rather we could remove the parentheses completely to make it less confusing) or we have to fix #5 as well and make optimizer=None create a NevergradOptimizer.

@eslam69
Copy link
Contributor Author

eslam69 commented Apr 6, 2021

So either we have to revert the parentheses change (or rather we could remove the parentheses completely to make it less confusing) or we have to fix #5 as well and make optimizer=None create a NevergradOptimizer.

I'll try fixing #5 and adding the corresponding tests for it.

@eslam69
Copy link
Contributor Author

eslam69 commented Apr 7, 2021

GammaFactor() requires 2 positional arguments: delta and time, and I have some doubts about the values I used as default.

@mstimberg
Copy link
Member

GammaFactor() requires 2 positional arguments: delta and time, and I have some doubts about the values I used as default.

Good point. The time needs to be the same as the total duration of the run, you should be able to get it at self.duration. The delta argument could change with your neuron model and what kind of problem you are solving, but I'd go for 2*ms as the default which is what was used in the original publications.

@mstimberg
Copy link
Member

Thanks for the changes, this all looks good to me now after a superficial review. I am still out-of-office at the moment, but I will have a last close look at this in about two weeks before finally merging it.

Copy link
Member

@mstimberg mstimberg left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Everything looks good to me, thanks again for the contribution.

@mstimberg mstimberg merged commit 8a9d06e into brian-team:master Apr 19, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants