-
Notifications
You must be signed in to change notification settings - Fork 17
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Feature/mask NaNs in training loss function #72
base: develop
Are you sure you want to change the base?
Conversation
The loss function code is being altered in #70 to enable flexible configuration of loss functions. Changes
|
- Revert pop if fails
…lars' of github.com:ecmwf/anemoi-training into fix-allow-updated-scalars
…aining-loss-function
This has to be merged after #137 |
- Allow limiting of scalars rather than turning off
- Allow indexing on the scalar arbitrarily
…aining-loss-function
…aining-loss-function
After discussing with @sahahner and a quick test with an ocean dataset, I approve the PR. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Approve after testing.
Variables with missing values that are imputed by the imputer should not be considered in the loss.
Solves issue 271
Describe the solution
Pass the product of the imputer NaN mask(s) to the loss in the first forward pass and multiply the contributions to the loss of imputed grid points by zero.
Also, apply the remapper to the mask to remap the NaN mask in case the remapper is used.
The NaN masks are prepared in the imputer. The remapper contains a new function to remap the NaN mask.
These changes are part of anemoi-model, PR #56
Attention
This changes the default behaviour when using variables that contain NaN values that are imputed.