-
Notifications
You must be signed in to change notification settings - Fork 292
Use meta
correctly in map_blocks
to prevent dask from passing a 0d array through the regridder
#4598
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
Turns out I was wrong the other day when I told @trexfeathers that all this needed was working out, and that it didn't need deciding first... Investigation
|
new_data = map_complete_blocks( | |
src_cube, regrid, (src_y_dim, src_x_dim), meshgrid_x.shape | |
) |
The return dtype is set within area weighted regridding using np.promote_types(src_data.dtype, np.float16)
. We could:
- have
_regrid_area_weighted_rectilinear_src_and_grid__perform
calculate the dtype it wants and pass this as an argument to_regrid_area_weighted_array
. - tightly couple
_regrid_area_weighted_rectilinear_src_and_grid__perform
and_regrid_area_weighted_array
by computing the promotion twice, and then passing it as a kwarg tomap_complete_blocks
. - have
_regrid_area_weighted_array
catch a 0d input array as obviously from dask and return the correct type of array to satisfy it.
_regrid
iris/lib/iris/analysis/_regrid.py
Lines 1087 to 1089 in 9268ca9
data = map_complete_blocks( | |
src, regrid, (y_dim, x_dim), sample_grid_x.shape | |
) |
The return dtype of _regrid
is found through a similar promotion method, though with more caveats, and looks like it can change depending on what the _RegularGridInterpolator does. This gives us similar options:
- have
RectilinearRegridder.__call__
calculate the dtype it wants and pass this as an argument to_regrid
. - couple
RectilinearRegridder.__call__
and_regrid
by computing the dtype twice. - have
_regrid
catch a 0d input array as obviously from dask and return the correct type of array to satisfy it.
N.B. Where I've referred to "dtype" above, we'd also have to think about the type of the array (whether it's masked etc.) too, but that's easier as I think it's just whatever the source data was.
Question
We should make the same choice in both places, and I don't think the coupling two functions choice is a good one so do we:
- Pre-choose the dtype in the functions that call
map_complete_blocks
and have them pass ameta
argument tomap_complete_blocks
(which can then pass it tomap_blocks
? - Let map_blocks throw a 0d array through the regridders, and spot it in there then return a 0d array of the right type?
In order to maintain a backlog of relevant issues, we automatically label them as stale after 500 days of inactivity. If this issue is still important to you, then please comment on this issue and the stale label will be removed. Otherwise this issue will be automatically closed in 28 days time. |
This stale issue has been automatically closed due to a lack of community activity. If you still care about this issue, then please either:
|
In order to maintain a backlog of relevant issues, we automatically label them as stale after 500 days of inactivity. If this issue is still important to you, then please comment on this issue and the stale label will be removed. Otherwise this issue will be automatically closed in 28 days time. |
Closed by #5989 |
Context
dask.array.map_blocks
will under some circumstances pass a 0d array through the provided function when initialised, as documented in https://docs.dask.org/en/latest/generated/dask.array.map_blocks.htmlIn
iris/lib/iris/_lazy_data.py
Lines 355 to 388 in 4abaa8f
meta
, including when handing it an area weighted regridding function (and presumably other times) that won't pass through a 0d array. We do the same elsewhere in the same file too.Issues arising
DeprecationWarning
s #4574 documents a deprecation warning seen when the Iris tests are run as the 0d array is passed in by dask and then indexed.Suggestions
meta
kwarg should be set to, and set itThe text was updated successfully, but these errors were encountered: