Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

SparseArray Error during "first spatial update" - parameter exploration #152

Open
sibogg opened this issue Aug 5, 2021 · 1 comment · May be fixed by #224
Open

SparseArray Error during "first spatial update" - parameter exploration #152

sibogg opened this issue Aug 5, 2021 · 1 comment · May be fixed by #224

Comments

@sibogg
Copy link

sibogg commented Aug 5, 2021

Hi!

I just started to explore minian and ran into a "ValueError: All arrays must be instances of SparseArray." while running the demo pipeline notebook with the demo movie. It happened in the parameter exploration step during the first spatial update.
Here's the specs of the computer I'm using:

  • OS Ubuntu 20.04.2 LTS. Linux 5.11.0-25-generic 2021 x86_64 x86_64 x86_64 GNU/Linux
  • Architecture: x86_64 CPU op-mode(s): 32-bit, 64-bit Byte Order: Little Endian Address sizes: 43 bits physical, 48 bits virtual CPU(s): 64 On-line CPU(s) list: 0-63 Thread(s) per core: 2 Core(s) per socket: 32 Socket(s): 1 NUMA node(s): 1 Vendor ID: AuthenticAMD CPU family: 23 Model: 49 Model name: AMD EPYC 7502P 32-Core Processor Stepping: 0 Frequency boost: enabled CPU MHz: 1626.043 CPU max MHz: 2500.0000 CPU min MHz: 1500.0000
    It looks like the pipeline notebook have been updated a few times and I have not found any previous issue similar to the one I'm running into. Here's the error message in Jupyter:

ValueError Traceback (most recent call last) <timed exec> in <module>
~/Code/minian/minian/cnmf.py in update_spatial(Y, A, b, C, f, sn, dl_wnd, sparse_penal, update_background, normalize, size_thres, in_memory)
387 cur_blk = darr.array(sparse.zeros((cur_sub.shape)))
388 A_new[hblk, wblk, 0] = cur_blk
--> 389 A_new = darr.block(A_new.tolist())
390 else:
391 A_new = update_spatial_block(
~/anaconda3/envs/minian/lib/python3.8/site-packages/dask/array/core.py in block(arrays, allow_unknown_chunksizes)
3836
3837 # concatenate innermost lists on the right, outermost on the left
-> 3838 return rec.map_reduce(
3839 arrays,
3840 f_reduce=lambda xs, axis: concatenate(
~/anaconda3/envs/minian/lib/python3.8/site-packages/dask/array/numpy_compat.py in map_reduce(self, x, f_map, f_reduce, f_kwargs, **kwargs)
109 return f_reduce((f(xi, **next_kwargs) for xi in x), **kwargs)
110
--> 111 return f(x, **kwargs)
112
113 def walk(self, x, index=()):
~/anaconda3/envs/minian/lib/python3.8/site-packages/dask/array/numpy_compat.py in f(x, **kwargs)
107 else:
108 next_kwargs = f_kwargs(**kwargs)
--> 109 return f_reduce((f(xi, **next_kwargs) for xi in x), **kwargs)
110
111 return f(x, **kwargs)
~/anaconda3/envs/minian/lib/python3.8/site-packages/dask/array/core.py in (xs, axis)
3839 arrays,
3840 f_reduce=lambda xs, axis: concatenate(
-> 3841 list(xs), axis=axis, allow_unknown_chunksizes=allow_unknown_chunksizes
3842 ),
3843 f_kwargs=lambda axis: dict(axis=(axis + 1)),
~/anaconda3/envs/minian/lib/python3.8/site-packages/dask/array/numpy_compat.py in (.0)
107 else:
108 next_kwargs = f_kwargs(**kwargs)
--> 109 return f_reduce((f(xi, **next_kwargs) for xi in x), **kwargs)
110
111 return f(x, **kwargs)
~/anaconda3/envs/minian/lib/python3.8/site-packages/dask/array/numpy_compat.py in f(x, **kwargs)
107 else:
108 next_kwargs = f_kwargs(**kwargs)
--> 109 return f_reduce((f(xi, **next_kwargs) for xi in x), **kwargs)
110
111 return f(x, **kwargs)
~/anaconda3/envs/minian/lib/python3.8/site-packages/dask/array/core.py in (xs, axis)
3838 return rec.map_reduce(
3839 arrays,
-> 3840 f_reduce=lambda xs, axis: concatenate(
3841 list(xs), axis=axis, allow_unknown_chunksizes=allow_unknown_chunksizes
3842 ),
~/anaconda3/envs/minian/lib/python3.8/site-packages/dask/array/core.py in concatenate(seq, axis, allow_unknown_chunksizes)
3899 type(max(seq_metas, key=lambda x: getattr(x, "array_priority", 0)))
3900 )
-> 3901 meta = _concatenate(seq_metas, axis=axis)
3902
3903 # Promote types to match meta
~/anaconda3/envs/minian/lib/python3.8/site-packages/sparse/_common.py in concatenate(arrays, axis, compressed_axes)
1246 from ._coo import concatenate as coo_concat
1247
-> 1248 return coo_concat(arrays, axis)
1249 else:
1250 from ._compressed import concatenate as gcxs_concat
~/anaconda3/envs/minian/lib/python3.8/site-packages/sparse/_coo/common.py in concatenate(arrays, axis)
157 from .core import COO
158
--> 159 check_consistent_fill_value(arrays)
160
161 arrays = [x if isinstance(x, COO) else COO(x) for x in arrays]
~/anaconda3/envs/minian/lib/python3.8/site-packages/sparse/_utils.py in check_consistent_fill_value(arrays)
429
430 if not all(isinstance(s, SparseArray) for s in arrays):
--> 431 raise ValueError("All arrays must be instances of SparseArray.")
432 if len(arrays) == 0:
433 raise ValueError("At least one array required.")
ValueError: All arrays must be instances of SparseArray.

And below is a screenshot of the cell that I run into this issue. Because of this issue, I cannot choose sparse penalty for spatial update in the very next cell with the 'hv.output()' command and continue.
value_error

I have restarted my kernel a few times and tried with larger memory_limit setting but still run into the same issue.
Any help would be greatly appreciated!
Thanks

@phildong
Copy link
Member

Hey sorry for taking too long to reply! I was trying to think through what could be the cause of this but couldn't get an idea.
I don't currently own a ubuntu 20.04 machine, but we're working on reproducing this.
In the meantime if you notice anything else related to this error please let us know!
Sorry again for not being responsive!

@ximion ximion linked a pull request Oct 7, 2022 that will close this issue
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants