Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Problem reading Maxwell two datasets #1578

Open
sofiapuvogelvittini opened this issue Oct 3, 2024 · 4 comments · May be fixed by #1579
Open

Problem reading Maxwell two datasets #1578

sofiapuvogelvittini opened this issue Oct 3, 2024 · 4 comments · May be fixed by #1579
Assignees

Comments

@sofiapuvogelvittini
Copy link

Hello, I am getting the following error while trying to read a Maxwell two dataset:

"KeyError Traceback (most recent call last)
Cell In[9], line 1
----> 1 se.read_maxwell(h5_data,rec_name=rec_si, stream_id=well_si)

File ~/anaconda3/envs/si_env/lib/python3.11/site-packages/spikeinterface/extractors/neoextractors/maxwell.py:62, in MaxwellRecordingExtractor.init(self, file_path, stream_id, stream_name, block_index, all_annotations, rec_name, install_maxwell_plugin, use_names_as_ids)
59 self.install_maxwell_plugin()
61 neo_kwargs = self.map_to_neo_kwargs(file_path, rec_name)
---> 62 NeoBaseRecordingExtractor.init(
63 self,
64 stream_id=stream_id,
65 stream_name=stream_name,
66 block_index=block_index,
67 all_annotations=all_annotations,
68 use_names_as_ids=use_names_as_ids,
69 **neo_kwargs,
70 )
72 self.extra_requirements.append("h5py")
74 # well_name is stream_id

File ~/anaconda3/envs/si_env/lib/python3.11/site-packages/spikeinterface/extractors/neoextractors/neobaseextractor.py:188, in NeoBaseRecordingExtractor.init(self, stream_id, stream_name, block_index, all_annotations, use_names_as_ids, **neo_kwargs)
158 def init(
159 self,
160 stream_id: Optional[str] = None,
(...)
165 **neo_kwargs: Dict[str, Any],
166 ) -> None:
167 """
168 Initialize a NeoBaseRecordingExtractor instance.
169
(...)
185
186 """
--> 188 _NeoBaseExtractor.init(self, block_index, **neo_kwargs)
190 kwargs = dict(all_annotations=all_annotations)
191 if block_index is not None:

File ~/anaconda3/envs/si_env/lib/python3.11/site-packages/spikeinterface/extractors/neoextractors/neobaseextractor.py:27, in _NeoBaseExtractor.init(self, block_index, **neo_kwargs)
23 def init(self, block_index, **neo_kwargs):
24
25 # Avoids double initiation of the neo reader if it was already done in the init of the child class
26 if not hasattr(self, "neo_reader"):
---> 27 self.neo_reader = self.get_neo_io_reader(self.NeoRawIOClass, **neo_kwargs)
29 if self.neo_reader.block_count() > 1 and block_index is None:
30 raise Exception(
31 "This dataset is multi-block. Spikeinterface can load one block at a time. "
32 "Use 'block_index' to select the block to be loaded."
33 )

File ~/anaconda3/envs/si_env/lib/python3.11/site-packages/spikeinterface/extractors/neoextractors/neobaseextractor.py:66, in _NeoBaseExtractor.get_neo_io_reader(cls, raw_class, **neo_kwargs)
64 neoIOclass = getattr(rawio_module, raw_class)
65 neo_reader = neoIOclass(**neo_kwargs)
---> 66 neo_reader.parse_header()
68 return neo_reader

File ~/anaconda3/envs/si_env/lib/python3.11/site-packages/neo/rawio/baserawio.py:185, in BaseRawIO.parse_header(self)
172 def parse_header(self):
173 """
174 This must parse the file header to get all stuff for fast use later on.
175
(...)
183
184 """
--> 185 self._parse_header()
186 self._check_stream_signal_channel_characteristics()
187 self.is_header_parsed = True

File ~/anaconda3/envs/si_env/lib/python3.11/site-packages/neo/rawio/maxwellrawio.py:145, in MaxwellRawIO._parse_header(self)
143 gain_uV = 3.3 / (1024 * gain) * 1e6
144 mapping = settings["mapping"]
--> 145 sigs = h5file["wells"][stream_id][self.rec_name]["groups"]["routed"]["raw"]
147 channel_ids = np.array(mapping["channel"])
148 electrode_ids = np.array(mapping["electrode"])

File h5py/_objects.pyx:54, in h5py._objects.with_phil.wrapper()

File h5py/_objects.pyx:55, in h5py._objects.with_phil.wrapper()

File ~/anaconda3/envs/si_env/lib/python3.11/site-packages/h5py/_hl/group.py:357, in Group.getitem(self, name)
355 raise ValueError("Invalid HDF5 object reference")
356 elif isinstance(name, (bytes, str)):
--> 357 oid = h5o.open(self.id, self._e(name), lapl=self._lapl)
358 else:
359 raise TypeError("Accessing a group is done with bytes or str, "
360 "not {}".format(type(name)))

File h5py/_objects.pyx:54, in h5py._objects.with_phil.wrapper()

File h5py/_objects.pyx:55, in h5py._objects.with_phil.wrapper()

File h5py/h5o.pyx:189, in h5py.h5o.open()

KeyError: "Unable to synchronously open object (object 'routed' doesn't exist)""

Not very sure what it means, and is interesting that for othe recordings, same setup, I dont get any error.

here you can find the respective dataset and the jupyter notebook I am using "SC2_all_wells_latest.ipynb":
https://drive.google.com/drive/folders/1CoqJTnrn4_qIQEU_rZEc3MuNhaxgHln3?usp=sharing

I've also added another folder, 'data_that_works', which contains a recording of the same plate at a later time point. This one can be read

Could you please help me figure out what may be going wrong here? Thanks a lot!

@alejoe91 alejoe91 linked a pull request Oct 3, 2024 that will close this issue
@alejoe91
Copy link
Contributor

alejoe91 commented Oct 3, 2024

@sofiapuvogelvittini the problem with the dataset is that for some well*** fields there are no signals associated to them.

I made a PR with a fix here #1579 . Can you test it out?

@sofiapuvogelvittini
Copy link
Author

I am still encountering the same error after switching to the branch :/

@alejoe91
Copy link
Contributor

alejoe91 commented Oct 3, 2024

Weird! I was able to open both dataset with the fix. How did you install from the PR branch? Make sure you restart the notebook!

@sofiapuvogelvittini
Copy link
Author

I am sorry, I think I was not installing it properly. I have now managed to install it correctly, and it’s working with both datasets! Thank you for your help!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants