Skip to content

Commit

Permalink
Update _widget.py
Browse files Browse the repository at this point in the history
updated chunking to 'auto' instead of time steps. That way larger data can be processed at the cost of speed. This was a problem with chunk sizes of approx. 6 Gb on a 16 Gb ram machine
  • Loading branch information
Macl-I authored Aug 19, 2024
1 parent d0168f6 commit cb49860
Showing 1 changed file with 4 additions and 3 deletions.
7 changes: 4 additions & 3 deletions src/napari_fast4dreg/_widget.py
Original file line number Diff line number Diff line change
Expand Up @@ -142,9 +142,10 @@ def run_pipeline(image,
ref_channel = len(data[0])

# read in raw data as dask array
new_shape = (np.shape(data)[0],1,np.shape(data)[-3],np.shape(data)[-2],np.shape(data)[-1])
data = data.rechunk(new_shape)

#new_shape = (np.shape(data)[0],1,np.shape(data)[-3],np.shape(data)[-2],np.shape(data)[-1])
data = data.rechunk('auto')
new_shape = data.chunksize

# write data to tmp_file
data = write_tmp_data_to_disk(tmp_path, data, new_shape)
print('Imge imported')
Expand Down

0 comments on commit cb49860

Please sign in to comment.