Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Deadtime correction: add LUT option #115

Open
toqduj opened this issue Oct 8, 2022 · 6 comments
Open

Deadtime correction: add LUT option #115

toqduj opened this issue Oct 8, 2022 · 6 comments
Assignees
Labels
enhancement New feature or request

Comments

@toqduj
Copy link

toqduj commented Oct 8, 2022

Eiger detectors provide a LUT as part of their metadata for the deadtime correction. The deadtime is normally applied by the detector, but this can be switched off. Switching it off allows for Poisson statistics to be calculated on the raw detected counts rather than (incorrectly) on the corrected counts. After this step, the deadtime correction with the LUT can be applied.

Since the LUT only has to be loaded once into the Python space, it is expected that passing this objet handle to the deadtime correction will not be too much of a performance issue.

Question on preference: Is it better to make this a separate alternative deadtime correction method (deadtime_lut.py or something), or to make the LUT an optional parameter in the current method, overriding the parameter-based deadtime calculation when supplied?

Our Eiger R detector has a LUT that tops out at some point. Once the values top out, the mask should probably be set to its NeXus-defined over-limit mask value for those pixels that are out of LUT.

@garryod garryod self-assigned this Oct 10, 2022
@garryod
Copy link
Member

garryod commented Oct 12, 2022

Hi Brian,

I hadn't realised a lookup table was provided by the instrument, but it seems to be the way to go if available. Unfortunately I don't have any such tables to hand at the moment but have spoken to Tim about extracting them from the instrument. I will get back to you with my further thoughts once this is done.

My preference would be for a separate method to be implemented for dead-time correction with lookup tables, as it then leaves open the possibility of having a common function which delegates to either of the existing methods based on the supplied inputs.

Interpolation and extrapolation behaviour is an interesting one. If needed it may be possible to make use of the functions which reside within scipy.interpolate for interpolation, some of which support extrapolation.

@toqduj
Copy link
Author

toqduj commented Oct 12, 2022

Interpolation wouldn't be necessary, it's a LUT based on detected counts -> corrected counts, and goes from 0 or 1 to something. So there'll be some uncertainty there too. I'll dig out an example eiger datafile and tack it onto this comment.
USAXS_30.zip
I think an example is here: /entry/instrument/detector/detectorSpecific/detectorModule_001/countrate_correction_lookup_table
But I also think I see why I never implemented it; it seems to be module-specific, which makes sense when you think about it, but also means we'd need to assign each LUT an ROI, perhaps. What do you think? There's also a separate table, which probably will need interpolation, and will likely be much slower.

As for limits, there is a separate value of
/entry/instrument/detector/detectorSpecific/detectorModule_001/countrate_correction_count_cutoff
which is probably the upper limit beyond which the correction can no longer be applied. I'd not extrapolate this LUT for the user if at all possible, if they want larger LUTs, they'll have to extrapolate themselves with their own assumptions. Note that this data comes from the cheap R model, which probably doesn't come with extended dynamic range.

I can also ask Dectris to verify, they're usually quite responsive.

@garryod
Copy link
Member

garryod commented Oct 12, 2022

The metadata here is amazing, I will have to look into how we can retrieve it using our data acquisition stack.

I had expected a more sparse 3D representation with pixel-wise corrections every N counts. With the data available in the countrate_correction_lookup_table, this method should be more performant than the current implementation which involves solving the Lambert W function iteratively.

ROIs are an interesting one, I don't see how we could support this without first supporting them, I will look at building out some utility functions for this when I'm next free (likely Monday). I previously had plans to support a syntax for operations on partial frames to allow us to distribute compute across arbitrary axis in our cluster jobs, which would share some boilerplate with ROI operations. As an aside, do you happen to know where in the metadata (if anywhere) we can get the module bounds?

As far as extrapolation appears that the LUT pins all values above the cutoff to the maximum value for the type, this seems like a good default but it might be nice to utilise the countrate_correction_cutoff to do something more application appropriate.

@toqduj
Copy link
Author

toqduj commented Oct 12, 2022

The metadata's good no?
I suspect the four modules of our Eiger R 1M are placed by the three elements:
/entry/instrument/detector/detectorSpecific/detectorModule_000/data_origin
/entry/instrument/detector/detectorSpecific/detectorModule_000/data_rotation
/entry/instrument/detector/detectorSpecific/detectorModule_000/data_size

Module 0: origin: [1030, 257] rotation: 180 size: [1030, 257]
Module 1: origin: [0, 257] rotation: 0 size: [1030, 257]
Module 3: origin: [1030, 808] rotation: 180 size: [1030, 257]
Module 4: origin: [0, 808] rotation: 0 size: [1030, 257]

The rotations are annoying, but not completely unworkable. I think being able to define an ROI for an operation would be helpful for cases like this.

I'll have to find a feature-full detector image and try to reconstruct the full image from the modules using this information, just so I can double-check my understanding.

@garryod garryod added the enhancement New feature or request label Oct 12, 2022
@garryod
Copy link
Member

garryod commented Oct 12, 2022

For some unknown reason I didn't interpret these fields as the physical location of the module despite it being rather obvious. The rotation is a curious one, will have to play about with the values and see what comes of it, I expect we will leave handling of this to the user and accept only a rectangular bounding box.

If you do find a feature-full image you are happy sharing please send it my way as it would be quite useful for my testing.

@toqduj
Copy link
Author

toqduj commented Oct 13, 2022

No problem, here's one a quick search of our scicat database turned up. Should be asymmetric enough for reconstruction testing:
image
20220204_97.zip

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

2 participants