-
Notifications
You must be signed in to change notification settings - Fork 8
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Verification #133
Verification #133
Conversation
This nyaml entry still gives an error
while the script here adds this key:
Which one of these should it be: |
@sherjeelshabih @sanbrock What do you think of this approach? It is roughly working already (except for the error in the previous comment but I'm actually not sure if this is a general bug in the template generation, @sherjeelshabih ?) and IMO it would be helpful to have a verification tool for nexus files. If you think this is a good path I would continue adding more verification features also in other parts of the code (i.e., the points addressed above) and tests. |
I think SOLUTE[solute2] might work. The parts outside the brackets are used to look for the NXDL node. So the function get_node_at_nxdl_path receives a concatenation of what is outside the brackets. You can check if that function finds this NXDL path correctly: /ENTRY/SAMPLE/SOLUTE/solvent. I keep forgetting the exact caps/no caps format that goes in there. But as far as the converter sees this, the outside will be used to get the nxdl node for nexus info it neds. Then it just writes whatever you have in the [brackets] to the hdf5 file. |
I really like this approach. I already see that you are using the same validation routine used while converting. This will help improve verification there too then. That's awesome. I think the not loading of large datasets should have a nice wrapper to allow readers, from the other side, to also somehow tell this verfication routine to not load large sets until it actually writes the hdf5. Then for writing we can have a streamIO object or whatever one calls it. For the verification of hdf5 files this will be easier as we can rely on the h5py's lazyload functionalities. Maybe for this PR we can focus on the simple hdf5 case and just have this in mind. I think shape and dim aren't checked but types were being checked. Or you mean the types don't deal with what comes out of h5py.dtype? |
ok I will just check what works. SOLUTE[solute2] will be a bit trickier to implement in the reverse way as I'm currently just reading the |
Yes this is why I started this because I felt most of the parts for validation are already there. I just did not have an easy way to use this from the command line.
Yes sounds good 👍️This would also help building this for verification as we could just build a small class to pass instead of the actual data and let the verification routine check its contents against the nxdl.
I did not actually check the details of what is being checked and what not by the code already (the points are also a somehow chaotic collection of the points I did not want to forget to address here 😅) |
Pull Request Test Coverage Report for Build 8845625971Details
💛 - Coveralls |
This is superseeded by #333, which includes everything of this PR but uses the new validation algorithm. |
This is a simple verification. I just load everything from a provided file and write it into a dict. There are still several issues: