-
Notifications
You must be signed in to change notification settings - Fork 7
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Identify channel and correlation-like dimensions in non-standard MS columns #329
Conversation
I also want to to try address #268 in this PR. |
I'm gonna rerun a convert command that used to fail to confirm that the fix works but it looks pretty uncontroversial to me. I'll get back to you as soon as the convert is done |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks good! Am correct in saying that this doesn't store this information - it simply presumes that any non-standard column will probably be visibility shaped? This will work in 99.99% of cases, but if the sizes of chan
and corr
are the same it may fail/assign (row, chan, chan)
.
Its slightly more intelligent than that, it checks if the dimension name is something like |
Yep, I think this should try to handle the |
One extra note - this doesn't work if you only select the column with unknown dimensions: print(xds_from_ms("msdir/C147_minimal.MS", columns=("TEST_DATA",))[0]) prints
|
Yes. Unfortunately due to the fact that dask-ms was designed to handle arbitrary tables produced by the CTDS, it doesn't really reason about the MS structure very deeply -- so the returned datasets will only have Is this a critical blocker to your current workflows or could this be left to a future PR? I'd imagine that dask-ms convert should work with this change as it bulk exports MS columns. |
Not really, as the schema option still exists (QC always used it for extra model/weight columns). As you say, this should be good enough on the convert. Just thought it worth mentioning as it was the first thing I stumbled over when trying out the branch. |
This will work perfectly almost every time. Thanks @sjperkins |
Closes Non-standard MS columns have an auto-generated schema which is not chunked according to logic for standard dimensions #241
Closes Silently fails when providing incorrect schema for WEIGHT column #268
Tests added / passed
If the pep8 tests fail, the quickest way to correct
this is to run
autopep8
and thenflake8
andpycodestyle
to fix the remaining issues.Fully documented, including
HISTORY.rst
for all changesand one of the
docs/*-api.rst
files for new APITo build the docs locally: