-
Notifications
You must be signed in to change notification settings - Fork 7
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Cannot run on custom data #78
Comments
@MarlinSchaefer Thanks for raising the issue. I'll see if I get time today to take a look at this. Will send another message when I've figured out what's going on. |
@MarlinSchaefer Right so it's probably a shape issue you're dealing with. I checked my code and the I think you may have the detector and sample rate switched around. Your x_data should end up having a shape of (number of training samples, number of parameters to infer) and your y_data_noisy/y_data_noisefree arrays should have the shape (number of training samples, sample rate, number of detectors). Also, I'm not sure if this will break the code, but it's probably safe to also make sure that your test data has an extra dimension at the beginning for the number of test samples (even if you're using just 1 test sample). i.e. (number of test samples, number of parameters to infer) for x_data_test and (number of test samples, sample rate, number of detectors) for y_data_test_noisy/y_data_test_noisefree. |
@hagabbar I've tested switching the channels. So for 1000 signals I now have the shape I've also tested using an additional dimension for the test-data. Doing so causes the code to crash when re-shaping the y-data (line 722 in run_vitamin). It seems to be assembling the data assuming that it is a single sample in load_data. |
I've looked a bit more into this, but I'm having a hard time understanding everything your code does. |
@hagabbar I've had a closer look at the errors/code again. So I've found one problem and was able to resolve it. However, that caused further problems. So in line 618 of CVAE_model.py there is the line: Either of them passes the reshape but crashes on line 626 |
Marlin, I think this is because the sky parameters output from the decoder are designed to be 3D in the sense that they are modelled using the Fisher Von Mises distribution which describes a Gaussian-like blob of probability on the 2-sphere (sky). The Tensorflow probability functions model this with a single variance parameter (so a single blob-width on the 2D sky) but it uses 3 location parameters to define a 3D unit vector pointing towards the centre of the blob. I think we have the decoder output 3 numbers for the location and then we either normalise it to be a unit vector or it normalises it inside the Fisher Von Mises function itself. Does this make sense? |
When trying to train the CVAE on data I created on my own, I get the error posted below.
Note that I'm trying to run the code in Python 3.7, which is officially not supported. However, I got the same error when I was trying to run the code locally in Python 3.6. On this local machine on the other hand I do not have access to a graphics card and was running it on the CPU, which is not supported for training if I remember correctly.
As it is a re-shaping error I guess it is due to the data-format I'm using and that the training data isn't quite in the correct shape. I deduced, that your code expects the training data to be of shape
(number training samples, number detectors, number samples per timeseries)
. My custom training-data contains only the keys['rand_pars', 'snrs', 'x_data', 'y_data_noisefree', 'y_data_noisy', 'y_normscale']
with respective shapes[(9,), (1000,3), (1000,1,9), (1000, 3, 256), (1000, 3, 256), ()]
For the test data I deduced that the code expects single samples and thus the data to be of shape
(number detectors, number samples per timeseries)
. My test set therefore contains the same keys as the training set, which have the respective shapes[(9,), (3,), (1,9), (3, 256), (3, 256), ()]
.The full error message:
The text was updated successfully, but these errors were encountered: