You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I am working with your code and I was able to do the training using my custom data.
The result of the training is not good because I'm getting extremely high loss.
Can I request for your data(csv) you used in the code?
I will just check if I will still get the same result.
Thank you in advance.
The text was updated successfully, but these errors were encountered:
Thanks for trying my code.
Unfortunately the raw data is not available at the moment.
The script is also due to be tidied up soon.
The dataset was all of the same "class" of object, with subtle but real differences/ variance in features/noise. The size of the dataset was 1000, with 3000 additional samples created by averaging 2-5 random samples from the true data. There were 11000 c. dimensions though only 6000 or so contained variance.
I attach also a rudimentary parameter/architecture search script used to choose the number of layers. I also tweaked the level of compression which is propagated through the layers toward and from the latent dimension. The latent dimension size may also be tweaked. The learning rate too has a big impact.
There is code to produce images of your generated meshes in pptk too to examine changes to parameters.
*I'm not 100% sure about the calculation of the loss function as I tweaked it from an implementation found else where.
Here is a small sample of the data to give an idea of how similar each sample is:
n.b some points across the 11k or so dimensions were identical for most samples too.
*These are the xyzxyzxyz coordinates
Hello, @SamoraHunter
I am working with your code and I was able to do the training using my custom data.
The result of the training is not good because I'm getting extremely high loss.
Can I request for your data(csv) you used in the code?
I will just check if I will still get the same result.
Thank you in advance.
The text was updated successfully, but these errors were encountered: