-
Notifications
You must be signed in to change notification settings - Fork 5
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
What exactly the x(or say surface) represents in forward function of Autoencoder #8
Comments
There are two kinds of points in the reconstruction task
|
Thanks for your reply. So it means you use the surface points ( However, you use all the query points in the decoder, do these query points offer gt information to make the classification labels learning more easier? Since these query points already contain the occupancy information in 3D space. Maybe I misunderstand something. BTW, if I just want to learn an AutoEncoder model that reconstructs the input point cloud itself, how can I do this? |
The learning of neural fields (a.k.a., neural implicit representations, coordinate-based networks) is to represent shapes with a function (an MLP in this case).
For your case (I assume it's a point cloud autoencoder), it's totally different from our task. However, you can still try to reuse our point cloud encoder. Then build a decoder with upsampling on top of it. We will release a subset of the datasets. We use OccNet's repository to convert shapenet models to watertight ones first. Then we sample on the surfaces to get point cloud representations of models and sample labeled (inside/outside) points in the bounding volume. However, if you are only interested in point clouds, just use any polygonal mesh processing software (e.g., trimesh) to do surface sampling. |
Hi, when I read this code, I found there are two inputs
x
andpoints
inAutoencoder
. I found thex
is thesurface
variable, so what it exactly mean?Since the reconstruction task just needs point cloud input, why do we need the "surface" as one of inputs?
Maybe I misunderstand something as I cannot download so a large dataset.
The text was updated successfully, but these errors were encountered: