-
Notifications
You must be signed in to change notification settings - Fork 2
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Annotations / confidence #3
Comments
Thank you very much for your interest in our work, and we apologize for inevitable mistakes when publishing data. In addition, what is currently released is the first version of the data, which contains some errors. For sessions with incorrect camera parameters, we roughly checked the data and recently uploaded a new file |
Thanks for the reply, I'm looking forward to later updates. There aren't many large-scale in-the-wild datasets like this available, so it's quite exciting! |
@wangjiongw After some more checking, it seems the camera calibrations have errors. In some cases the horizontal and vertical focal lengths are specified as very different values which does not make sense for these images where the pixels are very nearly square shaped. It could be something with processing the checkerboard perhaps. |
Thanks for the dataset again.
When visualizing the data, I noticed that many labels are incorrect, probably the triangulation or camera calibration failed. Is there any way to obtain some confidence values to filter these out?
Example on 20220618_1b492fd601_subj22/c02:
Thanks!
The text was updated successfully, but these errors were encountered: