-
Notifications
You must be signed in to change notification settings - Fork 1.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
WebXR Depth Camera #5848
WebXR Depth Camera #5848
Conversation
This PR is ready for a review. Added an example for object placement using depth information. Bear in mind that it is only available when depth sensing uses CPU path. GPU path does not provide such functionality, but potentially can be emulated by reading pixel from a depth texture in a separate PR. This is useful technique for more precise object placement when hit-test is not providing reliable information. I've also used both of those techniques, and pick the closest point which provided better results for reliable objects placement. |
mipmaps: false, | ||
addressU: ADDRESS_CLAMP_TO_EDGE, | ||
addressV: ADDRESS_CLAMP_TO_EDGE, | ||
minFilter: FILTER_LINEAR, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
does it make sense to use linear interpolation on the depth?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The depth texture always comes with lower resolution than framebuffer resolution of each eye. So interpolation helps to smooth out the edges. But now you mentioned it, if we don't upload actual texture, then we are not setting the texParameteri
of it, right? So these flags pretty much do nothing?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think you might be right here.
if (this._textureColor) { | ||
this._textureColor.destroy(); | ||
this._textureColor = null; | ||
} | ||
|
||
if (this._textureDepth) { | ||
this._textureDepth.destroy(); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
So we override this._textureDepth.impl._glTexture
and then destroy it.
- Was _glTexture already allocated before? I suspect not, so we're not leaking it.
- can we destroy _glTexture provided by XR?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We don't allocate it indeed, but for the safety, when the view is destroyed (end of the session), I do destroy that texture also. Subsequent sessions work well, so it seems to be good.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Very nice PR @Maksims . I added few minor comments / questions, but this is pretty much ready for merge. Thanks!
Thank you! |
This PR is dependent on #5786 (Camera Access).With release of Quest 3, Depth Sensing API has been added to its browser, which also affected the Specs. Previously only one type of devices were capable of providing depth information: Android Phones, which assumed a single monoscope view. So XrDepthSensing was designed with assumption of a single view. Now Quest 3 exposes depth data per view, which is multiple depth textures. This does require a redesign of the access to depth data. So part of this API we are deprecating XrDepthSensing in favor of XrView access to depth information.
This API is pretty low-level. It also provides a more precise real-time depth based hit testing using getDepth method, which is more precise than XrHitTest and reacts to moving objects in real world.
In the future PR I will add material flag for "Depth Occlusion", that will automatically occlude virtual geometry with real world geometry, with underlying implementation that makes use of Depth Sensing for users matter of just a checkbox.
New APIs:
Examples:
TODO:
I confirm I have read the contributing guidelines and signed the Contributor License Agreement.