-
Notifications
You must be signed in to change notification settings - Fork 31
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Eye tracking (for OSC etc) #21
Comments
@shinyquagsire23 In accessibility, we can enable pointer control with eyes, and a white pointer will show on screen just like the mouse pointer for iPad but driven by eyes. Can we obtain the position of this pointer by putting a virtual screen and capture the mouse event? https://developer.apple.com/documentation/realitykit/arview/mousemoved(with:) |
I wish lol, none of the accessibility stuff even seems to work in immersive mode, keyboard-based pinch events report 0,0,0 as their gaze ray, and we only have one event handler accessible bc there's no windows |
@shinyquagsire23 I can still see a mouse pointer in immersive mode. Will test if it have correct mouse input. |
yes it's visible but there's no flat plane to get the movements from. Frankly I doubt even flat plane apps get them. |
Can we put a huge cube to capture it? |
Technically maybe, but again the flip side is like, I don't know if those movements can even be sent to panels at all, and Apple doesn't allow programmatically moving windows, so the user would have to position them by hand :/ |
Possibly relevant docs:
https://developer.apple.com/documentation/compositorservices/4082136-cp_drawable_get_rasterization_ra
https://developer.apple.com/documentation/metal/mtlrasterizationratemap
The rasterization map might leak some basic eye tracking? Needs investigation. And maybe pestering Apple for an actual permission + API if the above doesn't work.
The text was updated successfully, but these errors were encountered: