You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Currently, all axis data is normalized either signed (-1.0 to 1.0) or unsigned (0.0 - 1.0) except for IMU data, which has custom scales applied in source/target devices to make them feel somewhat close. Instead of doing this, we should refactor all IMU implementations to normalize/denormalize the values so that any source device feels consistent on any target device.
The text was updated successfully, but these errors were encountered:
This normalization is useful for enabling features like a gyro mouse or a gyro joystick.
A big part of the translation from 3D orientation to 2D vector would be choosing the plane along which you'd project your values. e.g. how 3DS aiming worked in Ocarina of Time or the Google Daydream controller.
The mechanic I see a lot is having a "reset orientation" trigger which sets your current position as "forward". Is there a way to specify an action to toggle the gyro mapping that could work for this and a place to store the "forward" vector that persists between translations?
Quote from @NeroReflex while discussing in Discord:
There are like.... 6 algorithms generally used in-games and none of them works really good for every game type sadly
You will need to have multiple translations available. The trivial one of discarding z is not that much good
Valve started with one well anough and is adding more because only one simply cannot make everybody happy
If you do your searches from the perspective of a game dev you will find multiple results doing gyro-> controls with different suggested algorithms for each type of game
For example a popular thing they do (I absolutely dislike) is using the derivative of the rotation to do the translation
Or simply changing the sensitivity of the translation
Personally I plan to use this for navigating my screen rather than FPS so maybe we could start with an algo that works for that and go from there.
Currently, all axis data is normalized either signed (-1.0 to 1.0) or unsigned (0.0 - 1.0) except for IMU data, which has custom scales applied in source/target devices to make them feel somewhat close. Instead of doing this, we should refactor all IMU implementations to normalize/denormalize the values so that any source device feels consistent on any target device.
The text was updated successfully, but these errors were encountered: