Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

What are the Sensors requirements to make smartphones ARCore ready ? #594

Closed
nathgilson opened this issue Oct 4, 2018 · 2 comments
Closed
Labels
device support questions/issues with device enablement question

Comments

@nathgilson
Copy link

nathgilson commented Oct 4, 2018

In treads #51 and #49 we can learn that "Certifying a device for ARCore requires a large amount of calibration work looking at the camera geometry and behavior, IMU behavior, relative position of the two, and most critically the relative timing of the two."

Is that possible to have a full detailed documentation concerning the hardware specifications and calibration ?

@inio inio added question device support questions/issues with device enablement labels Oct 4, 2018
@inio
Copy link

inio commented Oct 4, 2018

The biggest things a device manufacturer can due to make enabling ARCore as easy as possible are (in order of decreasing frequency that they've been issues):

  • By far the most common issue: Timestamps of camera images and motion sensor data should be correct and referenced to CLOCK_BOOTTIME. Timestamps should be consistent across all power modes, CPU load, system configuration options, multi-month uptimes, and build types (userdebug vs user). Be careful of the Android specifications regarding when exactly during the integration of a camera frame the timestamp should be (I believe beginning of integration of the top-left pixel but I'm not 100% sure on that) and that the timing-related metadata fields (exposure time, rolling shutter skew, etc.) are correctly filled in.
  • When multiple surfaces with different aspect ratios are attached to a camera capture session, the camera driver should maximize the covered sensor area. For example, if VGA (4:3) and 1080p (16:9) surfaces are attached to a camera with a 4:3 sensor, the sensor should be configured to read out the full sensor area with the 16:9 image being cropped out of that. We've gotten good at working around this one but it costs power.
  • The device should support Sustained Performance Mode and be able to sustain performance in this mode even when in a typical case.
  • The camera should provide consistent focus in fixed focus mode regardless of device orientation and temperature.
  • The device should have a high quality IMU providing uncalibrated accelerometer and gyroscope streams with as close to raw sensor readings as possible (minimizing filtering done in the IMU itself and eliminating anything done CPU side).
  • Mechanical design and assembly procedures should ensure that the camera module and IMU chip are consistently positioned and oriented relative to each other.
  • The camera lens should be designed to minimize changes in distortion when adjusting focus.

If you are a device manufacturer, additional details may be available by contacting your Android Tech Account Manager. Doing this early can save a lot of time later.

@inio inio closed this as completed Oct 22, 2018
@Galyean
Copy link

Galyean commented Dec 7, 2018

@inio In fact, android provide sensor events such as "TYPE_ACCELEROMETER_UNCALIBRATED" and "TYPE_GYROSCOPE_UNCALIBRATED". But as I can see from the documentation, the data from these uncalibrated sensor events may not be very close to raw sensor readings as filtering done in the IMU can not be avoided. So, how can I get raw IMU sensor readings if I use an android device? Looking forward to your reply.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
device support questions/issues with device enablement question
Projects
None yet
Development

No branches or pull requests

3 participants