-
Notifications
You must be signed in to change notification settings - Fork 3
How to test your camera records in‐sync
This section describes how to test your camera is suitable to be used for WAVE audio testing. For §8 audio-only tests and §9 combined audio and video tests the camera is required to record audio and video jointly and the recording must be in sync. There should not be a large offset in the joint recording as this might cause false negative results. Please follow these instructions to test your camera before running the test suite, and it is recommended to repeat this test if camera settings/set-up are changed.
- Ensure you have followed the audio set-up instructions here.
- To test that the camera can capture audio and video in-sync you will need:
- A/V sync test media with “flashes” and “beeps” available to download from: test media
- The following tools installed on you PC:
- ffmpeg – to process the video to individual frames.
- MediaInfo – to check the precise frame rate of the recording.
- ffplay – for playing the test media. This is part of ffmepg package, installing ffmpeg will be default install ffplay.
- Sonic Visualizer – audio processing tool to detect the time at which the beeps occur.
- Ensure that the PC is not running any other programs while playing the test media.
- Connect the camera to a power supply or ensure it is fully charged (if running on battery).
- Set audio volume of device under test (DUT) to a high-level to enable clear recording via the DUT’s wired output.
- Play the A/V sync test media on the PC using ffplay and record it with the camera.
- Check the recording using ffplay and MediaInfo to ensure what has been captured is correct:
- Contains audio recorded jointly with video.
- Video is clearly captured at around 120fps and 1080p Full HD.
- Audio is 48kHz and 16-bit.
Follow the following steps to process video to check where the flashes are captured and calculate time in recording. Ignore the flash at the beginning of the recording (at 0 second in test media), because some players might have issue to playback the first flash correctly.
- Use ffmpeg to extract the frames in the recording to images (png format) with the following command:
ffmpeg -i .\<file_name>.MP4 %04d.png
- Open the file explore that contains extracted images and locate to the images where "flash" is captured. When the "flash" is captured, the image changes from black to grey to bright white then back to grey to black.
- Find the first image with grey and note the file name (frame number) e.g. 234 in the capture below.
- Repeat for at least 3 flashes in consecutive order (ensuring you do not miss any), each time noting the starting "grey" frame number.
- Calculate "flash" start time in recording
- Find the exact frame rate of the recording to be able to make correct calculation (as some cameras might not capture at exactly 120 fps (e.g. 119 fps, 119.8 fps…).
- record_frame_duration_in_ms = 1000 / frame_rate
- flash_time = (frame_num – 1) x record_frame_duration_in_ms
- In the example recording, the camera frame rate is 119.880 fps (120000/1001 fps) and the frame number when the flash starts is 234:
- record_frame_duration_in_ms = 1000 / (120000/1001) = 1001 / 120
- flash_time = (234-1) X 1001 / 120 = 1944ms
- Find the exact frame rate of the recording to be able to make correct calculation (as some cameras might not capture at exactly 120 fps (e.g. 119 fps, 119.8 fps…).
NOTE Perform this calculation for each of the flashes identified in step 1.
Follow the following steps to process audio to check where the beeps are captured and calculate time in recording. Ignore the beep at the beginning (at 0 second in test media), because some players might have issue to playback first beep correctly at the beginning.
- Open the recording file in Sonic Visualizer.
- Process only the left channel which is in the top half of the Sonic Visualizer window (see next slide).
- Find the start of the audio wave for the "beep" and note down the time.
NOTE You can zoom in to see the exact time.
- Repeat for at least 3 beeps in consecutive order (ensuring you do not miss any), each time noting the start time.
- Finally, calculate the offset between the beep time (from step 3) and the flash time (from step 2.b.).
- offset (ms) = beep_time - flash_time
- The offset should be within the tolerance [-120,40]ms.
- Repeat it for at least 2 more flash and beep pairs in consecutive order.
- In the example:
- Offset_1 = 1944 – 1971 = -27ms (within tolerance)
- Offset_2 = 2911 – 2961 = -50ms (within tolerance)
- Offset_3 = 3912 – 3951 = -39ms (within tolerance)
- … …
- All offsets should be within the tolerance [-120, 40]ms.
- More consecutive flash-and-beep pairs that meet the requirement will give more confidence in the camera.
- All flash-and-beep pairs should be in sync within the tolerance (this assumes that the player and device are not introducing an offset): one failure means the camera test has failed.
- If there are one or more failures:
- Play the recording back on ffplay to check if there are any system sounds (notifications from emails etc.) recorded at the point of failure.
- We suggest to repeat test multiple times before deeming the camera unsuitable for running the WAVE test. Playing the A/V sync test content on another PC/device (a TV can be used) or with another player will help to rule out the device/player as the cause of offset failure.
- If the same camera fails multiple tests, then it cannot be used as it will cause false results from the Observation Framework.
device-observation-framework