Skip to content

How to Measure E2E Latency

Asharam Seervi edited this page Apr 10, 2023 · 17 revisions

Attention: We have migrated our documentation to our new platform, Ant Media Resources. Please follow this link for the latest and up-to-date documentation.

Methodology

Here is a way to measure E2E latency between publisher and player.

  1. Draw the timestamp(publish time) onto the stream canvas while broadcasting the stream.
  2. Draw the timestamp(play time) onto the stream canvas while playing the stream.
  3. Extract publish and play time text using OCR (Optical Character Recognition).
  4. Calculate the E2E latency by subtracting the publish time from play time.

Guide Measuring End to End Latency

Follow the step by step guide for calculating the E2E latency with Amazon Rekognition or Google's Vision API.

Before starting, let me drop the required pages source codes, they work with our web SDK;

  • publish_with_timestamp.html: The page that draw the timestamp(publish time) on the stream canvas while broadcasting the stream. This page is already available in Ant Media Server v2.3.0+ with other samples.
  • player_with_timestamp.html: The page that draw the timestamp(play time) on the stream canvas while playing the stream. It also calls OCR API and calculate the latency. This page is already available in Ant Media Server v2.3.0+ with other samples.

1. Synch device with a TimeServer

It is required for both publisher and player devices to be in synchronization in terms of time for calculating the difference. We use NTP time provider for the tests, it is the default time provider for mostly used operating systems by default. If the time servers can't be used(the case in mobile devices), you can manually synchronize the devices via player_with_timestamp.html. Check the images below

Manual Synch

  • Find the offset in publisher and player devices. We've used AtomicClock to find the offset.

github

Here the local device is beyond NTP by 290 milliseconds. The time difference between 11:09:25.060 and 11:09:25.351(System Clock - bottom of the image) Let's say the device is publisher, then the publisher offset is -291. If the device was behind NTP by 290 milliseconds, the offset would be 290 without a negative sign.

  • After you check the time difference manually from a time server, you can enter the offset of the publisher and player devices at the player page. If the devices time is beyond the NTP time, offset value will be negative. Otherwise it will be positive. In the page you will see that there is a publisher and player offset.

at

2. Setup OCR: Choose AWS Rekognition or Google's Vision API

Enter your credentials and region to;

Screenshot from 2021-02-08 18-41-09

After the authentication is done, from terminal enter the following command;

gcloud auth application-default print-access-token

The response will be the token which should be given to the player_with_timetamp.html in Vision Token box.

Enter your token to;

Screenshot from 2021-02-08 18-42-49

If the gcloud can't be recognized from terminal, you can set it to the path from downloading the sdk manually to your home directory; https://cloud.google.com/sdk/docs/install

 Run the following commands on Ubuntu:
$~/google-cloud-sdk/bin$ source '/home/karinca/google-cloud-sdk/path.bash.inc'
$~/google-cloud-sdk/bin$ source '/home/karinca/google-cloud-sdk/completion.bash.inc'
$~/google-cloud-sdk/bin$ gcloud

3. Measure latency

After you give the required parameters, latency will be measured every seconds programmatically.

Screenshot from 2021-02-08 00-56-26

Accuracy of the E2E Measurement

There are a few things that effects the accuracy, when measuring the time.

  1. Canvas rendering Since we draw the current time on top a canvas with stream, we have a delay from canvas rendering time of javaScript. It is adding 10 milliseconds of more latency to the calculation, which can be ignored.

  2. Canvas FPS Canvas FPS adds 30 milliseconds of delay to the measured delay.

  3. Time Offset Even if the device is synced with a time server automatically, there will be tens of milliseconds of error rate for each device. Which makes total double of tens of milliseconds of error when we measure latency. If we sync devices by hand, the error depends on the user since it is a manual task.

User Guide

Reference

Troubleshooting

Draft

Proposals

Clone this wiki locally