Skip to content

How to Measure E2E Latency

Ahmet Oğuz Mermerkaya edited this page Feb 22, 2021 · 17 revisions

Methodology Measuring End to End Latency

Here is a way to measure E2E latency between publisher and player.

  1. Draw the timestamp(publish time) onto the stream canvas while broadcasting the stream.
  2. Draw the timestamp(play time) onto the stream canvas while playing the stream.
  3. Extract publish and play time text using OCR (Optical Character Recognition).
  4. Calculate the E2E latency by subtracting the publish time from play time.

Guide Measuring End to End Latency

Follow the step by step guide for calculating the E2E latency with Amazon Rekognition or Google's Vision API.

Before starting, let me drop the required pages source codes, they work with our web SDK;

For versions 2.2.1 and above, you can download these files into /<ams_intallation_folder>/webapps/<app_name> ( ex. /usr/local/antmedia/webapps/LiveApp )

For older versions than 2.2.1 you need to update all of the javascript files according to this repo: https://github.com/ant-media/StreamApp/tree/master/src/main/webapp

1) Synch device with a TimeServer

It is required for both publisher and player devices to be in synchronization in terms of time for calculating the difference. We use NTP time provider for the tests, it is the default time provider for mostly used operating systems by default.

If the time servers can't be used:

Manuel synchronization can be done via our player_with_timestamp.html.

In the page you will see that there is a publisher and player offset.

at

After you check the time difference manually from a time server, you can enter the offset at the player. If the devices time is beyond the NTP time, offset value will be negative. Otherwise it will be positive.

For example;

github

Here the local device is beyond NTP by 290 milliseconds. Let's say the device is publisher, then the publisher offset should be entered as -290. If the device was behind NTP by 290 milliseconds, the offset would be 290 without a negative sign.

2) Choose AWS Rekognition or Google's Vision API

For AWS Rekognition:

To enable the AWS SDK for using Rekognition, you need to get your AWS Access Key ID and AWS Secret Key. Check out the following link for AWS authentication: https://docs.aws.amazon.com/general/latest/gr/aws-sec-cred-types.html#access-keys-and-secret-access-keys

Enter your credentials and region to;

Screenshot from 2021-02-08 18-41-09

For Google Vision API:

To get the token from vision API, you should download and enable gcloud from the terminal. Check out Google's documentation;

https://cloud.google.com/vision/docs/setup

After the authentication is done, from terminal enter the following command;

gcloud auth application-default print-access-token

The response will be the token which should be given to the player_with_timstamp.html in Vision Token box.

Enter your token to;

Screenshot from 2021-02-08 18-42-49

If the gcloud can't be recognized from terminal, you can set it to the path from downloading the sdk manually to your home directory; https://cloud.google.com/sdk/docs/install

Run the following commands on Ubuntu:

$~/google-cloud-sdk/bin$ source '/home/karinca/google-cloud-sdk/path.bash.inc'

$~/google-cloud-sdk/bin$ source '/home/karinca/google-cloud-sdk/completion.bash.inc'

$~/google-cloud-sdk/bin$ gcloud

3) Measure latency

After you give the required parameters, latency will be measured every second programmatically.

Screenshot from 2021-02-08 00-56-26

(Above latency is measured from localhost to localhost)

4) Accuracy

There are a few things that effects the accuracy, when measuring the time.

  1. Canvas rendering Since we draw the current time on top a canvas with stream, we have a delay from canvas rendering time of javaScript. It is adding 10 milliseconds of more latency to the calculation, which can be ignored.

  2. Canvas FPS Canvas FPS adds 30 milliseconds of delay to the measured delay.

  3. Time Offset Even if the device is synced with a time server automatically, there will be +-10 milliseconds of error rate for each device. Which makes total +-20 milliseconds of error when we measure latency. If we sync devices by hand, the error depends on the user since it is a manual task.

User Guide

Reference

Troubleshooting

Draft

Proposals

Clone this wiki locally