The “audio.latency”
tracing category is a tool to measure internal chromium audio latency, without any hardware dependencies. This document outlines how to record and analyse a latency trace, and offers suggestions as to how to construct test pages.
When an input stream detects an audio signal peak (e.g. clapping hands in front of microphone), we start an “audio.latency” trace event. When the output stream detects an audio peak, we stop the tracing event. The duration of the trace event should give us the total internal Chrome latency. This excludes any latency incured from hardware such as a microphone or sound card's internal buffers.
getUserMedia()
chrome://tracing
should be open.getUserMedia()
+ one audio output (e.g. one media element OR one WebAudio graph).Input Controllers
or Output Streams
in the chrome://media-internals audio tab, as pictured below. getUserMedia()
and playing out the captured stream.AmplitudePeak
trace event: the wall duration corresponds to the measured latency The simplest latency test page involves directly routing a getUserMedia()
MediaStream to an <audio>
element or WebAudio. Such a web page should allow us to measure the round trip latency within Chrome, between the audio service and the renderer process. For convenience, a simple test page is checked in under third_party/blink/manual_tests/audio_latency.html
. The audio signal flow for this test pages matches the diagram in the overview.
Adding an encoding and decoding step to a simple baseline test web page should allow us to measure the extra latency incurred by encoding and decoding audio. No such test page is provided, but the signal flow should follow this diagram:
Adding a network round trip to a codec latency test page should simulate the expected end-to-end latency in real-world scenarios. The sender and receiver test pages can live in two tabs, if only one page calls getUserMedia()
and one page plays out audio. Again, no example test page is provided, but the signal flow should follow this diagram: