Tuesday, December 26, 2023


Confused by latency?

Latency in a DAW can ruin a song creation by throwing off your timing; how bad things get depends on the amount of latency in the system you're using. Let's examine a topic that all too often gets swept under the rug or buried in a myriad of less important specifications. 


Back in the 70's when multitrack tape recording was in its infancy, there basically was no latency. With a proper headphone system, when you sang into a mic the sound was heard immediately, with no delay in the musicians headphones. 


In 2024 there are many paths to increased latency, it comes creeping into your production in little 7-15 ms (or more) increments depending on how much memory you devote to your buffers and how well your interface operates with minimal buffer settings. We'll talk more about buffers later.

What Does Latency Sound Like?

Each millisecond of latency-delay is equivalent to about 1-foot in physical distance. This means extremely low latency, something like two milliseconds total roundtrip, from input to output, is equal to the delay your ears would experience if you were playing with other musicians who were about two feet away from your ears. It is about the same time-delay a drummer experiences playing the snare drum in a typical drum set. That is what extreme low latency sounds like, we call that “real time” because there is virtually zero delay, it feels natural to the performer and sounds good in their headphones. Not many current systems deliver this level of performance but a few do. 

At relatively low levels of latency (8-15 milliseconds total, various per track), sounds blending in your studio headphones become slightly blurred or cloudy. In trying to understand latency and its affect on your studio recordings, it is always handy to visualize the various recorded tracks as musicians on a stage and how far away from you are the other performers (the tracks). More latency means they sound like they are farther away. When they are far away it takes more time for the sound from their instruments to reach your ears. 

The best sounding DAW interface is the one that has the lowest latency because it delivers performance that feels like “real time” which allows the various recorded tracks to deliver great feel. Musicians call this "tight" because it sounds like everyone is playing very near the beat and their performance together sounds good.


  1. Use a DAW which provides fast (two millisecond, or better) round trip latency performance

  2. Use a DAW that provides a separate monitor layer in software

A separate monitor layer in your DAW software means another mixing panel to keep track of, it's like running another whole separate mix, or several of them depending on the hardware and software setup. Many manufacturer's act like having another application to control your monitor mix is a great feature but I like things simple: if your DAW interface has extremely low latency you won't need an external app for monitor mixing.

MORE LATENCY OPTIONS: A Good Reason To NOT Use Wireless Systems In The Studio;

Modern digital wireless systems can inject latency too, under five milliseconds on good ones, 12-15 milliseconds on the extremely low priced models. Lets say your budget DAW system has eight milliseconds of latency and you use a low priced wireless guitar system too, it's like the band you're playing with is 20 feet away from your ears. If you've ever played with a band on a large stage, things get weird when the band members spread out too far (without sophisticated monitor systems). 

A long guitar cord can also affect your tone in the studio but a cord will NEVER increase latency. 


If you're a solo artist creating original compositions (that is one person, recording all the musician tracks yourself), you want the lowest latency system you can get. With all the challenges a recording artist faces the last thing you need is built-in, blurry sound, courtesy of latency. Tweaking-out latency is an editing nightmare; it is so much easier to record the tracks with minimal latency in the first place. 


If you've ever played a software synthesizer using a keyboard controller then you know what it feels like when latency is high; you press a key down but there is a slight delay before the tone sounds, it's not much of a delay, but it's there and it spoils the artistry of your performance. Latency gnaws on your timing and tightness as a musician. The only way to get rid of this problem is to buy hardware for your DAW that provides the lowest level of roundtrip latency delay.

In a system with very low latency, the time it takes from when you press a key on the midi controller to when the sound comes out of the DAW interface is about one millisecond. This level of low latency performance feels natural to a keyboard player, very close to the playing experience of using a real acoustic piano. 


Manufacturers don't really want you to know about latency, they don't want you to give it a thought. They can brag about extreme headroom or the A-to-D and D-to-A chips they're using as a smoke screen of technical specification jargon to draw attention away from their lack luster latency performance. Don't be confused: Latency matters because it dramatically affects how your performance is recorded in the first place and that affects how it sounds when your song is done. 

Driverless hardware (which relies on Apple Audio Units or the Windows equal) sounds like a good idea until you realize it can't achieve ultra low levels of latency on its own, even using Thunderbolt. As it turns out, requiring you to install a special piece of driver software is the best way to achieve ultra low latency using the latest computer technology. Custom software drivers that exploit Thunderbolt to the max deliver realtime performance without an additional monitor layer of software. 


This latency discussion should be very important to folks who use their DAW as a replacement for a multitrack tape recorder. In a “Hybrid” out-of-the-box studio which uses virtual instruments, real instrument recordings and realtime processing, all mixed through an analog console with effects sends and returns; low latency is EXTREMELY important. If you're recording a lot of individual instruments acoustically and mixing these with virtual synthesizers and sample based instruments then you definitely want to be focused on latency and doing whatever you can to improve your recorded performance.


Depending on your DAW software latency may be referred to as "Delay Compensation", it's the same topic with a different title. 

Somewhere in your DAW settings is something called “Buffers” or "Buffer adjustments", it might go by a different name but it's in there and every DAW has this. When you adjust this setting you allocate more DRAM (memory) to each tracks playback performance. Terminology gets confusing here, buffers are in "Samples" while delay Latency is measured in milliseconds. 

Buffer adjustments in Apple Logic DAW software, Mac Studio 20-core, 128GB memory, 4TB SSD

On this system with the I/O Buffer Size set to 32 Samples minimum Resulting Latency is only 2.0 ms which is virtually Realtime Performance.

The Quantum 4848 audio interface software driver reduces latency to this low level and you can leave it there all the time.

For big mixing projects on DAW systems with worse latency you can increase the I/O Buffer size up to 128 or even 256 samples to allow a lot more plugins to be used. 

With the I/O Buffer at 256 Samples, resulting latency delay makes accurate realtime recording virtually impossible but it allows you to use more plugin processing on lower powered computer systems.

From a creative mixing standpoint a big buffer setting can become intoxicating with seemingly unlimited plugin processing until you discover you want to record a new vocal or guitar part. In this unfortunate situation you would need to go back to low buffer settings to eliminate the adverse delay effects of latency; changing your buffer size midstream in a song production with a lot of virtual instruments and tracks can lead to a crashing computer (and wasted time).

If you increase your buffer size then you also increase the monitoring latency. With large buffer settings it can be like trying to play along with a band that is literally150 to 250 feet away from you. 


When you're recording with a digital audio system and listening simultaneously to the audio output from the interface, you are hearing the audio signal after it is: 

Analog to Digital Conversion in the interface > sent to DAW computer > returned from DAW > digital to analog audio conversion > to headphone amplifier. (when you see "roundtrip" mentioned, this is the trip they're talking about, one track going through the whole process from input, to recording to playback).

All these steps take tiny amounts of time, they happen real quick but these bits of time add up to milliseconds (ms). Another name for all this is "processing time" because digital audio is not yet truly real time (though it is getting very close).

The fastest (and lowest latency) DAW systems deliver "roundtrip" delay of 2 ms when you're recording with a mic or analog input. Using virtual synthesizers they are even faster with latency delay performance of 1.1 ms between when you press a key and when you hear the synthesizer play; that is fast enough to please a classical pianist playing a virtual grand piano live. 


If you can find a DAW hardware interface with custom driver software that takes maximum advantage of the Thunderbolt bus interface and reduces latency to less-than three-milliseconds total, you'll be on your way to tighter sounding recorded performances. 

BEWARE of the interfaces that use “Thunderbolt” as a marketing virtue but don't actually deliver faster roundtrip latency performance than USB 3 without another software application for monitoring.

Dig a little bit deeper into the real specifications of whatever system you're looking at (before you buy) and invest in systems that deliver the lowest latency. 


Older, personal DAW interfaces had an extra hardware knob or button that would turn on “hardware-thru monitoring” which lets you hear the microphone or guitar/instrument input directly with zero latency. When hardware-thru is enabled you can blend recorded tracks playing back from the DAW with a live microphone or instrument input feeding into the DAW. While these interfaces don't have the greatest technical specs and you don't get to listen with reverb or effects, the “real time” monitoring this simple feature provides is hard to beat. 

I keep an older Focusrite two-channel interface in my kit of devices because it works "driverless" with modern operating systems and it's dirt simple for realtime monitoring. 

GOOD MUSIC TO YOU! Thanks for reading High on Technology

©2023 Mark King  It's not ok to copy or quote without written permission from the author.