Thread: TV and Hi-Fi
View Single Post
  #23   Report Post  
Posted to uk.d-i-y
Paul[_46_] Paul[_46_] is offline
external usenet poster
 
Posts: 870
Default TV and Hi-Fi

Dave Plowman (News) wrote:
In article ,
Max Demian wrote:

Motorola smart phone FM radio function has a delay of about 1/3 second.
So there must be some digital processing, even with FM.


Think with all modern computers there's no analogue side chain. So all
analogue is digitised, then converted back to analogue for the headphones.

And all digital has latency.


The ADC can do a conversion in one sample time. If
sampling at 48KHz, the delay to make digital is
the time period of one clock cycle.

That's not where significant computer audio delay comes from.
At that level, we're talking a few clock cycles here
and there (say 20us to 60us or so). You need time for
the sharp cutoff analog filter, a time for the sample
and hold, then the conversion cycle.

Computer audio delay, comes from interrupt handlers
and in effect, "packet buffers". The driver sends
a few hundred bytes to the sound card. When the buffer
is half empty, the scheme requests another packet.
The idea is to have a fifo buffer, where there aren't
any overruns or underruns. The size of the buffer
determines how many interrupts per second will result.
And across all OSes, the buffer has been set to a rather
large value at times. I don't have a collection
of numbers for this (in Linux you could adjust this
in ALSA).

Packet buffers reduces the overhead. If you had to raise
an interrupt for every byte of data, there would be
48000 interrupts per second. Which is four times
too many. Around 10000 to 15000 interrupts per second
is considered "enough", before the interrupt overload
solution cuts in. It's not practical to do it a
byte at a time.

All computer audio has latency (a latency you can hear
or experience). Not all hardware devices by themselves
though, have latency. If you had an ADC tacked in line
with a DAC, the latency through there would be quite
small and would not appear to affect lip sync on a
TV set.

The video latency through your LCD monitor, is around
four frame times or so.

The next time a computer mis-behaves on you, well,
there's a lot of crazy stuff going on, and not
very good odds of the computer doing a nice job.
Sorta like those LED lightbulbs we bought, that
take three seconds to turn on. Great idea,
but "heavily taxed". A latency tax.

The computer isn't always available when you want it
to be. People building "audio workstations", verify
their build with DPCLAT. The computer this was
collected on, when the desktop is idle, only the
shortest of the green bars is present in DPCLAT.
This means the computer could be used as an
Audio Workstation. However, you can't record audio
and play a 3D video game, without the possibility
of a click or pop in your recording effort. Measuring
the Delay Procedure Call latency, indicates whether
the computer is responsive enough for real time work.

https://i.postimg.cc/FzMkN7pV/dpclat-composite.gif

Using that picture, I could set my buffer depth to
1 millisecond. And tolerate most behaviors on the
machine. If I wanted to cover absolutely everything,
the buffer becomes 20 milliseconds, and then I can
be playing a CD while a game switches to 3D mode,
and the audio won't click, rasp, or buzz.

And just for the record, the video card should
not be allowed to do that. Those tall red bars.
Somebody should give NVidia a smack.

Paul