Archived from groups: rec.audio.high-end (More info?)
Everyone,
I'm curious about real-world ADC used for professional-level audio
digitization. Specifically, I'd like to know if such ADCs are able to
determine the amplitude of an analog signal at a specific instant of
time (the sample), or if they simply "average" the amplitude value of
the analog signal for a small, finite time span ("delta t") around the
sample time?
I ask this question because, on another forum, a person is arguing
that current ADCs for high-fidelity audio essentially determine the
average amplitude between samples. Since the analog signal itself is
inherently non-linear (while averaging assumes linearity), this
introduces a small sampling error (different from the quantization
error) leading to audible -- and to some audiphiles harsh sounding --
artifacts at 44.1k sampling. He then argues that only by going to very
high sampling rates, such as 192k, will this effect become inaudible.
Obviously, I'm dubious on this claim, since I would assume audio
engineers designing real-world ADCs are able to effectively sample
"instantaneously" rather than averaging over a fairly long time
interval (such as equal to 1 divided by the sample rate). But I'll
leave it to the ADC experts here to clarify the gentleman's particular
argument.
Thanks.
Kurt
Everyone,
I'm curious about real-world ADC used for professional-level audio
digitization. Specifically, I'd like to know if such ADCs are able to
determine the amplitude of an analog signal at a specific instant of
time (the sample), or if they simply "average" the amplitude value of
the analog signal for a small, finite time span ("delta t") around the
sample time?
I ask this question because, on another forum, a person is arguing
that current ADCs for high-fidelity audio essentially determine the
average amplitude between samples. Since the analog signal itself is
inherently non-linear (while averaging assumes linearity), this
introduces a small sampling error (different from the quantization
error) leading to audible -- and to some audiphiles harsh sounding --
artifacts at 44.1k sampling. He then argues that only by going to very
high sampling rates, such as 192k, will this effect become inaudible.
Obviously, I'm dubious on this claim, since I would assume audio
engineers designing real-world ADCs are able to effectively sample
"instantaneously" rather than averaging over a fairly long time
interval (such as equal to 1 divided by the sample rate). But I'll
leave it to the ADC experts here to clarify the gentleman's particular
argument.
Thanks.
Kurt