Heeeeeeeeee's baaaaaaaaack!
(I always wanted to use that line someday.)
Actually, from reading through the previous posts, and the articles they link to, I already see most of the important information....
(Geeqner summed it all up pretty nicely in his post).
So all I'm going to add is a little mroe detail here and there - and my own take on the
priorities of what's involved.
I'm going to start by talking about an assumption that a lot of audiophiles seem to accept without thinking about it too much.
All analog systems have significant and serious losses in terms of "capturing all the information".
Microphones have limited bandwidth, frequency responses that are far from flat, uneven acoustic pickup patterns, and significant amounts of distortion and noise.
Microphone preamps also introduce noise, distortion, and frequency response anomalies.
Therefore, before we even start, we do not have "an accurate representation of the original".
The original is long gone; the horse has escaped, the barn has fallen down, and we're working from a snapshot, some bones, and a DNA sample.
The analog recording and playback process has a
LOT of steps in it.... and each and every one introduces losses and errors.
- the microphone itself
- the microphone preamp
- the mixing console
- the record mastering lathe
- the mastering, mothering, and pressing process
- the turntable
- the stylus tracking the groove in the vinyl
- the motor mechanism in the cartridge that converts motion into an electtical signal
- the phono preamp
- the regular preamp
- your amplifier
- your speakers
Part of the real problem with analog systems is that all of those errors add up... and all are pretty much unavoidable.
I also left out the optional steps.... like when the vinyl gets scratched, or the analog master tape stretches, or gets some magnetic bleed-through.
In a digital reproduction signal chain, you have pretty much the same steps
UP TO THE MICROPHONE PREAMP.
And you also have pretty much the same steps
FROM THE REGULAR PREAMP ON OUT.
The difference is that, in between those two, you have the equivalent of one of those wormholes thay have on Stargate Atlantis.
You have a single step where the analog audio is converted into digital format (by some sort of ADC - an analog-to-digital converter).
And you have another single step where the digital signal is converted back into analog (by some sort of DAC - a digital-to-analog converter).
Both of those steps are
REALLY critical, and either can be done well or badly, and errors or flaws introduced at either of those critical steps is essentially irreparable.
HOWEVER, BETWEEN THOSE STEPS, IT IS QUITE POSSIBLE TO GUARANTEE ACTUAL PERFECTION OF SIGNAL STORAGE AND TRANSFER.
Note that I said
POSSIBLE.
You absolutely can do all sorts of things to the signal while it's in digital form that will alter it or mess it up.
However, if you choose not to, you can avoid doing so.
Ten years ago, I ripped the music tracks from my favorite CD.
I stored them on a disc as digital data files.... and I also created a checksum of the contents.
(A checksum is a unique digital signature of the contents.... if I do a new checksum of those files today, and it matches the original, then I
KNOW that the files haven't changed).
I can make a copy, of a copy, of a copy, of a copy of those files..... bury one in the back yard, put one on a spaceship to Mars, and send 10,000 of them to my closest friends.
And, unless I screw up, those copies will be
EXACTLY THE SAME as the original... not "close"... not "audible the same"... they will *BE* *THE* *SAME*.
And, if I send a copy of that checksum along with each copy, the people who recieve them can test them and know, with absolute certainty, that each is a
PERFECT copy of the original.
In other words, I have gone from an analog process, where errors and deterioration could creep in at every step of the process, to a more clearly defines process.....
The recording studio did their part, mastered the album, and got it just the way they wanted it....
And, at my end, I have a playback system that sounds exactly the way I like it, according to my priorities....
And, between those two ends, my music is "locked in suspended animation".
I don't have to worry if my digital file will get scrateched (it won't).
And I don't have to worry if the copy I have is better or worse than the one my friend bought (they are the same).
(If they're copies of the same master, then they
ARE the same, and I can do a checksum on each and know beyond any doubt that this is true.)
And, yes, copyright issues aside, I can make more perfect copies whenever I want to (and I don't need a $500k mastering lathe to do it).
The other big difference between analog and digital technology is that the quality level of digital audio files is "open".
For all practical purposes, vinyl has reached its limits.
A typical good quality vinyl album may have a signal-to-noise ratio of 70 dB (with luck).
It can actually have a frequency response that extends as high as about 50 kHz (with significant drawbacks if you choose to go above 20 kHz or so).
It requires several steps that involve equalization, and various sorts of transducers, each of which introduces frequency response errors and distortions.
We will NEVER see a vinyl album with a S/N ratio of 135 dB...
We will never see a vinyl album with a frequency response that is flat from 20 - 20 kHz +/- 0.01 dB
We will never see a vinyl album with a THD of 0.003%
It is absolutely true that both the A/D and D/A conversion processes introduce errors.
And it is also true that, for a given sample rate, there are limits on both the frequencies you can store and ther time accuracy you can achieve.
HOWEVER, the limits that are there with vinyl are
NOT there with digital files... because you can
ALWAYS "up the ante".
A CD, with a sample rate of 44.1 kHz, has a frequency response limited to a little over 20 kHz.
But, if you really want a frequency response that's better, go up to 96k, which has a frequency response that extends to about 45 kHz.
Want more?, Go to 192k, and you're frequency response will extend to about 90 kHz.
Want even more? Go to 384k, and you'll get frequency response to about 175 kHz.
Likewise, a CD has a dynamic range of just over 80 dB (accounting for the bits you lose to dithering).
Not enough? Go up to 24 bits and your dunamic range is around 130 dB.
There's
ALWAYS another step.... with no end in sight.
However, to go back to the basic and original question.....
Yes, the conversion process introduces significant errors.....
And, yes, many CDs sound pretty darned bad.
(But, face it, we've all heard some pretty awful sounding vinyl too.)
However, based on the sound quality of the CDs and vinyl albums I've heard....
The
BEST QUALITY CD I've ever heard sounded at least as good as the
BEST QUALITY vinyl album I've ever heard.
I wouldn't go so far as to claim that either was perfect..... and, yes, both were very very good.
Now, to go back to that question again.....
I personally think that a well-produced CD can at least match the quality of a well-produced vinyl album.
(And I notice those quantization errors much less than I notice the surface noise and distortion on a vinyl album.)
And it adds to that all the benefits I outlined above for digital media.
However, I'm still prepared to concede that some people might find the particular flaws of CDs more annoying than those on vinyl.
So, just to be safe, I think I'll go up to 24/96k digital.
Now I have a frequency response that's flatter than the best vinyl, a dynamic range that's about 70 dB better, and distortion that's orders of magnitude lower.
And, in case you were wondering, all those other errors introduced by the conversion have been reduced by about the same amount.
And, if I was totally obsesses, I can kick it up another equivalent notch by going to 24/192k.
And, on many of this year's products, I can bump it up to 32/384k.
Now, to be brutally honest, the quality of the mastering,
AND THE CHOICES MADE DURING MASTERING, are more important than any of this.
There is clearly no issue with the dynamic range available on a CD, or on a 24 bit audio file....
But that isn't really going to help us at all if the guy who mastered it
DECIDED to limit the dynamic range because he
WANTS it to sound that way.
I should also take this opportunity to point out that the so-called "loudness wars" are not at all what many audiophiles have been led to believe.
(And most of the so-called analyses of dynamic range are in fact technically deeply flawed.)
In fact, the dynamic range of virtually every CD ever recorded is pretty much the same;
the quietest sound on them is the noise floor; and the loudest sound can't possibly go above 0 dB (many CDs maintain a safety margin of about 2 dB).
What has happened is
NOT that the dynamic range has been gradually getting smaller...
What has happened is that the
AVERAGE VARIATION IN SOUND LEVEL has been getting smaller...
This is not some sort of error... it is a deliberate choice... and it is an aesthetic choice.
Here's an article that explains this really well... for those who are interested...
I STRONGLY recommend reading this article... because it explains why modern remasters so often sound so different than the original verisons.www.soundonsound.com/sound-advice/dynamic-range-loudness-war