There seems to be some confusion about my views here.... so let me clarify them.
First off, you're exactly correct, I want to hear my music reproduced as accurately as possible.
I don't want anything to alter it in an attempt to "improve" it somehow.
As far as I'm concerned the goal with high fidelity is simply perfect accuracy.
If something were to measure absolutely perfectly in all ways then it would both sound perfect and be impossible to improve.
(Perfect is perfect - if it's perfect you can't make it work better or sound better.)
However, nothing is perfect, and that includes both our audio equipment and our test equipment.
And, beyond even that, when we measure audio equipment, even though we are capable of measuring many different parameters, we typically just measure a few of the most important ones.
Your second statement is a bit of an oversimplification.
To quote someone from long ago: "There is only one way something can be right but an infinite number of ways in which it can be wrong".
It's easiest to explain with an example....
The perfect amplifier would have zero distortion.
So, even though nothing is absolutely perfect, we might hope for an amplifier that has "no audible distortion of any kind".
Unfortunately, this isn't always the case, and different types of distortion may be more or less annoying than others.
Because of this, it is possible for an amplifier with "0.5% THD" to sound better than another amplifier with "0.1% THD".
The reason is that we haven't looked closely enough - because "THD" is a general term.
So, for example, an amplifier with 0.5% second harmonic distortion might sound better than one with 0.1% third harmonic distortion.
However, in point of fact, an amplifier with no audible amount of either would sound better yet.
In short, while the goal is perfect accuracy, it is possible that a large error of one type may sound less annoying than a smaller error of a different type....
(And, since there are many many ways in which something can be wrong.... there are an awful lot of "apples to oranges" comparisons.)
However, in the end, if we had a device that had "no audible errors of any sort" then it would be "audibly perfect" and could not be improved upon.
There is, however, an extremely important distinction between analog and digital.
An analog signal cannot be perfect.... every analog signal includes some amount of noise and some amount of distortion.
Therefore, with an analog audio signal, no matter where you set the line for defining "audibly perfect", absolute perfection is impossible to achieve.
One of the main reasons for the existence of digital data in general is that, WITHIN SPECIFIC CONTEXTS, digital data CAN IN FACT BE PERFECT.
This is true because, with digital data, the context itself is very well defined.
For example, I can state that 1 + 1 = 2 .
This will be equally true whether I write those numbers on a LASER printer, with a pencil or crayon, or scratch them on a wall with a rock.
It's true because we have defined a specific context.
We have defined the fact that we don't care what the number looks like, what color it is, or whether it's smooth or jagged - as long as we can make out the numbers that's all that counts.
And the same exact thing is true for a digital audio file.
It is simply a list of numbers.
As long as we can read all the numbers, with no errors, and no omissions, then it is the same.
(And, in the obverse, the ONLY thing that could cause a problem would be if it caused numbers to get lost or be read incorrectly.)
So, for example, if I have two digital audio files, and they compare the same, then THEY ARE THE SAME.
(What happened to those numbers along the way does not and cannot matter in the least.)
Now, let's pick a slightly different CONTEXT..... we are transporting those numbers to a DAC which is going to convert them into analog audio.
- anything that alters the numbers along the way, or that causes the DAC to be unable to read them, will cause "a problem"
- anything that occurs which affects the DAC in such a way as to alter its ability to convert the numbers correctly and accurately will be "a problem"
- NOTHING ELSE MATTERS (if the numbers themselves aren't altered, and the DAC's ability to convert them isn't altered, then they have NOT been made "better or worse"
- conversely, IN ORDER FOR ANYTHING TO MAKE THOSE NUMBERS BETTER OR WORSE, it would have to either change the numbers, or interfere with or improve the DAC's ability to convert them
There are simply no more options.
Note that there are many ways in which something might "interfere with or improve the DAC's ability to convert those numbers to analog audio".
However, they are all measurable, and more or less well known.
And, in order to alter how the output sounds, you MUST alter one of them.
My point is that there are no real unknowns here..... about the only factors that haven't already been thoroughly excluded are jitter and noise.
I should note that both jitter and noise, like distortion, are somewhat complex.
So it is possible, for example, that a larger amount of one sort of jitter could be more innocuous than a smaller amount of a different sort of jitter.
However, as before, if we really have INAUDIBLE amounts of jitter and noise, then we're done, and there really is nothing left to improve.
(Or, equivalently, if we have passed our signal through something that removes all of one or both of them so none remains.)
Per Keith:
:(I guess there is a parallel to audio there; I don't want something that introduces some sort of alteration or distortion that "sounds good".)"
Why do I get the uneasy feeling that Keith would favor improved measured performance regardless of how it sounded because it's "more accurate"?