Here's the way it works....
Output impedance
BY ITSELF has nothing whatsoever with sound.
However, it can affect how other things may affect the sound.
In the old days you used to hear about things like "matching output and input impedance" because that would give you "maximum energy transfer".
That's actually true, and used to matter back when audio components had signal-to-noise rations that were barely high enough.
(Matching output impedance and input impedance will give you the absolute best S/N ratio - but the difference it makes is actually really tiny.
Today it's still important when you talk about radio transmitters and antennas, and cable TV, because a mismatch will result in energy being reflected back through the system (it messes up the picture).
However, unless you're running interconnects that are
MILES long that won't happen at audio frequencies, and most of the other considerations don't apply with modern audio equipment.)
Here's how it works with modern audio equipment.
First of all, you need to understand that "output impedance" is not at all the same as "minimum recommended load".
Output impedance describes a characteristic of the output circuit itself, while minimum recommended load is talking about the load that you connect to it.
This is conceptually important because a lot of modern equipment may have a
VERY low output impedance (1 Ohm or less); but may still specify a much higher "minimum recommended load".
(An excellent example is a typical power amp, which is rated to drive a minimum 2 Ohm or 4 Ohm load, even though its output impedance is probably below 0.01 Ohms.)
OK.... IGNORING THINGS LIKE INTERCONNECTS, the general guideline is that, to avoid various issues, you want to make sure that the minimum recommended load on your source device
is a lot lower than the input impedance of your destination device (10x is usually considered to be a reasonable safety margin; more is better).
NOTE that, for the most part, this is a threshold type situation; it
DOES NOT suggest that a system with a 20x safety margin is likely to sound better than one with a 10x safety margin.
(What actually happens is that, if you exceed or approach the limit, you may get a very low output level, or you may even cause the source device to distort... but it will be fine until then.)
Exactly what happens when you exceed the safety margin depends on the circuitry in both pieces of equipment.
With some combinations, you'll get a frequency response interaction; with some you'll get excessive distortion; and, with some, you'll just get a very low output level, but the signal will remain clean.
(With preamps and sources, it's pretty unlikely that you'd actually damage anything - but not impossible.)
This is specifically an issue with vintage tube equipment.......
"In the old days" most tube amps had an input impedance between 100k and 1 megOhm, and many vintage preamps were designed to drive this sort of load.
They literally were not designed to drive a load lower than 47k or thereabouts; so some vintage preamps will have serious difficulties driving the 10k or 20k input impedance of many solid state power amps.
(Note that, in specific designs, using a lower input impedance will help you get better noise performance; but that isn't to say that designs with lower impedance are necessarily better....
and you can't "improve the design" just by lowering the input impedance; it's part of an overall design consideration.)
IN GENERAL, a tube preamp with a cathode follower output will both have a low output impedance, and be willing to drive relatively low impedance loads (like most solid state amps).
IN GENERAL, a simple tube preamp, using a
PLATE COUPLED output from a tube like a 12AX7, will
NOT be happy driving a load below 100 kOhms or so.
NOW, we need to revisit that line about "ignoring things like interconnects"... specifically as relates to tube equipment.
So far, we've been sort of assuming that you're connecting both pieces of equipment directly together (no wire).
However, this obviously is rarely the case in real life... so here's the deal.
The inputs on most pieces of equipment is pretty much a resistive load (this is even mostly true at audio frequencies if the input is capacitor coupled).
So, when the output impedance of your source interacts with the input impedance of the destination (both resistive), it tends to affect level, distortion, and maybe noise - but not the frequency response.
HOWEVER, most interconnect cables are mostly capacitive (this is true for all normal shielded coaxial cable - and for most, but not all, weird audiophile cables).
What happens then is that the output impedance of the source interacts with the capacitance of the interconnect cable to produce a high-cut filter.
The result is that the treble is rolled off - perhaps quite audibly.
Whether this happens, and how much, is determined by the source impedance of your source (not the minimum load), and by the capacitance of your cable
(and it's also affected to a degree by the input impedance of your destination device).
Capacitance is also
DIRECTLY related to cable length; for a given type of wire, if you double the length, you double the capacitance.
Without getting into numbers (which vary a lot), with
MOST solid state equipment, the output impedance is low enough that the capacitance of
MOST interconnects shouldn't make an audible difference.
(We might typically be talking about being -0.1 dB at 10 kHz and -0.2 dB at 20 khz.... )
However, the longer the wires you use, and the higher the output impedance on your source device, the more likely the interaction will be at least slightly audible.
However, this situation is very different if you have a tube preamp with a 100 kOhm output impedance.
In that situation, even a relatively short interconnect may make an audible difference, and the difference with long cables will be noticeable.
(Again, the difference is that the high frequencies will be rolled off, and they will be rolled off more with a longer cable.)
This is why, since cables themselves vary considerably, with tube equipment it's a good idea to use cable with relatively low capacitance, and as short as possible.
The output impedance of the source device also relates to the likelihood of picking up noise in the interconnect cable.
Again, without getting into numbers, the lower the output impedance on your source device, the more immune the connection will be to picking up noise from outside the cable.
This generally isn't an issue with solid state equipment, which generally has an output impedance below 1 kOhm, but can be an issue for those tube preamps with a very high output impedance.
(Which is why, with them, you want to use a short cable, with low capacitance, and the best shielding you can manage....
It's also why tube devices with low impedance cathode follower outputs are considered better by most people.)
However, to reiterate the original question, as long as you don't arrive at an especially bad combination, impedance in and of itself has nothing whatsoever to do with how something sounds.
Few replies here...
I don't think ARC meant for tube only amps to be used with the REF 5SE. I've heard it with Ayre and Bryston amps and it sounded fantastic too. The XPA-1s sound great too, just... Different...
The hard part here is trying to fully understand how impedance plays together and what it means for perceived sound. Should have studied more electronics in college!