KeithL
Administrator
Posts: 10,274
|
Post by KeithL on May 9, 2023 9:18:08 GMT -5
The simple reality is that these days simplicity is not considered to be an important virtue. Processors have HDMI outputs because TVs and projectors have HDMI inputs; and TVs and projectors have HDMI inputs because processors have HDMI outputs. And, likewise, preamps and pre/pros have analog audio outputs because almost all amps have analog inputs; and amps have analog inputs because most preamps and processors have analog outputs. So, at the moment, pro amps use XLR inputs, and TVs use HDMI inputs, because both are the standard. As more and more people stream direct to their TV, using WiFi or an Ethernet cable, we may wonder how long it will be before HDMI inputs on TVs and projectors become "optional". In the context of our discussion... I own several pairs of small computer speakers that have USB inputs... so, at least externally, they have an "all digital signal path". I have no idea if they are internally that way or not - because the folks who sell them do not consider that detail to even be worth mentioning. And the reason most computer speakers have a USB input is because most people agree that to be the best overall audio output on most computers. And, at least for now, many people still have a preamp, or a DAC with analog inputs, because they still have one or two analog sources - but some of us DON'T still have any analog sources. (Of course we still need an analog output on our DAC or preamp to drive our analog amplifier; and we need that analog amplifier to drive the input on our analog speakers; for now.) And, at least for now, most DACs still have optical and coax inputs - because, while computers have USB DAC outputs, most CD players and streamers still do not. But, as DACs with USB inputs, and DACs with ONLY USB inputs, become more common, we can expect to see USB DAC outputs to start showing up on more streamers and other sources. (I can connect a simple USB DAC like our Ego+ to my computer, or a Raspberry Pi, or even my phone; it seems foolish that, at least for the moment, I can only do so with one or two CD players and streamers.) However, we ARE already seeing systems like Dante, which is gaining popularity in studio audio systems, where EVERYTHING is done via Ethernet. A Dante controller sends audio directly to the input of a Dante powered speaker via Ethernet... via ordinary Ethernet cabling and switches. (So, externally, Dante is "purely digital"... and does it really matter to the end user if there are any analog stretches left in the internal signal paths on those speakers?) ...Imagine if your source is a CD, or a digital audio file, or a digital audio stream... (which is often the case these days). And it has a digital audio output... In that case, if you have a Class-D amplifier, then the fewest conversions would be achieved by never converting the signal into analog at all... (There's no reason to add a D/A at the source, and then an A/D on the amp.) (But you could put a DSP processor between them without having to perform ANY conversions.) (And, if you need a Volume control, or even Tone controls, those can all be implemented "perfectly" as DSP functions.) ... The word of note above is "could." Yes, a totally digital signal path is possible, but offhand I know of no pro amplifier manufacturer who has one - I'm not saying it doesn't exist, just that I haven't stumbled across any yet. All the pro amps I've seen use analog XLR inputs. But having an all digital signal path would be cleaner, by far!
|
|
|
Post by leonski on May 9, 2023 11:57:22 GMT -5
The simple reality is that these days simplicity is not considered to be an important virtue. Processors have HDMI outputs because TVs and projectors have HDMI inputs; and TVs and projectors have HDMI inputs because processors have HDMI outputs. And, likewise, preamps and pre/pros have analog audio outputs because almost all amps have analog inputs; and amps have analog inputs because most preamps and processors have analog outputs. So, at the moment, pro amps use XLR inputs, and TVs use HDMI inputs, because both are the standard. As more and more people stream direct to their TV, using WiFi or an Ethernet cable, we may wonder how long it will be before HDMI inputs on TVs and projectors become "optional". In the context of our discussion... I own several pairs of small computer speakers that have USB inputs... so, at least externally, they have an "all digital signal path". I have no idea if they are internally that way or not - because the folks who sell them do not consider that detail to even be worth mentioning. And the reason most computer speakers have a USB input is because most people agree that to be the best overall audio output on most computers. And, at least for now, many people still have a preamp, or a DAC with analog inputs, because they still have one or two analog sources - but some of us DON'T still have any analog sources. (Of course we still need an analog output on our DAC or preamp to drive our analog amplifier; and we need that analog amplifier to drive the input on our analog speakers; for now.) And, at least for now, most DACs still have optical and coax inputs - because, while computers have USB DAC outputs, most CD players and streamers still do not. But, as DACs with USB inputs, and DACs with ONLY USB inputs, become more common, we can expect to see USB DAC outputs to start showing up on more streamers and other sources. (I can connect a simple USB DAC like our Ego+ to my computer, or a Raspberry Pi, or even my phone; it seems foolish that, at least for the moment, I can only do so with one or two CD players and streamers.) However, we ARE already seeing systems like Dante, which is gaining popularity in studio audio systems, where EVERYTHING is done via Ethernet. A Dante controller sends audio directly to the input of a Dante powered speaker via Ethernet... via ordinary Ethernet cabling and switches. (So, externally, Dante is "purely digital"... and does it really matter to the end user if there are any analog stretches left in the internal signal paths on those speakers?) The word of note above is "could." Yes, a totally digital signal path is possible, but offhand I know of no pro amplifier manufacturer who has one - I'm not saying it doesn't exist, just that I haven't stumbled across any yet. All the pro amps I've seen use analog XLR inputs. But having an all digital signal path would be cleaner, by far! HDMI was an update on an older and non-compatible standard. I've seen DVI, which had NO audio contained in the signal. Worked, but was obviously flawed.
|
|
KeithL
Administrator
Posts: 10,274
|
Post by KeithL on May 9, 2023 13:57:29 GMT -5
DVI was actually the basis for HDMI ... and some variations of DVI can be "converted to HDMI" with a simple passive adapter cable. The main difference, other than not including audio, was that DVI DID NOT support "strong copy protection". This rendered DVI essentially incompatible with commercial copy-protected HDMI content. (And, unfortunately, the HDCP copy protection that is a mandatory part of HDMI is responsible for a large percentage of the complexity and problems associated with HDMI.) The simple reality is that these days simplicity is not considered to be an important virtue. Processors have HDMI outputs because TVs and projectors have HDMI inputs; and TVs and projectors have HDMI inputs because processors have HDMI outputs. And, likewise, preamps and pre/pros have analog audio outputs because almost all amps have analog inputs; and amps have analog inputs because most preamps and processors have analog outputs. So, at the moment, pro amps use XLR inputs, and TVs use HDMI inputs, because both are the standard. As more and more people stream direct to their TV, using WiFi or an Ethernet cable, we may wonder how long it will be before HDMI inputs on TVs and projectors become "optional". In the context of our discussion... I own several pairs of small computer speakers that have USB inputs... so, at least externally, they have an "all digital signal path". I have no idea if they are internally that way or not - because the folks who sell them do not consider that detail to even be worth mentioning. And the reason most computer speakers have a USB input is because most people agree that to be the best overall audio output on most computers. And, at least for now, many people still have a preamp, or a DAC with analog inputs, because they still have one or two analog sources - but some of us DON'T still have any analog sources. (Of course we still need an analog output on our DAC or preamp to drive our analog amplifier; and we need that analog amplifier to drive the input on our analog speakers; for now.) And, at least for now, most DACs still have optical and coax inputs - because, while computers have USB DAC outputs, most CD players and streamers still do not. But, as DACs with USB inputs, and DACs with ONLY USB inputs, become more common, we can expect to see USB DAC outputs to start showing up on more streamers and other sources. (I can connect a simple USB DAC like our Ego+ to my computer, or a Raspberry Pi, or even my phone; it seems foolish that, at least for the moment, I can only do so with one or two CD players and streamers.) However, we ARE already seeing systems like Dante, which is gaining popularity in studio audio systems, where EVERYTHING is done via Ethernet. A Dante controller sends audio directly to the input of a Dante powered speaker via Ethernet... via ordinary Ethernet cabling and switches. (So, externally, Dante is "purely digital"... and does it really matter to the end user if there are any analog stretches left in the internal signal paths on those speakers?) HDMI was an update on an older and non-compatible standard. I've seen DVI, which had NO audio contained in the signal. Worked, but was obviously flawed.
|
|
cawgijoe
Emo VIPs
"When you come to a fork in the road, take it." - Yogi Berra
Posts: 5,035
|
Post by cawgijoe on May 9, 2023 14:08:12 GMT -5
DVI was actually the basis for HDMI ... and some variations of DVI can be "converted to HDMI" with a simple passive adapter cable. The main difference, other than not including audio, was that DVI DID NOT support "strong copy protection". This rendered DVI essentially incompatible with commercial copy-protected HDMI content. (And, unfortunately, the HDCP copy protection that is a mandatory part of HDMI is responsible for a large percentage of the complexity and problems associated with HDMI.) HDMI was an update on an older and non-compatible standard. I've seen DVI, which had NO audio contained in the signal. Worked, but was obviously flawed. I still have a Sony XBR tube tv (34" widescreen) in the basement that has DVI for video. My Roku has an HDMI to DVI cable attached to it. It works. My wife uses it when she is riding the exercise bike, mostly in the winter. It will likely be given away next year when we move or head to the dump.
|
|
KeithL
Administrator
Posts: 10,274
|
Post by KeithL on May 9, 2023 15:07:13 GMT -5
To be quite blunt... you were lucky. DVI actually worked fine for computer monitors (although it has now been replaced by HDMI). Most of the problems between DVI and HDMI occurred right at the transition between DVI and HDMI. Right at that point there were quiet a few TVs and monitors that had DVI inputs but also shipped with a passive HDMI-to-DVI adapter cable... And, right before HDMI officially took over, many TV sets had HDMI connectors, but were not fully compliant with the standard, and were sold as "HDMI ready"... (And, in some cases, this meant that they could handle HDMI video just fine, but failed to comply with the associated HDCP copy protection standard.) And, to make matters even more muddy, in the early days some cable boxes, and even some specific premium cable channels, had not initially enabled HDCP. (Note that HDCP copy protection is mandatory for things like Blu-Ray discs... but for other content the requirement is determined by the content creator.) So, for example, when certain premium cable channels were first offered on certain cable networks, the HDCP copy protection was for some reason NOT enabled... And this enabled those channels to play on older TVs that were not fully HDCP compliant... Then, one day, that network enabled HDCP on those channels... And "suddenly" those premium channels would no longer play on some older TVs... (Except, since the change wasn't officially announced, until someone figured it out all those viewers were aware of was that those channels "mysteriously no longer worked" on their TVs.) And, to make things more annoying, while video piracy is certainly common, it generally occurs at places in the signal chain not protected by HDCP anyway. All this leads to a relatively widely stated consensus that: "The copy protection on HDMI causes a lot of problems that annoy legitimate customers while not really doing much to cut down on illegal video piracy". DVI was actually the basis for HDMI ... and some variations of DVI can be "converted to HDMI" with a simple passive adapter cable. The main difference, other than not including audio, was that DVI DID NOT support "strong copy protection". This rendered DVI essentially incompatible with commercial copy-protected HDMI content. (And, unfortunately, the HDCP copy protection that is a mandatory part of HDMI is responsible for a large percentage of the complexity and problems associated with HDMI.) I still have a Sony XBR tube tv (34" widescreen) in the basement that has DVI for video. My Roku has an HDMI to DVI cable attached to it. It works. My wife uses it when she is riding the exercise bike, mostly in the winter. It will likely be given away next year when we move or head to the dump.
|
|
|
Post by leonski on May 9, 2023 21:14:35 GMT -5
I had a client who wanted me to hook up his (DVI) TV as part of a video display at his shop. It was a case of 'mission creep' since I would have had to come up with RCA connected AUDIO and make that work. And Sync was not assured!
I had a TV with DVI and now realize I may have been able to get THAT to work since the TV presumably had some provision for getting the audio back to the video signal. But thats a long time ago.....
|
|
cawgijoe
Emo VIPs
"When you come to a fork in the road, take it." - Yogi Berra
Posts: 5,035
|
Post by cawgijoe on May 9, 2023 21:42:10 GMT -5
To be quite blunt... you were lucky. DVI actually worked fine for computer monitors (although it has now been replaced by HDMI). Most of the problems between DVI and HDMI occurred right at the transition between DVI and HDMI. Right at that point there were quiet a few TVs and monitors that had DVI inputs but also shipped with a passive HDMI-to-DVI adapter cable... And, right before HDMI officially took over, many TV sets had HDMI connectors, but were not fully compliant with the standard, and were sold as "HDMI ready"... (And, in some cases, this meant that they could handle HDMI video just fine, but failed to comply with the associated HDCP copy protection standard.) And, to make matters even more muddy, in the early days some cable boxes, and even some specific premium cable channels, had not initially enabled HDCP. (Note that HDCP copy protection is mandatory for things like Blu-Ray discs... but for other content the requirement is determined by the content creator.) So, for example, when certain premium cable channels were first offered on certain cable networks, the HDCP copy protection was for some reason NOT enabled... And this enabled those channels to play on older TVs that were not fully HDCP compliant... Then, one day, that network enabled HDCP on those channels... And "suddenly" those premium channels would no longer play on some older TVs... (Except, since the change wasn't officially announced, until someone figured it out all those viewers were aware of was that those channels "mysteriously no longer worked" on their TVs.) And, to make things more annoying, while video piracy is certainly common, it generally occurs at places in the signal chain not protected by HDCP anyway. All this leads to a relatively widely stated consensus that: "The copy protection on HDMI causes a lot of problems that annoy legitimate customers while not really doing much to cut down on illegal video piracy". I still have a Sony XBR tube tv (34" widescreen) in the basement that has DVI for video. My Roku has an HDMI to DVI cable attached to it. It works. My wife uses it when she is riding the exercise bike, mostly in the winter. It will likely be given away next year when we move or head to the dump. From the 34XBR800 manual: Video 7 DVI/HDTV Input: This input consists of stereo RCA jacks and a DVI HDTV terminal. The DVI HDTV terminal can accommodate a copy-protected digital connection (HDCP) to other devices (such as digital set-top boxes) that have compatible interfaces. The DVI-HDTV input terminal is compliant with the EIA-861 standard and is not intended for use with personal computers.
|
|
|
Post by Boomzilla on May 10, 2023 8:01:17 GMT -5
...I'd love to audition a TOTALLY and REALLY digital amp from input to speaker outs..... NAD?
|
|
|
Post by leonski on May 10, 2023 13:01:04 GMT -5
...I'd love to audition a TOTALLY and REALLY digital amp from input to speaker outs..... NAD? You'll need to look at 'Masters Series' which are a little $$$ but maybe worthwhile?
|
|