|
Post by Casey Leedom on Jan 3, 2019 12:43:04 GMT -5
This is a question of pure curiosity. I'm very uneducated in this area.
If we have an HDMI Switch — i.e. with no Video Scaling/Processing capability — with multiple outputs, how does such a device cope with Output Devices with differing Resolutions, Color Spaces, etc.? I assume that there will be cases where the same Input Device will be routed to both Output Devices for "Mirroring" (Monitoring).
I think that HDMI is not an End-to-End protocol in terms of Datagram Framing — i.e. that it's a Link-by-Link protocol with respect to Data Transport. But it appears that it is End-to-End with respect to exchanging Capability Information like Resolution, Color Space, etc.? (Is this the job of HDCP and those "EDIDs" (Extended Display Identification Data) I see bandied about?) That is, if a single Input Device is "switched" to be Mirrored on two differing Output Devices, does the HDMI Switch have to pass the lowest common denominator Output Device Capabilities to the Input Devices?
As I said, I really don't know anything about HDMI.
Casey
|
|
|
Post by thrillcat on Jan 3, 2019 14:28:51 GMT -5
Think of the basic switch as a splitter, and it will make more sense. It sends the lowest common denominator to two places. It can't generate a second signal for the second display.
|
|
bootman
Emo VIPs
Typing useless posts on internet forums....
Posts: 9,358
|
Post by bootman on Jan 3, 2019 16:19:50 GMT -5
It is easiest to think of HDMI as a chain that is only as good as the weakest link. So on a passive switch with dual outputs to a 4K and a 1080p display, both outputs will be 1080p.
On a video processor with dual outs, you can send different signals out since the processor will convert them. (note the XMC is NOT a video processor. )
|
|
|
Post by Gary Cook on Jan 3, 2019 16:24:42 GMT -5
"How does a switch handle 2 different outputs?" It doesn't, it only has one output, that being the lowest common denominator. Which is one of the basic issues with HDMI and HDCP, the licensing requirements for end to end compliance (and everything in between). We, as consumers, pay a huge price for copyright protection that is easily broken if someone really wants too.
Cheers Gary
|
|
|
Post by dreamwarrior on Jan 15, 2019 17:01:57 GMT -5
^^ not lowest common denominator, it's the greatest common supported format, no? Plus, how is this any different than analog? I mean, if I put a 1080p output through a component video splitter and then hooked one end to an older 720p set, I'm not going to see anything but static on that set as it fails to sync to the signal. The "value add" for HDMI's EDID scheme is that, ideally, both units receive a compatible signal. When it works, I'd call that a plus.
IMO, the biggest PITA in HDMI is HDCP; specifically, the requirement for all sinks in the chain to negotiate the keys before transmission can start, which causes delays. Combined with the fact that this negotiation occurs any time the audio/video feed changes (instead of just when units dropped in/out of the chain), makes the delays even more pervasive. I believe the fast switching feature in the 2.1 spec aims to avoid this.
|
|
bootman
Emo VIPs
Typing useless posts on internet forums....
Posts: 9,358
|
Post by bootman on Jan 15, 2019 17:05:07 GMT -5
^^ not lowest common denominator, it's the greatest common supported format, no? Plus, how is this any different than analog? I mean, if I put a 1080p output through a component video splitter and then hooked one end to an older 720p set, I'm not going to see anything but static on that set as it fails to sync to the signal. The "value add" for HDMI's EDID scheme is that, ideally, both units receive a compatible signal. When it works, I'd call that a plus. IMO, the biggest PITA in HDMI is HDCP; specifically, the requirement for all sinks in the chain to negotiate the keys before transmission can start, which causes delays. Combined with the fact that this negotiation occurs any time the audio/video feed changes (instead of just when units dropped in/out of the chain), makes the delays even more pervasive. I believe the fast switching feature in the 2.1 spec aims to avoid this. Because HDMI is a DIGITAL signal not an ANALOG one so LOWEST resolution display wins. (for unprotected content) HDCP won't let any output come out of a protected file unless all displays are 4k and the correct version of HDCP is supported on both. www.lifewire.com/hdmi-switchers-basics-4158344
|
|
DYohn
Emo VIPs
Posts: 18,348
|
Post by DYohn on Jan 15, 2019 18:26:40 GMT -5
|
|
|
Post by dreamwarrior on Jan 16, 2019 18:34:50 GMT -5
^^ not lowest common denominator, it's the greatest common supported format, no? Plus, how is this any different than analog? I mean, if I put a 1080p output through a component video splitter and then hooked one end to an older 720p set, I'm not going to see anything but static on that set as it fails to sync to the signal. The "value add" for HDMI's EDID scheme is that, ideally, both units receive a compatible signal. When it works, I'd call that a plus. IMO, the biggest PITA in HDMI is HDCP; specifically, the requirement for all sinks in the chain to negotiate the keys before transmission can start, which causes delays. Combined with the fact that this negotiation occurs any time the audio/video feed changes (instead of just when units dropped in/out of the chain), makes the delays even more pervasive. I believe the fast switching feature in the 2.1 spec aims to avoid this. Because HDMI is a DIGITAL signal not an ANALOG one so LOWEST resolution display wins. (for unprotected content) HDCP won't let any output come out of a protected file unless all displays are 4k and the correct version of HDCP is supported on both. www.lifewire.com/hdmi-switchers-basics-4158344Uhhh, I don't know what I posted to make you think I didn't know HDMI was digital, but...the "lowest" resolution doesn't win. Also, I think I made it clear that, despite being digital, the HDMI negotiation, ideally, results in the same best working solution that analog could have delivered. Example: if device A supports (1080, 720, and 480) and set B (4k, 1080, 720, and 480) then the lowest resolution would be 480, but the greatest common resolution would be 1080, which is what HDMI would select. Do you disagree? This is also not taking into account the support of different HDCP versions, which is another factor in successfully negotiating 4k on "compliant" devices that require newer HDCP versions for 4k outputs (this despite that HD Fury works around it, but they did get, AFAIK, pestered by HDMI regarding converting from one HDCP version to another, I believe even sued).
|
|
bootman
Emo VIPs
Typing useless posts on internet forums....
Posts: 9,358
|
Post by bootman on Jan 16, 2019 18:41:50 GMT -5
Because HDMI is a DIGITAL signal not an ANALOG one so LOWEST resolution display wins. (for unprotected content) HDCP won't let any output come out of a protected file unless all displays are 4k and the correct version of HDCP is supported on both. www.lifewire.com/hdmi-switchers-basics-4158344Uhhh, I don't know what I posted to make you think I didn't know HDMI was digital, but...the "lowest" resolution doesn't win. Also, I think I made it clear that, despite being digital, the HDMI negotiation, ideally, results in the same best working solution that analog could have delivered. Example: if device A supports (1080, 720, and 480) and set B (4k, 1080, 720, and 480) then the lowest resolution would be 480, but the greatest common resolution would be 1080, which is what HDMI would select. Do you disagree? This is also not taking into account the support of different HDCP versions, which is another factor in successfully negotiating 4k on "compliant" devices that require newer HDCP versions for 4k outputs (this despite that HD Fury works around it, but they did get, AFAIK, pestered by HDMI regarding converting from one HDCP version to another, I believe even sued). Thanks for clarifying. Yes I do agree!
|
|