|
Post by 2muchht on Oct 4, 2017 16:01:26 GMT -5
Just remember that the forthcoming Xbox One X WILL have HDMI 2.1 either at launch or shortly there after. No real need for it, but sometimes you want to be first.
In reality, this is a reasonable future-proofing step, but FOR NOW the reality is that there is nothing NOW that requires it. The video processing power of this console could clearliy render out 12-bit/4:2:2 or even higher, which would "break the bank" for HDMI 2.0. SInce they are not commerically available yet I haven't seen a teardown, so any notion of what the HDMI Tx is would be speculation. Remember, the original PS3 "Fat" model was the first product with HDMI 1.3 and they uniquely able to pull that trick off due to some VERY special and even MORE unique architecture. Xbox and PS can do that A/V manufacturers, even the "big guys" can't do that; they are stuck with the two current chip major chip providers. There are one or two others now, but you won't see them in high quality products. There is at least one more coming, but it is not here yet. TV sets often have the Rx capability built in to their SoC, but others use Lattice or Panasonic merchant, off-the-shelf, parts.
Trust me, (or don't) this isn't easy for anyone.
Oh, and one more for Casey: In the HDMI world the test sequence is bundled together in what is called the "CTS', or compliance test standard. Not only does that hae to be completed...and for HDMI 2.1 it is apparently not done yet...you also need to have companies such as Teledyne LeCroy/Quantum Data update their widely used test and measurement gear to perform and report out on the CTS regimine. That obvioulsy can't be done until the CTS is done. Same for Astro and they few others who make the specialize test generators and measurement/test devices one needs.
Again, with appologies to any dentists or medical professionals here, this makes going to the dentist or proctologists here like a day at the beach.
|
|
|
Post by Casey Leedom on Oct 4, 2017 16:44:23 GMT -5
So funny 2muchht ... I was recently at the University of New Hampshire's InterOperability Laboratory[1] with a bunch of other manufacturers of 25Gb/s and 100Gb/s Ethernet equipment for ... well, interoperability testing ... :-) And, in addition to the various Ethernet equipment manufacturers, there were of course representatives Ethernet Testing equipment manufacturers from LeCroy, Viavi, etc. During the Social Hour near the end of the four-day testing effort I ended up asking "who needs [the next generation of] 200Gb/s and 400Gb/s Ethernet?" Pretty much everyone there was very dubious regarding the market for this. Each time we turn the crank, it takes the Ethernet and Test Equipment manufactures about the same amount of time and money to get to that new Ethernet generation. But, each time the size of the addressable market shrinks ... but without the ability to charge enough more to cover the costs of development. So, the Ethernet industry is headed for a very nasty wall where no one can make enough money on the high end and no one except the big FABs can make money on the low end (and maybe not even the big FABs). I joked that we need a new Killer Application which everyone wants and will demand this increased bandwidth. I sort of wonder if HDMI is heading in the same direction ... Casey [1] The University of New Hampshire's InterOperability Laboratory is a fabulous resource by the way and they could take on HDMI Interoperability Testing as well under their mandate.
|
|
|
Post by 2muchht on Oct 4, 2017 18:06:42 GMT -5
...The University of New Hampshire's InterOperability Laboratory is a fabulous resource by the way and they could take on HDMI Interoperability Testing as well under their mandate. Thanks, Casey. Wanted to thank you for giving me the opportunity to post my 500th contribution here. Yes, yes, I know that is a mere fraction of others' totals, but I choose my topics carefully and despite the relatively small number of posts I'm been a Lounge "member" almost since the begining of this over ten years ago. That aside, oh that I wish there was another US-based HDMI Certified Test Lab. Unfornately, there are way too few over the whole world as it is. THe issue is more thang the tetst gear and base knowledge needed. THe whole facility has to be certified by HDMI Licensing and/or Forum. If they are interested, have them talk to the HDMI LLC people and see if it makes sense for both sides. Hope that they can do this and make a profit to support other UNH activities.
|
|
|
Post by cwt on Oct 6, 2017 11:49:38 GMT -5
|
|
|
Post by Casey Leedom on Oct 6, 2017 12:31:27 GMT -5
The HDMI 2.1 transmission bandwidth is 48Gb/s which means that the signaling rate is 24Gb/s/lane. We do 100Gb/s Ethernet here which is 25Gb/s/lane so what they're talking about isn't completely insane. However, there are several very big differences: - The HDMI Connector is a complete joke. I want to find the people who specified it and shoot them.
- In 100Gb/s Ethernet, we count 5m cables as long. We use optical for anything longer.
I think that this will be the big problem for HDMI 2.1. They should just get rid of the old connector and specify a much better, shielded and locking connector. See the Ethernet QSFP28 Connector for a much better design. They should also establish an Optical Transceiver Module as an option for very long runs. And to be honest, they should just consider piggy-backing on the already extant 100Gb/s Ethernet infrastructure. In the next Ethernet generation we'll be doing 50Gb/s/lane which will open up the possibility of 400Gb/s which would completely Future Proof HDMI. Casey
|
|
|
Post by AudioHTIT on Oct 11, 2017 13:23:27 GMT -5
|
|
|
Post by garbulky on Oct 11, 2017 14:08:26 GMT -5
It's tough to say. I remember when the HDMI 2.0 was introduced and 4k. I immediately pointed out that with HDMI 1.4 and the XMC-1 that this would make things obsolete if it didn't release with the HDMI 2. Not many people were even slightly concerned as there weren't many HDMI 2 sources. Blu ray ultra HD wasn't out. 4k was being looked at derisively. "No difference" was mentioned many times.
I remember telling my friend the improtance of HDMI 2 when they were spending money on the latest 4k tv's - a lot of which did not have HDMI 2 or HDR. Most tuned me out. Now with HDMI 2 starting to become strongly accepted, a device without HDMI 2 becomes strongly handicapped. THe confusion on HDMI 2 is still very prevalent. Just yesterday I was helping a friend sort out the HDMI 2 requirement - turns out the receiever was only HDMI 1.4. I also pointed out the issue of no Atmos. People didn't seem to care. Heck I'm still 2.0. But this quickly became a sticking point for the XMC-1 so much that Emotiva wisely made the choice to support it.
The point is, things we think of as ridiculous now (8k video?!) quickly become deal breakers in the future when technology moves on. Also HDMI 2.1 has got dynamic HDR which is supposed to be a huge deal for HDR standards. I do wonder how on earth technology can even catch up wit the minimum requirements of HDR. 4k video is already pretty huge in its reuirements. 8k video and all the other stuff sounds unwordly to me. But it will come.
|
|
|
Post by Bonzo on Oct 11, 2017 15:05:42 GMT -5
I find it odd so many people worry about all this in a processor and consider it totally obsolete. All the processor usually does is some switching and OSD (mainly in everyday life, volume). Considering I've been living without those things for over 10 years now, I really couldn't care less. Even after I (ever) get an 8k display, if my processor doesn't handle, it, I'll just wire the TV direct and run the audio to the processor separately, just like I do now. No biggie.
I've said it before and I'll say it again, HDMI is the best and worst A/V connection of all time. A complete double edged sword.
|
|
|
Post by Casey Leedom on Oct 11, 2017 16:28:54 GMT -5
... Also HDMI 2.1 has got dynamic HDR which is supposed to be a huge deal for HDR standards. ... I'm pretty sure that people have said that Dynamic HDR is going to be "back-ported" to HDMI 2.0b ... Casey
|
|
|
Post by garbulky on Oct 11, 2017 16:48:07 GMT -5
I find it odd so many people worry about all this in a processor and consider it totally obsolete. All the processor usually does is some switching and OSD (mainly in everyday life, volume). Considering I've been living without those things for over 10 years now, I really couldn't care less. Even after I (ever) get an 8k display, if my processor doesn't handle, it, I'll just wire the TV direct and run the audio to the processor separately, just like I do now. No biggie. I've said it before and I'll say it again, HDMI is the best and worst A/V connection of all time. A complete double edged sword. I agree. Why does a processor have to be compatible? IMO there's no reason for it to perform any video duties. Imo they should have developed a standard for HDMI receivers so that no matter how advanced the HDCP protocols for digital rights get, one primitive one is considered sufficient for secondary non visual gear while only primary gear like the display and source needs to be "up to date". There is nothing that really needed in most of these HDMI standards that's a required "feature" in a processor. Also why are media companies so uptight about their security. I guarantee they'll get cracked very quickly anyway.
|
|
|
Post by Casey Leedom on Oct 11, 2017 17:09:30 GMT -5
Except that they can't do this. Even if they had defined HDMI as a bunch of high-speed signal lines which processors "just passed through" and then some very stable low-speed lines for control, audio, etc., as the speeds get higher, the work needed to maintain signal integrity on "pass through" lines gets harder and harder.
And it would make doing On Screen Displays "interesting", but not impossible. To do this they'd have to have designated some more low-speed video signal lines explicitly for overlay video which the TV/Monitor would need to compose.
Casey
|
|
hemster
Global Moderator
Particle Manufacturer
...still listening... still watching
Posts: 51,920
|
Post by hemster on Oct 11, 2017 18:00:20 GMT -5
...I've said it before and I'll say it again, HDMI is the best and worst A/V connection of all time. A complete double edged sword. Yes, and for now, we have no choice but to fall on that sword!
|
|
|
Post by garbulky on Oct 11, 2017 18:17:29 GMT -5
Except that they can't do this. Even if they had defined HDMI as a bunch of high-speed signal lines which processors "just passed through" and then some very stable low-speed lines for control, audio, etc., as the speeds get higher, the work needed to maintain signal integrity on "pass through" lines gets harder and harder. And it would make doing On Screen Displays "interesting", but not impossible. To do this they'd have to have designated some more low-speed video signal lines explicitly for overlay video which the TV/Monitor would need to compose. Casey I think it's too late to make the original HDMI products backwards compatible with older devices. But I think especially with HDMI 2 the issue of compatibility problems and confusion has reared its head like no other. It's as bad as when HDMI first appeared and people found out that HDCP created issues with playback of their DVDs. But that was near the beginning of the digital video revolution. Now that HDMI is the defacto standard, the issue of having to replace everything in the chain to keep up with the times may be more than people can bear. People find they have to upgrade their entire systems. At some point people don't want to do that. People can bear upgrading their tv because it brings them new stuff. But updating an audio system that one is perfectly happy with becomes a much tougher sell. Think about it, most people aren't jumping on board the 4k thing .I don't know anybody who wants to buy 4k ultra hd players outside of my audio friends. Lots of my friends don't care between blu rays and dvd's anyway. The more these things upgrade to finer quality, the less people will be convinced to upgrade leading to less following of new technology. Maybe it's time for a new feature to allow backward compatibility - at least from now on. To allow people to ease in to new technology.
|
|
|
Post by Gary Cook on Oct 11, 2017 22:50:42 GMT -5
I'd say that's slightly misleading, though with the amount of 4K material available (small) Here are the TV shows that we are currently watching on Netflix in 4K; The Blacklist Daredevil The Defenders House of Cards Iron Fist Jessica Jones Longmire (Series 6 starts next month) Luke Cage Marcella Medici: Masters of Florence Orange Is the New Black Currently on Stan in 4K; Better Call Saul (yep, we get it in 4K) Electric Dreams Billions Seems to me to be enough 4K material to keep us busy and that's only the TV shows, there's doco's and movies as well. Cheers Gary
|
|
|
Post by Gary Cook on Oct 11, 2017 23:26:26 GMT -5
I find it odd so many people worry about all this in a processor and consider it totally obsolete. All the processor usually does is some switching and OSD (mainly in everyday life, volume). Considering I've been living without those things for over 10 years now, I really couldn't care less. Even after I (ever) get an 8k display, if my processor doesn't handle, it, I'll just wire the TV direct and run the audio to the processor separately, just like I do now. No biggie. It's OK for me to have to switch the 3 bits of gear to accommodate the Apple TV4K (via ARC) and the Oppo 203 (2 x outputs) as the UMC-200 doesn't handle it, which means swapping inputs on the Samsung TV. But the WAF is not at all high. Plus the Samsung only has 3 x HDMI inputs, so if/when the cable box goes to 4K I'm screwed. To have ARC working on the Samsung I have to have CEC on, which causes another handful of problems with gear turning on and off when we don't want it to. Plus I'd like to go Atmos/DTSX/Auro shortly, so there is a limit as to how long I can live with the current compromises. Cheers Gary
|
|
|
Post by AudioHTIT on Oct 11, 2017 23:51:44 GMT -5
I'd say that's slightly misleading, though with the amount of 4K material available (small) Here are the TV shows that we are currently watching on Netflix in 4K; The Blacklist Daredevil The Defenders House of Cards Iron Fist Jessica Jones Longmire (Series 6 starts next month) Luke Cage Marcella Medici: Masters of Florence Orange Is the New Black Currently on Stan in 4K; Better Call Saul (yep, we get it in 4K) Electric Dreams Billions Seems to me to be enough 4K material to keep us busy and that's only the TV shows, there's doco's and movies as well. Cheers Gary Yeah, that seems like plenty, I don’t have the bandwidth to stream 4K, what does 4K Netflix take? I can watch Netflix on Dish though and have 4K Hopper, don’t know what 4K is enabled on it. Didn’t finish a new cabinet I’m building and ski season’s a month away, so UHDTV purchase put off until spring, and another model year to consider.
|
|
|
Post by Gary Cook on Oct 12, 2017 0:50:08 GMT -5
Here are the TV shows that we are currently watching on Netflix in 4K; The Blacklist Daredevil The Defenders House of Cards Iron Fist Jessica Jones Longmire (Series 6 starts next month) Luke Cage Marcella Medici: Masters of Florence Orange Is the New Black Currently on Stan in 4K; Better Call Saul (yep, we get it in 4K) Electric Dreams Billions Seems to me to be enough 4K material to keep us busy and that's only the TV shows, there's doco's and movies as well. Yeah, that seems like plenty, I don’t have the bandwidth to stream 4K, what does 4K Netflix take? I can watch Netflix on Dish though and have 4K Hopper, don’t know what 4K is enabled on it. Didn’t finish a new cabinet I’m building and ski season’s a month away, so UHDTV purchase put off until spring, and another model year to consider. Good question on the bandwidth they say 25 Mbps. But I really don't know what it actually needs as I have between 100 and 130 Mbps available from the cable and the ATV4K is on 802.11 ac wifi, so we have never suffered from any buffering. It's spring time here of course and the racing season is a over (too hot in race cars) but I have some house stuff to do before we get into preparing cars for next season. Plenty to do. Cheers Gary
|
|
|
Post by Bonzo on Oct 12, 2017 8:49:56 GMT -5
I find it odd so many people worry about all this in a processor and consider it totally obsolete. All the processor usually does is some switching and OSD (mainly in everyday life, volume). Considering I've been living without those things for over 10 years now, I really couldn't care less. Even after I (ever) get an 8k display, if my processor doesn't handle, it, I'll just wire the TV direct and run the audio to the processor separately, just like I do now. No biggie. It's OK for me to have to switch the 3 bits of gear to accommodate the Apple TV4K (via ARC) and the Oppo 203 (2 x outputs) as the UMC-200 doesn't handle it, which means swapping inputs on the Samsung TV. But the WAF is not at all high. Plus the Samsung only has 3 x HDMI inputs, so if/when the cable box goes to 4K I'm screwed. To have ARC working on the Samsung I have to have CEC on, which causes another handful of problems with gear turning on and off when we don't want it to. Plus I'd like to go Atmos/DTSX/Auro shortly, so there is a limit as to how long I can live with the current compromises. Cheers Gary I am still using my old 2005 Denon with zero HDMI inputs and get damned near everything everyone else gets. Yeah, the 7.1 Dolby True and DTS Master decoding has to be done inside my Blu-ray player, which has it's downsides, but since I don't use (or really need) room correction in my room, no big loss. Now if broadcast TV actually had 7.1 Dolby True or DTS Master, I'd be missing something, but they don't, yet. So up until now, it all still can be done. Which makes my point that older HDMI in a receiver or processor does not make it totally obsolete like many people seem to think (or want to think). Of course it matters differently for different people with different needs. Some people have 8 HDMI inputs they need. I get that. And it's a whole lot more convenient if those input plugs are on the back of the receiver / processor. (Something I argue also still holds true for RCA analog, for which many people have dissed me here for). But it's mostly a factor of convenience, simply to ease switching and get volume indicators on the TV. Emotiva reps have told me many times for my RCA analog woes to get an external switcher. People have considered me difficult for not accepting that. Well, why isn't the goose good for the gander? For anyone who needs lots of HDMI inputs, why not just get your self an external switcher? Of course now there is Atmos and DTS:X. So now, if you want them, THOSE do in fact require a new processor. But again, as HDMI progresses, folks could still run HDMI lines separately, and/or get that external switcher instead of buying an entire new device just for inputs. That's most certainly what I'm going to do. Yep, analog is old and a waste of time. HDMI rocks. NOT!
|
|
|
Post by Bonzo on Oct 12, 2017 8:56:15 GMT -5
...I've said it before and I'll say it again, HDMI is the best and worst A/V connection of all time. A complete double edged sword. Yes, and for now, we have no choice but to fall on that sword! Well I haven't fallen on it for 10 years now and I haven't missed a thing. I've gotten a new processor (have yet to actually hook it up) to do the Atmos / DTS:X, but I don't really care about the new upcoming HDMI versions right now. Until I get a new TV, why bother. And since my TV is a 65" Panny ZT, unless it quits working, I'm not getting anything new anytime soon.
|
|
KeithL
Administrator
Posts: 9,928
|
Post by KeithL on Oct 12, 2017 10:04:23 GMT -5
Here's the most balanced commentary on HDMI 2.1 I've seen so far: www.cnet.com/how-to/what-is-hdmi-2-0b/And here's are my personal comments on the subject (with a lot of industry experience behind them)..... 1) It isn't going to stop. While you're sitting there watching the 2020 Olympics on your brand new uber expensive 8k TV (with HDMI 2.1), they'll be interrupting the show to advertise the new 16k TVs (which will all require HDMI 3.0). Therefore, being future proof is a pleasant myth; the reality is that, for a lot of money, you can get a little bit ahead... with luck... and hope they don't veer off in another direction. In contrast, as soon as something becomes current and mainstream, the price drops like a lead sinker (you can buy a passable 50" LED back-light 4k TV for $399 now.) And, of course, we'll do as much as we can to offer upgrades on our high-end gear for as long as we can. 2) As of last year, according to NetFlix, about half of their customers had connections that were still too slow to stream HD well. 4k takes up at least twice that much bandwidth (it would be 4x if it weren't for some very aggressive compression that is somewhat more effective than the previous versions). 8k is going to require between 2x and 4x more bandwidth than 4k; probably closer to 4x unless they figure out another big jump in compression technology as well. Remember that, if your connection is too slow to stream HD now, it is NOT going to do well with 4k either, and you can forget about 8k. Some people feel that, because of the better compression, and wider color gamut, 4k can "look better than HD" when sent over the same limited bandwidth - but you are NOT getting the full benefit of 4k. (Note that streaming 4k on NetFlix is already being jammed into about 1/4 of the bandwidth you get with a 4k UHD disc - and 4k discs are themselves significantly compressed.) It has already reached a point where opinions about whether it's worth upgrading from HD to 4k are distinctly mixed..... (More than a few broadcast pros have suggested that HD with HDR would have been a better upgrade path, with more obvious benefits, than 4k.) 3) Dynamic HDR is already supported in HDMI 2.0b . Dolby Vision is dynamic HDR... And the new HDR10+ is dynamic HDR... The HDMI 2.1 standard includes a new version of dynamic HDR - but dynamic HDR is NOT new. 4) We are being promised that HDMI 2.1 will be possible using current HDMI cables and connectors - or similar ones - BUT SOME OF THE NEW CAPABILITIES AND FEATURES REQUIRE ALL NEW HARDWARE. This is a nice way of saying that "soon you'll be able to buy hardware that's HDMI 2.1 compatible - but it won't actually do most of the new stuff you wanted it for". (Who here remembers when you could buy devices with DVI inputs that were "HDMI ready"? Remember how well THAT worked out?) 5) A lof of the current limitations are in the displays themselves. As far as I know, there is NO currently available display that can deliver the full gamut and brightness range that HDR supports. (LCDs don't get dark enough; and OLEDs don't get bright enough; and projectors never seem to be quite as sharp as either.) This means TWO important things: a) If the display is already the limit, then that suggests that you won't see most other improvements until displays are improved. b) Displays WILL be improved; so, no matter how good the electronics are in the TV you buy today, in a few years you'll be able to buy one that looks better because panel technology has improved. 6) And, finally, always, ALWAYS remember that we're talking about an entire system here.... and the end result is always going to be limited by the weakest link. The current 4k discs, which look pretty darn good, are Chroma 4.2.0 . (I've never seen streamed 4k content that looked quite as good as a well produced 4k disc - at least not yet.) Current streaming services use higher compression, and so more compression artifacts, regardless of their color gamut (so they just don't usually look as good as discs). To put it bluntly.... you have to differentiate between "more bits" or "better color gamut" and "actually looks better".
|
|