|
Post by Casey Leedom on Oct 3, 2017 11:00:08 GMT -5
So when the HDMI 2.1 "standard" was announced at CES 2017 this winter I mostly just shrugged it off as a technology in search of an application ... and one which was going to take significantly longer to see in the field than promised (I know quite a bit about high-speed digital signaling since that's the business I'm in currently). But, recently Rob Sabin and Steve Guttenberg of Sound & Vision posted a video blog (see the 4:18 mark) where Rob said that HDMI was a market disruption that we would see in 2018 and make it very difficult to buy processors with only HDMI 2.0b. (For Emotiva I presume it would be less of a problem because of their current upgradable HDMI sub-board assembly approach.) But A. does anyone else think that we'll really see [affordable] HDMI 2.1 products in 2018, or B. even if it does make it in 2018, that there will be any use for the extra bandwidth and capability any time soon? If you look at the HDMI 2.0b and 2.1 specifications, 2.0b supports "4K" x 60Hz and Dynamic HDR formats. 2.1 extends that to "4K" & "8K" x 120Hz and in other fairly extreme ways. When are we likely to see any content or displays capable of that? It seems to me that Rob Sabin's "worries" about buying a processor in 2018 are a bit over-blown ... Casey
|
|
|
Post by pknaz on Oct 3, 2017 11:46:34 GMT -5
What is most interesting to me about this conversation is the marketing hype. Even with a 135" screen, you need to be at least 8 feet or less away from the screen to be able to see all available detail at 4k resolution. My viewing distance is currently 15ft away from my screen location - doesn't make much sense for me to invest in 4k, except for HDR.
|
|
|
Post by Casey Leedom on Oct 3, 2017 12:30:10 GMT -5
There is some justification for having resolution higher than the human eye's ability to discern in terms of the Arc Angle Subtended by a Single Pixel. While we may not be able to see a single pixel because it's too small, the macroscopic effects of aliasing (not quite being able to put an image pixel exactly where it truly resides in the image geometrically) with large fields of pixels can result in visible artifacts. See for instance this interesting article. Casey
|
|
|
Post by AudioHTIT on Oct 3, 2017 14:33:50 GMT -5
What is most interesting to me about this conversation is the marketing hype. Even with a 135" screen, you need to be at least 8 feet or less away from the screen to be able to see all available detail at 4k resolution. My viewing distance is currently 15ft away from my screen location - doesn't make much sense for me to invest in 4k, except for HDR. View AttachmentI'd say that's slightly misleading, though with the amount of 4K material available (small), if you don't need a new TV then you're right, it might not be the right time to 'invest' in UHD. But if you need (or want) a new TV – your current one has problems or is broken, is too small, doesn't have HDMI, makes funny noises, just doesn't have a very good picture. Then buying a UHD TV makes perfect sense, the best technology for any type of TV viewing is available in the current UHD TV's (and good deals can be found on those a model year old or so). I'm looking at 80+", currently the Samsung 82" interests me. I normally sit 10 - 12' away which puts me pretty close to the maximum noticeable distance, but I still expect I will see a variety of improvements including HDR, WCG, fewer motion artifacts, AND better resolution. Here are a few different size / distance calculators, they tend to agree and paint a different 'picture' than you're presenting. i.rtings.com/images/optimal-viewing-distance-television-graph-size.pngi.i.cbsi.com/cnwk.1d/i/tim/2013/01/27/resolution_chart.jpgrhtpull-fm0pvtsb.netdna-ssl.com/wp-content/uploads/2013/03/Ideal-Distances-Chart1.jpg?x37803
|
|
|
Post by AudioHTIT on Oct 3, 2017 14:49:49 GMT -5
To the OP, I think HDMI 2.1 will be available on a few devices at CES 2018, (probably fewer that will be actually shipping). Also your bandwidth question really hits the nail on the head, are you going for higher resolutions than 4K? 4K/120 will likely become more available in 2018, but I'm comfortable at this time buying a UHD TV or processor with 2.0b. I currently don't even have the bandwidth to stream any form of UHD, nor can I buy it, so UHD Blu-ray is the only way I'll get to see it anyway.
|
|
|
Post by Casey Leedom on Oct 3, 2017 17:59:10 GMT -5
I do have to admit buying the recent Oppo UDP-203 despite only having an older 60" Pioneer KRP-600M (last generation Kuro technology), which only does HD (1,080 x 1,200). At ~some point~ I'll want to upgrade to a UHD (3,840 x 2,160) in the range of 80-90" ... but not till the prices drop significantly. But at least now I can buy UHD discs with HDR/Dolby Vision for the future ... :-)
And, more importantly, HDMI 2.0b is good enough for the above ...
Casey
|
|
|
Post by gus4emo on Oct 3, 2017 18:26:26 GMT -5
Aren't we getting sick of the industry coming up with BS, why not have a chip in any equipment that can be upgradable, for at least the next five years or so, as long as they see people are upgrading they will keep doing this sh**, just saying. ...
|
|
|
Post by Casey Leedom on Oct 3, 2017 18:50:45 GMT -5
Ah, that one is an easy question to answer. The High-Speed Serializer/Deserializer (AKA "PHY"s for Physical Link Layer) logic is almost always either at the very limit of what's achievable for state-of-the-art products or the cheapest answer which fits the specified needs for commodity products. At no point do we deploy logic/chips which could handle 10x what the specified needs of the product are because that would add unnecessary cost.
In any case, specifically, the chip logic absolutely has to be developed for the higher speeds as they're standardized upon so there's no way that an HDMI 2.0b chip could handle HDMI 2.1 (unless some crazy manufacturer through tons of extra cost into their product on the grounds that it might be needed some day).
So, with the above in mind, Emotiva is actually doing all of us a favor by having developed a decoupled system architecture with a separately upgradable HDMI sub-assembly. I'm not sure that any other manufacturer is doing this. Of course for some of the really big commodity manufacturers it may make more sense for both the manufacturer and consumer to simply buy a whole new processor with upgraded parts because they have economies of scale in their manufacturing and sales that Emotiva doesn't have ...
Casey
|
|
|
Post by gus4emo on Oct 3, 2017 20:18:53 GMT -5
Ah, that one is an easy question to answer. The High-Speed Serializer/Deserializer (AKA "PHY"s for Physical Link Layer) logic is almost always either at the very limit of what's achievable for state-of-the-art products or the cheapest answer which fits the specified needs for commodity products. At no point do we deploy logic/chips which could handle 10x what the specified needs of the product are because that would add unnecessary cost. In any case, specifically, the chip logic absolutely has to be developed for the higher speeds as they're standardized upon so there's no way that an HDMI 2.0b chip could handle HDMI 2.1 (unless some crazy manufacturer through tons of extra cost into their product on the grounds that it might be needed some day). So, with the above in mind, Emotiva is actually doing all of us a favor by having developed a decoupled system architecture with a separately upgradable HDMI sub-assembly. I'm not sure that any other manufacturer is doing this. Of course for some of the really big commodity manufacturers it may make more sense for both the manufacturer and consumer to simply buy a whole new processor with upgraded parts because they have economies of scale in their manufacturing and sales that Emotiva doesn't have ... Casey At least make it that the upgrade is not so we have to spend $2500 or $3500 plus on new equipment just because......believe me they know what they're doing. ...
|
|
|
Post by 2muchht on Oct 3, 2017 20:56:57 GMT -5
A few observations:
- Casey Leedom is correct, and there is more to it than even that. It isn't just the need for new chips to handle the high speed and bandwidth. As all of this goes up it also requires changes to the PCB to which the chip is mounted. Thus, even if the chip, itself, were upgradeable (which as noted isn't possible), the signal path on the PCB will likely not work with the hgiher speeds. This is more than just "a new chip".
- Remember that it isn't just the 2160p resolution that is important. MORE important is frame rate as we move (later rather than sooner) from 60fps to 120fps, color sampling as we move from 4:2:0 to 4:2:2 and eventually to 4:4:4, and finally the move from 8-bit to 10-bit, 12-bit, and ultimately 16-bit. Those three things, particularly in combination, increase theneed for higher bandwidth. As you total these up one can easily move beyond the capability of HDMI 2.0b.
- Look at the above and remember that there is a reason why the industry prefers "UHD" rather than "4K" as they need to convey that there are other benefits to all of this that you ABSOLUTELY can see even if you are too far away from the screen to notice the difference in the resolution. The things mentioned above are NOT "BS designed to sell better sets". If you've seen 4:4:4/16-bit color and even 12-bit you will be able to instantly see the difference it makes.
- HDR is a wild card. HDMI 2.0a/b is fine for Dolby Vision, HDR-10, HLG and Advanced HDR by Technicolor. At this point we do not have a hard answer on whether HDR-10+ needs HDMI 2.1 or can work with the existing formats. HDMI 2.0b should be able to handle the ST-2094-40 for dynamic metadata, but there is some thought that it may just have too heavy a headroom requirement. Those who know aren't telling and those who are telling just don't know. Perhaps more will be clear at the end of the month at the SMPTE Technical Conference in LA. THere are a number of papers that will be delivered on various aspects of HDR. REMEMBER, HDR-10+ and Dolby Vision are backward compatible.
- This isn't going to happen all at once, or as quick as it might be reported. Yes, we'll see SOME things SHOWN at CES, but don't count on any delivery other than for the Xbox One X. We'll see.
Bottom line: This is NOT some thing ginned up to sell TVs. It delivers demonstrable benefits. On the other hand, like everything else, it won't be either simple or easy. Such is life.
|
|
|
Post by Casey Leedom on Oct 3, 2017 21:10:22 GMT -5
Hhmmm, thanks for the insight into the HDMI standards process 2muchht ... which makes me wonder again about the original question of when to realistically expect to see widespread HDMI 2.1 deployment. The I first read the announcement coming out of CES 2017 in January of this years and it was described as being "in development" at that time, my own knowledge of high-speed signaling told me that we probably wouldn't see anything for two years. And yes, you're also right about the board PCB Signal Integrity issues. We deal with that all the time in my company where we build {10, 40, 25, 100}Gb/s Ethernet Adapters. And we're headed towards 200 and 400Gb/s using 50Gb/s Lanes ... Casey
|
|
klinemj
Emo VIPs
Honorary Emofest Scribe
Posts: 14,743
Member is Online
|
Post by klinemj on Oct 3, 2017 21:29:10 GMT -5
So, I got stung in the eyes by bees when I was young. My eyeballs changed shape and my eyes were swollen shut for 2 weeks. That left me 20/600 and 20/400 in my eyes from a young age. With age, not only did my close vision drop off, but now all vision varies from day to day. Some days, the lens that correct me work, some days I see double. lucky for me, my vision from 10+ feet away is good...for now...no double vision there.
Until HDMI x.y fixes the issues I have...I'm good.
Mark
|
|
|
Post by Loop 7 on Oct 3, 2017 21:44:17 GMT -5
Once resolutions cross the theoretical limits of human eye capability, the advances won't be as breathtaking as 480p to 1080p and 1080p to 4K (actually 3840p) but will be marketable nevertheless.
I'm still a stick in the mud and like to remind people how a great deal of cable, satellite and IP TV are 720p/1080i. Of course, the upscaling chips are just amazing and will only improve.
Personally, I'm much more excited about next generation video compression standards that require lower bandwidth for higher resolutions.
|
|
|
Post by 2muchht on Oct 3, 2017 23:17:21 GMT -5
A plea to all, FWIW: Don't fixate on whether or not the resolution aspect of UHD is worth it to you. Keep in minf all the other benefits of the UHD ecosystem that make a BIG difference. In fact, I was at a meeting this week about ATSC 3.0 and they spent quite a bit of time on the benefits of HDR, HFR and WCG apart from 4K. A comment I've heard all year at various SMPTE, HPA, NAB and other events is that given the fact that the upscaling in "4K" sets has improved dramatically, there is a VERY GOOD case to made that you may see some content in HDR at 1080p and then let the set do the upscaling. We'll see if that happens, but from a finanical and production workflow standpoint it makes a great deal of sense.
Casey: "I could tell you but I'd have to kill you." Chips are coming but it is slow. THEN we need the CTS, which is also not done. More is still going on behind the scenes. Fill in the blanks...
|
|
|
Post by Casey Leedom on Oct 3, 2017 23:37:06 GMT -5
Ah, so you're actually part of the HDMI 2.1 working group then, 2muchht? Very cool! I understand that you're not allowed to disclose non-published information, but if there are public statements about the standard, please do point us at them. Thanks! And I agree with you regarding the important aspects, but also think that "4K" will be important. Personally I can't wait to see more 60Hz material and I'm glad that we seem to have walked away from the 3D distraction. Casey
|
|
|
Post by 2muchht on Oct 4, 2017 0:53:42 GMT -5
Ah, so you're actually part of the HDMI 2.1 working group then, 2muchht? Very cool!... For clarity, I am NOT "actually part of the HDMI 2.1 working group..." There is nothing that I've said that isn't out there, particularly if you are in the "SMPTE-centric side of the world", which I am. What I try to do is counteract and correct reports that are out there (NOT yours) on other various outlets. Occupational hazard. As always, look for the raw data info and draw your own conclusions. That's what I try to do.
|
|
|
Post by cwt on Oct 4, 2017 0:57:48 GMT -5
What is most interesting to me about this conversation is the marketing hype. Even with a 135" screen, you need to be at least 8 feet or less away from the screen to be able to see all available detail at 4k resolution. My viewing distance is currently 15ft away from my screen location - doesn't make much sense for me to invest in 4k, except for HDR. View AttachmentAdding to this I would contend that the real image improvements UHD gives us is WCG and HDR both of which we have now with 4k Until projection technology and led/oled tech improves its contrast ratio substantially that is the current limit of perceived picture quality .. Even my JVC 7000e doesn't have anywhere the peak brightness of even an oled let alone a led panel . And we have 2 differing standards for oled/led HDR standards ; because black level /brightness isn't up to scratch yet That to me is an incentive to wait until our display technology catches up .. And I don't relish sending a RMC1 to the US for new boards . Hey 2muchht your posts always seem to have that particular insight ; good to have in this environment
|
|
|
Post by Casey Leedom on Oct 4, 2017 11:29:54 GMT -5
Okay, thanks for the clarifications 2muchht. I have a friend who works at Dolby Laboratories here in San Francisco who also needs to be careful about what he says in this domain since so much of it is either proprietary or under development. And of course, I'm used to it in my own industry in the computer field. Casey
|
|
KeithL
Administrator
Posts: 9,937
|
Post by KeithL on Oct 4, 2017 13:00:02 GMT -5
From everything I've heard about it (which isn't much)...... HDR10+ is an "open" (free license) equivalent to the proprietary Dolby Vision (both are dynamic HDR)... and is intended to work with HDMI 2.0b (or maybe even HDMI 2.0a). Most experts seem to agree that HDMI 2.1 hardware is probably at least a few years down the road - except as a show headline. Also, as many people have noticed, there isn't an awful lot of really high quality 4k content out there.... Note that, if you're talking abut bandwidth, 4k STREAMING content is already limited to about 1/4 of the bandwidth of 4k discs..... Which kind of makes one wonder where the bandwidth for 8k programming is going to come from. (Over-compressed 4k already looks worse than good quality HD; trying to squeeze 8k into a similar amount of space doesn't seem destined to end well.) A few observations: - Casey Leedom is correct, and there is more to it than even that. It isn't just the need for new chips to handle the high speed and bandwidth. As all of this goes up it also requires changes to the PCB to which the chip is mounted. Thus, even if the chip, itself, were upgradeable (which as noted isn't possible), the signal path on the PCB will likely not work with the hgiher speeds. This is more than just "a new chip". - Remember that it isn't just the 2160p resolution that is important. MORE important is frame rate as we move (later rather than sooner) from 60fps to 120fps, color sampling as we move from 4:2:0 to 4:2:2 and eventually to 4:4:4, and finally the move from 8-bit to 10-bit, 12-bit, and ultimately 16-bit. Those three things, particularly in combination, increase theneed for higher bandwidth. As you total these up one can easily move beyond the capability of HDMI 2.0b. - Look at the above and remember that there is a reason why the industry prefers "UHD" rather than "4K" as they need to convey that there are other benefits to all of this that you ABSOLUTELY can see even if you are too far away from the screen to notice the difference in the resolution. The things mentioned above are NOT "BS designed to sell better sets". If you've seen 4:4:4/16-bit color and even 12-bit you will be able to instantly see the difference it makes. - HDR is a wild card. HDMI 2.0a/b is fine for Dolby Vision, HDR-10, HLG and Advanced HDR by Technicolor. At this point we do not have a hard answer on whether HDR-10+ needs HDMI 2.1 or can work with the existing formats. HDMI 2.0b should be able to handle the ST-2094-40 for dynamic metadata, but there is some thought that it may just have too heavy a headroom requirement. Those who know aren't telling and those who are telling just don't know. Perhaps more will be clear at the end of the month at the SMPTE Technical Conference in LA. THere are a number of papers that will be delivered on various aspects of HDR. REMEMBER, HDR-10+ and Dolby Vision are backward compatible. - This isn't going to happen all at once, or as quick as it might be reported. Yes, we'll see SOME things SHOWN at CES, but don't count on any delivery other than for the Xbox One X. We'll see. Bottom line: This is NOT some thing ginned up to sell TVs. It delivers demonstrable benefits. On the other hand, like everything else, it won't be either simple or easy. Such is life.
|
|
|
Post by Casey Leedom on Oct 4, 2017 13:34:42 GMT -5
... Most experts seem to agree that HDMI 2.1 hardware is probably at least a few years down the road - except as a show headline. ... Ah, that sort of lines up with my initial thought on when we could expect to see HDMI 2.1 hardware in consumer products. It basically takes a year from final netlist/tapeout to when you can expect to see the first products. And you can't get to netlist/tapeout till all of the features are decided, the Verilog and Design Verification Tests are created and lots, and lots, of testing. There's a reason why the product cycles are as long as they are ... and the only reason that they are as short as we perceive them to be is because big companies like Apple, Intel, nVidia. etc. have multiple generations of their products running simultaneously with multiple teams working on them. Casey
|
|