|
Post by Axis on Nov 29, 2017 19:22:40 GMT -5
That table lumped together all the versions of HDMI 2 in one category when in reality there are significant differences. Lumped together or not, many other articles on HDMI 1.4b/2.0/2.0a/2.0b have stated that these versions are very much capable of passing dynamic metadata, namely Dolby Vision. LC, I had to tell a friend what THX sound standards are. He just designed a 1 TB SSD and switching power supply with nine layers of circuit that folded into 2 inches length. Dan would love top have Chris work for him and he has never hade a surround sound system.
|
|
KeithL
Administrator
Posts: 9,958
|
Post by KeithL on Nov 29, 2017 19:23:07 GMT -5
Well, considering that Dolby says that Dolby Vision COULD be implemented in HDMI 1.4 (if they wanted to)...... And HDR10+ (which is the competing dynamic HDR standard being supported by Samsung) most certainly is designed to work in HDMI 2.0b...... .... maybe it was accurate the day it was published..... If you think about it, it's pretty obvious that Dolby Vision works in HDMI 2.0b (since there are currently no HDMI 2.1 devices available - any devices that currently support it must be HDMI 2.0b - right?) Of course, now you only get Dolby Vision if you purchase a license from Dolby.... perhaps they're suggesting that it will be included in HDMI 2.1 for free (I doubt it). Perhaps some of you could post the wonderful - or not - experiences you've had viewing Dolby Vision content that was clearly superior to the regular flavor....... The "Feature Support Table"(shown below)in the above link shows that "Dynamic HDR (HDR Dynamic metadata) is not supported by HDMI 2.0a/b. How is this true if the new AppleTV claims to be able to pass Dolby Vision and only has HDMI 2.0a support?
|
|
|
Post by garbulky on Nov 29, 2017 19:25:37 GMT -5
The much more interesting thing - other than massive resolution improvements is the variable refresh rate. For computer gaming this is fantastic. It reduces tearing on the screen and vsync issues.
|
|
|
Post by Axis on Nov 29, 2017 19:27:36 GMT -5
Unless you are buying a 8K display very very soon, what do you need HDMI 2.1 for again? Remember HDMI is a chain and don't always assume the AVR/AVP is the weakest link. HDMI this or that is great if EVERYTHING is the same. Not everyone has a display with HDMI 2.0b yet. Heck I have an OLED display that does HDR and DV and it isn't even HDMI 2.0b so why should I even think about 2.1? Bootman you are the Man. Thanks for all the knowledge you gave me.
|
|
|
Post by Axis on Nov 29, 2017 19:28:52 GMT -5
Well, considering that Dolby says that Dolby Vision COULD be implemented in HDMI 1.4 (if they wanted to)...... And HDR10+ (which is the competing dynamic HDR standard being supported by Samsung) most certainly is designed to work in HDMI 2.0b...... .... maybe it was accurate the day it was published..... If you think about it, it's pretty obvious that Dolby Vision works in HDMI 2.0b (since there are currently no HDMI 2.1 devices available - any devices that currently support it must be HDMI 2.0b - right?) Of course, now you only get Dolby Vision if you purchase a license from Dolby.... perhaps they're suggesting that it will be included in HDMI 2.1 for free (I doubt it). Perhaps some of you could post the wonderful - or not - experiences you've had viewing Dolby Vision content that was clearly superior to the regular flavor....... The "Feature Support Table"(shown below)in the above link shows that "Dynamic HDR (HDR Dynamic metadata) is not supported by HDMI 2.0a/b. How is this true if the new AppleTV claims to be able to pass Dolby Vision and only has HDMI 2.0a support? Thank you Keith. You are the smartest person I ever met. Thanks
|
|
|
Post by overtheair on Dec 2, 2017 19:28:04 GMT -5
It will be interesting to see how many people are willing to upgrade their current high-end 4k screen to a new 8k screen THE SAME SIZE just because it has a few extra features. A lot of the people I talk to are already experiencing "upgrade burnout" pretty badly........ they aren't especially eager to buy more new equipment in a few years.... so I guess we'll see....... Keith, A lot of valid points in your post but its the assumption that people will be upgrading from 4K display/recent AVR etc to a new HDMI 2.1 based system which is a weakness in the argument/position IMHO. Partially for the "upgrade burnout" reason you outline, I suspect there are many consumers who haven't updated their equipment recently and are still running 1080P displays (especially projectors), older AVR's and Blu-ray players. I fit this solution category. When I upgrade for 4K and Atmos I will be looking closely at whether I should wait for HDMI 2.1 or just upgrade to equipment with HDMI 2.0b. If I upgrade next year then unless its late in the year it will probably have to be with HDMI 2.0b. If 2019 then I would likely expect the system to support some of the non-resolution/refresh rate HDMI 2.1 features even if it doesn't support the full HDMI 2.1 bandwidth. HDMI spec won't be the only factor in the decision but it will be a consideration. Note that I am not a gamer. If I were then it seems HDMI 2.1 is a much more compelling solution in what it at least has the potential to support. Just my 2c.
|
|
|
Post by Gary Cook on Dec 3, 2017 15:12:37 GMT -5
In my case the 4K TV came first as it was well and truly time to upgrade from the 40" to the largest screen that I could fit in the space allocated. That was originally 50", which was not enough of an upgrade,so I held off for some time. Until around a year ago Samsung introduced a mid spec/price range with very narrow surrounds and I could just squeeze a 55" into the space. A short time afterwards my cousin asked me to get an Oppo 203 for him which I could use while he was away sailing around the pacific for 6 months or so. As soon as it was released I bought an Apple TV4K. My cousin arrived back a couple of weeks ago so I needed a replacement for the 203, I did a side by side comparison with a few players and found the Sony X800 was indistinguishable in picture quality. That was added to the system last weekend. The processor to handle all of the 4K is next, I just have to decide which one. With the above in mind, the biggest overall experience improvement was the larger screen size, which at our viewing distance of ~2.5 metres is well worth it. Some of the 4K HDR material makes almost as much difference again, but it's not smack you in the face better. Some is next to indistinguishable from the 2K version. So to KeithL 's point, my view is that a lot of people upgrade to 4K not for the 4K itself but for the larger screen sizes it supports. As in my case, without a compete room redesign, there is a limit in many homes to the screen size and as a result, even if 8K supports it, many won't be able to take advantage of it. Plus there is a limit to how often people are willing (and able) to change their TV. Plus they are pretty reliable so replacement for failures isn't high. As a result maybe 4K is the sweet spot in many instances and without compelling need I'm not sure 8K will have as fast an adoption rate. Cheers Gary
|
|