dsl1
Seeker Of Truth
Posts: 4
|
Post by dsl1 on Feb 1, 2017 12:18:19 GMT -5
I'd like to purchase an XMC-1 and would also like the HDMI 2.0a multiport board when it's available.
Dan, could we get an estimate price range for the new board (EG: Will it be about the same cost as the current single HDMI 2.0 upgrade) or are we looking at a full different pricing level. If it's about the same I'll jump on board and buy the XMC-1 now.
Thanks!
|
|
|
Post by duh1nonly on Feb 7, 2017 16:33:05 GMT -5
I have 5(!) devices that I want to pass HDR/4k content through- but it is my $2000 XMC-1 holding everything back! lol
One HDMI 2.0 port on the current upgrade board (non HDMI 2.0a) wouldn't be worth it as I want to pass HDR data. I have a PS4 Pro, XBOX ONE S, Roku Ultra, OPPO UDP-203 4k blu ray player, and a PC with a Titan X Pascal GPU. I can only have the OPPO hooked straight up to my TV as it has 2 HDMIs out. Everything else can't pass HDR through the XMC-1. 4k games do get to the TV (but the computer will only pass it at 30fps @4k...I got the PC to pass 60fps once at the desktop- but launching any games forced it back to 30fps (not very smooth to play at).
I really want this new multi port HDMI 2.0a (b?) upgrade board ASAP as it my most expensive piece of audio gear I've ever bought (XMC-1) that is the weak link in my setup.
|
|
cawgijoe
Emo VIPs
"We made too many of the wrong mistakes." - Yogi Berra
Posts: 4,914
|
Post by cawgijoe on Feb 7, 2017 20:30:59 GMT -5
It would be nice to have a ballpark figure so as to be prepared when it goes up for sale.
|
|
bootman
Emo VIPs
Typing useless posts on internet forums....
Posts: 9,358
|
Post by bootman on Feb 7, 2017 22:44:05 GMT -5
Is it going to be more than $499?
If so there are so many other options out there after I sell.
|
|
|
Post by jmilton on Feb 8, 2017 8:49:01 GMT -5
|
|
KeithL
Administrator
Posts: 9,964
|
Post by KeithL on Feb 8, 2017 9:59:51 GMT -5
Yes, it will be really important - as soon as you buy that new 8k TV. You didn't really think the train was going to stop - now did you? And, yeah, when 8k comes along, we'll probably have an update board for that too....
|
|
|
Post by bolle on Feb 8, 2017 10:04:38 GMT -5
Yes, it will be really important - as soon as you buy that new 8k TV. You didn't really think the train was going to stop - now did you? And, yeah, when 8k comes along, we'll probably have an update board for that too.... Sounds good Keith! But back to the HDMI 2.0a board - any date and price? Budget receivers with HDMI 2.0a are on the market for quite a while - when will we be able to upgrade our XMC-1 to the "a"?
|
|
|
Post by jmilton on Feb 8, 2017 10:44:24 GMT -5
Well, uncompressed color (4:4:4) and Dolby Atmos via ARC would be useful long before 8K hits the street.
|
|
dsl1
Seeker Of Truth
Posts: 4
|
Post by dsl1 on Feb 8, 2017 11:42:32 GMT -5
Yes, it will be really important - as soon as you buy that new 8k TV. You didn't really think the train was going to stop - now did you? And, yeah, when 8k comes along, we'll probably have an update board for that too.... Sounds good Keith! But back to the HDMI 2.0a board - any date and price? Budget receivers with HDMI 2.0a are on the market for quite a while - when will we be able to upgrade our XMC-1 to the "a"? Would really like an answer as well, want to hop on the XMC-1 train as well but in the same situation as poster #2 with multiple devices I want to pass HDR with. Really hope it's still $350. If it gets near the $500 range certainly makes you want to consider other options.
|
|
|
Post by 2muchht on Feb 8, 2017 15:34:23 GMT -5
...HDMI 2.1 will be here soon. Why should you care?... While HSMI 2.1 was announced during CES, the reality is that it won't be available until the end of 2017, at the earliest. Why? The "CTS", or test compliance spec, isn't done yet. That means that a manufacturer cannot finalize the HDMI stacks and it can't be tested and approved. Then again, word is that the chips won't be available in quantity until late summer, at the earliest. Add to that the fact that support would probably go to "the big guys" first. My guess is that companies such as Emotiva won't be able to do 2.1 until Q1/18. Not their fault, just the way the world works. Let's see HDMI 2.0b now and then we'll all be set for at least a year. There really isn't any use for 2.1 at this point, anyway.
|
|
|
Post by hosko on Feb 8, 2017 18:35:48 GMT -5
I believe its the same board as what will be used in the RMC1, so when you see the release date that should be your answer. They probably don't want to say because that would show their hand of when other products will ship, plus they could still be working on it.
|
|
dsl1
Seeker Of Truth
Posts: 4
|
Post by dsl1 on Feb 8, 2017 19:13:06 GMT -5
Some info here maybe: www.technologyintegrator.net/article/emotiva-showcase-new-models-ces-2017/"Emotiva spared no expense in designing this upgrade to make the XMC-1 the best AV processor at its price point. Eight HDMI inputs and dual HDMI outputs deliver a bit-perfect video stream, with full support for HDMI 2.0b and HDCP 2.2, and all current 4K HDR10 video formats." "HDMI update: $349"
|
|
|
Post by garbulky on Feb 8, 2017 19:31:05 GMT -5
Well, uncompressed color (4:4:4) and Dolby Atmos via ARC would be useful long before 8K hits the street. Maybe you can help me here. Since windows 98 I've been using "32 bit color". From what I gather this is very different from 10 bit color.
|
|
|
Post by qdtjni on Feb 8, 2017 19:35:31 GMT -5
|
|
|
Post by cwt on Feb 9, 2017 0:57:32 GMT -5
Really hope it's still $350. If it gets near the $500 range certainly makes you want to consider other options. Maybe consider this ds11 ; its the very latest being 2.0b not a [cutting edge costs ]; and the current 2.0 board is ubiquitous being used in many other components. Rest assured the eventual hdmi 2.1 board will have much more $ r&d incorporating it if you want all its features ;)Interested at all ? If I was a pc gamer or wanted better HDR I would pay the price rather than lose a good pre pro .. If only that its much more robust with comms than what we have now www.hdmi.org/manufacturer/hdmi_2_1/index.aspxTheres always the Krell example of what a basic hdmi upgrade costs for some perspective 8-)It needs another one for 4k/60 as well.. www.monacoav.com/audio-video-news/whats-new/krell-foundation-upgrade-the-best-just-getting-better/
|
|
|
Post by jmilton on Feb 9, 2017 10:05:30 GMT -5
Well, uncompressed color (4:4:4) and Dolby Atmos via ARC would be useful long before 8K hits the street. Maybe you can help me here. Since windows 98 I've been using "32 bit color". From what I gather this is very different from 10 bit color. 8 bits for RGB= red 8, blue 8, green 8...so it becomes 24 "bits". Computer use a faux color scheme that flashes 2 different shades of a color to create a "perceived" color shade in between.
|
|
KeithL
Administrator
Posts: 9,964
|
Post by KeithL on Feb 9, 2017 12:48:26 GMT -5
I'm not sure what you're talking about with "flashing two different shades of a color to produce one in between". I believe some panels do this internally, to simulate a wider color gamut, but I wouldn't expect any current video source to do it. Dithering is often used by alternating PIXELS of multiple colors (but at the same time), to create "in between" colors, and can be used for printed or screen images. Dithering is still used today with printing, and with some low-res screen images, but is generally avoided because it trades resolution for colors. (But I haven't heard of any actual video source alternating colors to produce in between ones lately - although it's certainly possible.) Generally, when computer graphic systems talk about "32 bit color" what they're talking about is 24 bit color (RGB 8:8:8) plus 8 bits of transparency. This is important when doing graphic rendering and compositing... but isn't really a characteristic of the video going out to the monitor. This would be a setting on your video card because the video card does graphics processing..... (This feature is used on video games, for example, to merge a moving character on top of a background, or to merge several levels of different objects.) In modern terminology "12 bit color" means 12 bits per color, which would be a total of 36 bits in RGB, or 48 bits of RGB + Z (transparency). (It's written both ways - which makes it confusing. It's supposed to be obvious which they mean by the overall number and whether it's divisible by 3 or 4.) Chroma 4:4:4 is a whole different issue. Normal UHD Blu-Ray discs use what's known as Chroma 4:2:0. What this MEANS is that the GREEN image is stored at full resolution, while the RED and BLUE images are stored at half resolution. While this sounds nasty, since that aligns perfectly with how sensitive our eyes are to the various colors, it IS the optimum usage of the available bandwidth. Your eyes are more sensitive to sharpness in the green, so that's the most important, and you don't notice a slightly lower resolution in the other colors. (Which is another way of saying that 4:2:0 gives you the best picture that will fit into the space that's available for it.) HOWEVER, when you send computer TEXT (small white characters with narrow lines and dots), it doesn't work so well. You end up with those annoying little color fringes where the slightly blurrier red and blue color planes "slop over" the green image. (Instead of clean white letters, you end up with white letters with little rainbow fringes around them.) This may be made better or worse if you enable some of the various "sharpening" and "processing" options on your TV. With Chroma 4:4:4, all three colors are sent at the same resolution - so there are no color fringes. It is important to note, however, that having Chroma 4:4:4 capability doesn't make any difference if you're playing a 4k UHD disc - because the disc itself is limited to Chroma 4:2:0. However, it matters quite a bit if you plan to use your TV as a computer monitor - especially if you plan to display small text. (Not at all important for playing videos; may be somewhat important for some video games; VERY important for big spreadsheets or text documents with small print.) Maybe you can help me here. Since windows 98 I've been using "32 bit color". From what I gather this is very different from 10 bit color. 8 bits for RGB= red 8, blue 8, green 8...so it becomes 24 "bits". Computer use a faux color scheme that flashes 2 different shades of a color to create a "perceived" color shade in between.
|
|
|
Post by jmilton on Feb 9, 2017 13:24:42 GMT -5
...which brings us full circle back to HDMI 2.1 and it's potential!
Then again, I am just a trichromat.
|
|
|
Post by ÈlTwo on Feb 9, 2017 14:23:14 GMT -5
Then again, I am just a trichromat. Is that a three colored three cornered hat?
|
|
|
Post by duh1nonly on Feb 10, 2017 21:32:14 GMT -5
KeithL- that was the most simple and beautiful explanation of Chroma 4:4:4 I've ever read. When I try to explain this to people I bumble around like an idiot trying to put it into words.
|
|