KeithL
Administrator
Posts: 10,274
|
Post by KeithL on Jan 15, 2016 14:39:44 GMT -5
Yes - assuming it does as it claims. Even further, it would also allow you to use those sources with an older "4k ready" TV or projector that supported the 4k video modes but not HDMI 2.0 and HDCP 2.2 . HDCP itself shouldn't cause problems; the problems arise when something goes wrong. Likewise, you can send 4k content between two devices that don't support HDCP 2.2 ( and the XMC-1 can pass 4k / 60 content just fine). You might, for example, get this 4k content from the output of an upsampling Blu-Ray player. However, the standard for the new 4k Blu-Ray DISCS specifies that the content must be configured to require a connection with HDCP 2.2 copy protection. Likewise, there is nothing that would prevent a game console from providing an unprotected 4k video output, but the content provided by a particular application - like NetFlix 4k - may require it. So using a HDfury with existing XMC-1 would support two HDCP2.2 HDMI sources without needing the upgrade card - (under the assumption it really works like they says/claim - essentially stripping the HDCP requirement requested from the source)
|
|
KeithL
Administrator
Posts: 10,274
|
Post by KeithL on Jan 15, 2016 14:49:41 GMT -5
The circuit boards for the XMC-1 are manufactured for us in California (which was in the USA the last time I looked ) Majority of electronic components are made in China. Don't let manufactured in Tennessee fool you. The components ship from China (or other overseas locations) to the US and then Emotiva does some (or all) assembly there. The new boards and existing boards are not that different in this regard. At least assembly is back in the US. EDIT: looks like this has been covered already. We don't actually know where the CCAs (circuit card assemblies) are manufactured. They could be fabricated by a subcontractor in the US or an off- shore subcontractor. Once the pick and place SMT (surface mount) machine is programmed (by an engineer) and the reels of SMT parts are loaded (by a factory worker), there is very little labor left in the fab process. That's the whole rationale for designing with SMT parts. So US manufacturing in this case can be competitive. Russ
|
|
|
Post by Percussionista on Jan 15, 2016 15:21:36 GMT -5
The circuit boards for the XMC-1 are manufactured for us in California (which was in the USA the last time I looked ) +1. Just to be sure I looked out the window to verify, and yep, we are still in the USA (well, at least San Jose, CA) ;-)
|
|
|
Post by sycraft on Jan 15, 2016 15:25:53 GMT -5
A lot of the chips are too, most likely. TI fabs a lot of their stuff in the US, particularly their analogue and mixed signal products under the Burr Brown name.
|
|
|
Post by teaman on Jan 15, 2016 15:32:21 GMT -5
The circuit boards for the XMC-1 are manufactured for us in California (which was in the USA the last time I looked ) +1. Just to be sure I looked out the window to verify, and yep, we are still in the USA (well, at least San Jose, CA) ;-) My sister in law might have problems with that. She didn't realize CA was a part of the US until we told her about five years ago. Gotta love the US educational system!
|
|
KeithL
Administrator
Posts: 10,274
|
Post by KeithL on Jan 15, 2016 16:58:37 GMT -5
The soon-to-be-released HDMI 2.0 upgrade board uses the same switches as the current board, and supports the same data rate (300 mHz), which is the base requirement for HDMI 2.0 . The main difference is that it includes HDCP 2.2 support, which is required to pass a lot of 4k content. I also feel obligated to point out that I think everybody is getting ahead of themselves with HDR. There is already a basic version of HDR ("HDR 10") - which has been added to the 4k disc spec - and which supports 4:2:0 with 10 bits color. But there is another version being promoted by Dolby - which will only work on some displays (although it is backwards compatible). And, apparently, since the Dolby version of HDR isn't required by the spec, many disc players and displays may not support it. There also seems to be another version of the standard being promoted by Phillips and friends. And that will definitely only work on devices which include specific hardware support for it. (In other words, it looks like it will be a while before the dust settles on HDR, and worrying about who supports which version is probably a bit premature at this point.) ...the XMC-1 can pass 4k / 60 content just fine... That's good to know, but unfortunately it does NOT tell the full story. Yes, even the current HDMI 1.4 board can pass 2160/60p, but one presumes that given the Analog Devices chip used it can ONLY to 8-bit color and 4:2:0 color sampling. That is all you can get with a 300MHz design. Can someone please resolve this once and for all: Is the new HDMI board 300MHz, or 600MHz? I'm hoping that it is latter, as that means we will be able to do 10-bit or 12-bit color. At least 10-bit color depth is mandatory for any of the new HDR standards, including both HDR-10 and Dolby Vision. The 600MHz clock speed, along with the 18 Gbps bandwidth gives you the ability to also jump up up 4:2:2/12-bit or 4:4:4/8-bit. Yes, 4:4:4 is still a dream, but 10-bit color is NOT. The 600MHz capability is really going to be common-place with the likes of UltraHD Blu-ray, and you can pre-order that NOW for delivery this spring. Same for the HDR content coming from a few of the streaming services that some of us will use a Roku 4 or the INCREDIBLE Dish 16 tuner DVR that will record 4 simultaneous 4K/UHD streams with. Before investing in an XMC-1 it would be nice to know this. Would also be helpful to know what the capability of the HDMI inputs on the ESP-1 will be.
|
|
hemster
Global Moderator
Particle Manufacturer
...still listening... still watching
Posts: 51,952
|
Post by hemster on Jan 15, 2016 17:06:11 GMT -5
The soon-to-be-released HDMI 2.0 upgrade board uses the same switches as the current board, and supports the same data rate (300 mHz), which is the base requirement for HDMI 2.0 . The main difference is that it includes HDCP 2.2 support, which is required to pass 4k content. I also feel obligated to point out that I think everybody is getting ahead of themselves with HDR. There is already a basic version of HDR ("HDR 10") - which has been added to the 4k disc spec - and which supports 4:2:0 with 10 bits color. But there is another version being promoted by Dolby - which will only work on some displays (although it is backwards compatible). And, apparently, since the Dolby version of HDR isn't required by the spec, many disc players and displays may not support it. There also seems to be another version of the standard being promoted by Phillips and friends. And that will definitely only work on devices which include specific hardware support for it. (In other words, it looks like it will be a while before the dust settles on HDR, and worrying about who supports which version is probably a bit premature at this point.) That's the great thing about standards... there are plenty to choose from!
|
|
|
Post by copperpipe on Jan 15, 2016 17:20:47 GMT -5
The problem is that "the legal guys" don't look at things like this the same way we do. If you leave your front door unlocked, and someone walks in, then he's guilty of "trespassing"; if you lock it, and he forces or picks the lock, then he's guilty of "breaking and entering" - which is a much more serious crime with much more serious penalties. And, for that matter, at least where I live, if I were to shoot someone who had broken into my house, it would be considered justified; but the argument would be somewhat less certain if he or she had just wandered in through my unlocked door (where they could claim it was accidental). And, in none of these cases does the quality of the door lock, or whether it really would stop a determined ten year old, matter. In either case, I don't hear many people suggesting that door locks are silly, or that we should all leave our doors open, simply because they won't even slow down a determined burglar. Likewise, by using encryption, and so forcing you to break that encryption in order to make a copy, they've ensured that they can associate significant penalties with copying. Specifically, laws like the DMCA (Digital Millenium Copyright Act), which is honored in much of the world, explicitly states that removing copy protection is illegal... and the penalties for doing so often far exceed the actual penalties for copyright infringement or illegal copying. By defining the act of making the copy as illegal, they can charge you with a crime for doing so, without having to actually prove that you gave or sold a copy to someone. (They lost the battle when it comes to your being legally allowed to make a backup; but they're altered the situation to where you can't make that legal backup without breaking a different law; which leads to the same result.) So, from the point of view of "the other side", there may be a gray area about your making copies for your own backup use, and how to prove whether you might give one of those copies to a friend (which would be illegal).... but, by decrypting the content, you have committed a crime for which you can be prosecuted. It's even more subtle than that; by making it illegal for you to remove the encryption, they have prevented you from making a copy as a backup - then possibly giving that copy to your friend later. (And, yes, this does happen. I've known many people who admitted to buying a CD, ripping it onto their computer, then either selling it or giving it to a friend - and NOT destroying their copy once they no longer owned the original. Note that the law does include the requirement that, if you sell or give away the original, you are legally required to delete your backup copy. This is the sort of "casual copying" that simply preventing you from making even a legitimate copy prevents or reduces. And, regardless of how easy the copy protection is to remove, just like forcing someone to buy a lock pick to get into your house discourages at least some breakins, forcing someone to deliberately purchase an illegal software program to remove the encryption does in fact discourage some video and software piracy. ) Just to be clear, I agree with you that the annoyance of HDCP far exceeds the value, and I think it's a waste of everybody's time and effort...... However, I will admit that it does provide some value in terms of copy protection. It's silly because it doesn't stop piracy but it does screw with legit uses. The amount of times I've encountered flickering, dropouts, and coloured static in classroom/conference room presentation systems and it has traced back to issues with HDCP handshake is far too many. Likewise, my computer won't play Blu-rays, despite having everything HDCP protected, because of the way I have to have the GPU split off the video to run my audio on HDMI and monitor on DisplayPort. Never mind just wanting to record something for later viewing (something which the courts have ruled to be explicitly legal). If they want to have some simple technology to prevent something "casual" then fine, I can live with something like SCMS for S/PDIF I guess. However going nuts with the encryption is silly. It doesn't help, and it cannot help, prevent piracy. The issue is simply that the device does have the decryption key, it has to for it to work, and people can and will attack that and get the key. I mean look at Blu-ray: Two different systems, AACS and BD-J, which are both very complex and were supposed to be "secure for more than 20 years", both of which are thoroughly bypassed. They haven't been cracked in the cryptographic sense, but it doesn't matter because they can be bypassed in various ways and there's software out there which does it no problem. There is no need for HDCP 2.2, it just makes things more difficult for compatibility. The original HDCP already works to keep anyone casual from doing anything and the pirates have no problem getting around it in other ways. This makes me wonder; if you are legally allowed to make backup's of your disc's, but by doing so you break an encryption law, wouldn't a judge say "something is wrong with this picture" and toss out the case? Where does "intent" fit into all this (I'm obiously not a lawyer); after all, breaking encryption to steal company secrets is a far different crime (in my eyes) than breaking encryption so that you can exercise your right to a backup.
|
|
|
Post by Percussionista on Jan 15, 2016 18:50:42 GMT -5
The soon-to-be-released HDMI 2.0 upgrade board uses the same switches as the current board, and supports the same data rate (300 mHz), which is the base requirement for HDMI 2.0 . The main difference is that it includes HDCP 2.2 support, which is required to pass a lot of 4k content. I also feel obligated to point out that I think everybody is getting ahead of themselves with HDR. There is already a basic version of HDR ("HDR 10") - which has been added to the 4k disc spec - and which supports 4:2:0 with 10 bits color. But there is another version being promoted by Dolby .... .... Can someone please resolve this once and for all: Is the new HDMI board 300MHz, or 600MHz? I'm hoping that it is latter, as that means we will be able to do 10-bit or 12-bit color. At least 10-bit color depth is mandatory for any of the new HDR standards, including both HDR-10 and Dolby Vision. The 600MHz clock speed, along with the 18 Gbps bandwidth gives you the ability to also jump up up 4:2:2/12-bit or 4:4:4/8-bit. Yes, 4:4:4 is still a dream, but 10-bit color is NOT. The 600MHz capability is really going to be common-place with the likes of UltraHD Blu-ray,.... So... the new XMC-1 board will do 300MHz, which should be ok to do TV/movies at 24 frames/sec, and with 10-bit color at 4:2:0 (i.e., the upcoming UHD blu-ray discs)... but... if anyone was planning on doing gaming to a 4K display, and wanted 60Hz frames, they wouldn't be able to do that unless there was a further upgrade to 600MHz 18 Gbps bandwidth? I realize the latter is not prime territory for Emo, but wanted to have this clarified. So many numbers, so many acronyms, .... still just a bit confused sorry. I do aim to get a 4K TV with "full" HDR support as well as the already pretty standard HDMI 2.0 HDCP 2.2 on the new sets. Still waffling on 2016/2017, but not likely to happen until Oppo comes out with a new box, pricing on the TVs, etc. And HDR will hopefully get further sorted out too ;-)
|
|
|
Post by sycraft on Jan 16, 2016 3:05:43 GMT -5
The HDR thing is likely to remain a mess for a long time. Not only are there format considerations, but there's hardware issues. LCDs can do HDR, basically they just need brighter backlights which is more expensive but not really an issue, commercial grade LCDs already have such backlights (some are bright enough to operate in direct sunlight). But then there's OLED. Being an emissive technology, high brightness is not something you'd want to do with it, as it would burn out the display faster. While that isn't necessarily a problem that can't be overcome, it is a real issue.
|
|
|
Post by rogersch on Jan 16, 2016 8:33:31 GMT -5
. (In other words, it looks like it will be a while before the dust settles on HDR, and worrying about who supports which version is probably a bit premature at this point.) And here again I think the Industrie wil shoot themself in the foot (Dutch saying...). I'll wait buying equipment (players and video panels) until it is clear which HDR standard will prevail and I think a lot of people will do the same
|
|
|
Post by mgbpuff on Jan 16, 2016 9:03:55 GMT -5
But HDR10 will be on every HDR UHD disc. If Dolby Vision is also on a HDR UHD disc, then it is in addition to the HDR10 basic. So any HDR capable display will work with the UHD disc using the HDR10 standard. Dolby Vision will only work on displays with specific Dolby Vision firmware. This sort of thing will go on until the cows come home, so waiting is simply a self denial version of whipping ones self.
|
|
cawgijoe
Emo VIPs
"When you come to a fork in the road, take it." - Yogi Berra
Posts: 5,035
|
Post by cawgijoe on Jan 16, 2016 9:07:48 GMT -5
. (In other words, it looks like it will be a while before the dust settles on HDR, and worrying about who supports which version is probably a bit premature at this point.) And here again I think the Industrie wil shoot themself in the foot (Dutch saying...). I'll wait buying equipment (players and video panels) until it is clear which HDR standard will prevail and I think a lot of people will do the same That's great unless you need to buy a new TV like I just did because mine died. Sometimes you just can't wait. I do agree that they are much better off agreeing on a standard ahead of implementation though.
|
|
|
Post by bradford on Jan 16, 2016 14:44:15 GMT -5
The HDR thing is likely to remain a mess for a long time. Not only are there format considerations, but there's hardware issues. LCDs can do HDR, basically they just need brighter backlights which is more expensive but not really an issue, commercial grade LCDs already have such backlights (some are bright enough to operate in direct sunlight). But then there's OLED. Being an emissive technology, high brightness is not something you'd want to do with it, as it would burn out the display faster. While that isn't necessarily a problem that can't be overcome, it is a real issue. HDR is about contrast not just brightness. Both OLED and LED can achieve the necessary levels but they come at it from the opposite direction. OLED has absolute blacks since it can be completely off and needs less brightness to achieve the HDR spec. LED's are very bright and with local dimming can lower the black level to hit the spec. There is a good article in the latest Widescreen Review that covers this in detail. There was some back and forth between the OLED and LCD camps on the spec and it was agreed that it could be achieved from either direction.
|
|
|
Post by cwt on Jan 18, 2016 1:05:02 GMT -5
But HDR10 will be on every HDR UHD disc. If Dolby Vision is also on a HDR UHD disc, then it is in addition to the HDR10 basic. So any HDR capable display will work with the UHD disc using the HDR10 standard. Dolby Vision will only work on displays with specific Dolby Vision firmware. This sort of thing will go on until the cows come home, so waiting is simply a self denial version of whipping ones self. Its just as well the udk8500 has 2 hdmi outs as it will be a pain getting rival ce's to agree on a comprehensive standard [apart from HDR10] . Reminds me of all the other proprietary systems thats ever been - ilink [standard but not good enough for a lot of ce's] hence ; no denonlink; no sony hats Hopefully the upcoming full hdmi card for the xmc1 will have that little addendum 2.0a when and if it gets sorted out ..
|
|
|
Post by ejn1111 on Feb 14, 2016 8:55:04 GMT -5
I was planning on getting the new board upgrade and I currently have a Blue Jean Cables HDMI (BJC Belden Series-1) 40 ft run.... I think its rated at 10.2 Gbps up to 50 feet or so... Does this mean I will need a new 18 Gbps HDMI cable with the HDMI board upgrade to go with the new JVC 4k60, HDR projector I'm looking at getting? I get confused on the bandwidth "truly" needed for HDR etc to a projector which isn't needing the audio streams which may free up some bandwidth? Not sure if this is the right way to look at it or not.
ps Thanks Emotiva for making the upgrade available at a reasonable cost! Much appreciated.
|
|
|
Post by millst on Feb 14, 2016 11:31:45 GMT -5
No, existing high speed HDMI cables should work (even at the higher speed).
-tm
|
|
|
Post by ejn1111 on Feb 14, 2016 18:09:48 GMT -5
No, existing high speed HDMI cables should work (even at the higher speed). -tm Just got an email from the Blue Jeans Cable guy saying no passive HDMI cable is likely to work at full 4k settings beyond 25 ft I will try it first but has me thinking I might have to rethink my connections.
|
|
|
Post by junchoon on Feb 15, 2016 1:51:12 GMT -5
Just curious which 4k jvc projector u r referring to?
|
|
|
Post by millst on Feb 15, 2016 10:40:28 GMT -5
No, existing high speed HDMI cables should work (even at the higher speed). -tm Just got an email from the Blue Jeans Cable guy saying no passive HDMI cable is likely to work at full 4k settings beyond 25 ft I will try it first but has me thinking I might have to rethink my connections. I was wondering if that distance would make a difference. They're the experts so they're probably right (unless they're trying to sell you a new cable, haha). I have a cable that long that isn't rated for high speed, but pushed 3D fine, so you might as well give it a shot. -tm
|
|