|
Post by doc1963 on Apr 6, 2024 9:01:07 GMT -5
You can get any of the current processors now at reduced prices. Why wait if you’re so anxious? Russ Because I need HDMI 2.1, have a XBOX X and PS5, along with the 4K BR Player, Roku, Switch, Atari Retro Unit, Quad Tablo with HDMI out. Since you have these needs and didn’t get the response you were looking for, I think you have your answer. Even if someone in the know were to respond that they were “on track”, you’d still be looking at months before you had a product in your hands. Personally, if I needed a product today, I’d buy what’s available today and, most importantly, one that already has a proven track record. With Emotiva’s new processor, even if you are willing to wait a couple of months to get one, how many more months are you willing to wait for potential bug fixes that most likely will not come as quickly as you’d like…? The Anthem AV90 that you mentioned isn’t cheap, but it is very highly regarded, seems to work well, and is available today. Good luck with whatever you decide to do…
|
|
|
Post by lavocat on Apr 6, 2024 12:05:38 GMT -5
You can get any of the current processors now at reduced prices. Why wait if you’re so anxious? Russ Because I need HDMI 2.1, have a XBOX X and PS5, along with the 4K BR Player, Roku, Switch, Atari Retro Unit, Quad Tablo with HDMI out. Why not a Marantz AV10 ? Best competitor… omho
|
|
geebo
Emo VIPs
"Too bad that all the people who know how to run the country are driving taxicabs and cutting hair"
Posts: 24,211
|
Post by geebo on Apr 6, 2024 12:32:16 GMT -5
Because I need HDMI 2.1, have a XBOX X and PS5, along with the 4K BR Player, Roku, Switch, Atari Retro Unit, Quad Tablo with HDMI out. Why not a Marantz AV10 ? Best competitor… omho He said he does not like the Marantz units.
|
|
boo
Minor Hero
Posts: 13
|
Post by boo on Apr 7, 2024 8:47:36 GMT -5
Because I need HDMI 2.1, have a XBOX X and PS5, along with the 4K BR Player, Roku, Switch, Atari Retro Unit, Quad Tablo with HDMI out. Since you have these needs and didn’t get the response you were looking for, I think you have your answer. Even if someone in the know were to respond that they were “on track”, you’d still be looking at months before you had a product in your hands. Unfortunately you seem to be correct. Another sign of bad customer service, they are the ones that keeps putting these dates out, not myself, who was willing to give them thousands of dollars for their product(s). So tomorrow, ( was going Saturday, but life got in the way), go to a somewhat local dealer, who offered me a great discount on a AVM 90.
|
|
|
Post by doc1963 on Apr 7, 2024 15:02:36 GMT -5
Since you have these needs and didn’t get the response you were looking for, I think you have your answer. Even if someone in the know were to respond that they were “on track”, you’d still be looking at months before you had a product in your hands. Unfortunately you seem to be correct. Another sign of bad customer service, they are the ones that keeps putting these dates out, not myself, who was willing to give them thousands of dollars for their product(s). So tomorrow, ( was going Saturday, but life got in the way), go to a somewhat local dealer, who offered me a great discount on a AVM 90. Please don't take my original response to you as some sort of knock against Emotiva's credibility. My point was simply that if you need something today, buy what's available today and, maybe more importantly, something that already has a stable track record. The Anthem product you mentioned fits that criteria, but don't overlook the fact that Anthem was also very much behind their targeted release schedule. It happens. And after they finally launched, they too had a lot of bugs to work out. Unfortunately. that happens too. But since their two processors have now been on the market for several years, they're pretty solid. You wouldn't be able to say the same if you waited for a new generation processor from almost anyone with the lone exception of Marantz. D&M has a massive amount of engineering resources and most often seems to get it "right" before they release.
|
|
|
Post by sebna on Apr 7, 2024 23:12:46 GMT -5
I think you are forgetting Yamaha. They easily dwarf Demon in both stability and scale.
|
|
|
Post by AudioHTIT on Apr 7, 2024 23:43:47 GMT -5
I think you are forgetting Yamaha. They easily dwarf Demon in both stability and scale. Do they make a processor? I’ve only seen AVRs.
|
|
|
Post by doc1963 on Apr 7, 2024 23:53:36 GMT -5
I think you are forgetting Yamaha. They easily dwarf Demon in both stability and scale. Actually, I haven't, but since Yamaha doesn't have a current AV processor on the market, I didn't mention them. The CX-A5200 was discontinued some time ago and there's no sign of an eventual replacement. The OP says he requires HDMI 2.1 compatibility and doesn't want a receiver. I would agree that, historically, Yamaha has made decent products that were operationally stable, but I would disagree that Yamaha "dwarfs" D&M in any way.
|
|
|
Post by doc1963 on Apr 7, 2024 23:57:25 GMT -5
I think you are forgetting Yamaha. They easily dwarf Demon in both stability and scale. Do they make a processor? I’ve only seen AVRs. Nope... The CX-A5200 was their last AVP and was discontinued some time ago. Since there's no sign of a replacement, I believe they're sticking with just marketing AVRs.
|
|
|
Post by idontknow on Apr 8, 2024 14:32:51 GMT -5
So, help me understand the upgrade vs buying the plus new. Do you really get exactly the same thing as a new one? From the last podcast that showed the guts layed out, it seems that every single component/board would have to be replaced, no? Seems cheap to do that for $1500-ish.
|
|
|
Post by ttocs on Apr 8, 2024 15:01:26 GMT -5
So, help me understand the upgrade vs buying the plus new. Do you really get exactly the same thing as a new one? From the last podcast that showed the guts layed out, it seems that every single component/board would have to be replaced, no? Seems cheap to do that for $1500-ish. I'm trying to recall some details, but one that I do recall is that the upgrades are done with recycled cases. We send in our unit and get one that's already been configured so we don't have to wait, or, we can request that our case gets used and wait longer for the new boards to be installed. The XMC-2+ model will get the former RMC-1 audio board, but all else is new +Series hardware. The warranty on upgrades is 1 year plus some of the remaining warranty left on the unit you send in. But yeah, I think the upgrade pricing is great!
|
|
|
Post by aswiss on Apr 8, 2024 16:56:52 GMT -5
So, help me understand the upgrade vs buying the plus new. Do you really get exactly the same thing as a new one? From the last podcast that showed the guts layed out, it seems that every single component/board would have to be replaced, no? Seems cheap to do that for $1500-ish. I'm trying to recall some details, but one that I do recall is that the upgrades are done with recycled cases. We send in our unit and get one that's already been configured so we don't have to wait, or, we can request that our case gets used and wait longer for the new boards to be installed. The XMC-2+ model will get the former RMC-1 audio board, but all else is new +Series hardware. The warranty on upgrades is 1 year plus some of the remaining warranty left on the unit you send in. But yeah, I think the upgrade pricing is great! And the page is still here ;-) with the same prices.
|
|
Lsc
Emo VIPs
Posts: 3,353
|
Post by Lsc on Apr 8, 2024 18:02:00 GMT -5
So, help me understand the upgrade vs buying the plus new. Do you really get exactly the same thing as a new one? From the last podcast that showed the guts layed out, it seems that every single component/board would have to be replaced, no? Seems cheap to do that for $1500-ish. You get a new one and Emotiva will most likely sell the trade-in as a refurb unit. The biggest thing about the trade-in is that it will probably be a little while before the trade-in window opens - so if you want the unit right away (whenever it’s released)…you don’t have to wait.
|
|
|
Post by jasonf on Apr 9, 2024 0:52:41 GMT -5
Why not a Marantz AV10 ? Best competitor… omho He said he does not like the Marantz units. I've been comparing both for a couple of months (review incoming I swear...) and the Marantz MAY not be right for him. I will say that musically, the RMC has a bigger, brighter stage than the AV10 for sure. I think the AV10 is a little more accurate and "intimate", but the RMC sings. Right now I think the AV10 dominates for home theater, and all around stability/experience.
|
|
|
Post by lavocat on Apr 9, 2024 3:27:41 GMT -5
He said he does not like the Marantz units. I've been comparing both for a couple of months (review incoming I swear...) and the Marantz MAY not be right for him. I will say that musically, the RMC has a bigger, brighter stage than the AV10 for sure. I think the AV10 is a little more accurate and "intimate", but the RMC sings. Right now I think the AV10 dominates for home theater, and all around stability/experience. Did you compare the Marantz with a Dirac calibration (done & actived) ?
|
|
|
Post by sebna on Apr 9, 2024 6:11:51 GMT -5
He said he does not like the Marantz units. I've been comparing both for a couple of months (review incoming I swear...) and the Marantz MAY not be right for him. I will say that musically, the RMC has a bigger, brighter stage than the AV10 for sure. I think the AV10 is a little more accurate and "intimate", but the RMC sings. Right now I think the AV10 dominates for home theater, and all around stability/experience. Are you sure you are not tempted to add AVM90 to the mix ?
And on more serious note - in terms of object placement and transition between speakers - does any of the two do a batter job at it? Would one cause the speakers to disappear more than other? More seamless handover of objects between speakers etc.?
I am asking as I know one very vocal AVM90 owner who also had RMC and claims that there is no comparison in favour of AVM90 when it comes to mentioned above. However that guys is such hyperbolic it is difficult to know what is what with him and what is his imagination and bias and what is real
|
|
|
Post by AudioHTIT on Apr 9, 2024 14:10:38 GMT -5
… And on more serious note - in terms of object placement and transition between speakers - does any of the two do a batter job at it? Would one cause the speakers to disappear more than other? More seamless handover of objects between speakers etc.? … Not something I’ve considered from a processor perspective, but some things come to mind as definitely audible in this regard. First (like most other topics), speaker model, type, placement, toe-in / aiming, would have a very large impact on object rendering, and creating a ‘seamless bubble’. Also speaker setup; levels, distance, EQ (Dirac, REW, PEQ, general ARC), and of course even with those things, the room itself has a huge impact — but those are all (somewhat) processor independent. Within the processor, the version or revision of the Dolby and DTS code, would seem to be the biggest influencer, but also (possibly) if the processor is left in ‘Auto’ mode, its ability to detect and select the proper CODEC would be an influencer. Beyond that, it would seem to get into the more ‘aesthetic’ qualities that are debated with the more loosely defined terms we hear in the audiophile world. Maybe there’s more? Edit: Recently, while watching “Masters of the Air” on TV, I was struck by how seamless the Atmos bubble sounded. It felt like objects were in places I’ve never heard or noticed before, and moved very naturally. As the scenes were often shot from a pilot or crew perspective, there was truly a complete sphere of locations the sound could be coming from. The production values, and other aspects of this series, are all excellent,
|
|
NicS
Sensei
Will the G4 upgrade help quell my RMC1-L frustrations...?
Posts: 213
|
Post by NicS on Apr 9, 2024 14:26:33 GMT -5
A cinematographers perspective on whether 8K is better than 4K, or indeed is it preferable? Buckle up.
For me, this depends on a number of criteria. Primarily it's the type of content. So I'll handle these separately.
Movie & TV drama. The prevailing capture frame rate for movies/TV is 23.98fps (24p, nominally/ 25p for Europe). While most capture is around the 4K mark, as 24fps the image blur in anything other than graphic content and wide shots that are completely still drop to an effective resolution around the 1.5K mark. In fact, as someone with 30 years of experience behind the camera, the job of the cinematographer is to do the opposite of what might be called "resolution maximization." Typically, we'll take a 4K digital camera, put on a "detuned" lens (the latest fad in filmmaking), add diffusion filters to the lens, then fill the set with atmosphere mist, shooting at "wide-open" on the lens which renders 85% of the frame out of focus and that area in focus bing subject to spherical aberration (amongst other artifacts), which helps make the image look "beautiful". In post, artificial film gain is added, further deteriorating resolution until the picture is considered graded and matches shot for shot. It's important to understand that this production criteria is limited to drama, where storytelling is linked to aesthetic, etherial concerns. It's also important to note the recent resurgence of the use of film as the originating medium. This is mostly an affectation of a certain class of filmmaker; those wishing the film industry was still in their exclusive control (Spielberg, Scorsese, Tarantino, Nolan, Wes Anderson all still insist on shooting film)
Movie & TV Documentary. This changes the criteria a bit. For those documentaries with reenactments & interviews, 24p capture is most prevalent, following the dominant aesthetic. The big pivot comes with nature documentaries. This is most starkly revealed in watching Planet Earth III in 4K Ultra from BluRay, in a 4:4:4, 12bit output at 60fps. This visceral, window-like realism is genuinely jaw dropping. It freaks my dog out completely. But its not the sort of detail you want when looking at Meryl Streeps face. Shooting Meryl this way would result in immediate dismissal as the cinematographer and the sudden end of your career. However, as anyone who has any of the Planet Earth discs will know, this is how you can tell if your projector/monitor is performing correctly. Any changes in setup are immediately apparent with such exacting content.
Reality TV. While 4K capture is typical, it is still not preferable to deliver an image that is maximized for 4K resolution. This would compromise the aesthetic concerns too much. Effort is made to reduce resolution to fit inside the acceptable range of what is currently commercially acceptable image quality. However, graphics are distinctly better and more legible compared to 1080p delivery. I doubt 8K would improve anything in this genre.
I used to shoot quite a bit for Victoria's Secret. I did the fashion shows, some advertising and the ill-fated Swim Special (oh boy). My relationship started with me being brought in to shoot "beauty B-Roll", for use in bumpers in the production of the TV broadcast fashion show. Back in 2005 the best option for "beauty" was to shoot on 32mm negative. The models would exit the stage, then pass though my little studio on the way to the changing room, often getting undressed at the same time. I would then shoot then doing into-the lens beauty porn. Over the years, I introduced various early digital cameras including the ARRI D-20 (the first really useable digital motion picture camera), Weisscam, Thompson Viper....but never RED: terrible on skin tone and generally unreliable. My entire role was to make these women look as gorgeous as possible (not hugely difficult) and to hover in this neo-reality of objectification for profit. I was never concerned with the camera resolution, I was only concerned with actual resolution, which I kept under the 1.5K target.
Sport. I'm not much of a sport fan, admittedly. But what I've noticed in terms of resolution isn't so much in the moving image, as in the graphic content. In 4K, the graphics cane be made to be more legible in a smaller font. And I'm sure this will be true of 8K production. My only exposure to 8K sport is in the production of prototype lenses for capture for dome style use. This is becoming more of a "thing" but it isn't something that will translate to domestic use for years to come, and certainly not in the realm of panel display. What might be advantageous with 8K origination would be user designated "picture-in-picture" control. I've been involved in experimentation in this area. For me it has limited appeal, but that is for the market to decide. In this scenario, more pixels would be highly advantageous, and the feature could be limited to 8K deliverables at a "premium" cost, something that excites the bean counters at the steaming services.
Gaming. I'm an avid player of Borderlands. I'm level 72, with a full cache of Legendary 800+ weapons acquired in Mayhem 11 mode. If that means nothing to you, it's OK. But it reflects a prodigious amount of time and a chunk of skill; that in the real world is utterly, utterly pointless. I can see how 8K display would be great for this. 8K at 120hz more so. Latency can go *bleep* itself in this realm. The chance of me playing Borderlands on my projector are slim. The screen is in the living room, which forces us into a night-time only viewing routine. Which I'm fine with. I don't need any daytime distraction. Plus, I think the days of screen based gaming is soon to be over, pivoting 100% to headset based display in the next 5 years.
Conclusions. Would I buy an 8K projector? Sure. It's most likely coming and most likely we won't have a choice. But realistically, projectors are on their way out. Just as the traditional TV has it's days numbered.
In Glendale, in an innocuous business park, Sony has a facility that helps filmmakers make the choice of using Sony technology in their productions. Well, choice is an odd term. If it's a Sony produced project, "choice" is limited to their product line. It's the same facility that Claudio Miranda visited when deciding to shoot Top Gun 2 on the Sony Venice 2 camera. In this facility is a viewing theatre in which is installed the Sony Crystal Vision LED system. This scalable, panel based, modular system is widely used in the corporate market but is currently very expensive. But, as with all things in tech, the price is plummeting. These systems are finding their way into large scale viewing (the Sphere in Las Vegas), and into the homes of the ultra rich. I know someone who spent $500,000 on a home theatre setup with this technology. His priorities are somewhat distorted by his bottomless wealth.
We are within a decade of this tech being something we buy at Ikea, and line walls with, cheaper than the cost of wood panelling. This isn't hyperbole. This is being actively developed. When this technology comes on line, 8K, even 16k will be essential. In fact the camera developed for generating content of the Sphere in Vegas in 16K. Which is soon to be 24K, to help with artifact reduction. So that's the future. The not-too-distant future at that.
I'll end this textual vomiting with one thought: resolution is a factor to be considered in the creation of content. Storytelling, the type we love in movies and TV, relies on images that tend to be subjective. We all have a different personal interpretation with a movie, less so with documentaries. What is more important, to me at least, is dynamic range and color rendition. To me, a decent 2K projector (like a Christie L2K1500), with a very wide color gamut, high dynamic range and 15,000 lumens, is visually superior to the 4K Sony I have. It's also 12 times the weight and 10 times the cost. My Sony might be better during the credits, but the Christie, when watching movies is far, far superior. I borrowed one to try out. There was NO COMPARISON. But my wife would never, let me bolt a 100lb lump like that to the ceiling. To me color rendition and dynamic range are far, far more significant in terms of overall image quality than resolution.
One final thing. Marvel uses tools I developed as a cinematographer in their pre-viz. I have an ongoing relationship with their visual effect supervisors who live 100% in the world of what screen resolution means. In the past few years, Marvel takes 4K original material, down resing to 2K for most work, only going back to 4K when needed, then in the final pass they upres back to 4K using an AI driven system, This saves millions of dollars in storage and render time. Nobody, and I mean not a single person prefers the 4K production path when they tested this at Marvel, It looks better and is cheaper to upres from 2K.
Thought you might find that interesting. If you didn't, sorry for all the words.
|
|
|
Post by PaulBe on Apr 9, 2024 14:43:52 GMT -5
A cinematographers perspective on whether 8K is better than 4K, or indeed is it preferable? Buckle up. For me, this depends on a number of criteria. First is the type of content type. So I'll handle these separately. Movie & TV drama. The prevailing capture frame rate for movies/TV is 23.98fps (24p, nominally). While most capture is around the 4K mark, as 24fps the image blur in anything other than graphic content and wide shots that are completely still drop to an effective resolution around the 1.5K mark. In fact, as someone with 30 years of experience behind the camera, the job of the cinematographer is to do the opposite of what might be called "resolution maximization." Typically, we'll take a 4K digital camera, put on a "detuned" lens (the latest fad in filmmaking), add diffusion filters to the lens, then fill the set with atmosphere mist, shooting at "wide-open" on the lens which renders 85% of the frame out of focus and that area in focus bing subject to spherical aberration (amongst other artifacts), which helps make the image look "beautiful". In post, artificial film gain is added, further deteriorating resolution until the picture is considered graded and matches shot for shot. It's important to understand that this production criteria is limited to drama, where storytelling is linked to aesthetic, etherial concerns. It's also important to note the recent resurgence of the use of film as the originating medium. This is mostly an affectation of a certain class of filmmaker; those wishing the film industry was still in their exclusive control (Spielberg, Scorsese, Tarantino, Nolan, Wes Anderson all still insist on shooting film) Movie & TV Documentary. This changes the criteria a bit. For those documentaries with reenactments & interviews, 24p capture is most prevalent, following the dominant aesthetic. The big pivot comes with nature documentaries. This is most starkly revealed in watching Planet Earth III in 4K Ultra from BluRay, in a 4:4:4, 12bit output at 60fps. This visceral, window-like realism is genuinely jaw dropping. It freaks my dog out completely. But its not the sort of detail you want when looking at Meryl Streeps face. Shooting Meryl this way would result in immediate dismissal as the cinematographer and the sudden end of your career. However, as anyone who has any of the Planet Earth discs will know, this is how you can tell if your projector/monitor is performing correctly. Any changes in setup are immediately apparent with such exacting content. Reality TV. While 4K capture is typical, it is still not preferable to deliver an image that is maximized for 4K resolution. This would compromise the aesthetic concerns too much. Effort is made to reduce resolution to fit inside the acceptable range of what is currently commercially acceptable image quality. However, graphics are distinctly better and more legible compared to 1080p delivery. I doubt 8K would improve anything in this genre. Sport. I'm not much of a sport fan, admittedly. But what I've noticed in terms of resolution isn't so much in the moving image, as in the graphic content. In 4K, the graphics cane be made to be more legible in a smaller font. And I'm sure this will be true of 8K production. My only exposure to 8K sport is in the production of prototype lenses for capture for dome style use. This is becoming more of a "thing" but it isn't something that will translate to domestic use for years to come, and certainly not in the realm of panel display. What might be advantageous with 8K origination would be user designated "picture-in-picture" control. I've been involved in experimentation in this area. For me it has limited appeal, but that is for the market to decide. In this scenario, more pixels would be highly advantageous, and the feature could be limited to 8K deliverables at a "premium" cost, something that excites the bean counters at the steaming services. Gaming. I'm an avid player of Borderlands. I'm level 72, with a full cache of Legendary 800+ weapons acquired in Mayhem 11 mode. If that means nothing to you, it's OK. But it reflects a prodigious amount of time and a chunk of skill; that in the real world is utterly, utterly pointless. I can see how 8K display would be great for this. 8K at 120hz more so. Latency can go *bleep* itself in this realm. The chance of me playing Borderlands on my projector are slim. The screen is in the living room, which forces us into a night-time only viewing routine. Which I'm fine with. I don't need any daytime distraction. Plus, I think the days of screen based gaming is soon to be over, pivoting 100% to headset based display in the next 5 years. Conclusions. Would I buy an 8K projector? Sure. It's most likely coming and most likely we won't have a choice. But realistically, projectors are on their way out. Just as the traditional TV has it's days numbered. In Glendale, in an innocuous business park, Sony has a facility that helps filmmakers make the choice of using Sony technology in their productions. Well, choice is an odd term. If it's a Sony produced project, "choice" is limited to their product line. It's the same facility that Claudio Miranda visited when deciding to shoot Top Gun 2 on the Sony Venice 2 camera. In this facility is a viewing theatre in which is installed the Sony Crystal Vision LED system. This scalable, panel based, modular system is widely used in the corporate market but is currently very expensive. But, as with all things in tech, the price is plummeting. These systems are finding their way into large scale viewing (the Sphere in Las Vegas), and into the homes of the ultra rich. I know someone who spent $500,000 on a home theatre setup with this technology. His priorities are somewhat distorted by his bottomless wealth. We are within a decade of this tech being something we buy at Ikea, and line walls with, cheaper than the cost of wood panelling. This isn't hyperbole. This is being actively developed. When this technology comes on line, 8K, even 16k will be essential. In fact the camera developed for generating content of the Sphere in Vegas in 16K. Which is soon to be 24K, to help with artifact reduction. So that's the future. The not-too-distant future at that. I'll end this textual vomiting with one thought: resolution is a factor to be considered in the creation of content. Storytelling, the type we love in movies and TV, relies on images that tend to be subjective. We all have a different personal interpretation with a movie, less so with documentaries. What is more important, to me at least, is dynamic range and color rendition. To me, a decent 2K projector (like a Christie L2K1500), with a very wide color gamut, high dynamic range and 15,000 lumens, is visually superior to the 4K Sony I have. It's also 12 times the weight and 10 times the cost. My Sony might be better during the credits, but the Christie, when watching movies is far, far superior. I borrowed one to try out. There was NO COMPARISON. But my wife would never, let me bolt a 100lb lump like that to the ceiling. To me color rendition and dynamic range are far, far more significant in terms of overall image quality than resolution. One final thing. Marvel uses tools I developed as a cinematographer in their pre-viz. I have an ongoing relationship with their visual effect supervisors who live 100% in the world of what screen resolution means. In the past few years, Marvel takes 4K original material, down resing to 2K for most work, only going back to 4K when needed, then in the final pass they upres back to 4K using an AI driven system, This saves millions of dollars in storage and render time. Nobody, and I mean not a single person prefers the 4K production path when they tested this at Marvel, It looks better and is cheaper to upres from 2K. Thought you might find that interesting. If you didn't, sorry for all the words. My sense is the 8K upgrade is going to be a big nothing for most of us end users who use the processor just for movies. Thanks for the cinematography seminar. I enjoyed it. Edit: I have an 85#, 75" diagonal, 'lump' bolted to the wall...
|
|
|
Post by marcl on Apr 9, 2024 15:21:58 GMT -5
A cinematographers perspective on whether 8K is better than 4K, or indeed is it preferable? Buckle up. For me, this depends on a number of criteria. Primarily it's the type of content. So I'll handle these separately. Movie & TV drama. The prevailing capture frame rate for movies/TV is 23.98fps (24p, nominally/ 25p for Europe). While most capture is around the 4K mark, as 24fps the image blur in anything other than graphic content and wide shots that are completely still drop to an effective resolution around the 1.5K mark. In fact, as someone with 30 years of experience behind the camera, the job of the cinematographer is to do the opposite of what might be called "resolution maximization." Typically, we'll take a 4K digital camera, put on a "detuned" lens (the latest fad in filmmaking), add diffusion filters to the lens, then fill the set with atmosphere mist, shooting at "wide-open" on the lens which renders 85% of the frame out of focus and that area in focus bing subject to spherical aberration (amongst other artifacts), which helps make the image look "beautiful". In post, artificial film gain is added, further deteriorating resolution until the picture is considered graded and matches shot for shot. It's important to understand that this production criteria is limited to drama, where storytelling is linked to aesthetic, etherial concerns. It's also important to note the recent resurgence of the use of film as the originating medium. This is mostly an affectation of a certain class of filmmaker; those wishing the film industry was still in their exclusive control (Spielberg, Scorsese, Tarantino, Nolan, Wes Anderson all still insist on shooting film) Movie & TV Documentary. This changes the criteria a bit. For those documentaries with reenactments & interviews, 24p capture is most prevalent, following the dominant aesthetic. The big pivot comes with nature documentaries. This is most starkly revealed in watching Planet Earth III in 4K Ultra from BluRay, in a 4:4:4, 12bit output at 60fps. This visceral, window-like realism is genuinely jaw dropping. It freaks my dog out completely. But its not the sort of detail you want when looking at Meryl Streeps face. Shooting Meryl this way would result in immediate dismissal as the cinematographer and the sudden end of your career. However, as anyone who has any of the Planet Earth discs will know, this is how you can tell if your projector/monitor is performing correctly. Any changes in setup are immediately apparent with such exacting content. Reality TV. While 4K capture is typical, it is still not preferable to deliver an image that is maximized for 4K resolution. This would compromise the aesthetic concerns too much. Effort is made to reduce resolution to fit inside the acceptable range of what is currently commercially acceptable image quality. However, graphics are distinctly better and more legible compared to 1080p delivery. I doubt 8K would improve anything in this genre. I used to shoot quite a bit for Victoria's Secret. I did the fashion shows, some advertising and the ill-fated Swim Special (oh boy). My relationship started with me being brought in to shoot "beauty B-Roll", for use in bumpers in the production of the TV broadcast fashion show. Back in 2005 the best option for "beauty" was to shoot on 32mm negative. The models would exit the stage, then pass though my little studio on the way to the changing room, often getting undressed at the same time. I would then shoot then doing into-the lens beauty porn. Over the years, I introduced various early digital cameras including the ARRI D-20 (the first really useable digital motion picture camera), Weisscam, Thompson Viper....but never RED: terrible on skin tone and generally unreliable. My entire role was to make these women look as gorgeous as possible (not hugely difficult) and to hover in this neo-reality of objectification for profit. I was never concerned with the camera resolution, I was only concerned with actual resolution, which I kept under the 1.5K target. Sport. I'm not much of a sport fan, admittedly. But what I've noticed in terms of resolution isn't so much in the moving image, as in the graphic content. In 4K, the graphics cane be made to be more legible in a smaller font. And I'm sure this will be true of 8K production. My only exposure to 8K sport is in the production of prototype lenses for capture for dome style use. This is becoming more of a "thing" but it isn't something that will translate to domestic use for years to come, and certainly not in the realm of panel display. What might be advantageous with 8K origination would be user designated "picture-in-picture" control. I've been involved in experimentation in this area. For me it has limited appeal, but that is for the market to decide. In this scenario, more pixels would be highly advantageous, and the feature could be limited to 8K deliverables at a "premium" cost, something that excites the bean counters at the steaming services. Gaming. I'm an avid player of Borderlands. I'm level 72, with a full cache of Legendary 800+ weapons acquired in Mayhem 11 mode. If that means nothing to you, it's OK. But it reflects a prodigious amount of time and a chunk of skill; that in the real world is utterly, utterly pointless. I can see how 8K display would be great for this. 8K at 120hz more so. Latency can go *bleep* itself in this realm. The chance of me playing Borderlands on my projector are slim. The screen is in the living room, which forces us into a night-time only viewing routine. Which I'm fine with. I don't need any daytime distraction. Plus, I think the days of screen based gaming is soon to be over, pivoting 100% to headset based display in the next 5 years. Conclusions. Would I buy an 8K projector? Sure. It's most likely coming and most likely we won't have a choice. But realistically, projectors are on their way out. Just as the traditional TV has it's days numbered. In Glendale, in an innocuous business park, Sony has a facility that helps filmmakers make the choice of using Sony technology in their productions. Well, choice is an odd term. If it's a Sony produced project, "choice" is limited to their product line. It's the same facility that Claudio Miranda visited when deciding to shoot Top Gun 2 on the Sony Venice 2 camera. In this facility is a viewing theatre in which is installed the Sony Crystal Vision LED system. This scalable, panel based, modular system is widely used in the corporate market but is currently very expensive. But, as with all things in tech, the price is plummeting. These systems are finding their way into large scale viewing (the Sphere in Las Vegas), and into the homes of the ultra rich. I know someone who spent $500,000 on a home theatre setup with this technology. His priorities are somewhat distorted by his bottomless wealth. We are within a decade of this tech being something we buy at Ikea, and line walls with, cheaper than the cost of wood panelling. This isn't hyperbole. This is being actively developed. When this technology comes on line, 8K, even 16k will be essential. In fact the camera developed for generating content of the Sphere in Vegas in 16K. Which is soon to be 24K, to help with artifact reduction. So that's the future. The not-too-distant future at that. I'll end this textual vomiting with one thought: resolution is a factor to be considered in the creation of content. Storytelling, the type we love in movies and TV, relies on images that tend to be subjective. We all have a different personal interpretation with a movie, less so with documentaries. What is more important, to me at least, is dynamic range and color rendition. To me, a decent 2K projector (like a Christie L2K1500), with a very wide color gamut, high dynamic range and 15,000 lumens, is visually superior to the 4K Sony I have. It's also 12 times the weight and 10 times the cost. My Sony might be better during the credits, but the Christie, when watching movies is far, far superior. I borrowed one to try out. There was NO COMPARISON. But my wife would never, let me bolt a 100lb lump like that to the ceiling. To me color rendition and dynamic range are far, far more significant in terms of overall image quality than resolution. One final thing. Marvel uses tools I developed as a cinematographer in their pre-viz. I have an ongoing relationship with their visual effect supervisors who live 100% in the world of what screen resolution means. In the past few years, Marvel takes 4K original material, down resing to 2K for most work, only going back to 4K when needed, then in the final pass they upres back to 4K using an AI driven system, This saves millions of dollars in storage and render time. Nobody, and I mean not a single person prefers the 4K production path when they tested this at Marvel, It looks better and is cheaper to upres from 2K. Thought you might find that interesting. If you didn't, sorry for all the words. It is very interesting. So if I understand the gist of it ... no matter how advanced technology gets to increase the fidelity, resolution, clarity and precision of image capture (and audio too), the trend will always be to degrade fidelity, resolution, clarity and precision either during capture or in post for aesthetic or "paleo nostalgic" reasons ... and to do so using the highest quality/precision tools. Then the consumer market will be full of very expensive equipment to reproduce this content at resolutions higher than it is humanly possible to comprehend. Pretty much?
|
|