LCSeminole
Global Moderator
Res firma mitescere nescit.
Posts: 20,848
|
Post by LCSeminole on Apr 20, 2024 19:32:08 GMT -5
|
|
|
Post by 405x5 on Apr 21, 2024 6:17:19 GMT -5
This harkens back to the earliest days (dts odly enough) of a higher level audio coupled with a most serious lack of source material. I have nothing against old Queen……look what they put out there as the first thing to drag you in??
|
|
|
Post by PaulBe on Apr 21, 2024 7:04:01 GMT -5
From the article - "We’re excited to see a streaming alternative to Dolby Atmos"
Why? What will be improved for the consumer?
Also from the article - "The change is coming thanks to Disney’s collection of IMAX Enhanced titles. In the past, viewers of IMAX Enhanced movies like Avengers: Infinity War were able to enjoy the visual side of IMAX’s presentation format: select scenes that are viewable in a 1.90:1 ratio thatalmost totally eliminate horizontal black bars when viewed on a standard 16:9 ratio TV.".
If you playback in 16:9 on a 16:9 TV the horizontal black bars ARE TOTALLY ELIMINATED. 16:9 = 1.78:1. Not much difference from 1.90:1.
There are changes coming to Disney.... May Disney reap what it sows.
|
|
|
Post by AudioHTIT on Apr 21, 2024 11:23:03 GMT -5
I only have limited experience with DTS:X, namely the Harry Potter series on UHD disc, but it sounds very good. However, Apple TV doesn’t support it, and if it will require bitstream for pass through, Apple has been particularly resistant to adding that back to the feature set. I’ll have to watch the Queen concert to see how TV handles the DTS:X audio. IMAX will probably be a while longer for me.
|
|
LCSeminole
Global Moderator
Res firma mitescere nescit.
Posts: 20,848
|
Post by LCSeminole on Apr 21, 2024 12:14:29 GMT -5
I only have limited experience with DTS:X, namely the Harry Potter series on UHD disc, but it sounds very good. However, Apple TV doesn’t support it, and if it will require bitstream for pass through, Apple has been particularly resistant to adding that back to the feature set. I’ll have to watch the Queen concert to see how TV handles the DTS:X audio. MAX will probably be a while longer for me. I doubt the ATV4K will support it(maybe in the future if there is enough interest?), but I’m thinking the on-board Disney+ app on my Sony OLED just might since the specs states it supports DTS digital surround. I’ll update this thread if indeed it does.
|
|
|
Post by davidl81 on Apr 21, 2024 16:05:21 GMT -5
Queen Rock Montreal is being released in May on Disney+ in iMax Enhanced. It shows it will be in DTS:X. I’m very curious to see if my Apple TV will support it. I have the 4K disc on order though just in case lol.
|
|
|
Post by markc on Apr 22, 2024 1:03:03 GMT -5
From the article - "We’re excited to see a streaming alternative to Dolby Atmos" Why? What will be improved for the consumer? Also from the article - "The change is coming thanks to Disney’s collection of IMAX Enhanced titles. In the past, viewers of IMAX Enhanced movies like Avengers: Infinity War were able to enjoy the visual side of IMAX’s presentation format: select scenes that are viewable in a 1.90:1 ratio thatalmost totally eliminate horizontal black bars when viewed on a standard 16:9 ratio TV.". If you playback in 16:9 on a 16:9 TV the horizontal black bars ARE TOTALLY ELIMINATED. 16:9 = 1.78:1. Not much difference from 1.90:1. There are changes coming to Disney.... May Disney reap what it sows. 1) Video: In jumping in with that apparently indignant reply, you are failing to demonstrate understanding in your post that 16:9 is a TV standard and very different to how the theatrical movies are almost always formatted natively. Watching a movie in 16:9 on a 16:9 TV may well have zero black bars but a lot of content must be cropped from the sides to allow this. Avengers: Infinity War was filmed, presented and screened in both 2.35:1 for standard theatres but also in IMAX 1.90:1 which had the same visual image width but increased picture height (more viewable image information top and bottom that was not visible in the 2.35:1) Standard practice with movie theatre aspect ratio presentation on a TV is usually a compromise - crop some off the sides and add in black bars top and bottom to give a 2.35:1 - 2.4:1 native aspect ratio something around a 2.0-2.2:1 image on a TV, depending on what the original ratio was. The IMAX enhanced titles such as Avengers: Infinity War presented and viewed at 1.90:1 on a 16:9 TV will have all of the image width that is on both 1.90:1 and 2.35:1 plus the extra visible image top and bottom to fill the screen There is your visual benefit to consumers! 2) Audio: Regarding the streaming alternative to Dolby Atmos, IMAX enhanced uses DTS:X and Neural:X decoding to map to available speakers in what may be a more adaptable way. This may present advantages to the consumer that you have disregarded in your post. Atmos decoding mandates overhead speakers. If they are not there, then the positional object based audio is not decoded and presented - only the bed channels. This is a potential weakness of Dolby's approach. With DTS:X used in IMAX enhanced audio (and DTS:x Pro), overhead speakers are not mandatory meaning that all audio is mapped to whatever speakers one has using Neural:X There is your audio benefit to consumers. Your interpretation of "may they reap what they sow" is negative implying that Disney suffers as a result of their choices. I am not a Disney hater and the "reap what they sow" phrase equally applies in a positive light, with them getting benefit from the work that they put in. All of this may be irrelevant to me as whatever audio stream is used, it needs to be presented properly by the hardware (getting back to Emotiva topics) Last night I had to hard reboot my XMC-2 twice before it would decode a Dolby TrueHD 5.1 bitstream so that the audio came from the LCR speakers up front. Persistently, the movie audio was mostly coming from the Rear left, Rear right and front centre channels instead of the front left, centre and front right. I default to using the Auto setting so that with a Dolby bitstream, the Dolby Surround mode is used to upmix to my 7.2.1. If only it worked reliably! I've had this before with my XMC-2 and no amount of flipping surround mode or HDMI inputs ever fixes it meaning a cold restart is needed, but this is the first time I have had to do two hard reboot restarts to get it fixed. I wish my hardware was invisible and seamless and let me focus on the content but 5 years in and these processors have no fixes that do just that. Emotiva's solution is to release a new processor (G4P line) and I have every reasonable expectation that 5 years from the release of the G4P, we will be anticipating the G5P as a solution for the predecessor's shortcomings.
|
|
|
Post by PaulBe on Apr 22, 2024 4:24:08 GMT -5
From the article - "We’re excited to see a streaming alternative to Dolby Atmos" Why? What will be improved for the consumer? Also from the article - "The change is coming thanks to Disney’s collection of IMAX Enhanced titles. In the past, viewers of IMAX Enhanced movies like Avengers: Infinity War were able to enjoy the visual side of IMAX’s presentation format: select scenes that are viewable in a 1.90:1 ratio thatalmost totally eliminate horizontal black bars when viewed on a standard 16:9 ratio TV.". If you playback in 16:9 on a 16:9 TV the horizontal black bars ARE TOTALLY ELIMINATED. 16:9 = 1.78:1. Not much difference from 1.90:1. There are changes coming to Disney.... May Disney reap what it sows. 1) Video: In jumping in with that apparently indignant reply, you are failing to demonstrate understanding in your post that 16:9 is a TV standard and very different to how the theatrical movies are almost always formatted natively. Watching a movie in 16:9 on a 16:9 TV may well have zero black bars but a lot of content must be cropped from the sides to allow this. Avengers: Infinity War was filmed, presented and screened in both 2.35:1 for standard theatres but also in IMAX 1.90:1 which had the same visual image width but increased picture height (more viewable image information top and bottom that was not visible in the 2.35:1) Standard practice with movie theatre aspect ratio presentation on a TV is usually a compromise - crop some off the sides and add in black bars top and bottom to give a 2.35:1 - 2.4:1 native aspect ratio something around a 2.0-2.2:1 image on a TV, depending on what the original ratio was. The IMAX enhanced titles such as Avengers: Infinity War presented and viewed at 1.90:1 on a 16:9 TV will have all of the image width that is on both 1.90:1 and 2.35:1 plus the extra visible image top and bottom to fill the screen There is your visual benefit to consumers! 2) Audio: Regarding the streaming alternative to Dolby Atmos, IMAX enhanced uses DTS:X and Neural:X decoding to map to available speakers in what may be a more adaptable way. This may present advantages to the consumer that you have disregarded in your post. Atmos decoding mandates overhead speakers. If they are not there, then the positional object based audio is not decoded and presented - only the bed channels. This is a potential weakness of Dolby's approach. With DTS:X used in IMAX enhanced audio (and DTS:x Pro), overhead speakers are not mandatory meaning that all audio is mapped to whatever speakers one has using Neural:X There is your audio benefit to consumers. Your interpretation of "may they reap what they sow" is negative implying that Disney suffers as a result of their choices. I am not a Disney hater and the "reap what they sow" phrase equally applies in a positive light, with them getting benefit from the work that they put in. All of this may be irrelevant to me as whatever audio stream is used, it needs to be presented properly by the hardware (getting back to Emotiva topics) Last night I had to hard reboot my XMC-2 twice before it would decode a Dolby TrueHD 5.1 bitstream so that the audio came from the LCR speakers up front. Persistently, the movie audio was mostly coming from the Rear left, Rear right and front centre channels instead of the front left, centre and front right. I default to using the Auto setting so that with a Dolby bitstream, the Dolby Surround mode is used to upmix to my 7.2.1. If only it worked reliably! I've had this before with my XMC-2 and no amount of flipping surround mode or HDMI inputs ever fixes it meaning a cold restart is needed, but this is the first time I have had to do two hard reboot restarts to get it fixed. I wish my hardware was invisible and seamless and let me focus on the content but 5 years in and these processors have no fixes that do just that. Emotiva's solution is to release a new processor (G4P line) and I have every reasonable expectation that 5 years from the release of the G4P, we will be anticipating the G5P as a solution for the predecessor's shortcomings. 1. Concerning apparent indignance and failure to demonstrate understanding - All home TVs are 16:9. All films shot in a different native format will be fully seen on my 16:9 screen without any user input. I 'just push play'. The picture can be cropped by choice if a player has this feature, or there will be black bars horizontally or vertically, or, the picture will fill the 16:9 screen because the native format of the video is 16:9. There is no requirement for a video to be produced in 16:9 – It’s the artist choice. My ‘indignance’, as you call it, concerns the industry choice to make formats a consumer concern when it is an artist concern. My TV frame is not going to magically change aspect ratio to suit the video. The video will fit within the aspect ratio of my TV, with no needed additional manipulation. Obviously, the video aspect ratio and the TV's aspect ratio can be different while we experience the full video picture within ANY physical TV aspect ratio. I'm glad we have TVs that can accommodate different video aspect ratios. The potential horizontal and vertical 'black bars' don't concern me. I have films shot in a many aspect ratios, including some that use multiple aspect ratios in the same movie. It doesn’t concern me. Whatever it IS, it was the artists choice. I follow along and watch. My TV, physically, remains at the 16:9 TV standard. I ‘jumped in‘ with comments about the article and silly marketing in the industry. If this is Not OK with you; adjust. Did you or a family member write the article? Thanks for the seminar on aspect ratios. What would I do without you. . Now, I will go back to watching Casablanca in 4:3, on my 16:9 screen with vertical left and right black bars, and none of the picture is cropped… Great film... The 4K version of Casablanca is obviously Not using all of the 16:9 4K(3840) horizontal pixels, Nor is it making sound that needs to be 'adapted' to the latest and greatest fix-the-mix sound solutions... For the board's edification, this Wiki article on 4K seems to be pretty good - en.wikipedia.org/wiki/4K_resolution2. Concerning disregarded advantages - My speaker setup, and processor speaker and decoding setup, mandates where the sound will go. I make processor choices as I desire within my speaker setup. As long as the industry keeps needing to fix-the-mix, in new and exciting ways (so they tell me), I will disregard their solutions. I have adapted very well to my solutions, which don't require the industry's ever evolving fix-the-mix solutions. My speaker layout is as optimized as I can make it for 7.2.2 Atmos and DTS:X. AURO isn't even an afterthought for me. Adding 4 more holes in the ceiling, at great expense, for .4 heights, doesn't hold my interest either. The industry can't seem to standardize on a common optimum speaker layout between formats. Optimum speaker layouts are different within formats. 5.1 mixes - especially music mixes - don't translate well to recommended 7.1 layouts. Very Few people will have multiple speaker layouts in the same room. Therefore, the fix-the-mix solutions will always be a cluster. Add in consumer room layout limitations and the cluster gets bigger. Adaptations are malleable and variable opinions, with moving goalposts. The only constant or consistency in the industry is money. More movement = more money. 3. Concerning Disney sowing and reaping - Disney IS suffering from their choices. May their suffering continue if they keep making the same choices. 4. Concerning off-topic processor concerns which get away from the thread topic - No comment on your processor issues. There is another forum for that, where I have clearly stated my processor concerns.
|
|
KeithL
Administrator
Posts: 10,256
|
Post by KeithL on Apr 22, 2024 10:13:12 GMT -5
Hi guys... I didn't actually read the article... but I just can't resist jumping in with a few "editorial comments"... Let me start by saying that I agree with Paul here... I have no problem watching a movie with black bars on the sides, or the top and bottom, if my screen happens to have a different aspect ratio than the film. However I do have to concede that I can see some of the points "from the other side". All modern TV screens have a 16:9 aspect ratio... but that wasn't true back before HD... BUT, back in those days, there were several different aspect ratios used in theaters as well... So, if you were watching a movie filmed in "Cinemascope" at your little local theater, you probably STILL had black bars... Or, maybe, they scaled it up so it fit vertically, and the edges were cut off... Or maybe they did "pan and scan" (where someone actually manually decided which part of the screen to show)... (And that worked out well... unless you had two people, shouting at each other, or shooting at each other, from way off on opposite sides of the screen.) You also may not remember some of the really awful options that some of the early wide screen TVs had... Like horizontally stretching only the left and right edges of the screen... sort of like a fun-house mirror. (Watching a scrolling news ticker on one of those was almost as much fun as taking drugs.) And, of course, for some people, the issue isn't the black bars, but the size of the resulting picture. If you play an actual video that was filmed at an aspect ratio of 2.35:1 on a 16:9 screen those black bars take up a third of the screen area. (And that was a huge problem on a 3:4 25" CRT screen.) Also remember that, when Casablanca was filmed, it was intended to be seen in a theater... (And not on a smaller home screen.) But, at this point, we can't exactly go back and tell all of the guys who produced all of that great old content that "they did it wrong". (Well, we can say it, but it won't help anything.) With sound we have the same situation... The original artist is always going to do the best they can, to get the effect they want, with what's available. Today, at least in theory, Dolby Atmos will enable the sound mixer to "put any individual sound anywhere they want". So, again in theory, this should mark the end of compromise. But, in practice, any good sound designer is going to do their best to make sure that their content works for most of what they consider their important audience. This means that, even if they can have a lot of action going on overhead, they're also going to take into account the knowledge that some of their audience doesn't have those height speakers. And today we have another little issue... Twenty years ago we could usually assume that the artist was primarily concerned with the theatrical experience... and the home experience was secondary. But, today, some movies aren't shown in theaters at all, and we're placing more emphasis on the home experience. So... as a content creator... do you design your content to look and sound better in a theater... or in a living room...? (Or do you compromise so you can have a pretty good experience in both.) (Or do you make two different versions... and then ask people to choose which one they want to watch.?) These are all interesting - and valid - questions. So.... Are "DTS:X and IMAX Enhanced" on Disney+ a great new step forward in the technology of home movie streaming...? Or is it just "the latest largely meaningless gimmick to get people to watch that movie on Disney+ instead of NetFlix..."? (I remember when IMAX was a big deal... but, to be honest, my current local Imax Theater is OK... but not at all awesome.) ............................................. 1. Concerning apparent indignance and failure to demonstrate understanding - All home TVs are 16:9. All films shot in a different native format will be fully seen on my 16:9 screen without any user input. I 'just push play'. The picture can be cropped by choice if a player has this feature, or there will be black bars horizontally or vertically, or, the picture will fill the 16:9 screen because the native format of the video is 16:9. There is no requirement for a video to be produced in 16:9 – It’s the artist choice. My ‘indignance’, as you call it, concerns the industry choice to make formats a consumer concern when it is an artist concern. My TV frame is not going to magically change aspect ratio to suit the video. The video will fit within the aspect ratio of my TV, with no needed additional manipulation. Obviously, the video aspect ratio and the TV's aspect ratio can be different while we experience the full video picture within ANY physical TV aspect ratio. I'm glad we have TVs that can accommodate different video aspect ratios. The potential horizontal and vertical 'black bars' don't concern me. I have films shot in a many aspect ratios, including some that use multiple aspect ratios in the same movie. It doesn’t concern me. Whatever it IS, it was the artists choice. I follow along and watch. My TV, physically, remains at the 16:9 TV standard. I ‘jumped in‘ with comments about the article and silly marketing in the industry. If this is Not OK with you; adjust. Did you or a family member write the article? Thanks for the seminar on aspect ratios. What would I do without you. . Now, I will go back to watching Casablanca in 4:3, on my 16:9 screen with vertical left and right black bars, and none of the picture is cropped… Great film... The 4K version of Casablanca is obviously Not using all of the 16:9 4K(3840) horizontal pixels, Nor is it making sound that needs to be 'adapted' to the latest and greatest fix-the-mix sound solutions... For the board's edification, this Wiki article on 4K seems to be pretty good - en.wikipedia.org/wiki/4K_resolution2. Concerning disregarded advantages - My speaker setup, and processor speaker and decoding setup, mandates where the sound will go. I make processor choices as I desire within my speaker setup. As long as the industry keeps needing to fix-the-mix, in new and exciting ways (so they tell me), I will disregard their solutions. I have adapted very well to my solutions, which don't require the industry's ever evolving fix-the-mix solutions. My speaker layout is as optimized as I can make it for 7.2.2 Atmos and DTS:X. AURO isn't even an afterthought for me. Adding 4 more holes in the ceiling, at great expense, for .4 heights, doesn't hold my interest either. The industry can't seem to standardize on a common optimum speaker layout between formats. Optimum speaker layouts are different within formats. 5.1 mixes - especially music mixes - don't translate well to recommended 7.1 layouts. Very Few people will have multiple speaker layouts in the same room. Therefore, the fix-the-mix solutions will always be a cluster. Add in consumer room layout limitations and the cluster gets bigger. Adaptations are malleable and variable opinions, with moving goalposts. The only constant or consistency in the industry is money. More movement = more money. 3. Concerning Disney sowing and reaping - Disney IS suffering from their choices. May their suffering continue if they keep making the same choices. 4. Concerning off-topic processor concerns which get away from the thread topic - No comment on your processor issues. There is another forum for that, where I have clearly stated my processor issues.
|
|
|
Post by marcl on Apr 22, 2024 10:25:40 GMT -5
Hi guys... I didn't actually read the article... but I just can't resist jumping in with a few "editorial comments"... Let me start by saying that I agree with Paul here... I have no problem watching a movie with black bars on the sides, or the top and bottom, if my screen happens to have a different aspect ratio than the film. However I do have to concede that I can see some of the points "from the other side". All modern TV screens have a 16:9 aspect ratio... but that wasn't true back before HD... BUT, back in those days, there were several different aspect ratios used in theaters as well... So, if you were watching a movie filmed in "Cinemascope" at your little local theater, you probably STILL had black bars... Or, maybe, they scaled it up so it fit vertically, and the edges were cut off... Or maybe they did "pan and scan" (where someone actually manually decided which part of the screen to show)... (And that worked out well... unless you had two people, shouting at each other, or shooting at each other, from way off on opposite sides of the screen.) You also may not remember some of the really awful options that some of the early wide screen TVs had... Like horizontally stretching only the left and right edges of the screen... sort of like a fun-house mirror. (Watching a scrolling news ticker on one of those was almost as much fun as taking drugs.) And, of course, for some people, the issue isn't the black bars, but the size of the resulting picture. If you play an actual video that was filmed at an aspect ratio of 2.35:1 on a 16:9 screen those black bars take up a third of the screen area. (And that was a huge problem on a 3:4 25" CRT screen.) Also remember that, when Casablanca was filmed, it was intended to be seen in a theater... (And not on a smaller home screen.) But, at this point, we can't exactly go back and tell all of the guys who produced all of that great old content that "they did it wrong". (Well, we can say it, but it won't help anything.) With sound we have the same situation... The original artist is always going to do the best they can, to get the effect they want, with what's available. Today, at least in theory, Dolby Atmos will enable the sound mixer to "put any individual sound anywhere they want". So, again in theory, this should mark the end of compromise. But, in practice, any good sound designer is going to do their best to make sure that their content works for most of what they consider their important audience. This means that, even if they can have a lot of action going on overhead, they're also going to take into account the knowledge that some of their audience doesn't have those height speakers. And today we have another little issue... Twenty years ago we could usually assume that the artist was primarily concerned with the theatrical experience... and the home experience was secondary. But, today, some movies aren't shown in theaters at all, and we're placing more emphasis on the home experience. So... as a content creator... do you design your content to look and sound better in a theater... or in a living room...? (Or do you compromise so you can have a pretty good experience in both.) (Or do you make two different versions... and then ask people to choose which one they want to watch.?) These are all interesting - and valid - questions. So.... Are "DTS:X and IMAX Enhanced" on Disney+ a great new step forward in the technology of home movie streaming...? Or is it just "the latest largely meaningless gimmick to get people to watch that movie on Disney+ instead of NetFlix..."? (I remember when IMAX was a big deal... but, to be honest, my current local Imax Theater is OK... but not at all awesome.) 1. Concerning apparent indignance and failure to demonstrate understanding - All home TVs are 16:9. All films shot in a different native format will be fully seen on my 16:9 screen without any user input. I 'just push play'. The picture can be cropped by choice if a player has this feature, or there will be black bars horizontally or vertically, or, the picture will fill the 16:9 screen because the native format of the video is 16:9. There is no requirement for a video to be produced in 16:9 – It’s the artist choice. My ‘indignance’, as you call it, concerns the industry choice to make formats a consumer concern when it is an artist concern. My TV frame is not going to magically change aspect ratio to suit the video. The video will fit within the aspect ratio of my TV, with no needed additional manipulation. Obviously, the video aspect ratio and the TV's aspect ratio can be different while we experience the full video picture within ANY physical TV aspect ratio. I'm glad we have TVs that can accommodate different video aspect ratios. The potential horizontal and vertical 'black bars' don't concern me. I have films shot in a many aspect ratios, including some that use multiple aspect ratios in the same movie. It doesn’t concern me. Whatever it IS, it was the artists choice. I follow along and watch. My TV, physically, remains at the 16:9 TV standard. I ‘jumped in‘ with comments about the article and silly marketing in the industry. If this is Not OK with you; adjust. Did you or a family member write the article? Thanks for the seminar on aspect ratios. What would I do without you. . Now, I will go back to watching Casablanca in 4:3, on my 16:9 screen with vertical left and right black bars, and none of the picture is cropped… Great film... The 4K version of Casablanca is obviously Not using all of the 16:9 4K(3840) horizontal pixels, Nor is it making sound that needs to be 'adapted' to the latest and greatest fix-the-mix sound solutions... For the board's edification, this Wiki article on 4K seems to be pretty good - en.wikipedia.org/wiki/4K_resolution2. Concerning disregarded advantages - My speaker setup, and processor speaker and decoding setup, mandates where the sound will go. I make processor choices as I desire within my speaker setup. As long as the industry keeps needing to fix-the-mix, in new and exciting ways (so they tell me), I will disregard their solutions. I have adapted very well to my solutions, which don't require the industry's ever evolving fix-the-mix solutions. My speaker layout is as optimized as I can make it for 7.2.2 Atmos and DTS:X. AURO isn't even an afterthought for me. Adding 4 more holes in the ceiling, at great expense, for .4 heights, doesn't hold my interest either. The industry can't seem to standardize on a common optimum speaker layout between formats. Optimum speaker layouts are different within formats. 5.1 mixes - especially music mixes - don't translate well to recommended 7.1 layouts. Very Few people will have multiple speaker layouts in the same room. Therefore, the fix-the-mix solutions will always be a cluster. Add in consumer room layout limitations and the cluster gets bigger. Adaptations are malleable and variable opinions, with moving goalposts. The only constant or consistency in the industry is money. More movement = more money. 3. Concerning Disney sowing and reaping - Disney IS suffering from their choices. May their suffering continue if they keep making the same choices. 4. Concerning off-topic processor concerns which get away from the thread topic - No comment on your processor issues. There is another forum for that, where I have clearly stated my processor issues. We're probably in better shape with video aspect rations today, given the studios know that most people have a relatively large TV compared to years ago and maybe they are more likely to letterbox the whole enchilada. With Atmos audio they are very likely to compromise the capability of Atmos in favor of playback on ear buds, and also not to offend the paleo-nostalgic memories of older folks who want the Atmos version to still sound like the stereo record they remember from their youth. I attend whatever AES video programs I can find on the topic of "famous mix engineer/producer talks about Atmos" ... and EVERY ONE so far compromised the Atmos for ear buds and the stereo sound ... and one guy even admitted he doesn't use Objects and just mixes to the bed and tops and "turns knobs until it sounds good". Most clearly don't really understand Atmos but are getting paid good money to make new versions of old records. But I'm most interested in knowing how you determined that watching a stretched ticker is AMOST as good as taking drugs.
|
|
|
Post by PaulBe on Apr 22, 2024 12:32:29 GMT -5
Hi guys... I didn't actually read the article... but I just can't resist jumping in with a few "editorial comments"... Let me start by saying that I agree with Paul here... I have no problem watching a movie with black bars on the sides, or the top and bottom, if my screen happens to have a different aspect ratio than the film. However I do have to concede that I can see some of the points "from the other side". All modern TV screens have a 16:9 aspect ratio... but that wasn't true back before HD... BUT, back in those days, there were several different aspect ratios used in theaters as well... So, if you were watching a movie filmed in "Cinemascope" at your little local theater, you probably STILL had black bars... Or, maybe, they scaled it up so it fit vertically, and the edges were cut off... Or maybe they did "pan and scan" (where someone actually manually decided which part of the screen to show)... (And that worked out well... unless you had two people, shouting at each other, or shooting at each other, from way off on opposite sides of the screen.) You also may not remember some of the really awful options that some of the early wide screen TVs had... Like horizontally stretching only the left and right edges of the screen... sort of like a fun-house mirror. (Watching a scrolling news ticker on one of those was almost as much fun as taking drugs.) And, of course, for some people, the issue isn't the black bars, but the size of the resulting picture. If you play an actual video that was filmed at an aspect ratio of 2.35:1 on a 16:9 screen those black bars take up a third of the screen area. (And that was a huge problem on a 3:4 25" CRT screen.) Also remember that, when Casablanca was filmed, it was intended to be seen in a theater... (And not on a smaller home screen.) But, at this point, we can't exactly go back and tell all of the guys who produced all of that great old content that "they did it wrong". (Well, we can say it, but it won't help anything.) With sound we have the same situation... The original artist is always going to do the best they can, to get the effect they want, with what's available. Today, at least in theory, Dolby Atmos will enable the sound mixer to "put any individual sound anywhere they want". So, again in theory, this should mark the end of compromise. But, in practice, any good sound designer is going to do their best to make sure that their content works for most of what they consider their important audience. This means that, even if they can have a lot of action going on overhead, they're also going to take into account the knowledge that some of their audience doesn't have those height speakers. And today we have another little issue... Twenty years ago we could usually assume that the artist was primarily concerned with the theatrical experience... and the home experience was secondary. But, today, some movies aren't shown in theaters at all, and we're placing more emphasis on the home experience. So... as a content creator... do you design your content to look and sound better in a theater... or in a living room...? (Or do you compromise so you can have a pretty good experience in both.) (Or do you make two different versions... and then ask people to choose which one they want to watch.?) These are all interesting - and valid - questions. So.... Are "DTS:X and IMAX Enhanced" on Disney+ a great new step forward in the technology of home movie streaming...? Or is it just "the latest largely meaningless gimmick to get people to watch that movie on Disney+ instead of NetFlix..."? (I remember when IMAX was a big deal... but, to be honest, my current local Imax Theater is OK... but not at all awesome.) 1. Concerning... 4. Concerning... For discussion, I made the assumption that TVs are modern, large, and 4K. No more small pictures and no stretched pictures. I use a 75" set which is fairly normal these days. Last week, I saw a 4K 75" set on sale for $700. The current OLED and mini-LED 75" sets are about $2K. I suppose there are still a few working 25" CRTs in homes. Also, I thought all surround sound on discs is remixed for home use. I made that assumption too.
|
|
|
Post by AudioHTIT on Apr 22, 2024 12:46:01 GMT -5
... With sound we have the same situation... The original artist is always going to do the best they can, to get the effect they want, with what's available. Today, at least in theory, Dolby Atmos will enable the sound mixer to "put any individual sound anywhere they want". So, again in theory, this should mark the end of compromise. But, in practice, any good sound designer is going to do their best to make sure that their content works for most of what they consider their important audience. This means that, even if they can have a lot of action going on overhead, they're also going to take into account the knowledge that some of their audience doesn't have those height speakers. And today we have another little issue... Twenty years ago we could usually assume that the artist was primarily concerned with the theatrical experience... and the home experience was secondary. But, today, some movies aren't shown in theaters at all, and we're placing more emphasis on the home experience. So... as a content creator... do you design your content to look and sound better in a theater... or in a living room...? (Or do you compromise so you can have a pretty good experience in both.) (Or do you make two different versions... and then ask people to choose which one they want to watch.?) These are all interesting - and valid - questions. ... Along these lines, but with streaming DTS:X content in mind, what choice will be made by -- either by the source, streaming device, or 'system' -- when DTS:X content is to be streamed, and the device reports only Atmos is available? Will the source have an alternative Atmos or surround mix? Will the system fall back to DTS HDMA and then let Neural:X fill in the empty speakers? Is there (or could there be) a DTS:X to Atmos 'remapper' (unlike an upmixer)? I guess we'll find out soon enough.
|
|
KeithL
Administrator
Posts: 10,256
|
Post by KeithL on Apr 22, 2024 14:27:54 GMT -5
You've got the right idea... but you didn't follow it back far enough... Dolby Digital, Dolby Digital Plus, Dolby TrueHD, and Dolby Atmos are all digitally encoded formats - which require the Dolby Decoder to play. DTS, DTS HDMA, DTS:X, DTS:X Pro, and the new Imax format which is based on DTS:X Pro are all digitally encoded formats - which require the DTS Decoder to play. There is no way to "convert between them" without decoding and then re-encoding the content. If the source device can ONLY encode DTS, but the target device can ONLY decode Dolby, then there is no digitally encoded format that both support... So the source will then have to fall back to a format that is NOT encoded... like PCM 2.0 ... or maybe PCM 5.1 . You could then use either the Dolby Surround Upmixer) or DTS Neural:X to synthesize more channels from that (not encoded) source. (But the result will not be nearly as pleasing as if your playback device included the proper decoder.) The reality is that a source could indeed contain or offer alternate Dolby Atmos and DTS:X track sets... However, because both individually take up a lot of space or bandwidth, it's doubtful that a source is going to want to support both. The producer would also have to provide copies of both. In the old days many DVDs had both DD+ and DTS tracks but that's pretty rare these days. However most modern PLAYBACK devices support at least some version of both Dolby and DTS... And you would have the option of decoding the content in something like an AppleTV box... which could then output the result as PCM... or even re-encode. In the end it's just one more thing to complicate the question of: "Who plays what and how well?" (Also remember that, at least in theory, the new DTS Imax format would also require video metadata to be passed on to the TV.) Along these lines, but with streaming DTS:X content in mind, what choice will be made by -- either by the source, streaming device, or 'system' -- when DTS:X content is to be streamed, and the device reports only Atmos is available? Will the source have an alternative Atmos or surround mix? Will the system fall back to DTS HDMA and then let Neural:X fill in the empty speakers? Is there (or could there be) a DTS:X to Atmos 'remapper' (unlike an upmixer)? I guess we'll find out soon enough.
|
|
KeithL
Administrator
Posts: 10,256
|
Post by KeithL on Apr 22, 2024 14:34:54 GMT -5
But what percentage of "all the classic content on the entire planet" was made back in the days of 4:3 and Cinemascope? And are we accepting that "in terms of aspect ratio... artistic license is no longer a thing"? (I personally have no problem with that last one... but some artists might.) And, yes, technically speaking "all surround sound on discs is remixed for home use".... But exactly what that means varies considerably. Did they actually create a specific different custom mix for home viewing? Or did they actually have a human being, or maybe an AI, "create a custom mix from the original master"? Or did they just "mix everything down into fewer tracks"? (Did they actually make a custom mix, optimized to sound good in smaller rooms, or did they just dump the master disc into the "Handy Dandy Stream-o-Matic Mark IV"?) For discussion, I made the assumption that TVs are modern, large, and 4K. No more small pictures and no stretched pictures. I use a 75" set which is fairly normal these days. Last week, I saw a 4K 75" set on sale for $700. The current OLED and mini-LED 75" sets are about $2K. I suppose there are still a few working 25" CRTs in homes. Also, I thought all surround sound on discs is remixed for home use. I made that assumption too.
|
|
|
Post by PaulBe on Apr 22, 2024 15:16:11 GMT -5
But what percentage of "all the classic content on the entire planet" was made back in the days of 4:3 and Cinemascope? And are we accepting that "in terms of aspect ratio... artistic license is no longer a thing"? (I personally have no problem with that last one... but some artists might.) And, yes, technically speaking "all surround sound on discs is remixed for home use".... But exactly what that means varies considerably. Did they actually create a specific different custom mix for home viewing? Or did they actually have a human being, or maybe an AI, "create a custom mix from the original master"? Or did they just "mix everything down into fewer tracks"? (Did they actually make a custom mix, optimized to sound good in smaller rooms, or did they just dump the master disc into the "Handy Dandy Stream-o-Matic Mark IV"?) I see the 16:9 TV as an appliance desktop. What is displayed is determined by the art. I don't care when the art was made, nor in what format. Make the display big and versatile. I have no control over quality of content creation. Video choices and Audio Remix choices are made. The consumer makes purchase choices. I expect the Industry will lead consumers to make purchase choices that suit the Industry. Once in a while, the consumer gets a gem. I do have a problem with the slow disappearance of worthy art. IMO, art is supposed to lift the human spirit. Much of what is created today denigrates the human spirit. It is what it is. I am not as fascinated by AI as are many other people. AI does not create itself. AI's most startling feature is how it distills and displays the flaws of the humans behind the AI.
|
|