|
Post by BillBauman on Mar 6, 2010 16:07:30 GMT -5
I understand that bitstream contains the LPCM information but in a compressed form, like a zipped computer file. Can someone confirm if this is true? If that's the case it seems to me it doesn't matter if the unzipping is done in the DVD player or in the Pre/pro. In computers there are a number of zip utilities that all seem to produce the same result. Or is this changing from Bitstream to LPCM the type of process that is subject to the addition of jitter, a time based error where all the ones and zeros are all transmitted but not in the correct order? If the description in the first paragraph is correct there should be no difference where the decoding is done. If the description in the second paragraph is correct then the device with the better decoder would be the preferrable place to do the decoding. Your first paragraph is correct, as I tried to explain with my above post. I may have not done a very good job.
|
|
markd
Emo VIPs
Posts: 182
|
Post by markd on Mar 6, 2010 17:08:00 GMT -5
Yup, the zip analogy is a very good one- most computer users follow that one.
Things do start to diverge right after the un-zip though.
If you do it in the PLAYER (which I believe is required ability for meeting blu-ray spec) you can mix in secondary audio from the disc (commentary, etc).
If you do it in the PROCESSOR, you may have more post-processing (PLII, Neo, etc) options available- some processors seem to have limits if the data is already in LPCM.
|
|
|
Post by BillBauman on Mar 6, 2010 18:43:14 GMT -5
markd, good points. How the data is interpreted and/or modified once it's been unpacked is left up to the determination of the device that unpacked it and anything downstream from there.
|
|
|
Post by Steve_vai_rules on Mar 13, 2010 7:14:19 GMT -5
there is a bit of a misunderstanding when it comes to this whole idea of its just converting 1s and 0s to other 1s and 0s and that it is essentially the same....
working with larger server farms and HPC clusters i know this to know be true and i have seen it often.. Regardless of how simple a conversion from one set of 1s and 0s to another something can go wrong. the less processing and conversion done the better for the original data.
and unless there is some verrrrry expensive ECC code and circuits involved then i would doubt that the bitstream and lpcm signals would be the same all the time.. Does that result in an audible difference.. well thats open to debate.. But to say that the signals will be the same at the end of the day is a flawed argument.. You would be describing a perfect system.. Which isnt possible in our current world of electronics.. There are all kinds of forces at play that can cause random changes to occur and the more being done to a packet or signal the more chances of those errors occuring. and also you can't create a perfect implementation of that conversion technology either so there will be differences present regardless because of this...
And as an aside there is the issues of jitter and post processing noise..
Also unlike a learned system these devices are making interpretations based on mathematical algorithms. there are a variety of implementations of these algorithms and that too can introduce artifacts, errors, or signal differences...
There is more then just 1's and 0s here.. You can only simplify something so much.. Any more and the argument becomes flawed..
btw i have seen 100s and 1000s of compute' hours ruined by a few bits of system induced error, out of terabit sized data sets... (and thats with an excessive amount spent on ecc implementations)
Matt
|
|
markd
Emo VIPs
Posts: 182
|
Post by markd on Mar 13, 2010 16:36:41 GMT -5
Hi Matt-
Sorry, I have to respectfully disagree.
There are some pretty large differences in the kinds of systems/errors we are talking about. . . sure, will a cosmic ray hit cause an error in my audio/video? Yes- but it is a transient error that is only going to affect a single sample. . . not a significant error. The reason you need protections of the kinds you are talking about in a database/processing system are that errors accumulate and increase as the bad data is re-used. The very nature of the data being worked with in the AV world is transitory- single hit errors will flush out.
Now, if you have a crappy system, and you get a ton of errors, then you have a different problem- crappy system. ;-)
There are no "interpretations" going from bitstream to lpcm- the zipfile metaphor is accurate. How many times can you unzip a file and get the same results? Sure, once in a (looong) while you will get a cosmic ray hit and it will barf- but millions of times before that it will work. While there are multiple implementations, there are also required specs, with tested datasets before a particular implementation is certified.
As far as perfect systems- no, I don't think we can make perfect systems- but the errors are vanishingly small. If they weren't, the internet plain wouldn't work. The number of conversions/packets required for me to read your words is in the millions, but they come across just fine.
Markd
|
|
|
Post by Steve_vai_rules on Mar 13, 2010 18:19:04 GMT -5
hey mark;
thus why i added the stipulation of 'debatable audibility'..
it's not just cosmic ray errors that can invoke those problems.. there things like jitter and post processing noise, etc. and yes although the data inthe av world is transitory and just being moved from container to container there still is algorithmic interpretation being done. yes a zip file has a standard algorithm employed to allow for a spec based data container that is universal regardless of code used to interact with that data... There are lots of apps that can use zip files, but all they have done is taken the algorithm for interpreting zip file code and added that to the applications code.. Depending on how the code is structured all kinds of weird and stupid things can happen to that zip file even though it has been implemented 'to spec'... same applies with every file type. mp3, aac, mpeg4, .xls, jpg, gif, tiff, you name it.. it doesnt have to be poorly written or buggy code that can show these limitations.. they crop up all the time.. now yes i understand that a pre/pro is a whooooole lot simpler then even some computer applications but its still very much like a computer and thus the rules still apply.. are the errors in data that will crop up be insignificant.. probably most of the time.. but as these devices get more and more complex and more reliant on applied software models versus absolute hardware standards then you will likely see these errors as more significant.. and a perfect example of complexity defeating a system and creating errors in an otherwise robust system would be the whole toyota fiasco..
again are these audible? probably not.. but i wanted to show that this is more complex of an issue then people are making it out to be.. its not just moving 1s and 0s from a to b...
Matt
|
|
NorthStar
Seeker Of Truth
"And it stoned me to my soul" - Van Morrison
Posts: 0
|
Post by NorthStar on Mar 13, 2010 20:44:12 GMT -5
Linear Pulse Code Modulation is the only true quality game in town.
Bitstream is jittery, juddery, and just plainly & simply buggy.
That is my plain & simple opinion.
|
|