Lonnie
Emo Staff
admin
Pay no attention to the man behind the curtain
Posts: 6,999
|
Post by Lonnie on Sept 7, 2021 16:59:27 GMT -5
Hello one and all, The thread was temporarily closed while we took a minute to put a XMC-2 on the bench. First let me say that I can't explain why he got the results he got, where he got the unit he tested or what was the conditions of the unit and test. That being said, I can say it was seriously not happy for whatever reason. The fact he couldn't get video, the length of time it took to move through the menu, and such are tell tail signs that something was not right and we really should take a look at that unit. Now as many have stated, the XMC-2 is based on the exact same architecture as the RMC-1 and as such they should measure reasonably close to each other and they do. We set up one of our AP's to run what we believe to be the exact same test and configured the output to display the same way to minimize confusion. This my friends is how they all perform when they leave here. Lonnie
|
|
|
Post by derwin on Sept 7, 2021 17:04:44 GMT -5
Is that 2nd harmonic at -85dB legit?
The RMC measures down closer to -100-105 dB.
Glad to see the LF noise isn’t on all the units, but why is the XMC performing 15 dB worse on THD?
|
|
Lsc
Emo VIPs
Posts: 3,434
|
Post by Lsc on Sept 7, 2021 17:17:31 GMT -5
Yes why is the SINAD so much worse than the RMC1?
AND the XMC1?
Is it firmware related? What is the RMC1’s SINAD from the latest firmware?
I’m a little confused but I’m sure there is a good explanation for all of this.
|
|
Lonnie
Emo Staff
admin
Pay no attention to the man behind the curtain
Posts: 6,999
|
Post by Lonnie on Sept 7, 2021 17:18:13 GMT -5
Is that 2nd harmonic at -85dB legit? The RMC measures down closer to -100-105 dB. Glad to see the LF noise isn’t on all the units, but why is the XMC performing 15 dB worse on THD? Depends on reference level used for test. At -20db for the test, all harmonics go away (which is by the way where the unit runs for movies and movies). But Amir was testing at 0db, so we did as well.
|
|
|
Post by derwin on Sept 7, 2021 17:19:46 GMT -5
Is that 2nd harmonic at -85dB legit? The RMC measures down closer to -100-105 dB. Glad to see the LF noise isn’t on all the units, but why is the XMC performing 15 dB worse on THD? Depends on reference level used for test. At -20db for the test, all harmonics go away (which is by the way where the unit runs for movies and movies). But Amir was testing at 0db, so we did as well. The RMC and XMC-1 were also tested at 0dB and produced much lower THD 😢 Credit where it is due though - that noise floor looks really good, and yeah, that may have a higher impact on actually SINAD at listening levels. If that’s a design choice, kudos, but just curious why the harmonics got worse. The harmonics are worrisome from an IMD perspective though - complex music may suffer more from that.
|
|
Deleted
Deleted Member
Posts: 0
|
Post by Deleted on Sept 7, 2021 17:56:10 GMT -5
Hello one and all, The thread was temporarily closed while we took a minute to put a XMC-2 on the bench. First let me say that I can't explain why he got the results he got, where he got the unit he tested or what was the conditions of the unit and test. That being said, I can say it was seriously not happy for whatever reason. The fact he couldn't get video, the length of time it took to move through the menu, and such are tell tail signs that something was not right and we really should take a look at that unit. Now as many have stated, the XMC-2 is based on the exact same architecture as the RMC-1 and as such they should measure reasonably close to each other and they do. We set up one of our AP's to run what we believe to be the exact same test and configured the output to display the same way to minimize confusion. This my friends is how they all perform when they leave here. Lonnie Awesome! Can't wait to observe the dialogue that results from this type of review testing: www.audiosciencereview.com/forum/index.php?threads/emotiva-xmc-2-review-av-processor.26378/page-9#post-903808
|
|
|
Post by louron on Sept 7, 2021 19:11:13 GMT -5
Hello one and all, The thread was temporarily closed while we took a minute to put a XMC-2 on the bench. First let me say that I can't explain why he got the results he got, where he got the unit he tested or what was the conditions of the unit and test. That being said, I can say it was seriously not happy for whatever reason. The fact he couldn't get video, the length of time it took to move through the menu, and such are tell tail signs that something was not right and we really should take a look at that unit. Now as many have stated, the XMC-2 is based on the exact same architecture as the RMC-1 and as such they should measure reasonably close to each other and they do. We set up one of our AP's to run what we believe to be the exact same test and configured the output to display the same way to minimize confusion. This my friends is how they all perform when they leave here. Lonnie Awesome! Can't wait to observe the dialogue that results from this type of review testing: www.audiosciencereview.com/forum/index.php?threads/emotiva-xmc-2-review-av-processor.26378/page-9#post-903808Sure need more information and explanations. I would say the quickest the best. A respectful dialogue would help. Some people calling other jokes are not helping. Some here can call me whatever they want and say I measurbate, I don’t care. It just prove they are from the past. I work for a very large company in product development (including electronic components) and every time we design a product, we go through extensive tests and measurements. Science is the key. You are safe flying because engineers tested all the components of the airplanes. No, engineers don’t just listen to the motors! In the end it all can be measured and there are standards to perform tests. These tests can and are always replicated multiple times. I am an Emotiva XMC-2 owner (and othe Emotiva products) and I have quite a lot of issues. I am not counting the number of factory resets and funny noises or strange transitions when I change speakers presets. I do feel mine worked better before I updated to the last firmware. I would go back to the previous version if I wasn’t worried to brick the unit! I have enough issues that I am very careful and honestly afraid when I update the firmware. I am happy the thread is reopen but closing it didn’t made me feel good. A post that Emotiva was testing and addressing the issue would have been more mature! Closing it in my view was wrong. I will follow closely the thread. I must say that my XMC-2 is listed for sale because I want to listen to music not keep working on small issues. I just got tired of the constant adventures. The only thing that was making me feel good was that I thought it would measure close to the RMC-1. I think many of us were thinking that. I could live with waiting 14 seconds every time I switch inputs before I get audio and video and even all the problems every time I want to do DiracLive calibration, as long as I am getting the performances that were sold to me…. Now we have a second test by Lonnie but this still shows some results that aren’t close to the RMC-1. These do not correlate what I was expecting from the Emotiva website and what I read from their people. It kind of raise more questions. I do believe in measurements, I am not working for ASR or Emotiva and I am not calling anyone a joke. I just want real answers. If the XMC-2 isn’t close to the RMC-1 or even the older XMC-1 then I want to know once and for all. Looks like it took ASR review to get some attention. hopefully we get to the bottom of it with extensive testing and then we get posted results. I am not for or against anyone and I truly hope it is only a defective unit but I still want to understand all the differences seems in both tests. There still are considerable differences even in Lonnie test between the RMC-1 and the XMC-2. I was hoping closer. If it isn’t that good then there is no reason for me to live with all the other issues. still I am hopeful.
|
|
Deleted
Deleted Member
Posts: 0
|
Post by Deleted on Sept 7, 2021 19:16:30 GMT -5
Sure need more information and explanations. I would say the quickest the best. A respectful dialogue would help. Some people calling other jokes are not helping. Some here can call me whatever they want and say I measurbate, I don’t care. It just prove they are from the past. I work for a very large company in product development (including electronic components) and every time we design a product, we go through extensive tests and measurements. Science is the key. You are safe flying because engineers tested all the components of the airplanes. No, engineers don’t just listen to the motors! In the end it all can be measured and there are standards to perform tests. These tests can and are always replicated multiple times. I am an Emotiva XMC-2 owner (and othe Emotiva products) and I have quite a lot of issues. I am not counting the number of factory resets and funny noises or strange transitions when I change speakers presets. I do feel mine worked better before I updated to the last firmware. I would go back to the previous version if I wasn’t worried to brick the unit! I have enough issues that I am very careful and honestly afraid when I update the firmware. I am happy the thread is reopen but closing it didn’t made me feel good. A post that Emotiva was testing and addressing the issue would have been more mature! Closing it in my view was wrong. I will follow closely the thread. I must say that my XMC-2 is listed for sale because I want to listen to music not keep working on small issues. I just got tired of the constant adventures. The only thing that was making me feel good was that I thought it would measure close to the RMC-1. I think many of us were thinking that. I could live with waiting 14 seconds every time I switch inputs before I get audio and video and even all the problems every time I want to do DiracLive calibration, as long as I am getting the performances that were sold to me…. Now we have a second test by Lonnie but this still shows some results that aren’t close to the RMC-1. These do not correlate what I was expecting from the Emotiva website and what I read from their people. It kind of raise more questions. I do believe in measurements, I am not working for ASR or Emotiva and I am not calling anyone a joke. I just want real answers. If the XMC-2 isn’t close to the RMC-1 or even the older XMC-1 then I want to know once and for all. Looks like it took ASR review to get some attention. hopefully we get to the bottom of it with extensive testing and then we get posted results. I am not for or against anyone and I truly hope it is only a defective unit but I still want to understand all the differences seems in both tests. There still are considerable differences even in Lonnie test between the RMC-1 and the XMC-2. I was hoping closer. If it isn’t that good then there is no reason for me to live with all the other issues. still I am hopeful. Amirm clarified some of the variations in testing results on ASR. I provided the link above. With two results and two narratives I was most curious about any margin of error and personally couldn't see how anyone could make any conclusive statements. I mean 3 out of 3 anyone? Enjoy
|
|
|
Post by routlaw on Sept 7, 2021 19:29:33 GMT -5
I've have avoided this conversation due to the potential of just what it turned into, that being a technical pi**ing match. Worth pointing out many engineers do not put much value in SINAD measurements. While I don't have a dog in this race regarding the XMC-2 I do have an RMC-1L that hopefully will ship soon since my XMC-1 arrived in TN today, so the subject was of interest. If I understand correctly how the errant XMC-2 was measured it was at 0db which would be ear bleeding level, amp blowing, speaker blowing levels the same as if you ran a source device rated at 2 volts RMS directly into an amp without a preamp in the mix. Even at -20 db for the test is way louder than most people could stay in a room with. With our XMC-1 we were typically at -32 db for most movies and TV series. The point is, why test a piece of gear at levels that have no relation to reality and sensible everyday use? Makes no sense to me. In the meantime for those interested or more importantly disturbed by the ASR review of the XMC-2 especially the SINAD value I would encourage you to read through the article linked below. I found it to be a very good tutorial on what SINAD is and why its outdated and not all that important in the 21st century. Granted a small part of this article is geared toward how this affects headphone users but the general thrust and majority is good reading for anyone in this hobby. I hope this helps. Given Lonnies measurements posted above, me thinks this (ASR review) is much ado about nothing. He makes a good point this device seemed troubled from the get go. thanks www.headphones.com/community/reviews-learning-and-news/evaluating-sinad-why-its-not-important
|
|
|
Post by JKCashin on Sept 7, 2021 19:37:38 GMT -5
If I understand correctly how the errant XMC-2 was measured it was at 0db which would be ear bleeding level, amp blowing, speaker blowing levels the same as if you ran a source device rated at 2 volts RMS directly into an amp without a preamp in the mix. Even at -20 db for the test is way louder than most people could stay in a room with. With our XMC-1 we were typically at -32 db for most movies and TV series. I routinely listen at -20db and even louder... I think the figure displayed on the front panel is not the whole story though. Last night for example we watched a movie off Netflix at about -15db... it was perfect. Impactful, but not "too loud". When it was over, we loaded YouTube and darn near blew the couch back up to the wall. OK, not really, but it was "too loud". So... the question is, and the answer may already be provided in the ASR review, how "loud" was the input source?
|
|
|
Post by arthurz on Sept 7, 2021 19:48:04 GMT -5
While I agree with the article you linked regarding SINAD not being a great metric, ASR did post the entire spectrum, and we can easily see the SINAD is dominated by distortion, not noise. Therefore ignoring ASR due to their mistaken focus on SINAD is sort of throwing the baby out with the bathwater.
|
|
|
Post by routlaw on Sept 7, 2021 19:52:26 GMT -5
JKCashin wow! I literally could not stay in the room at those levels either -15 or -20 db. I have to ask what is your source? We tend to use an Apple TV 4K for movies and TV and occasion Oppo 203, but otherwise its a outboard DAC via Mac Mini or CDP for music usually. I'm fairly certain if I ran things anywhere near those volumes something would blow up besides my head caving in. Clearly there is something different going on at your place than ours. Amazing results but thanks for sharing.
|
|
|
Post by derwin on Sept 7, 2021 20:03:31 GMT -5
I've have avoided this conversation due to the potential of just what it turned into, that being a technical pi**ing match. Worth pointing out many engineers do not put much value in SINAD measurements. While I don't have a dog in this race regarding the XMC-2 I do have an RMC-1L that hopefully will ship soon since my XMC-1 arrived in TN today, so the subject was of interest. If I understand correctly how the errant XMC-2 was measured it was at 0db which would be ear bleeding level, amp blowing, speaker blowing levels the same as if you ran a source device rated at 2 volts RMS directly into an amp without a preamp in the mix. Even at -20 db for the test is way louder than most people could stay in a room with. With our XMC-1 we were typically at -32 db for most movies and TV series. The point is, why test a piece of gear at levels that have no relation to reality and sensible everyday use? Makes no sense to me. In the meantime for those interested or more importantly disturbed by the ASR review of the XMC-2 especially the SINAD value I would encourage you to read through the article linked below. I found it to be a very good tutorial on what SINAD is and why its outdated and not all that important in the 21st century. Granted a small part of this article is geared toward how this affects headphone users but the general thrust and majority is good reading for anyone in this hobby. I hope this helps. Given Lonnies measurements posted above, me thinks this (ASR review) is much ado about nothing. He makes a good point this device seemed troubled from the get go. thanks www.headphones.com/community/reviews-learning-and-news/evaluating-sinad-why-its-not-importantThe TLDR of testing at 0dB (actually they use output voltage - 4 volts, so it controls for lower source levels or processing attenuation) is: 1. It provides a standard reference to compare all tests against. Everyone is tested at the exact same level so there is no discrepancy between different results and they can be compared directly 2. It actually makes SINAD look best. Usually noise dominates SINAD, and at lower volumes, that noise doesn’t go away, so your lower signal vs noise floor looks much worse. ASR has been testing the same way for a long time. I do agree, this is probably not realistic and puts extra weight on THD when it is the noise floor that is often more important in quality of a user’s listening experience. With all that said, the concerns here are two: 1) Why does this unit perform so much worse than the RMC - which it was sold to us as being so similar too? 2) The -85dB is plausibly audible, and worse, given the many bands of higher harmonics, it’s possible that internodulation distortion could be worse
|
|
|
Post by JKCashin on Sept 7, 2021 20:06:08 GMT -5
JKCashin wow! I literally could not stay in the room at those levels either -15 or -20 db. I have to ask what is your source? We tend to use an Apple TV 4K for movies and TV and occasion Oppo 203, but otherwise its a outboard DAC via Mac Mini or CDP for music usually. I'm fairly certain if I ran things anywhere near those volumes something would blow up besides my head caving in. Clearly there is something different going on at your place than ours. Amazing results but thanks for sharing. Precominantly an NVidia Shield, but also Samsung smart TV and Roku, and even the Tuner. The Tuner is the loudest of them all. Just checked -20db on a YouTube video and it's not loud at all. I checked with an SPL meter and at -20db, the average volume level at my listening position is 85db (A weighted).
|
|
|
Post by JKCashin on Sept 7, 2021 20:08:20 GMT -5
Just checked the Tuner. At -20db displayed volume level, A weighted SPL is 95db. I'd call that too loud.
|
|
|
Post by AudioHTIT on Sept 7, 2021 20:10:12 GMT -5
Given the variables of speaker sensitivity, amp gain and sensitivity, input sensitivity and level, source device output level, with or without Dirac (and it’s normalization), trim settings, and anything else I’ve missed; predicting a perceived loudness or volume based on a processor gain/attenuation number, is just about impossible. I personally think 0 dB is a legitimate level to play some systems at, and that any of the G3P should perform well there. It also seems likely that many people might normally play their systems around -20. Which is to say, both levels are reasonable and it would probably be informative to know the measurements under both conditions. It’s also been pointed out that ASR didn’t test an analog input in Reference, which is something not all processors even have, and of interest to G3P owners. We still don’t know why the RMC-1 & XMC-2 measured differently. I appreciate Lonnie’s willingness to put a unit on the bench and post the numbers, though I don’t think all the questions are yet answered.
|
|
Lonnie
Emo Staff
admin
Pay no attention to the man behind the curtain
Posts: 6,999
|
Post by Lonnie on Sept 7, 2021 20:15:07 GMT -5
OK, I think I need to clarify a little here. As with anything there are decisions that are made. In this case the performance was optimized for real world use vs. bench test. We could offset the input to the DSP so no harmonics would ever show up under any circumstances but then we would be throwing away a lot of real world headroom. We chose to optimize the performance to real world use so that with an input signal of -20db the ground floor noise is roughly 145db down and thus you would get a SINAD of somewhere around 110 to 112db. By doing this the harmonics will only show up under huge dynamic peaks and even at that, they are 85db down from the fundamental. Which means they are completely inaudible because they are swamped by the fundamental and even if you were to play it at a normal listening level, it’s still 30db below room noise, again inaudible. Basically the whole mountain out of a mole hill adage.
|
|
|
Post by AudioHTIT on Sept 7, 2021 20:16:59 GMT -5
Just checked the Tuner. At -20db displayed volume level, A weighted SPL is 95db. I'd call that too loud. On my system, the tuner has a much higher level than anything else, at least 10dB, maybe more like 15dB higher.
|
|
Lonnie
Emo Staff
admin
Pay no attention to the man behind the curtain
Posts: 6,999
|
Post by Lonnie on Sept 7, 2021 20:18:45 GMT -5
One last thing, the XMC-2 for the main channels measures EXACTLY the same as the RMC-1 and always has. Nothing has changed gents. If I remember correctly Amir tested the RMC-1 at -20db.
Lonnie
|
|
Lsc
Emo VIPs
Posts: 3,434
|
Post by Lsc on Sept 7, 2021 20:23:10 GMT -5
OK, I think I need to clarify a little here. As with anything there are decisions that are made. In this case the performance was optimized for real world use vs. bench test. We could offset the input to the DSP so no harmonics would ever show up under any circumstances but then we would be throwing away a lot of real world headroom. We chose to optimize the performance to real world use so that with an input signal of -20db the ground floor noise is roughly 145db down and thus you would get a SINAD of somewhere around 110 to 112db. By doing this the harmonics will only show up under huge dynamic peaks and even at that, they are 85db down from the fundamental. Which means they are completely inaudible because they are swamped by the fundamental and even if you were to play it at a normal listening level, it’s still 30db below room noise, again inaudible. Basically the whole mountain out of a mole hill adage. I still don’t understand the differences between the RMC1 and XMC2. Maybe it’s what ASR said on the test parameters… “ The main difference is that I was only driving two channels and they are driving all 8. Someone commented that the bug in bass management may be the problem here and this would sort of point to that direction.” “ Sample rate is also different (mine is 44.1 kHz and theirs is 48 khz). Also, the have set the bandwidth to 20 kHz whereas mine extends to 22.4 kHz.” Bottom line here is, regardless of how important one measurement is vs others, the RMC1 and XMC1 are two of the highest measuring processors while the XMC2 resides in the “poor” category. I had a chance to buy the RMC1L with a 40% card someone gifted me at the end of last year but didn’t feel the need. Had I known that technically the XMC2 was inferior, I probably would have pulled the trigger. It’s a bummer of a day for me and other XMC2 owners. Our 2 channel performance is inferior to the RMC units - which is contrary to our conversation at Axpona .
|
|