geebo
Emo VIPs
"Too bad that all the people who know how to run the country are driving taxicabs and cutting hair"
Posts: 24,181
|
Post by geebo on Feb 25, 2020 12:25:53 GMT -5
Let me just say that I have no clue how he was testing and under what conditions, but I can say this. Everyone raves about the sound quality of our gear. That is the one constant that holds true. If the processor was as he says, would it not be reflected in the sound quality? Think about it for a minute. In the meantime, I have asked Ray to put together test data that will be published in a little bit. I would simply publish our full test except its over 250 pages long and everyone would just get bogged down in the mire. So Ray is running an abridged version now to show the performance specs using industry standard test. When I asked him to do this, he wants to make a few technical comments as they relate to some of his comparisons and how comparing a HT processor to a desk top DAC is an apple to orange comparison and this goes for all processors, not just us. Lonnie Lonnie, SQ is subjetive. SNR/THD/etc are not. Why nmot use numbers to justify your claim?
I can take an old TV with crappy speakers, tweak the equalization and turn up the volume and make it sound better... but lab measurements would reveal the truth.
As Lonnie said in the message you quoted: "I have asked Ray to put together test data that will be published in a little bit." So your numbers are coming.
|
|
|
Post by SOWK on Feb 25, 2020 12:27:55 GMT -5
With all due respect, measurements are only consistant if run under the same conditions. Change any of the external variables and the end results change. That is why there are industry standard test with defined procedures. Lonnie Lonnie, 1-Will Emotiva ever be posting specs for the current processors? 2-Are the latest results on the ASR review flawed in any way and if so, how? for number 2:
Yes, he didn't preform the voodoo magic the RMC-1 needs to work correctly first.
My post to him on their forum.
"Do a factory reset under Setup/Advanced Turn off the unit Unplug the unit and all HDMI wait 1 min plug the unit back in, no HDMI yet install the official 1.8 Do a factory reset under Setup/Advanced Turn off the unit Unplug the unit with all HDMI still unplugged wait 1 min plug the unit back in, no HDMI yet Turn the unit on Make the changes you need in the setup menu turn off the unit Unplug the unit with all HDMI still unplugged plug the unit back in, and the HDMI. Retest with working HDMI Audio."
|
|
timg
Minor Hero
Posts: 68
|
Post by timg on Feb 25, 2020 12:31:06 GMT -5
I'm very interested to see the measurements Emotiva posts. Would be good to see another 3rd party test another unit to either confirm these measurements or perhaps call into light other issues. Assuming the expansion modules ever come (particularly the multi-sub Dirac Bass Management one), this is still the only > 16 channel processor for < $10k.
Lonnie,
Were there any hardware or assembly changes from the original 2x rounds of pre-order units vs. the current regular production units which might explain some of what Amir is seeing?
Thanks.
Tim
|
|
klinemj
Emo VIPs
Honorary Emofest Scribe
Posts: 14,740
|
Post by klinemj on Feb 25, 2020 12:33:48 GMT -5
With all due respect, measurements are only consistant if run under the same conditions. Change any of the external variables and the end results change. That is why there are industry standard test with defined procedures. Lonnie Lonnie, 1-Will Emotiva ever be posting specs for the current processors? 2-Are the latest results on the ASR review flawed in any way and if so, how? I can't speak to #2, but I have seen a lot from the reviewer in the past that were cause for pause...sketchy might be appropriate. Mark
|
|
Lonnie
Emo Staff
admin
Pay no attention to the man behind the curtain
Posts: 6,999
|
Post by Lonnie on Feb 25, 2020 12:42:34 GMT -5
Let me just say that I have no clue how he was testing and under what conditions, but I can say this. Everyone raves about the sound quality of our gear. That is the one constant that holds true. If the processor was as he says, would it not be reflected in the sound quality? Think about it for a minute. In the meantime, I have asked Ray to put together test data that will be published in a little bit. I would simply publish our full test except its over 250 pages long and everyone would just get bogged down in the mire. So Ray is running an abridged version now to show the performance specs using industry standard test. When I asked him to do this, he wants to make a few technical comments as they relate to some of his comparisons and how comparing a HT processor to a desk top DAC is an apple to orange comparison and this goes for all processors, not just us. Lonnie Lonnie, SQ is subjetive. SNR/THD/etc are not. Why nmot use numbers to justify your claim?
I can take an old TV with crappy speakers, tweak the equalization and turn up the volume and make it sound better... but lab measurements would reveal the truth.
Please re-read my post, numbers coming shortly. Lonnie
|
|
|
Post by alexreusch on Feb 25, 2020 12:44:54 GMT -5
There are a bunch of good reviewers out there: John Darko, Steve Guttenberg, Thomas & Stereo, Zero Fidelity (all available on YouTube)... and many more. I also do like "That Home Theater Dude" on Homecinema stuff. But at the end, it is you alone who decides what you like and what not. Reviewers can only give you hints and your decision should never be based only on the opinion of reviewers. We all have different ears! We all have different taste. Nobody tells me what I like! Measurements on the other hand don't change depending on who is doing them. They tell the truth and not just what someone else likes. Now if measurements directly relate to what I like is a different question That depends on how those reviews are done. A good reviewer does normally not tell you, which equipment you should buy, but instead give you an idea how he experienced the reviewed product. That's what I do like on the mentioned reviewers. Have a look at the work of John Darko or Thomas & Stereo for example. I really like the way they are doing their reviews. I really don't care about measurements. It says nothing to me...
|
|
|
Post by rhale64 on Feb 25, 2020 13:03:40 GMT -5
All I can say is when I had my RMC1 and a BAT VK40SE at the same time I listened to both and decided to sell the BAT. The RMC1 sounded just as clean but with more dynamics. The BAT sounded a little dull in comparison. So I am thinking a faulty unit on these measurements. Or something with the stinking firmware screwing with things. I don't doubt the measurements however. Just think something was wrong. I have also now compared my RMC1L to a Parasound Halo P5 and believe that the RMC1L sounds much better. Quieter and cleaner sounding. Also my buddy has a BAT tube pre that I believe doesn't sound any better than the RMC1 L. We both have Aerial speakers also. He has a Brooklyn Dac. And I don't have my Dac yet. I am still waiting on my custom ordered Holo Audio Spring Dac2 Kitsune tuned edition. We tried his Mytech and compared it to the USB dac input on the RMC1L, and the Brooklyn sounded better. With a deeper soundstage. But other than that I thought the RMC1L sounded a little smoother overall. And this can probably be attributed to the AKM vs Sabre sound. So the RMC1L is letting everything through. Even the differences in Dacs.
|
|
|
Post by Thunderduck on Feb 25, 2020 13:06:13 GMT -5
I very well could be wrong, but it is my understanding that the test unit was provided by an owner, not Emotiva. Could this in any way affect the test results?
|
|
timg
Minor Hero
Posts: 68
|
Post by timg on Feb 25, 2020 13:06:43 GMT -5
Subjective review: My S2000 is really fast. Objective review: The new Honda Odyssey has a better power to weight ratio and faster 0-60 than the S2000.
Both are true in their own way. An Odyssey doesn't drive or handle like a S2000, but at a stoplight, the S2000 is getting smoked...
Subjective without objective is useless.
Objective without subjective is questionable too. I use a lot of programs at work that spit out data. It doesn't necessarily make that data correct or mean that I understand it. It's very common to see colleagues with data that is incorrect, but it's what the program gave them so they trust it.
In audio and video, objective reviews with quantifiable results are mandatory. If we only listened to reviewers, everyone would be buying $10k snake oils to add to their cable terminations for better electron transfer and magical cable bridges to lift cables off the ground which yield improved transparency... We would also all have TVs set on Vivid 100% of the time.
Most of us are not experts in audio measurement and don't have the appropriate tools/procedures, so we're heavily reliant on the person providing the information to do it properly. Amir used multiple FW revisions, suffered dozens of lockups, and did a handful of new tests to attempt to understand and better explain the weird results he was getting. A single test/ set of data points isn't concrete proof, but I think he's done an excellent job pointing out that further review is necessary and that it appears there may be some fundamental design issues that should be addressed. We should all be grateful to him for the time and effort he put into this test.
Tim
|
|
|
Post by SOWK on Feb 25, 2020 13:10:45 GMT -5
I very well could be wrong, but it is my understanding that the test unit was provided by an owner, not Emotiva. Could this in any way affect the test results? No.
Other things could have though.
HDMI was not used and possible software issues due to him likely not unplugging and then re-plugging back in the HDMI and power to the unit before testing.
|
|
|
Post by rbk123 on Feb 25, 2020 13:11:15 GMT -5
The HTP-1, although tested the best so far, didn't test that great and there was similar up-in-arms responses over at AVS. One guy returned it because of the measurements.
I don't know if his measurements are trustworthy, since the RMC didn't seem to behave correctly for him, but let's assume his results are accurate. The only question that matters is: does it sound and perform worth $5k or $4k or whatever you ended up paying for it? Perform is the bugginess factor and features not working, but sound is sound - it doesn't matter what it measures, it matters how it sounds.
Now, still assuming his measurements are trustworthy, anecdotally the next question is: would it sound even better if the design changes were made to improve the results, per his suggestion? Some of the best measuring equipment out there sound like a$$, and some of the worst measuring gear sounds great. So that's a question we'll never know the answer to.
For those freaking out about the measurements who were considering buying one - when the time comes, buy it, try it out. Buy the HTP-1 or whatever and try it out in the same timeframe. Keep the one you like the most, and return the other. It's really not that complicated.
|
|
richb
Sensei
Oppo Beta Group - Audioholics Reviewer
Posts: 858
|
Post by richb on Feb 25, 2020 13:16:06 GMT -5
I very well could be wrong, but it is my understanding that the test unit was provided by an owner, not Emotiva. Could this in any way affect the test results? You are correct it was supplied by a user on AVS. Perhaps, there is a problem with the unit. I also do not understand the issues that he was having unless there is something with the AES input that, I doubt is getting much use. - Rich
|
|
cawgijoe
Emo VIPs
"We made too many of the wrong mistakes." - Yogi Berra
Posts: 4,895
|
Post by cawgijoe on Feb 25, 2020 13:31:19 GMT -5
I had never heard of this guy, but I haven't frequented AVS in a long time. I did take a look at his test data and of course the charts look impressive. The AVS folks obviously like him because they are bashing Emotiva for the most part in their comments and some are insinuating that the Emotiva test data will be skewed.
Does anyone know what Amir's qualifications are? What is his background? Who does he work for? Is he an expert? This is not to demean him, I just don't know.
I'm skeptical as I find it very odd that a company like Emotiva would put out a product that measures horribly in the first place. Why would you do that?
Very strange.
|
|
|
Post by urwi on Feb 25, 2020 13:31:41 GMT -5
We all have different ears! We all have different taste. Nobody tells me what I like! Measurements on the other hand don't change depending on who is doing them. They tell the truth and not just what someone else likes. Now if measurements directly relate to what I like is a different question With all due respect, measurements are only consistant if run under the same conditions. Change any of the external variables and the end results change. That is why there are industry standard test with defined procedures. Lonnie So far you didn't post any specs so I'm looking forward to your report! And your test conditions. ASR also uses AP like you do. AFAIK they documented test conditions in their review. I can only imagine your results are state of the art like they should be for a $5000 device factory direct.
|
|
koeitje
Minor Hero
Warning
50%
Posts: 28
|
Post by koeitje on Feb 25, 2020 13:35:46 GMT -5
Measurements on the other hand don't change depending on who is doing them. They tell the truth and not just what someone else likes. Measurements can be affected by noise from various sources. Using a PC to generate a signal is suspect to me. Cheap probes, cheap scope (or is the reviewer using a virtual scope?), using jumpers attached to probes (who knows?), any test component not rated for the test can allow noise to be injected into the signal. So measurements are equipment and personnel dependent. Measurements "can" be the same between multiple lab technicians if the same equipment is used under the same lab conditions. It's more likely that the measurements would be "close", but not identical, which is perfectly acceptable. Yes, you make a good point here. That's why you need a somebody that knows what they are doing, has a bunch of measurements to compare them to, has their results verified by multiple manufacturers and has the right tools. So that's what makes this review good. Amir is using an APX555 to do his measurements, this is the industry standard in audio measurements. This little piece of kit goes for $28000 excl. extra modules. He has a ton of measurements done and many have been confirmed by manufacturers. Mainly the smaller ones that are open to improving their products. And just so you know the PC is not generating the signal, the PC just controls the APX555. Also nobody is expecting for a HT processor to match a stereo DAC, because in a stereo DAC you don't have to deal with all the signal processing and noise in an AVR and you can stack channels. Disabling all the signal processing should result in results similar to this 8 channel DAC: www.audiosciencereview.com/forum/index.php?threads/review-and-measurements-of-okto-dac8-8ch-dac-amp.7064/. Its no surpise Okto publishes their own measurements on their website: they know what they are doing. But I hear that a simple reset should fix all problems, so I'm looking forward to SOTA performance .
|
|
|
Post by urwi on Feb 25, 2020 13:39:38 GMT -5
We all have different ears! We all have different taste. Nobody tells me what I like! Measurements on the other hand don't change depending on who is doing them. They tell the truth and not just what someone else likes. Now if measurements directly relate to what I like is a different question That depends on how those reviews are done. A good reviewer does normally not tell you, which equipment you should buy, but instead give you an idea how he experienced the reviewed product. That's what I do like on the mentioned reviewers. Have a look at the work of John Darko or Thomas & Stereo for example. I really like the way they are doing their reviews. I really don't care about measurements. It says nothing to me... It's certainly more convenient to follow someone you trust than to learn about engineering.
|
|
|
Post by richter250 on Feb 25, 2020 13:43:23 GMT -5
Lonnie, 1-Will Emotiva ever be posting specs for the current processors? 2-Are the latest results on the ASR review flawed in any way and if so, how? I can't speak to #2, but I have seen a lot from the reviewer in the past that were cause for pause...sketchy might be appropriate. Mark Sketchy is very appropriate. He gave a terrible "review" with bad measurements to the PS Audio Directstream DAC. That piece of gear is very well respected and has received glowing reviews from many trusted reviewers. He is a guy in his basement with some fancy testing gear that he may or may not know how to use. That is all.
|
|
Deleted
Deleted Member
Posts: 0
|
Post by Deleted on Feb 25, 2020 13:45:25 GMT -5
RE: firmware 1.8 1. Locking onto sound sources appears to be quicker and more reliable. 2. When pausing/resuming a source on my cable box I get two consistent (and largely acceptable) results. A. When using a pcm 2.0 source - audio begins immediately upon resuming play. B. When using a DD or DD+ source - audio plays - mutes - then plays (all within 2-3 tenths of a second). 3. When streaming Netflix/Prime on my cable box I still have random/infrequent static when resumming from a paused state. 4. It looks like the new firmware default is to camp on the most recently recieved source format instead of falling back to pcm 2.0. - for most of us I believe this should result in more consistent and reliable performance. Hopefully, the issue of lost audio & or lost channels is over (TBD). 5. Network connectivity comes and goes, but "refreshing" the pc and iOS app(s) seems to restore functionality.
|
|
|
Post by mikoz on Feb 25, 2020 13:48:55 GMT -5
We all have different ears! We all have different taste. Nobody tells me what I like! Measurements on the other hand don't change depending on who is doing them. They tell the truth and not just what someone else likes. Now if measurements directly relate to what I like is a different question That depends on how those reviews are done. A good reviewer does normally not tell you, which equipment you should buy, but instead give you an idea how he experienced the reviewed product. That's what I do like on the mentioned reviewers. Have a look at the work of John Darko or Thomas & Stereo for example. I really like the way they are doing their reviews. I really don't care about measurements. It says nothing to me...
Yeah, like when you go to a doctor, if the doctor doesn't do any blood tests to reveal how things are working... as long as you feel ok... measurements and tests done to baselines aren't important. Just be happy that some guy did a review sitting on his couch concluded it sounds better than brand XYZ... that's more important. SQ is subjective, real lab measurements are not. You may think it sounds better, and you can just "think" that.
It's interesting that the measurements seem to all point to a crappy power supply. When you pick up the RMC1 , it's pretty easy to tell there's not much weight to the PSU. When I first lifted it out of the box I nearly hit myself in the face because I was expecting it to weigh so much more, it looked like something from an "The Office" episode.
|
|
|
Post by thxultra on Feb 25, 2020 14:04:08 GMT -5
My biggest questions with this guys reviews are what is his test bench? I do agree with his gripes though about the firmware. I haven't tried 1.8 yet but 1.7 does have its share of issues. The Please wait issue is something most people aren't going to want to deal with. I also think there are way more problems then there should be with a processor at this price point. Will be interesting to see how Emotiva's numbers compare and also to see another review with actual data. One thing I will say is it is strange this processor has been out so long and no real reviews of it until this guys.... Emotiva has some answering to do.
|
|