DAC blind test: NO audible difference whatsoever

Status
This old topic is closed. If you want to reopen this topic, contact a moderator using the "Report Post" button.
In trying to attach meaning to this I can only assume that "the level of differences being claimed" is a claim made by the DAC industry that their DACs all sound different yet this difference is so small that it cannot be detected in these "amateur" ABX tests. This can only have meaning if the DAC industry know how different their DACs are (yet they cannot or will not tell us in technical terms, in most cases?) and the DAC industry knows how sensitive the "amateur" tests are. So we have two people deducing opposite conclusions from the same test:
1. "Your DACs are all so similar that I cannot tell the difference."
2. "Our DACs are so different that if you cannot hear this then your test is not sufficiently sensitive."
An obvious solution is for the DAC industry to tell us how different their products are, so a suitable test can be defined. If they cannot do this then they cannot bleat on about tests being insensitive.

DACs sound different to many people

If flawed techniques are used by those who disagree with this - measurements which are known not to tell us how something sounds & flawed listening tests which don't show their suitability to qualify as a "test" (no controls) then yes, this flawed evidence is not sufficient for the claims made about the auditory perception of DAC differences

If you don't understand the need for controls within ABX testing then you don't understand perceptual testing! Your push back against the use of controls in ABX testing is against all expert advice in perceptual testing.

Would you use equipment of unknown sensitivity to measure something & claim it as evidence of anything? You are swinging in the wind with an approach like this.
 
Last edited:
If flawed techniques are used by those who disagree with this - measurements which are known not to tell us how something sounds & flawed listening tests which don't show their suitability to qualify as a "test" (no controls) then yes, this flawed evidence is not sufficient for the claims made about the auditory perception of DAC differences

For over two decades at least deliberately flawed DAC's (Wadia, etc.) have been perceptually favored by many users. So maybe some people can hear the difference but the measurements are irrelevant since preferences seem to have no correlation to any of them. Same might hold for SET amps with huge distortions, why bother with measurements at all?

Measurements are always going to be about precision and accuracy against a model of a defined ideal outcome. The talk about inventing some kind of measurement to take into account some as yet unknown "desireable" distortions is a waste of time (IMO). People need to get over the fact that it is not about accuracy since virtually no listening experience presents an exact representation of the stereo signal to the ears. They also need to stop looking for hidden variables there are none, all equipment/speaker/room combinations are flawed in some way.
 
Last edited:
For over two decades at least deliberately flawed DAC's (Wadia, etc.) have been perceptually favored by many users. So maybe some people can hear the difference but the measurements are irrelevant since preferences seem to have no correlation to any of them. Same might hold for SET amps with huge distortions, why bother with measurements at all?
When you say "deliberately flawed" - by what criteria are you making this categorization - measurements?

Sean Olive has produced a headphone frequency curve which is not flat (flawed?) which he claims tests have shown is the preferred curve for listeners - they perceive it sounds more like speaker listening - more like the real thing.

It brings us back to what the goal of audio playback is? To create a believable illusion (we are mostly talking about 2 channel stereo & it's own flaws)? If you agree with this then there is no escaping that auditory perception is responsible for this illusion & delivering what it expects is the way to produce a believable illusion. Ignoring this fact or ignoring the fact that we don't know all the factors in the soundfield used by our auditory perception mechanism, is really a head-in-the sand attitude

What is being argued by 00940, QAMatt is not that, however - they claim measurements & listening tests as evidence for no audible difference between DACs

Measurements are always going to be about precision and accuracy against a model of a defined ideal outcome. The talk about inventing some kind of measurement to take into account some as yet unknown "desireable" distortions is a waste of time (IMO).
I don't know how you can divorce the goal that audio devices are being used for - that of creating this illusion - from the functionality of the device itself? If you are just trying to create an engineering device which has no other function than to produce a set of standard measurements which match some engineering checklist, then fine - state that but don't make the leap in faith that therefore the device satisfies it use case!

The issue seems to be that your "ideal outcome" is different from the actual end use for which the device is intended
 
But surely DACs don't have many nonlinearities? And the job they have to perform is the definition of linearity?

For the purposes of low level distortion discrimination tests, many DACs are nonlinear enough to matter.

For the purposes of mixing and mastering records, many DACs are nonlinear enough to matter.

For casual listening, use whatever you like.

The above are my opinions based on using various different DACs for various different purposes. if it seems counter to what you might expect, I guess then it is non-intuitive. Lots of things are.
 
Last edited:
Interesting discussion. Intuitively, I tend to think that many commercial dacs are "good enough" by now, unless they're specifically made to sound a certain way. After all, Apple is able to make a well-functioning dac for their earpods that is smaller than a finger nail.

Since my last posting in the thread many many pages back, it has come to my attention that there actually have been published blind tests in which differences between modern dacs have been shown to be audible. These tests have been done by the Swedish "audio engineering society" - Ljudtekniska sällskapet. They employ a type of blind test which they call before/after testing. A device is inserted into the chain, and the test panel is attempting to see whether they can perceive differences (without knowing when the device is in the chain and when it's not). The testees are all experienced listeners who are used to the testing format. It may be argued that this kind of testing is far removed from ordinary listening, and that's true. But it is a good test of whether any device is in fact audibly transparent to the human ear or not.

Recently, they could for example easily point out the Oppo Sonica dac in the chain. The Sonos connect actually performed well, with a very slight coloration in the bass which was audible. Intriguingly, the WXC-50 from Yamaha - which doubles as a streamer and media player - was deemed as being completely audibly transparent. The WXC-50 costs 350 USD. Unfortunately, the current iteration seems to be plagued by some glitches.

If one really wants to have transparent electronics in the chain, the WXC-50 seems like a good option for the dac, especially if Yamaha are able to fix the glitches. I really can't see any point in spending more money on a dac at this point. Building dacs can be rewarding for the technological challenge, but I doubt that one will be able to improve on the WXC-50. When a device is transparent, it's difficult to make it even more transparent.
 
Last edited:
If one really wants to have transparent electronics in the chain, the WXC-50 seems like a good option for the dac, especially if Yamaha are able to fix the glitches. I really can't see any point in spending more money on a dac at this point. Building dacs can be rewarding for the technological challenge, but I doubt that one will be able to improve on the WXC-50. When a device is transparent, it's difficult to make it even more transparent.

WXC-50 Specs attached below. Doubt it is transparent. Maybe the guys need to update their test protocols?
 

Attachments

  • WXC-50 Specs.jpg
    WXC-50 Specs.jpg
    75.4 KB · Views: 213
WXC-50 Specs attached below. Doubt it is transparent. Maybe the guys need to update their test protocols?

I have seen the specs. What do you see there which immediately marks it out to you as non-transparent?

With regards to their test protocol, I'm sure they would be happy to receive any constructive feedback: Kontakt - Ljudtekniska Sällskapet

That said, these guys have been doing professional blind testing for 20-30 years. They are not your average audiophile hobbyists. I'm not aware of any other organization which has taken blind testing of comercial audio products as seriously as them (except some manufacturers like Harman). It's actually a pity that their material only is available in Swedish. They have, for example, consistently labelled most commercial amplifiers as "non-transparent", with some very few exceptions (Bryston and a couple of others). So I wouldn't be so quick to discard their results, TBH.
 
Last edited:
I have seen the specs. What do you see there which immediately marks it out to you as non-transparent?

The specs show almost nothing, which is typically the case when there is little to show off.

What concerns me the most is the feature set of the unit combined with its price point. I can't see how it is physically possible to have that feature set, and truly transparent sound quality, all at a retail price of $350. Something does't add up, I don't think it can be done.

Also, if they are that great, how come all the mastering rooms, studios, etc., aren't using them? When isn't Yamaha marketing them to most discriminating customers? Again, something doesn't add up.

Now, if you say lots of good DACs are transparent, I don't believe you. When I replaced the Benchmark DAC-1 with a DAC-3 the new one sounded a little different. Neither one is completely transparent, I don't think. If one sounds different at all than the other one, they can't by definition both be transparent. At least one has to sound slightly colored, right?

But, somehow there is a secret DAC Yamaha makes in a streaming media player that is totally transparent, and not even Yamaha itself knows it? If they did know it, why wouldn't they write it all over the box, and say so in the advertising?

To me, everything says something is wrong here and it would make more sense to investigate it than accept it as true. Things that seem too good to be true usually are.
 
The specs show almost nothing, which is typically the case when there is little to show off.

What concerns me the most is the feature set of the unit combined with its price point. I can't see how it is physically possible to have that feature set, and truly transparent sound quality, all at a retail price of $350. Something does't add up, I don't think it can be done.

Also, if they are that great, how come all the mastering rooms, studios, etc., aren't using them? When isn't Yamaha marketing them to most discriminating customers? Again, something doesn't add up.

Now, if you say lots of good DACs are transparent, I don't believe you. When I replaced the Benchmark DAC-1 with a DAC-3 the new one sounded a little different. Neither one is completely transparent, I don't think. If one sounds different at all than the other one, they can't by definition both be transparent. At least one has to sound slightly colored, right?

But, somehow there is a secret DAC Yamaha makes in a streaming media player that is totally transparent, and not even Yamaha itself knows it? If they did know it, why wouldn't they write it all over the box, and say so in the advertising?

To me, everything says something is wrong here and it would make more sense to investigate it than accept it as true. Things that seem too good to be true usually are.

Ok, I see your point. It is of course possible that the Swedes missed something in their testing. They are open to that possibility themselves. This is what they write in their magazine (my translation):

«It is quite obvious that no participants could deliver answers that are close to statistical significance for hearing any difference. Here is the paradox: We in the listening panel would really like to hear a difference! […] Note that we cannot determine the total absence of distortion. We can only say that we have done as much as we can to detect differences in reproduction, and we have failed. It is therefore not impossible that another testing panel could detect audible differences. But we have not succeeded. That's why I think we just have to take off the hat for the Yamaha wxc 50! The sound that passes through the device sounds unobtrusive, clean and tidy. There is simply nothing to remark on, and there is really no need to write any additional words on how it sounds. All sounds played through this device sound very good. Or to be more precise : this device reproduces the sound extremely similar to the sound it is fed with ».

Regarding how Yamaha may pull it off: That's the benefit of being a large and evil capitalist corporation, I think. They have significant resources devoted to R&D, also for higher priced audio components. I would guess that much of this is dowstream technology that has been developed for other components, which is simply being reused in an entry-level product. I would assume, for example, that Yamaha has roughly 100 more people who work with R&D on audio than Benchmark.

 
Last edited:
When they refer to a difference, what difference? What are they comparing? Two DACs?

No, this is not a typical A/B or ABX test. It's more sensitive than most other blind test protocols, I think. They have an audio chain with very high grade components. Then they insert the device into the chain, with a switching device which allows them to listen to the sound with and without the device. Can they detect any difference at all with and without the device in the chain? If they can detect any difference, it means that the device is not transparent. As mentioned, most of the devices they have tested so far have failed their test.
 
Status
This old topic is closed. If you want to reopen this topic, contact a moderator using the "Report Post" button.