Funniest snake oil theories

Status
Not open for further replies.
I actually had to turn it down to avoid clipping.
I absolutely believe you that a real difference existed. but I can't see how that could be causing clipping with the same amp. it would mean a gross and easily measurable difference, hard to imagine when switching digital sources.
cheap DVD players are crap, a Sony one that sells for ~$100 does not even have a separate D/A chip.
one of the things on my to do list is to try a vintage R2R-based CD again, now that my system reached what I believe to be a "decent" level.
I've heard way too many reports of huge differences from people I rather trust to think it's just a myth. back when I did such a comparison myself my system was likely not revealing enough.

and I try to understand what we hear, either conceptually or by measurement, of any new device that comes on the market.
that's want I want too, even if only for the academic exercise. something which is intriguingly hard to understand, especially for a forum that's "geek"-oriented (and I don't mean it in the offensive way).
 
Last edited:
I absolutely believe you that a real difference existed. but I can't see how that could be causing clipping with the same amp.

Well that was my subjective observation. Maybe I turned it up louder because it sounded cleaner? I did not do objective measurements.

It seemed to me that the dynamic range was modestly improved. To my ears the most objectionable aspect of recorded music is excessive dynamic compression. Like when you hear a song on the radio that you listen to at home from a CD (which I know is compressed too), and the crescendo is coming, and it's not there. ;) That annoys me.

Would there be a difference in dynamic range between players? There was certainly a difference in the sound between the two players. And I know the Sony isn't all that great either.

cheap DVD players are crap, a Sony one that sells for ~$100 does not even have a separate D/A chip.

My cheap DVD player did produce decent sound at first, but it did become fatiguing after a while. My nephews said it produced "perfect" sound, but what do they know. It's all they ever heard. They claim that MP3 played back through a ghastly class D headphone amp is "perfect" too. The Sony player is much better in this respect. I know there's way better out there too.
 
It has been demonstrated.

What is the theory behind it? As I understand it, a typical digital output has noise and distortion of -100dB, say. The vinyl is way, way above this in noise and distortion terms.

My favourite new term is "diameter loss".
The turntable speed remains constant so groove speed decreases as the playback stylus approaches the inner diameter to approximately 8.3 inches per second at its minimum closing diameter. “Cutting Losses” and “Tracing Losses” become worse as this happens. Cutting Losses occur due to the width of the burnishing facets of the cutting stylus. The combination of high frequencies and reduced groove speed at the inner diameter results in some self erasure. Tracing Losses occur due to the failure of the playback stylus to accurately trace every groove undulation. The stylus will often take shortcuts and miss very small sections of the groove.

It doesn't sound good does it? Were the tests you referred to using the outside of the LP, or the inside?

In the demonstrations you refer to, are people saying they can just hear a difference, or that the digital version sounds worse? Are we sure there were no differences in the master other than the AD/DA? I hear people often referring to RF lurking on some D/A outputs. If the D/A implementation was poor and suffers from this, then do we know the vinyl cutting system didn't respond badly to it?

Assuming all's fair, it sounds like an amazing feat. If the digital distortion previously measured at -100dB is audible in amongst the vinyl mush, can it still be measured?
 
the Bybee purifiers received enough bashing. I don't believe there's anything new to say. it gets repetitive. it's obviously one of those tweaks that are too weird to be backed up by typical measurements so I see no point in going over it again. it's on my "I don't know" list.
there are way too many other things that are worth being "investigated".
 
Assuming all's fair, it sounds like an amazing feat. If the digital distortion previously measured at -100dB is audible in amongst the vinyl mush, can it still be measured?

It is an amazing feat. My personal experience is that when I hear a CD of something that I listened to on vinyl many decades ago, it's like hearing it for the first time. There is way more information than there was on the vinyl.

I've heard some CDs that sounded pretty terrible, but it wasn't artefacts 100 dB below the noise floor that made them sound that way. ;) It was the same things that make vinyl sound terrible- bad mix, or too much compression.

The primary issue underlying digital vs analog is the nature of the distortion, like tubes vs transistors. It's part psychoacoustics and part woo. I try to remain agnostic on the issue.

If there's subsonic or supersonic artefacts on CDs, they should be filtered out. Don't most CD players do a good job of that? It seems like it would be easy to do.
 
The primary issue underlying digital vs analog is the nature of the distortion, like tubes vs transistors. It's part psychoacoustics and part woo. I try to remain agnostic on the issue.

Yes, I'm quite prepared to believe that digital distortion is a different, nastier beast from analogue distortion, but it's a question of level. Measurements of real world dithered digital systems do seem to suggest they are staggeringly pure. If the artefacts really are 90-odd dB down from peak regardless of the type of signal, it's very difficult to imagine them being a problem. Is subtracting the original signal from the A/D-DA'ed one and getting (to all intents and purposes) silence not a valid way of looking at it?

I can see there's an issue with filtering the input for anti-aliasing, but with oversampling etc. are these problems not pretty well solved?

All things considered, I suspect there may be a little bit of folk superstition involved in the very common assumption that digital audio is evil. When you read about the crude mutilation that non-girl-and-guitar recordings have to undergo to get onto vinyl, you have to laugh.
 
I use neither CD or Turntable these days. All of my listening is done from the computer. But that's not why I mention this; I do so because some of my favorite recordings to listen to of any given album that they are available from are the "needledrops". It is amazing to me the difference sometimes. A copy of Dark Side of The Moon is staggering in the clarity as opposed the CD...
Food for thought.
 
It's trivially easy to insert a 16/44 A-D-D-A into a signal chain. I've done it myself.

So far, everyone who has tried it and taken care to match levels has not been able to tell the difference (except at pathological volume levels when there's no signal and you can hear the noise floor

I tend to doubt that, unless with simple signal material (only a few voices, limited dynamic range, probably only listened to for a few minutes). Linear PCM is plagued with rising quantization distortion exceeding 3% over the lowest 40db of its dynamic range, low sampling rate quantization creates significant dynamic envelope errors as the Nyquist limit is approached (5% for 40% of the sampling rate, for instance, decreasing more or less linearly as the frequency is minimized). And, of course, no response to speak of beyond 20Khz but pre-distorted Gibb's Phenomenon. Not what I'd call high quality 'problem-free' reproduction.
 
Last edited:
Dither deals with quantisation distortion.

Gibb's phenomenon is not distortion but the time domain picture of what is missing in the frequency domain i.e. it is evidence that the anti-alias filter is working. I have never come across a musical instrument which produces perfect square waves so the 'problem' is more theoretical than real. There is a real issue with filters, and that is to do with phase but digital methods such as oversampling ease this considerably.
 
Can you cut my master flat?

Sure, but you may not be happy with the results. Mistracking can result from excessive levels. For example bad sibilance or bright cymbals may result in groove modulations too complex to track on even the best playback systems. Excessive bass can result in skipping. Disc cutting engineers take this into account and use their judgement for the best playback results for different systems. Besides, you will hear a difference as the cartridge approaches the inner diameter of the disc. This is called “Diameter Loss”.

Vinyl Mastering FAQs

It's simply funny.
 
Last edited:
One 'funny' as in 'weird' effect of claiming red book audio is 'undetectable' is that it has propagated an industry wide fallacy for a couple of decades that this standard is robust enough for professional work. I suspect that these experiments have been done with pre-compressed audio, not live musical material which would mandate an extra 10-20 db of headroom.

The damage done to high quality audio by this, however unintentional, has been vast, and only partially overcome during the last 15 years or so by the adoption by audio professionals of 24 bit audio with 96 ks/S or higher quantization rates.
 
Last edited:
One 'funny' as in 'weird' effect of claiming red book audio is 'undetectable' is that it has propagated an industry wide fallacy for a couple of decades that this standard is robust enough for professional work.

The damage done to high quality audio by this, however unintentional, has been vast, and only partially overcome during the last 15 years or so by the adoption by audio professionals of 24 bit audio with 96 ks/S or higher quantization rates.

Claims both published and anecdotal are regularly made for audibly superior sound quality for two-channel audio encoded with longer word lengths and/or at higher sampling rates than the 16-bit/44.1-kHz CD standard. The authors report on a series of double-blind tests comparing the analog output of high-resolution players playing high-resolution recordings with the same signal passed through a 16-bit/44.1-kHz “bottleneck.” The tests were conducted for over a year using different systems and a variety of subjects. The systems included expensive professional monitors and one high-end system with electrostatic loudspeakers and expensive components and cables. The subjects included professional recording engineers, students in a university recording program, and dedicated audiophiles. The test results show that the CD-quality A/D/A loop was undetectable at normal-to-loud listening levels, by any of the subjects, on any of the playback systems. The noise of the CD-quality loop was audible only at very elevated levels.
https://secure.aes.org/forum/pubs/journal/?ID=2

Don't blame me. I'm just reporting some research published by the AES.

I can understand that you may as well use higher resolution in the recording studio and when manipulating audio - I, myself, use 32 bit calculations.
 
Last edited:
I've had people propose to me that they can't hear the difference with MP3 audio encoded at 128kb/s, and that digital radio at 64 and 32 kb/s per channel is 'professional quality'. These are people with nothing obviously wrong with their hearing.

How far are we willing to be led down this path?

On the other hand, I can readily differentiate playback SQ on my Home Theatre system between Dolby AC-3, PCM, Dolby True HD and DTS HD Master audio, and all I have to do is hit a few switches on my remote. Something has definitely gone wrong when native 24 bit audio isn't even available in an audio-only format any more.
 
Last edited:
Status
Not open for further replies.