Funniest snake oil theories

Status
Not open for further replies.
Well, for a start, at -60dB the sound level is such that I have to have the volume setting at maximum, and my ear right next to the speaker to hear any detail in what's going on - the volume, then, for me, is about that of headphones running at background music level.

Instrumental character is good, sound sources didn't mix, I was impressed with the effective quality of the presentation - the major defect was that the background was quite noisy whenever it wasn't masked by music content. This I experienced back in 1986, and thus I knew that the rubbish running around the traps about digital not being "good enough" was exactly that ...
 
One answer is that you have to be 10x fussier with digital, compared to analogue - if you think of the curve of a resonance peak representing how the quality alters with adjusting system parameters and environment, the Q of analogue is quite low, but that of digital is very large. In other words, until you're that close to the optimum that a pin can't slip between the crack, digital can sound pathetic, but a tiny touch more and everything snaps into place, effortlessly ...
 
I now think that the sophistication of digital has arrived where it can compete with the best analog. However, the ANALOG part of the digital playback system is pretty marginal, and I suspect, the 'weak link' of really high quality digital playback. That along with stable oscillators and really good power supplies.
I am almost excited, these days, that digital sound quality might get up to speed, so to speak.
 
Pro Tools HD didn't come out until 2002. Until then, 16 bit digital recording was standard in the professional industry.

And as even Sy would admit, 16 bit sucks for professional use.

But I forgot Sy's 24 bit pro recording system, which he had since the late '80's:)

wow haha, I would stop now if I were you ... do you do even basic fact finding before spouting this straight up nonsense? protools was just the first to become part of popular vernacular

consumer/prosumer level cards (M-Audio, Motu, RME, EMU, to name a few off the top of my head) had 24 and even 32bit in the 90's and software such as cubase (Steinberg), Logic (Emagic), digital performer (MOTU) were right there along with them. you could run such things on an Amiga ...hell even Protools, when it was digidesign, was in 1991 and Akai had 20-24bit samplers in the late 80's from memory

and before that it was dedicated hardware either standalone, or controlled via midi.
 
Last edited:
I now think that the sophistication of digital has arrived where it can compete with the best analog. However, the ANALOG part of the digital playback system is pretty marginal, and I suspect, the 'weak link' of really high quality digital playback. That along with stable oscillators and really good power supplies.
I am almost excited, these days, that digital sound quality might get up to speed, so to speak.

John, both analogue stages and clocking is pretty well sorted, or do you insist that better than 200-500fs at 1-10ppm long term stability is not stable enough for audio, when the wow and flutter (analogue equivalent of close in phase noise) of even the best turntables is orders of magnitude higher?
 
perhaps you found references to the first realtime hidef DSP DAW mix-cores by google and left it at that? that was probably Protools, after they were acquired by AVID. These were the first to use external hardware connected to a MAC, that allowed realtime non-linear control of effects in hires multichannel, with animated/motorized faders on a hardware mixer that interfaced with the DAW with realtime feedback (well there was some latency)

prior to that, hi-res was either dedicated hardware consoles like fairlight (I think tubular bells was done on fairlight, but dont quote me on that), or mixdown and effects were rendered offline and later with lower res preview for mixing in realtime, then rendered offline for the final edit.
 
In addition to the examples provided by Sy and Jcx, it appears no-one has mentioned Monty's excellent tutorial on the effects of dither (and other things):
Xiph.Org Video Presentations: Digital Show & Tell
That's an excellent video! I like his use of analogue test gear to 'prove' the point, and his indefatigableness in demonstrating that the digital system is the exact equivalent of the analogue system but with much better specs. It's genuinely counter-intuitive stuff, that even some slightly technical people can't get their heads around. The demo of the sine waves as they creep up to 20 kHz unchanged is genius.
 
Last edited:
I'm with you, T. I commend you for stating a reality that we share with a number of others.
Take me to your universe...
That along with stable oscillators and really good power supplies.
I would have a look around the digital world, communication systems, the internet, CERN, even professional audio, never mind normal audio, it gets a bit boring this Luddite attitude to digital and all things related.
Oh we have excellent very efficient power supplies these says, they are called SMPS's.
 
Take me to your universe...

I've got a bit more sympathy about John's position. If a technology is unfamiliar and not in your comfort zone, it's easy to just reject it out of hand. I'm not quite as old as John, but I'm not a youngster, and I have to admit that the only reason that I can be comfortable and intuitive with the notions of digital technology was that I was fortunate enough to be deeply involved with it while I was in my 20s (in the high precision instrumentation world). If I hadn't been, it would be very easy for me to spend a lot of effort rationalizing why it "can't" possibly be good, and waving away decades of data which contradict my prejudices.
 
I've seen reports that many CD's released between 1997 and 2003 were mixed down on 16 bit DAW's with repeated truncation of signals to 16 bit without dithering. Truly a recipe for sonic nastiness.

This post had my eyes popping out of their skulls.
I was working in professional recordings studios from the age of 16, that would put us at 1999. The main recording system I worked with at that time was a reel to reel digital multi-track recorder, 24-bit. This was a middle of the road recording studio and by no means high end, and the equipment was at least a few years old. The studio didn't switch to digital until they felt it offered an advantage over the existing analogue equipment. Given the equipment costs were well into the hundreds of thousands, I think they were pretty sure of there decision.

When I began recording at home with cheap 'domestic level' equipment it was at 24-bit with professional interface cards running 24-bit dacs, that would have been in 2001.

Back then my equipment was old second hand gear... I can't imagine a single commercial CD being mixed on a 16-bit DAW,

Even back in the 80's when 16-bit 1" tape was being used, the output was through a 20-bit DAC into a an analogue desk, where the recording would be mixed down and bounced back to tape. I just can't see any mixing occurring in the digital domain until we were well into the 24-bit 96kHz era.
 
Last edited:
True:)
A reverse analogy, something to try for the anti-digit crew, watch you high definition cable or sky TV through an old CRT next to your high def LCD screen (if you have them, or do you abhor those as well:D )

yes, but we all know that elevating the clock off the board, connecting only with untwisted shiny stuff and towering parallel caps makes for much blacker blacks, because digital is really analogue ...
 
Member
Joined 2007
Paid Member
If a technology is unfamiliar and not in your comfort zone, it's easy to just reject it out of hand.

I agree. I worked in the PC industry starting in '87 and I got to see the inception of several new ideas at the time that are fairly standard now. Even at that time, we realized that such things as sound and video via PC were "acceptable" to the base consumer and exciting, but we looked forward to the future with the constant barrage of better, faster parts and the improvements this would bring and going after some markets such as movie studios and recording studios.

I think now it wasn't Roland who came calling, but Ensoniq. I don't know why so many people think the advent of certain PC technologies didn't appear until year 2000 because we certainly knew of what could be coming on the horizon and understood it was "in the works" with prototypes in the lab even in '88.

I got to see full length movies on the computer screen well before it was available to consumers. :cool: But yeah, I knew of an awful amount of engineers who put in tons of weekends just trying to prove it could be done on a PC and done well.
 
Last edited:
Now that Intel/Microsoft PCs are everywhere and can do (almost) anything, it may come as a shock to our younger viewers that:
1. Personal computers were not invented by Intel/Microsoft - other architectures preceded theirs; the Intel/Microsoft model succeeded because of the marketing might of IBM.
2. PCs arrived quite late in the history of computing, and in some ways were a huge step backwards from which we have still not fully recovered.
3. Computers were widely used for decades before PCs, including high quality graphics.
4. The Internet existed before PCs and for many years PCs were completely useless at networking. Proper computers far exceeded the capability of PCs.
5. Other electronic systems were used very successfuly to do jobs which nowadays we do on a PC.
6. During this era we did not all live in caves and grunt - instead we put men on the moon, discovered quantum mechanics, invented radio, the LP, negative feedback etc.
 
I would have a look around the digital world, communication systems, the internet, CERN, even professional audio, never mind normal audio, it gets a bit boring this Luddite attitude to digital and all things related.

The Luddites are those with an attitude such as you express here where they are always apologizing for real world digital's deficiencies by cherry picking the very best of digital and pretending it is instead the average. Discussion of digital's "0.001% distortion" is a basic example of this.

Attitudes such as you express here are a major reason that MP3 and even 64 Kbit/second/channel digital audio are commonly thought of as being all but professional grade today, why 2 db dynamic range recordings are so common, and why consumer audio has not progressed past CD except for *video* applications - very ironic because 'traditionally', audio quality was supposed to 'not matter' in video. Yet 24 bit high sample rate multichannel digital audio not compressed down to near zero dynamic range is the norm for Blu-Ray discs. These manufacturers must not have been listening to Luddites.
 
Last edited:
Status
Not open for further replies.