Funniest snake oil theories

Status
Not open for further replies.
is a common misunderstanding or misleading characterization - dither doesn't "obscure" quantization artifacts in the sense of masking/being louder - it destroys the correlation

I know the level and spectrum of an optimal amount of jitter depends quite a bit on the exact performance to be realized, so not all jitter works uniformly well at 'destroying quantization effect correlation' for all signals of a broad spectrum recording.
 
Last edited:
EVERYONE uses dither, at least on any music CD I've seen since the late '80s. It's a standard part of all production software. Don't know about test CDs...

16 bit professional DAW's and multitrack recorders proliferated until at least the early 2000's (thanks to the 'perfect sound forever' heads), so there wasn't much to work with regarding dither and noise shaping using such low robustness standards.
 
I know the level and spectrum of an optimal amount of jitter depends quite a bit on the exact performance to be realized, so not all jitter works uniformly well at 'destroying quantization effect correlation' for all signals of a broad spectrum recording.

Jitter? We were discussing dither. Or is this a further rapid movement of the goalposts?
 
Pro Tools HD didn't come out until 2002. Until then, 16 bit digital recording was standard in the professional industry.

And as even Sy would admit, 16 bit sucks for professional use.

But I forgot Sy's 24 bit pro recording system, which he had since the late '80's:)
 
Last edited:
This may come as a shock, but there's more than ProTools out there.:D

16 bit recording CAN be done well, it's just much more finicky about level setting. 24 bit gives you more room to maneuver before mastering to 16 bits.

edit: Late '90s, you could get 24 bit soundcards for HOME use. Studios had gotten there long before.
 
Studios had gotten there long before.

Only a minority of them - most were still buying your line about 16 bit 'perfect sound forever', hook line and sinker. Probably some of that 16 bit equipment is *still* being used to make recordings.

I'm trying to imagine a pro studio in 1997 running on PC's with 'high end' sound cards in that notoriously noisy electrical environment. Ain't happening.

16 bit 2 channel recording maybe 'ok' if done well. 16 bit multitrack seems to be something I just don't want to listen to the results of.
 
Last edited:
My naive take on this is that in the days when PCs (however 'high end') were insufficient for professional sound they just didn't use PCs! Instead, they used professional digital sound equipment. Fancy that: professional people in a professional workplace using professional equipment - who would have thought it?

The world did not start when PCs were invented. If anything, they set things back by encouraging people to use inadequate technology simply because it was cheap, 'new' and well-marketed.
 
Last edited:
I'm trying to imagine a pro studio in 1997 running on PC's with 'high end' sound cards in that notoriously noisy electrical environment. Ain't happening.

That's because they didn't. Sound cards were the inevitable trickle down from pro equipment.

Again, if you're careful about peak levels, 16 bit is MORE than adequate. That gives a S/N exceeding nearly all analog recorders. 20 bit (and later 24) just makes things easier.
 
Member
Joined 2007
Paid Member
That's because they didn't. Sound cards were the inevitable trickle down from pro equipment.

SY, I bought certain PC components as one of six purchasers for CompuAdd Corp. here in Austin. That was up until '96. You just helped trigger a memory because I was actually the one person who purchased sound cards for all the PCs we manufactured.

I had one pro company come in and pitch a new sound card for PCs. It came with a high price tag, much higher than Creative and we nixed buying it because we didn't really have musicians or folks wanting to set up a home recording studio at that time using a PC, so we turned the nice fellow down. I'm not sure but it may have been Roland at the time. I just recall it had much more capability than other cards and they boasted improved sound and recording. If I had to guess, I would say this was about '95 when I had this meeting with the rep.
 
Last edited:
Certainly not recent ones. It would explain percents of distortion at -60dB though.
Yes, with maximum amplication, and one's ears jammed hard against the speaker driver one can hear digital artifacts in non-dithered material - the Denon CD has that, and for reason a Kiri Te Kanawa (opera) CD was mastered at a very low level, early 80's effort, and at the start you can heard little "glurging" noises which weren't dithered away ...
 
And "generic information at the lower levels" would mean what?


No clearly defined imaging, particularly for details, little identifiable instrumental character, sound sources having intermittent qualities that mix with each other or overwhelm others based on relative level, few or no consistent perceptible individual soundfields for different sound sources, background quality SQ at best sounding tremendously fatiguing and irritating at live levels, unstable variability and perceptible attenuation of sonic characters depending on the moment of reproduction possibly to the point that they become unidentifiable at times without prior knowledge, overall presentation washed out and inconsistent, and likely with annoying characteristics due to information loss - frequency extremes lacking in character and detail, probably unidentifiable at many times, and so on and so forth - everything many steps below optimum if identifiably there at all.

I don't experience any thrill at listening to a recording that is missing very many of the vital parts of a live performance because of a foolish lack in the recording quality. In fact, the opposite.
 
Last edited:
Status
Not open for further replies.