John Curl's Blowtorch preamplifier part II

Status
Not open for further replies.
And, as long the specs were 20-20 000Hz at +-0.5dB, specifications were filled.
Market was more sensitive to the features, noise of the mike preamps, numbers and flexibility of the correctors, automation, quality of the faders, look, reliability etc.
Don't believe that most of the mixing desks (monitors, amplifiers etc.) were top of the audiophile gear. A fashion market.
Another good example of how the audio industry stagnates, in terms of trying to fully understand the importance of all the factors that matter to subjective quality. Early on, it was established that a certain level of technical competence, measured in a very basic way, was "good enough" to get the job done, so that became the standard method for assessing quality. Of course, that meant that people on the floor who actually had to work with the gear on a day by day basis, quite often discovered inadequacies, problems, limitations, and had to work around these flaws to do their jobs properly if they felt strongly about the integrity of what they were doing.

This continuing lack of desire to delve deeper into what is actually going on when things sound "better" and then they don't, for "mysterious reasons" is puzzling ... well, space abhors a vacuum, so the quagmire of "magical" solutions, etc, will continue to spawn indefinitely, until enough people eventually do the right research and get some decent answers, that can used for properly engineering the equipment, first time round ...
 
Last edited:
Administrator
Joined 2004
Paid Member
Back in the early 70's, those in the know would use the LM301 for higher performance than the 741. If you used the 741 at low enough signal levels it would actually top 20 KHz without slewing it to death.

I think I still have some of those LM301 chips in a TO99 case around here somewhere. For sure I have some in the 8 pin DIP. The 741s are parked close by. :)

-Chris
 
Disabled Account
Joined 2012
Back in the early 70's, those in the know would use the LM301 for higher performance than the 741. If you used the 741 at low enough signal levels it would actually top 20 KHz without slewing it to death.

I think I still have some of those LM301 chips in a TO99 case around here somewhere. For sure I have some in the 8 pin DIP. The 741s are parked close by. :)

-Chris

Used in the Crown IC-150 line stage, i believe.



THx-RNMarsh
 
Administrator
Joined 2004
Paid Member
Hi Richard,
Bit perfect audio data retrieval is a myth.
Could you expand on the BOLD comment above?
Well, it comes down to the error correction flags in the DSP used to decode the EFM signal. This is where the digital bits are reconstituted. There are three states you can expect to see watching the error flags. No flags = perfect recovery, no detected errors. The C1 active = detected errors but were recoverable using error correction via the Reed-Solomon algorithm. This flag is active. The dreaded C2 error flag = unrecoverable read error. That means permanent errors that cannot match the master file except by pure accident. This flag goes active more than you want to think about.

The question is, just what exactly does your transport do with a C2 error? Cheap machines don't care and simply pass the error along to the DAC. A nicer machine will mute that data, so the D/A reproduces a zero (0) level for that / those values. Still higher up on the food chain, another machine will replace the corrupt data with the previous data value. Now for the best transport and DSP set. These DSP chips will look at the previous values, and the ones coming up next. Then it will interpolate a value in between so as to make the transition between the three values seamless. It is a best guess and normally a pretty good one.

CDs also have an expected BLER rate (BLock Error Rate) per rotation. This is for a disc without defects and a CD transport operating flawlessly. I forget what that figure was, but it exceeded 100 I think. The point is, this is for an undamaged CD with a flawless stamping and silvering. Now look at your CDs, they are not like that at all. Many are eccentric, and maybe even warped slightly. Pressing defects further distress the situation. Then there is that silvering ... some times it is translucent, meaning not all the beam is sent back into the laser assembly. So, low signal levels also can be a problem. These are common problems, even on our "bogey" test discs. The end result is C1 and C2 activity and departures from "bit perfect". Also please consider that reading defects do not occur singly, so you will often see the C1 flags come up, then the C2 flags become active as well.

This should also illustrate why you can't get a cheap transport and plug it into the latest high end DAC and get top performance. Not even close, because garbage in = garbage out. The fact that cheap transports don't recover data well is an inescapable fact of life.

Now, about our programs on the even cheaper computer CDROMs. Well, the data world does not use the Reed-Solomon code for error correction. The error correction used in data CDs and DVDs are far more robust than CDs for music. If errors are detected, the read head is returned to the track and it tries again, if no joy it will read other areas to retrieve the data that is repeated spread out further away. You have all heard the computer CDROM grinding away at times. It has just run into a bad area and is working to check other places to reassemble the data. The abysmal quality of the signal coming off a cheap data CDROM merely shows just how robust the data error correction is. If they used a good audio CD transport, the error correction wouldn't be exercised very much. Remember, audio CD players don't go back to re-read a section unless they skip back. That action is unintended folks!

Computer data discs and audio CDs are two completely different animals. Where being "bit perfect" isn't required to play music, it is to load programs or data. The different systems reflect that basic difference.

Now some of you will state that the data coming from the music CD is all fine, no errors. That is because defective data is removed and data that does not violate the data frame is inserted. So the data stream from your audio transport may not contain errors, but that has no bearing on whether the data is bit perfect or not. All it means is that what leaves the DSP section is valid data, that's all.

Now to really depress some of you. The Nakamichi OMS-5 / 7 uses dual 14 bit D/A converters. That's right, not even 16 bit. Yet, those machines sound great! Look up a TDA1540 if you don't believe me. I own and use one of these machines. One day it will be upgraded to a (hopefully) 20 bit decoder.

If any of you want "bit perfect" playback, you will have to gain access to the digital file for the music and transmit that. A digital DVD or CDROM for data will then be able to recover all your ones and zeros to recreate that file. From experience I can tell you that "bit perfect" playback isn't necessary for wonderful sound. However you will need a really good transport / DSP section to restore order among those little ones and zeros. The recreation will then be close, but not "bit perfect". You will not be able to hear the difference. Chasing "bit perfect" reproduction is pretty much a fools errand in my view because that does not matter to the playback experience.

-Chris
 
Last edited:
Disabled Account
Joined 2012
Hi Chris,

From your practical experience would you say it is possible to be 'detected' as a difference (delta) between a master file and a CD from same... if that CD had the 'typical' errors and fill-in going on. Not that the CD wouldnt be enjoyable still.... but a detectable difference. [ yes, i know its broad question].



THx-RNMarsh
 
Count on the peanut gallery to distort the issue. Dick is asking about distortion not harmonic distortion.

Yes I have listened to and looked at the Waslo demo.

SY

If you PM me an address I will send you a calender. It is now past 1978 and there is quite a bit of newer research.

George

As perhaps the best researcher here why not try to find a web version of Vernon B Mountcastles 1978 bit "The Mindful Brain". It was the major leap in changing much of neuroscience.

ES

PS. You might want to read Ian Hegglund's piece in Linear Audio. I think it was in #4.
 
Administrator
Joined 2004
Paid Member
Hi Richard,
I think you could detect a lot of errors that were consecutive, but just the normal rate for good machine and media - I don't think so. At worst you might have enough errors to give you a sense something is off, or become irritated. I don't know that anyone could honestly put their finger on what "it" was.

One worst case example might be a full scale bit in a non-oversampling D/A and cheap DSP type player. The 7th order low pass would mow it under. Now, if some happy soul removed the 7th order low pass filter, the error might make it through to the amplifier. Given the amount of 44,100 Hz signal that would also make it through, there just might be a dead tweeter and disappearing zobel making the entire point moot.

You really would need a long string of errors, bad enough to be tagged as bad, but not severe enough to mistrack. The longer the string of errors gets, the more certain it would be that you notice something. Keeping in mind that we are treating each sample as one event and that there would be 44,100 of them per second (oversampling would merely stretch the error in a string out).

To be honest Richard, what the answer might really depend on is what the DSP does about bad values. A really nice machine might "duck" the sound a bit while a cheapy CD thing might screech away. If the sub code is affected the machines might either stick there or attempt to skip ahead. That, you would notice.

-Chris
 
why are we discussing CD players? They are going the way of the Dodo bird, for all the reasons mentioned and many others.

And why shouldn't they? We want to be stuck with a format that uses 16 bit and a non standard (for the rest of the visual world) data rate? The sooner we shift over to 48/96k data rates the better. And I think everyone here would agree that 24 bit would be better, needed is a different discussion.

Alan
 
Last edited:
AX tech editor
Joined 2002
Paid Member
Hi Richard,
The dreaded C2 error flag = unrecoverable read error. That means permanent errors that cannot match the master file except by pure accident. This flag goes active more than you want to think about.
-Chris

Hi Chris,

Years ago I had a special test unit (IIRC it was an Elektor project) that would indicate and count the various errors. Lots of recoverable errors of course - often up to 10,000 on a single CD. But I never, never saw any unrecoverable errors except on one single CD that had a wide scratch across the surface. All others (and I tried dozens) never showed unrecoverable. So in my world, bit perfect copies are the norm of the day.

Jan
 
So in my world, bit perfect copies are the norm of the day.
Happy man. I remember this Sony DAT i still have in my home rack. We had equipped-it with a red LED lightening on errors. When the cassettes were not absolutely new, it should be used as a power LED with some of them.
As we used-it to stock our sound effects library in our post prod facility, that was the reason why we decided to move to SCSI hard disks.
I said previously that i had some CD witch were stocked in a wet basement for some years. It seems Fungus love the metalization. I have some of them absolutely silent, now.
 
Last edited:
Status
Not open for further replies.