John Curl's Blowtorch preamplifier part II

Status
Not open for further replies.
Keen observation Marce, but I am an old man (74) and I started playing the guitar at 15. (That's close to 60 years ago) and I had almost 10 years of collecting guitars, before I got into serious audio reproduction. I have designed audio, starting with a loudspeaker enclosure in 1963, which is 52 years ago, so that makes my audio design range over 50 years.
 
I do read and digest most of the stuff even the bits that are beyond my understanding, always leads to some interesting reading.
That's a lot of years, as an impressionable teenager I remember having a poster of The Grateful Dead's wall of sound on my bedroom wall, as a 14 year old (and budding bass player at that time, which I thought was cooler than the bass trombone I played in't Brass band and more likely to get me a girl....) it blew my mind... never thought then I'd be fencing verbally with one of the people involved with it.
:)
 
Disabled Account
Joined 2012
Regarding the present situation with digital clipping, that occurred slowly by trial and error, as a part of the loudness wars. It seems that people can't hear occasional clips on a good CD player, so mastering engineers did what their clients wanted, pushed the levels up a little more and see if it helps record sales. It did, at least for a lot of pop music.

I think you give recording people too much credit. especially if their background or even some present day recording is with analog recorders. They just plain clip it and dont notice or care. It isnt the Mastering engineer.. it happened in the studio or garage or bedroom where the multi-channel recording was made.

Most good engineers - if they want it to sound louder - will run things thru a limiter/compressor. No clipping but louder.


THx-RNMarsh
 
I started learning to play the guitar at 15, and at 16, I got into a rock and roll band. By this time I had gone through 2 augmented acoustic and 2 solid body guitars. I loved playing rock and roll, but the band kicked me out, anyway. '-(
I switched over to folk guitar when I fell in love with the sound of a Swedish made guitar at a music store, and I HAD to have it. I played regularly for several years until the Swedish made guitar was stolen. '-( Then I was heartbroken. I worked for years trying to replace my classical guitar, picking up the 12 string in the process. I tried several beautifully made guitars, one Swedish (Goya) and one Mexican, both beautifully made, but I just did not like them, and sold them. By this time, I was more involved in audio reproduction and my audio career started. By the way, the 12 string (that I might have killed for) was probably owned by Phil Lesh, the bass player of the Grateful Dead. I can't be sure, it was at someone else's house (the Grateful Dead's manager).
 
Last edited:
I think you give recording people too much credit. especially if their background or even some present day recording is with analog recorders. They just plain clip it and dont notice or care. It isnt the Mastering engineer.. it happened in the studio or garage or bedroom where the multi-channel recording was made.

Most good engineers - if they want it to sound louder - will run things thru a limiter/compressor. No clipping but louder.


THx-RNMarsh

Actually, if you talk to a lot of recording engineers I think you will find that a lot of them distorted analog tape intentionally, because they liked the sound of tape distortion. Some carefully studied exactly how much overdrive, bias level, tape type, and tape speed affected the sound of the distortion and adjusted their machines to optimize them for the particular type of music being recorded.

If you want something much louder with minimal distortion, then you use a look ahead limiter, such as Waves L1. If you want louder with a compressed sound, you can use a compressor. If you want loud distortion you can also choose clipping. They all sound different. Some engineers use a bit of each, like adding spice to food, not too much of any one flavor may be their preference.

To say that people are just lazy and careless is an attribution on your part as to the nature their character. It looks very much like you may be committing the fundamental attribution error. http://www.jiss.org/documents/volume_5/issue_1/JISS 2015 5(1) 44-57 FAE.pdf
 
www.hifisonix.com
Joined 2003
Paid Member
Mainly that a number of designs posted here in recent years reach that level. Members doing stuff for fun to see how low they can get distortion. Attached is the multitone plot for the amplifiers I am (slowly) building to run my active speakers with. I personally think this is a very good test, but isn't used much as needs a very good test set. This waveform is closer to 'real music' than most tests and although steady state gives a good idea of the IMD performance.

Based on this single data point and assuming amp not driven into clipping/current limiting I cannot see anything to suggest this is anything other than transparent to the source. That is of course my personal view as grot 10dB below the best noise floor in my room is (to me) undetectable. I am happy in my clotheardness

That is a very mean test - if you can get through it without generating loads of HD I think you are good to go.

Another interesting one would be to have a nearly full output swing LF signal (say 100 Hz) upon which you impose 10 or 20 equally spaced low level signals between 200Hz and 20 kHz at say -30 dBV (as measured at the amplifier output). I would expect in a less than ideal amplifier you will be getting a varying THD signal as the amp cycles through the LF signal i.e. this would not be a steady state distortion test! Analogous to the Bowles stability test but looking for THD.
 
www.hifisonix.com
Joined 2003
Paid Member
In the Bowles test, the LF signal forces shifts in the devices parameters that can provide insights into amplifier stability.

At LF you usually have more loop gain than at HF. I am postulating that the parametric changes introduced by the LF signal could introduce non-linearities in the HF signal that you cannot easily pick up on the LF signal. For example, the 7th harmonic of 100 Hz is 12.8 kHz. I think it's easier to just look at the HF signal distortion of the higher F's in the proposed test.
 
Member
Joined 2004
Paid Member
Mainly that a number of designs posted here in recent years reach that level. Members doing stuff for fun to see how low they can get distortion. Attached is the multitone plot for the amplifiers I am (slowly) building to run my active speakers with. I personally think this is a very good test, but isn't used much as needs a very good test set. This waveform is closer to 'real music' than most tests and although steady state gives a good idea of the IMD performance.

Based on this single data point and assuming amp not driven into clipping/current limiting I cannot see anything to suggest this is anything other than transparent to the source. That is of course my personal view as grot 10dB below the best noise floor in my room is (to me) undetectable. I am happy in my clotheardness

The multitone test can be very sensitive but if the tones are all harmonics they will hide lots. Spectral contamination testing is the broad term and there has been some good work on finding lists of tones that do not have harmonic and IM products on top of other tones. Here is more: http://www.tmr-audio.com/pdf/jon_risch_biwiring.pdf I use this for driver/speaker testing. For line level electronics I rarely see issues but YMMV. Amplifiers may have issues. For speakers synchronous averaging helps a lot to get noise out of the way.
 
Administrator
Joined 2004
Paid Member
Actually, if you talk to a lot of recording engineers I think you will find that a lot of them distorted analog tape intentionally, because they liked the sound of tape distortion.
We certainly did back in the 70s-80s. "Pack it on" was what we called it. The idea was that the VU levels had been set for older, lower permeability tape. It also got the signal up away from the noise and the tape could take it I''m sure it added some compression.

Or maybe we just liked the distortion. It's possible.
 
We certainly did back in the 70s-80s. "Pack it on" was what we called it. The idea was that the VU levels had been set for older, lower permeability tape. It also got the signal up away from the noise and the tape could take it I''m sure it added some compression.

Or maybe we just liked the distortion. It's possible.

Tape compression is a type of soft distortion. The harder you push it the more obvious and distorted it gets.

It seems among hi-fi amp designers, distortion is a bad thing, which makes sense considering what they are trying to do. But for musicians distortion is not always a bad thing. Some people go to great effort to perfect musical distortion, which may be hard for hi-fi amp designers to fathom.
 
Disabled Account
Joined 2012
To say that people are just lazy and careless is an attribution on your part as to the nature their character. It looks very much like you may be committing the fundamental attribution error. http://www.jiss.org/documents/volume_5/issue_1/JISS 2015 5(1) 44-57 FAE.pdf

Not really, I am here as a perfectionist... High level of perfection. The better part of an Audiophile. I am in the camp of HD Mastering and Direct to Disk etc. NOT those recordist who play producer to the sound effects of tape saturation, overload, distortion additions. And, there are many purists in analog (vinyl) and digital (CD) recording. They avoid compression and clipping as much as possible to produce the most realistic music possible. They leave the production of sound to the musicians and producers.

And, yes quit a few recordings are just sloppy work.


THx-RNMarsh



.
 
Status
Not open for further replies.