How to calculate db into distortion?

Status
Not open for further replies.
How does one determine how many db into distortion a tube is being driven? A brief blurb on a website says to drive each successive stage only 6-9db into distortion to get smoother distortion than just slamming one stage.

The concept makes sense, but how does one determine how many db a tube is being overdriven?

From http://www.geofex.com/tubeampfaq/TUBEFAQ.htm#BluesDistortion

A recently voiced although intuitively applied idea in distortion is that tube distortion sounds best when each successive distortion stage is overdriven by less than about 12db. This has the effect of keeping the tubes inside the area where the signal is more compression-distorted than clipped. That is what those resistive divider chains between distortion stages are for inside those distortion preamp schematics. Mesa's distortion preamps are another good example.

Overdriving a tube stage too much gives you harsher clipping, not the singing, sweet distortion we want. To really get sweet, crunchy distortion, keep each stage that goes into distortion no more than 6-9db into distortion.
 
6dB is 2:1. 9dB is 2.8:1. 12dB is 4:1.

If your amplifier just-clips at say 1V input, then this thought says to push it to 2V or 3V, but not 4V.

I would not take it as a "Hard Rule", but a starting point for prototype to adjust by ear.

(It may be mildly amusing to note that the same rule works for simple "log amps". A long chain of amplifiers, with a rectifier on each stage. Each amplifier clips. Each amplifier has the same gain, commonly 2 (6dB!). All the rectifiers are averaged together. This gives a DC output similar to the log of the input, V/dB. If the per-stage gain is 10dB the output is lumpy; if under 6dB it's just more stages for no good reason.)
 
Last edited:
Hmm, so if I pick a DC operating bias point of -2v, then it will start distorting at 2 v peak input signal to that state, right? The tube enters into cutoff at essentially 0v. So isn't it hard clipping if the input signal is 4v peak?
 
Hmm, so if I pick a DC operating bias point of -2v, then it will start distorting at 2 v peak input signal to that state, right?
In my experience, it's rarely quite as crisp and simple as that. As PRR said, best to listen and monitor with a 'scope at the same time, so you can use your ears and eyes to tell you when it "starts" distorting.

This isn't woo-woo and black magic, there are many real reasons for the inexact nature of this. For one, grid current flow starts well before the grid reaches 0 V, and may or may not cause visible changes to the waveform depending on source impedance of the signal. For another, it takes several percent distortion before your ear notices it's there (especially with guitar as the source), so the moment at which a guitar stage "starts distorting" is a subjective thing.

-Gnobuddy
 
Status
Not open for further replies.