Audibility of distortion in horns!

Status
Not open for further replies.
that's a little asinine and does not offer any clarification.

Well IMO what I wrote is actually a good, short, simple explanation. I have no idea of your knowledge base so if you want to understand better you could entertain the idea of helping me understand what you understand and what not.. ask a question and I can help from that.

Is it spectrum and the meaning of that you don't understand? Time domain? I have no idea what it is that stopped you from making sense from my post.

Also, as I wrote, gedlees later post dives a bit further into the subject.

And I'm not an assasin.. ;-)
 
That, and the too often ignored reality that we're talking about mechanical systems that do not, in fact, behave like our oversimplified theoretical or electronic models. Cone breakup is commonly modeled as a simple resonance and we look at it with a single tone sweep, ignoring what "breakup" actually means . . . that different parts of the cone are moving in different directions or with different velocity at the same time, and not tracking the voice coil at all. The question becomes not just what that "sounds like" at the frequency at which it is occurring, but what the "rest of the music" sounds like when that is happening.

Yes, one of many sources of intermodulation and (nonlinear) distortion in moving coil speakers.

I argued with S Linkwitz on another board 10-15 years ago about it, he did not believe it back then.
 
gedlee said:
The results of my test of distortion did not indicate any difference in symmetric versus non-symmetric nonlinearities. There is no scientific data that says that this is the case and no reason to believe that it would be the case based on the hearing mechanism.

And I did not say so either. If you read posts no17 and no18 again you'll see.

What I was saying was basically that it's flawed thinking, this thing with pleasing 2nd order HD. As you know any nonlinearity will result in modulation distortion.

But, what you write is actually incorrect even though I did not adress that specifically and even though you did not find this in your studies.

The envelope and AM-distortion is clearly different when a high frequency tone is modulated by a fixed low frequency tone dependning on if the nonlinearities are dominantly symmetric or asymmetric. Remember, our ears are spectrum, phase and envelope detectors in varying degrees depending on frequency range.
 
AllenB - good point, I was going to make that myself. Even though we call it "breakup" it is in fact simple resonances in the body of the diaphragm, and is, for the most part linear. This means that it cannot generate NLD. The vibration would have to be excessive for that to happen. The oil-can example is the only one that I see here, or have seen, that exhibits a nonlinear vibration of the cone. The harmonics that we see are all created by the cones natural motion in its nonlinear BL field, or inductance, compliance etc.
 
The envelope and AM-distortion is clearly different when a high frequency tone is modulated by a fixed low frequency tone dependning on if the nonlinearities are dominantly symmetric or asymmetric. Remember, our ears are spectrum, phase and envelope detectors in varying degrees depending on frequency range.

Which does not prove that it is audibly different. The ear is not a highly sensitive linear detector. It has all kinds of limitations - to wit perceptual coding.
 
I think there are a host of non-linearities in CD's and horns as shown in the paper below.

http://doc.utwente.nl/58981/1/Schurer94modeling.pdf

Also this is a very nice summary of distortions generated by CD's, by Grunberg EP 0340265 A1 (1989):
The elasticity resulting from the resistance of a gas to compression and rarefaction was first investigated scientifically by Robert Boyle in 1662. He showed that under isothermal conditions the pressure, p, of a given mass of gas varies inversely as to its volume, v, i.e. pv = c where c is a constant. This is a relationship for so called 'ideal' gases.

The behaviour of air as a medium for sound propogation does not comply with isothermal conditions. The rapidity of successive compressions and rarefactions determine adiabatic conditions. S.D. Poisson showed that the relationship between pressure and volume of a gas under adiabatic compression or expansion is of the form pvQ = c where < is the ratio of specific heat of the gas at constant pressure to the specific heat at constant volume. For air this value is ϊf = 1.41 and c = 0.726.

Poisson's relationship shows that equal positive and negative increments of pressure imposed upon equal masses of air cause unequal changes in volume, the volume change for positive pressure being less than that for negative.

The propogation of waves of finite amplitude was treated mathematically by Poisson in 1808. As a consequence of the above relationship, he showed that in general, acoustic waves cannot be propogated without a change in form and as a result additional harmonic frequencies are parasitically generated from the energy present in the fundamental. The theoretical magnitudes of the energies of the harmonics can be obtained from the solution to the exact differential equation of wave propogation in air. With respect to loudspeakers, the problem was first treated in 1933 by M.Y. Rochard who investigated the production of harmonics in horns due to the non-linearity of air.

In the theory and design of electro-acoustic compression drivers, the consideration of the linearity of volume displacement of the diaphragm for applied AC voltages has been a neglected factor. It has been assumed that for a sinusoidal electrical input, the displacement of the diaphragm is sinusoidal resulting in a sinusoidal change in the pressure of the air between itself and the phasing plug. (Lo Beranek, Acoustics P 272.)
 
Last edited:
i freely admit my understanding of distortion is not complete.
much of my confusion arises because there so many types and causes of distortion that creating "sub-classes" only confounds me more. it's an oversimplification but for me anything that wasn't part of, or different from, the "original signal" i consider distortion.
i guess my view is that if a "device" distorts or changes the signal it is "non linear" which may be a semantic point.
in order to help me sort some of this out and understand the linear, non linear definitions/classes of distortion is clipping considered linear or non linear?

i can think of many instances where a waveform's shape is modified (but not clipped) so what class does that belong to?

the term distortion is often used to describe differences in frequency response (which to me is not distortion at all it's merely differences in level at specific frequencies) and which class heading does this go under?
 
"clipping" in the sense used by PMA will include voltage clipping and current clipping and slew rate clipping/limiting.
All these are non linear distortions. These are sudden changes.

whereas changing the proportions of the harmonics in a signal due to a non linear amplification is a linear distortion. These are slowly changing as the signal level varies.

Keeping the language simple works for me. That is why I could not understand the earlier stuff about linear and non linear.
 
A single ended amplifier develops a bit of 2nd harmonic. At low levels the 2nd harmonic distortion is very low.
As level increases, the distortion increases.
As output signal approaches the supply rails, one can clearly see the waveshape changing. One half becomes pointed and the other half becomes rounded. This is classic single ended 2nd harmonic of a single ended stage. It is what the single ended VAS stage does.
This is a changed waveshape that is caused by the non linear amplification as signal level changes. It is a gradual change. It is linear. Simulators can predict it. Scopes can display it. Lot's of diagrams show why it happens.

The example earlier of rarefaction and compression of air is the same 2nd harmonic distortion mode. It is a linear effect and can be modelled as a gradually changing effect as level is changed..
 
Last edited:
back to it's audibility in horns.
i guess statements like "distortion in horns is irrelevant"just get my goat because i'd like to feel that the years of replacing diaphragms and spending exorbitant amounts of money on new horns, in a seemingly endless quest to improve the fidelity of my PA, wasn't a exercise in futility.
 
thanks for mentioning that X!
crossover distortion in an amp is what i've thought of as non linear.
and the sort of thing that i can readily hear in a horn. personal preference for high end horns or tweets class A all the way. i can live with worsening spec's as frequency decreases.
 
If the transistor models can accurately portray the behaviour at the very low Ic/Id values, then the simulator can use it's mathematical algorithm, to calculate the output signals and particularly the difference in upper and lower half output signals, just either side of the crossover.
This seems to me to be a gradually occurring change as levels change.

Does that make crossover distortion a linear distortion?
 
i know i'll be told i'm wrong but i consider "crossover" a non linear distortion but hey i was wrong about "clipping"
type of amplifier distortion doesn't change it's audibility in a horn or tweeter or speaker it does make me wonder if type(of distortion) plus level added to all the other non linear attributes of a horn make it worse, or can have "masking" qualities which puts it's audibility/detection in question.
 
It's really much simpler then you think, and the esatblished terms do makes sense.

Linear = changes the signal equally much no matter the signal level. You EQ circuit does the same job no matter if you put in 1mV or 1V into it.

Nonlinear = changes the signal in different amounts depending on input signal level. Low level means almost no distortion, medium level means more HD and IMD, high level may result in clipping.

Linear = changes the signal lineary vs level.

Nonlinear = changes the signal nonlinear vs level.

The simple thumb of rule >>> nonlinear distortion means you will see new spectral components, new tones, in the signal.
 
Last edited:
ok i think i know where i'm going wrong on my thinking.
i always viewed clipping as linear because it occurs at a specific point,but the result/by product (new component) is non linear.

I don't think it makes sense to say that a distortion product is linear. When we talk about linear and nonlinear we tak about the transfer function as such.

Distortion is the effect, nonlinear transfer functions is the cause.

I also don't think it makes sense to call something linear only because it happens at a known/specific point. Signal level still affects the results hence the mechanism is nonlinear.
 
Last edited:
Status
Not open for further replies.