Are 1W/1M sensivity plots compareable?

Hello,

i am struggling to understand the compareability of sensivity measurements.

Example:

A driver which is called a 2 Ohm driver has a Re = 1.3 Ohm, a minimum impedance of 2.1 Ohm
A driver which is called a 8 Ohm driver has a Re = 4.5 Ohm, a minimum impdedance of 4.5 Ohm


One question is:

the Impedance value ( 2/8 Ohm ) is a mean value as i understand it. But how is it calculated ?

Is a 8 Ohm driver with Re = 4.5 Ohm still a 8 Ohm driver or more like a 7 Ohm driver ?

I am struggling how much voltage i should set to get a 1W/1M sensivity measurement, as for a 8 Ohm driver i should apply 2,83V to equal 1 W. But 8 Ohm drivers are different the Re range goes from 4.2 Ohm to 6.2 Ohm.

Does that make a difference which voltage i have to set ? Or is it correct to apply 1khz and set 2,83V as long as the driver is labeled 8 Ohm ?

Thank you !
 
Nominal impedance found in the datasheets of speakers is indeed rarely exactly one of the simpler values like 2, 4, 8 or 16 ohms. The values are mostly rounded down or up to one of these commonly used numbers.

If you want to get close enough, take the manufacturer number. If you want to be really exact, use the minimum of a measured impedance curve of the speaker (chassis in enclosure).
 
Thanks for the input,

The reason i ask is f. ex.: A 2 Ohm driver built into a cabinet can show a impedance minium of 0.8 Ohm. Therefore, if we calculate the voltage input for 1W, it needs to be 1,12V ( P = U^2/R ) instead of 1,41 V if we take the 2 Ohm value.

20xlog(1,41V/1,12V) = 2dB difference. That is quite a lot.

So my conclusion is, it is not possible to determine a 1w/1m sensivity measurement without an impedance plot taken, is that correct ? Thank you !
 
The way typical speakers are used for home audio today, voltage sensitivity is more relevant. When connected to an amp with low output impedance, the loudness is proportional to voltage, and the amp will supply whatever current is required into the varying impedance (within reason). That's why it's more common to see drivers rated at 2.83 V these days, instead of at 1 watt. Do you have a specific reason for being concerned with 1-watt ratings?

https://www.audiosciencereview.com/...bout-nominal-vs-real-speaker-impedance.27994/
"IEC 60268-5 specifies that the lowest value of the impedance magnitude within the rated frequency range shall not be less than 80% of the rated impedance. The standard also requires that if the impedance at any frequency outside the rated frequency range (including DC) is less than 80% of nominal impedance, this should be stated in the specifications."
 
  • Like
Reactions: Brian Steele
Sensitivity measurements should always be done a specified VOLTAGE, the standard being 2.83V. Sensitivity should not be quoted in terms of W/M.

EFFICIENCY measurements are the ones that should be done at a specified WATTAGE (typically 1W), but I believe those are usually derived from the sensitivity measurements, using the system's "nominal impedance".
 
Hi all,

That's why it's more common to see drivers rated at 2.83 V these days, instead of at 1 watt. Do you have a specific reason for being concerned with 1-watt ratings?

Mattstat, the reason i am asking is - if one likes to compare f.ex. subwoofers, both loaded with 8 Ohm speakers. One draws way more than one Watt if applied with 2.83V, the reason beeing a lower impedance curve. BUT, why does one need a 1W or 2,83V measurement, if one can not calculate the maximum SPL of the speaker in the end, as the current that is drawn is unknown if the minimum impedance is unknown and the wattage is unknown.

Of course, maximum SPL depends on the nonlinearities and power compression in the end, but a linear logarithmic calculation of lets say 100dB 1W/1m + amp gain must be possible in comparision with other subs.

The issue is, the sensivity of a 2 Ohm speaker applied with 2,83V will be 6dB higher than the 8 Ohm driver, but the sub will not be louder in the end, as the 2 ohm driver already draws 4x the wattage given 2,83V than the 8 Ohm driver.

This as well counts in between 8 Ohm drivers, there are 7 Ohm and 9 Ohm drivers labeled as 8 Ohm, therefore it is not of any value to apply 2,83V to both of them ??

I can not compare them in the end if i apply the identical voltage ! I have no chance to calculate the right maximum SPL.
 
The layman can be fooled with Watt numbers and calculated, often more or less ficticious, manufacturer Max. SPL claims.

The (semi-)advanced user will have the opportunity to for example even in software like WinISD enter the TSP of his speaker, enter enclosure data, and receive calculated power draw and membrane excursion figures for a selected frequency and amplifier voltage, which provide a closer estimate (which is still neglecting some factors like power compression and resonator / port compression) of real Max. SPL.

The advanced or professional user will trust nothing but cleanly measured distortion limited SPLs, first choice being CEA-2010, with alternatives of for example multitone distortion or SPL @ THD.

The end.
 
One draws way more than one Watt if applied with 2.83V, the reason beeing a lower impedance curve. BUT, why does one need a 1W or 2,83V measurement, if one can not calculate the maximum SPL of the speaker in the end, as the current that is drawn is unknown if the minimum impedance is unknown and the wattage is unknown.
Normal, modern transistor amplifiers behave mostly like voltage sources. It seems like some of your thinking is along the lines of current sources.

In typical use, it doesn't matter how many watts it draws from the amplifier, as long as the amp can handle the load. If you're running a flea powered tube amp it's a different deal, which is why I asked if you had a particular reason for being concerned with it. Some low power/battery powered scenarios or other particular cases might also make something like this important.

There's more than one reason to look at voltage sensitivity. If you are designing a speaker and want to look at two drivers you plan to combine with a crossover, voltage sensitivity is what tells you whether they are going to play at the same loudness when connected to one typical transistor amplifier.

A little math will tell you what voltage is available from an amp rated at X watts per channel. E^2=PR. With experience, it gets pretty easy to bounce back and forth between power, voltage, sensitivity plus + X dB from X watts, etc. No, it's not a precise wattage consumed calculation, but that's rarely needed for normal home audio scenarios (kind of like never).

In practice, volume displacement/Xmax is often the limiting factor for output. Excursion is highly frequency and enclosure dependent. How you are using the driver is typically far more of a factor than the power handling rating, unless you are doing prosound or other high output designs.

Small full-range or other tweaky drivers with limited power handling are a different deal also. Which again gets us back to whether you had a specific reason you were chasing output at 1 watt, instead of the typically more useful 2.83 V.
 
Last edited: