i am struggling to understand the compareability of sensivity measurements.

Example:

A driver which is called a 2 Ohm driver has a Re = 1.3 Ohm, a minimum impedance of 2.1 Ohm

A driver which is called a 8 Ohm driver has a Re = 4.5 Ohm, a minimum impdedance of 4.5 Ohm

One question is:

the Impedance value ( 2/8 Ohm ) is a mean value as i understand it. But how is it calculated ?

Is a 8 Ohm driver with Re = 4.5 Ohm still a 8 Ohm driver or more like a 7 Ohm driver ?

I am struggling how much voltage i should set to get a 1W/1M sensivity measurement, as for a 8 Ohm driver i should apply 2,83V to equal 1 W. But 8 Ohm drivers are different the Re range goes from 4.2 Ohm to 6.2 Ohm.

Does that make a difference which voltage i have to set ? Or is it correct to apply 1khz and set 2,83V as long as the driver is labeled 8 Ohm ?

Thank you !