I don't see what the problem is. Both ways of expressing sensitivity are relevant and easy to recalculate. You only need to pay attention to the impedance of the driver if the sensitivity is expressed by 2.83V / 1m. So that there are no wrong conclusions that a driver is more sensitive just because the impedance is lower.
Expressing by wattage is not sensitivity but efficiency, and normally at a percentage. Efficiency is by wattage, sensitivity by voltage. You do not need to express the impedance when spec by voltage. You do at 1W or 2W as power varies into the load. At voltage, you get the output at the same voltage, no conversion necessary, and xover design is best done at voltage where no conversion needs to be done.
Efficiency does not take the doubling into account, so you only get +3dB as opposed to +6dB when using 2 drivers over one. If you mic-measure at one output level, and don't move the knob, then wattage will change with load difference whereas voltage remains consistent. You don't want to have to adjust for wattage in design on the fly, as it is changing.
If you have 2 drivers at 2.83V, and one is 89dB and the other is 91.5dB, you have 2.5dB delta. Plain and simple.
You are not the first person i've seen reference wattage over voltage, but it is just not as practical in use or during design of xovers with the way the drivers, amplifiers and microphones actually function.