Thanks for the tips.
I revisited the cables and they have been made correctly - twin shielded and screened audio cable with one wire from to signal - signal, one from gnd - gnd and shield connected to gnd only on the interface side.
I am not actually convinced that this is correct. You should have the following input connector pinout:
(XLR | TRS - at cable - at DUT output)
pin 1 | sleeve - cable shield - n/c
pin 2 | tip - cable hot - signal
pin 3 | ring - cable cold - gnd
Here's a matching attenuator topology as well:
This is useful for both BTL and SE outputs.
You may not need one if you have cables both for XLR (mic) and TRS (lone-level) input, as it looks like you have a wide variety of maximum input levels to choose from:
+3.5 dBu - XLR, no pad (there should be little degradation in input dynamic range when turning the gain up by up to about 20 dB)
+13.5 dBu - XLR, w/ pad
~+24 dBu - TRS, I'm guessing w/ pad
(Values taken from 2i4 1st gen manual, assuming this is the one you have.)
Switching to ASIO makes the DUT clip so I need to look into this more.
You may have had the output turned down digitally (maybe the software volumee control exposed in the OS is driver-level attenuation only). Turn down the 2i4's MONITOR control to reach previous levels. Turning up volume in Windows should bring non-ASIO levels up.
I've had a look at the absolute level calibration and have a Fluke179 that is capable. I'll have a play with this to get consistency.
What voltages do you suggest for the testing?
1 Vrms (XLR no pad) and 2 Vrms (XLR w/ pad) would be pretty standard, I suppose.
4 Vrms would be of interest with a 300 ohm load (TRS input should still do that w/o pad)
400 mV may be of interest with e.g. a 32 ohm load in particular (I presume distortion with 300 ohms would be well-described by intercept points at these levels, i.e. should agree well with levels extrapolated from 1 Vrms).