Software Distortion Compensation for Measurement Setup
Hi,
I am thinking about options for some sort of digital compensation of harmonic distortion within the measurement loop (DAC -> ADC). E.g. ESS DAC chips offer second and third harmonic distortion compensation.
I have played with virtual balanced measurement setup Virtual balanced in/out from regular soundcard in linux - results which produces very good results, if properly calibrated. The principle is simple - each input channel has a precisely calibrated software gain element in the chain and difference of the left/right samples is provided to audio applications through a virtual sound capture device (very simple to achieve in linux).
I am thinking of the very same principle for some basic digital harmonic distortion compensation. Initial calibration of the measurement loop would measure the loop performance and generate configuration parameters. Parameters would be used in some non-linear gain component in the input chain (very likely a modified linux alsa route plugin reading multiple gain coefficients - I can handle that).
I understand a simple non-linear gain compensates only gain non-linearities of the loop but that would be a good start, IMO.
I found a relevant very interesting paper http://jrossmacdonald.com/jrm/wp-content/uploads/052DistortionReduction.pdf which deals with calculation of the non-linear compensation gain coefficients.
Please does anyone have any relevant experience/suggestion/theoretical background in this area? IMO this feature is possible to implement, leading to better-precision measurements with a regular prosumer-level equipment.
I very much appreciate any input.
Hi,
I am thinking about options for some sort of digital compensation of harmonic distortion within the measurement loop (DAC -> ADC). E.g. ESS DAC chips offer second and third harmonic distortion compensation.
I have played with virtual balanced measurement setup Virtual balanced in/out from regular soundcard in linux - results which produces very good results, if properly calibrated. The principle is simple - each input channel has a precisely calibrated software gain element in the chain and difference of the left/right samples is provided to audio applications through a virtual sound capture device (very simple to achieve in linux).
I am thinking of the very same principle for some basic digital harmonic distortion compensation. Initial calibration of the measurement loop would measure the loop performance and generate configuration parameters. Parameters would be used in some non-linear gain component in the input chain (very likely a modified linux alsa route plugin reading multiple gain coefficients - I can handle that).
I understand a simple non-linear gain compensates only gain non-linearities of the loop but that would be a good start, IMO.
I found a relevant very interesting paper http://jrossmacdonald.com/jrm/wp-content/uploads/052DistortionReduction.pdf which deals with calculation of the non-linear compensation gain coefficients.
Please does anyone have any relevant experience/suggestion/theoretical background in this area? IMO this feature is possible to implement, leading to better-precision measurements with a regular prosumer-level equipment.
I very much appreciate any input.
Last edited: