Audio loopback - THD & Signal/Noise ratio

Status
This old topic is closed. If you want to reopen this topic, contact a moderator using the "Report Post" button.
I'm looking for some way to corroborate some stuff I am working on.

If I:

- plug a loopback cable between Line out and Line in

- Play a 1kHz tone (at 48kS/s)

- Record that tone (for say a second)

- Run Fourier Transform over that data

- Adjust the playback and capture gains until the optimum point is found (just below clipping is detected)

Is there any examples of what these spectrums should look?

And then if:

- I remove the the 1kHz tone from the data.

- Remove the subsonic/ultrasonic components (I assume there is a standard weighting that could be applied here)

Then what is left over is a measure of THD+Noise?

What sort of figures would you expect to see for THD+Noise on a nasty audio CODEC and a good audio CODEC?

In my current experiments I'm getting figures over about 0.005% on a laptop and 0.01% on a desktop. Do those numbers sound about right?
 
Status
This old topic is closed. If you want to reopen this topic, contact a moderator using the "Report Post" button.