Hello,
I invested some thought this morning. I looked at audio Tester V3 user manual. <snip>
Yes, I fully agree. In addition, there are different normalisation levels for dBFS. A sine wave with a "full level amplitude" spanning from +100% to -100% in the digital domain may correspond to either 0 dBFS or -3 dBFS depending on the "standard" that is implemented in the software. Wikipedia has more on this.
I personally don't like to mess around with different dB reference values. I tend to forget the the correct values, and sometimes it's hard to tell which reference was used (as in the dBFS example). I therefore like to use "direct" units without normalisation. What's wrong with plain Volts? Just use log scale on the y-axis of your plots and the curve has the same shape as a "dB curve", while the data are in absolute Volts. No normalisation levels involved. Easy.
That said, one big advantage of the RTX over other "soundcards" is that the level switches relate to fixed, discrete levels. No variable pots that will never again give the same levels once you touched them. Also, the calibration coefficients for the RTX levels are straightforward to determine from the labels given on the frontplate once you realise the numbers refer to sine-RMS amplitudes. Or you could use the table given on the WIKI page. I hope your software allows easy handling of different calibration factors for different settings of the level switches and balanced/unbalanced output.
Don't assume a newer faster PC will be better. I'm using a new latest 8th gen I7 w/ 32 G ram etc. on this desktop and the lowest latency I get is 1000uS.
It may also depend a lot on the USB drivers and operating systems. In an extreme example I got about 10 times less latency using Linux on my Mac laptop than with plain vanilla Mac OS running on the same machine.😱
This is typical of dropping a sample. Larger FFT's lead to a higher potential for this. Shared USB ports and internals can also cause problems. Even though the ports all look isolated many times they roll up to a single USB controller. USB 3 can be even more problematic since its driver is not native usually. USB 3 ports used for USB 2 may also be an issue since the USB 2 is "legacy" and not always the best implementation. For a desktop a dedicated USB controller can help.
This is a good tool: DPC Latency Checker
Don't assume a newer faster PC will be better. I'm using a new latest 8th gen I7 w/ 32 G ram etc. on this desktop and the lowest latency I get is 1000uS.
This tool doesn't work correctly with anything newer than Windows 7. It reports very high and incorrect results on most systems running Windows 10. On Windows 8 or higher people should use LatencyMon from Resplendence instead.
If you have a system with a 3rd generation Intel Core CPU (e.g. i5-3xxx) or higher they are almost guaranteed to be native Intel USB 3.0 XHCI ports. They first appeared on 2nd gen Core CPUs, so they have been around for a long time now. I agree to avoid using any port connected to a Renesas or ASMedia controller. Typically the newest Intel chipsets don't supply any 2.0 ports, they are all backward-compatible XHCI off the chipset or SoC. Vendors may add 2.0 ports through hubs.
You are right to note that the DPC / interrupt latency does not necessarily get better with newer systems. The best results I have seen are from a pre-UEFI Gen 1 Core i7-990x (Gulftown) system. It shouldn't be significantly worse on anything decent though. If you have DPC latency issues the usual culprit is a poorly written device driver. LatencyMon gives you the info to track down the offending driver. Occasionally UEFI/BIOS updates can help but only if there is something weird going on.
Last edited:
It may also depend a lot on the USB drivers and operating systems. In an extreme example I got about 10 times less latency using Linux on my Mac laptop than with plain vanilla Mac OS running on the same machine.😱
It will even depend what Linux kernel you run. You'll get lower latency with a kernel compiled with voluntary preemption enabled (some distros). The lowest would probably be achieved with the PREEMPT_RT patch, but it's overkill for most use cases. All it takes is one poorly written driver to derail things on Windows, Linux, and Mac OS X because drivers run in kernel space.
Last edited:
Yes, I fully agree. In addition, there are different normalisation levels for dBFS. <snip>
Yes, I fully agree too. I'm an old-fashioned man, I set my "ref" level once and then run the applications! That's why I love stand-alone instruments!
(something went wrong in the post)
Last edited:
I do not see the point in this discussion. Everybody has the opportunity to choose the scales he finds most suited for himself and the problem he is investigating.
Even "plain Volts" need to be accompanied by the switch setting of the instrument, so that you can judge what comes from the instrument and what from the DUT.
And with switch settings known it is easy to jump from one representation to the other, if needed.
Even "plain Volts" need to be accompanied by the switch setting of the instrument, so that you can judge what comes from the instrument and what from the DUT.
And with switch settings known it is easy to jump from one representation to the other, if needed.
I think that from members concern some uncertainty is flying around about the real values displayed on the RTX front panel and actual values at the output pins, which to be sure you have to measure them. May be a display meter would be a solution.
This tool doesn't work correctly with anything newer than Windows 7. It reports very high and incorrect results on most systems running Windows 10. On Windows 8 or higher people should use LatencyMon from Resplendence instead.
If you have a system with a 3rd generation Intel Core CPU (e.g. i5-3xxx) or higher they are almost guaranteed to be native Intel USB 3.0 XHCI ports. They first appeared on 2nd gen Core CPUs, so they have been around for a long time now. I agree to avoid using any port connected to a Renesas or ASMedia controller. Typically the newest Intel chipsets don't supply any 2.0 ports, they are all backward-compatible XHCI off the chipset or SoC. Vendors may add 2.0 ports through hubs.
This system has the Intel USB 3 plus Asmedia USB 3, something I see in many AT size motherboards for some reason. This is a very current Gigabyte Z-370 motherboard. Its running a fresh install of Win 10 pro. Unfortunately no easy way to figure out which controller connects to which port.
I was looking the Wiki tutorial about the RTX6001.
For now it is concentrated to the loop-back and amplifier measurements.
But, what is happening on digital devices, like dac?
Many Spectrum Software, like ARTA that I am familiar, they can't to choice a different ASIO driver for the input and output, for example if you have a usb to i2s card that feeds your dac...the best way would be if the input was the RTX and the output was the usb receiver.
For now, via ARTA I have export some files like THD, IMD and Jitter at various sampling and play them via foobar to usb receiver that feeds the dac.
Next, the output of the DAC goes to RTX input, then the ARTA set like external at the source.
OK, with that way THD, IMD, Frequency and Jitter could be done.
But, with that way I can't measure THD vs Frequency plot, linearity e.t.c
Is there any other software, that could be setup with different asio for input-output?
For now it is concentrated to the loop-back and amplifier measurements.
But, what is happening on digital devices, like dac?
Many Spectrum Software, like ARTA that I am familiar, they can't to choice a different ASIO driver for the input and output, for example if you have a usb to i2s card that feeds your dac...the best way would be if the input was the RTX and the output was the usb receiver.
For now, via ARTA I have export some files like THD, IMD and Jitter at various sampling and play them via foobar to usb receiver that feeds the dac.
Next, the output of the DAC goes to RTX input, then the ARTA set like external at the source.
OK, with that way THD, IMD, Frequency and Jitter could be done.
But, with that way I can't measure THD vs Frequency plot, linearity e.t.c
Is there any other software, that could be setup with different asio for input-output?
I do not see the point in this discussion. Everybody has the opportunity to choose the scales he finds most suited for himself and the problem he is investigating.
Even "plain Volts" need to be accompanied by the switch setting of the instrument, so that you can judge what comes from the instrument and what from the DUT.
And with switch settings known it is easy to jump from one representation to the other, if needed.
My Point is calibration or more particularly lack of calibration. By definition dBFS is a floating uncalibrated reference.
The switch settings are important yes. However the SW and hardware need to be calibrated together. The posted plots need legs of their own, they need a calibrated scale to be meaningful.
If a plot is posted with only a dBFS scale we have no idea what zero dBFS is.
If someone else posts another plot scaled in dBFS for comparison there is no way of contrasting the two.
Volts RMS will do, or any calibrated dB scale will do; dBV, dBu, dBw or whatever.
I like 0 volts RMS = 0dBV.
DT
The RTX6001 is calibrated by factory (that is you know how many volts FS is - the values indicated by the switch setting should be assumed to be "it") and there is no reason to assume that it would lose calibration between two measurements.My Point is calibration or more particularly lack of calibration. By definition dBFS is a floating uncalibrated reference.
The switch settings are important yes. However the SW and hardware need to be calibrated together. The posted plots need legs of their own, they need a calibrated scale to be meaningful.
If a plot is posted with only a dBFS scale we have no idea what zero dBFS is.
If someone else posts another plot scaled in dBFS for comparison there is no way of contrasting the two.
Volts RMS will do, or any calibrated dB scale will do; dBV, dBu, dBw or whatever.
I like 0 volts RMS = 0dBV.
DT
As any measurement instrument you probably need to recalibrate on regular basis (e.g. yearly).
And...
If a plot is posted with only a dBV scale we have no idea what how far you are away from FS of the ADC and can not judge other things related to this.
Last edited:
Is there any other software, that could be setup with different asio for input-output?
I've found that HPWworks is excellent in handling different asio drivers contemporarily. It was not without reason that my previous shots of the externally driven Mirand dac were done with that software.
Last edited:
I have an observation for which I see no explanation up to now.
For the setting Out=20dBV, In=10dBV, sample rate 192kHz, the wideband noise floor is for the balanced output 6dB higher than for the unbalanced.
Pictures loopback, no signal (with signal alike), first balanced, second from the BNC out.

If the noise from both differential output pins would be uncorrelated I would expect only a rise of 3dB. 6dB you can only get if the balanced noise is some symmetrization of unbalanced noise.
For the setting Out=20dBV, In=10dBV, sample rate 192kHz, the wideband noise floor is for the balanced output 6dB higher than for the unbalanced.
Pictures loopback, no signal (with signal alike), first balanced, second from the BNC out.


If the noise from both differential output pins would be uncorrelated I would expect only a rise of 3dB. 6dB you can only get if the balanced noise is some symmetrization of unbalanced noise.
I've found that HPWworks is excellent in handling different asio drivers contemporarily. It was not without reason that my previous shots of the externally driven Mirand dac were done with that software.
Joseph, thanks for the info.
Really, the HPWorks supporting ASIO Multi Client?
I must to download a demo version to try!
...
By definition dBFS is a floating uncalibrated reference.
...
If a plot is posted with only a dBFS scale we have no idea what zero dBFS is.
If someone else posts another plot scaled in dBFS for comparison there is no way of contrasting the two.
Volts RMS will do, or any calibrated dB scale will do; dBV, dBu, dBw or whatever.
I like 0 volts RMS = 0dBV.
DT
...
And...
If a plot is posted with only a dBV scale we have no idea what how far you are away from FS of the ADC and can not judge other things related to this.
I agree with all bolding lines. And the two statements have true!
For that reason, dBV and dBFS for the same measuring must including.
I have an observation for which I see no explanation up to now.
For the setting Out=20dBV, In=10dBV, sample rate 192kHz, the wideband noise floor is for the balanced output 6dB higher than for the unbalanced.
Pictures loopback, no signal (with signal alike), first balanced, second from the BNC out.
View attachment 660934 View attachment 660935
If the noise from both differential output pins would be uncorrelated I would expect only a rise of 3dB. 6dB you can only get if the balanced noise is some symmetrization of unbalanced noise.
The noise from the two balanced output pins is not uncorrelated.
Most of the noise comes from the DAC and the two signals (+) and (-) are just signals with opposite polarities, but originating from the same source, the DAC.
A minor part of the noise is added later in the output buffers and this noise contribution is uncorrelated. This probably explains that you measure a difference of 5.42 dB and not exactly 6 dB. If all the noise was correlated, I would expect 6 dB difference, but since some of it is not, you end up at a lower value.
I do not see the point in this discussion.
I guess we are trying to say that one should be careful to report the level switch seeings as well as the dB normalisation standards used with the software when showing off their graphs.
Everybody has the opportunity to choose the scales he finds most suited for himself and the problem he is investigating.
Yes!
Even "plain Volts" need to be accompanied by the switch setting of the instrument, so that you can judge what comes from the instrument and what from the DUT.
And with switch settings known it is easy to jump from one representation to the other, if needed.
I just mentioned the "plain Volts" as an example to get rid of the ambiguity involved with different "standards" of dBFS reference values. Apart from this, I am not sure what you are trying to point out here. If the data calibration in the software is right (i.e., it matches the level switch settings), the test results should be the same for the different level settings -- except maybe if the signal gets lost in the noise floor, or if it is clipped due to a mismatch of signal levels with the DUT.
Joseph, thanks for the info.
Really, the HPWorks supporting ASIO Multi Client?
I must to download a demo version to try!
Yes ☺
I like 0 volts RMS = 0dBV.
So 6 dBV is 2 x 0 Volt, which is also 0 Volt. Following this logic, all dBV values correspond to 0 Volt. This is so simple that even I might start to like decibels! 😀
No, seriously, you probably meant 1 Volt RMS = 0 dBV (see Wikipedia).
- Home
- Design & Build
- Equipment & Tools
- DIY Audio Analyzer with AK5397/AK5394A and AK4490