I'm trying to learn how to use ARTA's spectrum analyser. I new to all this and could use some help, both with respect to ARTA and the art of noise analysis itself.
I have an ESI Juli@ sound card and I am running ARTA under Crossover on a Mac Pro. I've done the loop-back testing and line sensitivity calibration to setup Audio Devices.
As an initial project, I am trying to examine the noise of my implementation of a balanced to single-ended line input (the "DUT") for an amp build project. I am using a Samuel Groner designed low noise pre-amp with a (nominal) gain of 60dB and have entered a gain of 1000 in ARTA's audio device setup. The DUT's inputs have been shorted.
As I am operating ARTA under a commercial implementation of Wine I don't have access to ASIO. I understand that I need, therefore, to ensure that the sample rate selected in ARTA's control panel is the same as the sample rate used by the Juli@ - set via a Juli@ control panel.
My first question relates to how to set the range for a wideband noise analysis, e.g. 22Hz-22kHz. I see that a low frequency cutoff can be set in Setup -> Spectrum Scaling. I also see how weightings can be applied. I don't, however, see how the upper frequency is set. I assume also that the RMS figure displayed in red is the wideband noise measurement computed - correct?
I understand that to relate a (singular) wideband measurement figure of, let's say, -120dBV to the noise spectral density I see in the graphic when PSD/dBV is selected in Setup -> Spectrum Scaling I need to divide it by the square root of the measurement bandwidth (e.g. 21,978Hz). That is a -120dBV RMS value will have a chart with values roughly averaging -163dBV/rtHz. Correct? My presumption is that the gain of the pre-amp is already taken into consideration and that the RMS value is "referred to the input". Correct or do I need to factor in the 60dB of gain somehow?
Any help appreciated. Thanks in advance
Steve
I have an ESI Juli@ sound card and I am running ARTA under Crossover on a Mac Pro. I've done the loop-back testing and line sensitivity calibration to setup Audio Devices.
As an initial project, I am trying to examine the noise of my implementation of a balanced to single-ended line input (the "DUT") for an amp build project. I am using a Samuel Groner designed low noise pre-amp with a (nominal) gain of 60dB and have entered a gain of 1000 in ARTA's audio device setup. The DUT's inputs have been shorted.
As I am operating ARTA under a commercial implementation of Wine I don't have access to ASIO. I understand that I need, therefore, to ensure that the sample rate selected in ARTA's control panel is the same as the sample rate used by the Juli@ - set via a Juli@ control panel.
My first question relates to how to set the range for a wideband noise analysis, e.g. 22Hz-22kHz. I see that a low frequency cutoff can be set in Setup -> Spectrum Scaling. I also see how weightings can be applied. I don't, however, see how the upper frequency is set. I assume also that the RMS figure displayed in red is the wideband noise measurement computed - correct?
I understand that to relate a (singular) wideband measurement figure of, let's say, -120dBV to the noise spectral density I see in the graphic when PSD/dBV is selected in Setup -> Spectrum Scaling I need to divide it by the square root of the measurement bandwidth (e.g. 21,978Hz). That is a -120dBV RMS value will have a chart with values roughly averaging -163dBV/rtHz. Correct? My presumption is that the gain of the pre-amp is already taken into consideration and that the RMS value is "referred to the input". Correct or do I need to factor in the 60dB of gain somehow?
Any help appreciated. Thanks in advance
Steve
Last edited:
- Status
- Not open for further replies.