Since 1982 it is 2V rms in unbalanced digital audio. Not in all but about 90%.
Not in tube stuff, these place themselves out of every possible standard.
Not in tube stuff, these place themselves out of every possible standard.
When you find that standard, I'm sure we'd all like to see it. As Mark pointed out,... until I tried to replace an audio source (from a HDMI Audio Extractor device, unbranded), with the analog audio output of a SKY HD receiver (available in the SCART socket).
The result was very disappointing, the audio level was about half of that delivered by the extractor.
I can compensate by increasing the volume on the amplifier driving the ceiling speakers (sports bar venue), but that don't seem to be the right thing to do, as the punters like it load during important footy matches.
If I've got to use pre-amplifiers, I'm not going to achieve anything by trying to remove what I thought was superfluous kit....
The audio out from the SKY box was not variable with the "Volume Controls" on the SKY remote. is there a way to boost it ?
TIA
-10dBV is considered a good level to run. When I was working I'd calibrate audio
interface boxes (Henry Matchbox most commonly) at -8 dBV and +4dBm. Never
had a compaint and things just "worked".
G²
-10dBV is acceptable for consumer devices. You don't expect hi-fi from that. For professional audio it's a joke. The SNR is a disaster.
-10dBV nominal level is 2Vrms peak, with 16dB of headroom. I think that's where the 2Vrms standard came from when transitioning from analog to digital.... when digital productions still had ample dynamic range...