I thought that "Analog Line Level" was a standard ...

... until I tried to replace an audio source (from a HDMI Audio Extractor device, unbranded), with the analog audio output of a SKY HD receiver (available in the SCART socket).

The result was very disappointing, the audio level was about half of that delivered by the extractor.

I can compensate by increasing the volume on the amplifier driving the ceiling speakers (sports bar venue), but that don't seem to be the right thing to do, as the punters like it load during important footy matches.

If I've got to use pre-amplifiers, I'm not going to achieve anything by trying to remove what I thought was superfluous kit....

The audio out from the SKY box was not variable with the "Volume Controls" on the SKY remote. is there a way to boost it ?

TIA
When you find that standard, I'm sure we'd all like to see it. As Mark pointed out,
-10dBV is considered a good level to run. When I was working I'd calibrate audio
interface boxes (Henry Matchbox most commonly) at -8 dBV and +4dBm. Never
had a compaint and things just "worked".

 
-10dBV nominal level is 2Vrms peak, with 16dB of headroom. I think that's where the 2Vrms standard came from when transitioning from analog to digital.... when digital productions still had ample dynamic range...