Low or high output impedance to high end amp

Hi, I have a high end class-D pre-amp/speaker-amp set, and just bought a vintage Sony tuner (ST-5066).
The tuner offers both 'normal' (0.75V) and 'high' (1.5V) impedance outputs.
As a newby in audio, I would reason that you would let the best amplifier do most of the amplifying within the setup, so that would suggest using the normal output, and have the class-D do the heavy lifting.
However, cables to the pre-amp are 5m. I understood that long cables require higher voltage to reduce quality loss along the way, suggesting high output.

What would be most optimal, or are there more factors weighing in?
 
Last edited:
"Normal" and "high" on your tuner refers to the output voltage (V), so depending on the input sensitivity (V) of the preamp, choose the appropriate output from the tuner.
The output impedance value (3.3kΩ / 4.5kΩ) of the tuner is not critical because the input impedance of the preamp is high enough (100kΩ).