Supply voltage vs THD+N

For example I'll use the LM3886 which I'm currently looking at the datasheet for. If you feed it 28v, at 8 ohms you get up to 38 watts output before distortion starts rising. If you feed it 35v you get up to 61 watts at 8 ohms before distortion starts rising. The distortion is effectively identical for both voltages from 0 to 3 watts, but between 3 and 38, distortion is almost half the amount using 28 instead of 35 volt power.

Is there an explanation for why this happens? Is this effect exaggerated in the LM3886 compared to other transistor amp chips, or is this how things always are - and the reason variable rail amplifiers exist? I always thought it was Energy Star efficiency requirements or similar