Is a DACs output stage redundant if integrated into a preamp?

I just built one of Miro's AD1862 DAC's. His circuit uses an op amp for the I/V stage, and you can adjust the gain by changing resistor values. I also built up the reconstruction filter which has a gain factor of 1.6. So as it currently sits, my DAC puts out nearly 5v peak. I'm running that into an F5 which requires high input voltage to hit it's rated 25w.

Hope that helps. Feel free to PM.

Here's the DAC thread: https://www.diyaudio.com/community/threads/dac-ad1862-almost-tht-i2s-input-nos-r-2r.354078/
 
Current output DACs need their I/V converter unmolested, its essential to linear function (you can damage the DAC without a suitable load on the current outputs even - checkout absolute maximum ratings for instance, some chips specify +/-0.5V).

Voltage output DACs are another matter, they are simple to interface as they output voltage. Some come with in-built opamps on the outputs, others don't and aren't necessarily able to drive low impedance loads - these should be buffered at the DAC.

However most high accuracy DACs are current-steering with current outputs, as that's inherently easier to coax into linear behaviour as charge is easier to conserve to high accuracy in a chip than voltage. They also tend to be differential so diffential->single-ended converter is also used on the DAC outputs.

Always read the datasheet for the DAC chip in question for the interfacing for that chip.