Single supply op amp buffer - deriving bias voltage?

Status
This old topic is closed. If you want to reopen this topic, contact a moderator using the "Report Post" button.
I'm currently designing a buffer input stage for an ADC board, and I'm wondering if the bare minimum of two resistors and a capacitor is sufficient for providing/blocking the mid rail bias voltage. For example, figure 1 in Analog Devices: Analog Dialogue: Avoiding Op-Amp Instability Problems In Single-Supply Applications, is what I'm talking about (minus the feedback resistors, since I'm just making a buffer). Since the gain is unity, is there actually any increased chance of instability by not decoupling the bias supply, or should I decouple it? (Figure 2 in the previous link). I have some pretty tight space restrictions (2x2 inch board to fit a DSP/CODEC/Analog Circuitry), so less components and smaller capacitances would definitely benefit me.
 
Administrator
Joined 2007
Paid Member
Fig 1 works OK as long as the supply is clean and stable. There won't be any stability issues as long as the opamp is suitable for unity gain. Fig 2 gives more isolation from supply noise and is technically a better solution.

It more a "digital line level" forum topic really, think we'll move it :)
 
Status
This old topic is closed. If you want to reopen this topic, contact a moderator using the "Report Post" button.