Oversampled DAC without digital filter vs NOS

Status
This old topic is closed. If you want to reopen this topic, contact a moderator using the "Report Post" button.
WADIA used the R2R BB PCM1702/04 chips with high SR as x64. Today's AKM/ESS DAC's are not able to feed this high SR. Also with the WADIA spline some early roll of at 22kHz was the trade off.

BTW: Did a 16M FFT ver. 82 times avg 256k FFT, looks close as equal. IMHO a spectral or TFD (time freq distribution as used to analyze birds) would show much more the real beef.

Picture Note: Lindberg 2L-087_stereo-96kHz_06 source (nice to see some spurious :D)

The question is, what does Wadia’s approach buy you compared to zero stuffing and LPF? It seems like marketing to me.
 
My view of interpolation by splines versus that by LPF is that which is best depends on the kind of data being interpolated. For audio, we are dealing with complex periodic signals having desired spectrum band which need to be separated from undesired spectrum bands. In other words, the desired signal needs to be bandlimited, for which, brickwall digital bandpass filtering is most ideal. Perfect frequency-domain removal of undesired image/alias bands via bandpass/lowpass filtering inherently results in perfect time-domain signal interpolation within the desired band.

However, not all data sets are periodic, or carry important information via frequency content. Interpolation can be applied to non-periodic data, where no desired information is carried via the spectrum content of the desired signal band. Non-periodic data could, for example, be a sequence of result points (samples) from an test, where some independent test variable is stepped across a range of values. In such an case, spline type interpolation could be utilized to ESTIMATE the test reults which would be obtained for values of the independent variable that were not actually tested.

So, I concur with Chris, spline type interpolation, as applied to audio signals, results in less than optimum interpolation. I see it's use in digital audio more as a effort to sell an commercially differential (but not better performing) feature to consumers. Just my own assessment.
 
Last edited:
I see it's use in digital audio more as a effort to sell an commercially differential (but not better performing) feature to consumers. Just my own assessment.

Remember Wadia is from from the days of very little compute power and the dread of "pre-ringing". From the Stereophile review there were plenty of anomalies that might mask things further. The DSP only did 8X oversampling and they used 4 DAC's summed via a shift register as a linear interpolator up to 32X. Your guess is as good as mine as to what comes out.

Oversampling is performed by two AT&T Digital Signal Processing (DSP) chips which, together, have the computing power of 50 IBM PCs or 36MIPS (Million Instructions Per Second).
 
Last edited:
Remember Wadia is from from the days of very little compute power and the dread of "pre-ringing"

Scott, yes, I remember. I was working at AT&T Microelectronics in Allentown, PA at that time (circa, 1991). The DSP utilized by Wadia was manufactured at the fab. which used to be there. Now, it's been converted in to a stadium for the Phillies' AAA' baseball team, the Iron Pigs.

I wasn't as much into DIY audio back then, but I knew of Wadia, and thought it interesting that an relatively small consumer audio company was an direct customer.
 
Robert W(adia) Moses published the first paper about their "improved decoding computing" allegedly in 1987 and the first production units (2000/1000?) came out in 1988 afair.

I haven´t done any serious listening back then and as usual one is listening to a complete device so it´s difficult to trace back what the reason for the reportedly better sound quality could have been.
JA was surprised by the somewhat mediocre measured performance while really appreciating the sound quality.
 
Last edited:
Status
This old topic is closed. If you want to reopen this topic, contact a moderator using the "Report Post" button.