Suggestions on nonlinear ADC input

Status
This old topic is closed. If you want to reopen this topic, contact a moderator using the "Report Post" button.
I'm looking at designing a nonlinear analogue input stage for an ADC, and another nonlinear analogue stage after the output of the DAC to compensate. I intend to eventually design a high-quality digital crossover/room-correction system. My reasoning for using nonlinear IO stages is this:

ADCs and DACs generally produce digital noise and other artifacts that are inversely proportional to the amplitude of a musical signal. At very high amplitude an ADC might have exceptional performance, but who cares? Most of the time an audio signal will be at moderate levels which might be several orders of magnitude lower. At these quieter levels the noise and other artifacts have a much bigger effect, especially if an analogue volume control can be used to turn up the volume. With 16-bit precision for example, most of the signal is not represented very accurately and the most important values might range between say +-2000 or even less. However, a high-volume climax or transient (where accuracy is least important) has a range of up to 65536 and is represented much more accurately. This limits the flexibility of digital volume controls, even with 24-bit ADC/DACs, because the best performance is generally just below the clipping level.

My idea for a solution to this is to basically distort an incoming signal so that relatively quiet changes are represented more accurately then they would be normally. To be effective, the signal would have to be distorted the other way after the DAC, a sort of modern-day digital equivalent to the ol' RIAA filters. Does anyone have any ideas on how to implement the required analogue stages?

I'm also considering other optimizations like:
-Using dual/multple ADCs and splitting the input frequencies with simple analogue active filters. This would reduce the average centre-offset for quiet signals where a relatively loud signal is also present, and possibly also allow a higher gain.
-Compression, but I don't have much hope for this technique.
-Some ideal filter where quiet signals go one way, and loud signals go the other way regardless of frequency.

CM:cool:
 
CeramicMan said:
My idea for a solution to this is to basically distort an incoming signal so that relatively quiet changes are represented more accurately then they would be normally. To be effective, the signal would have to be distorted the other way after the DAC, a sort of modern-day digital equivalent to the ol' RIAA filters. Does anyone have any ideas on how to implement the required analogue stages?

A RIAA filter is non-linear with frequency, what you want is a circuit that is non-linear with amplitude, slightly different ;)

Originally posted by CeramicMan
I'm also considering other optimizations like:
-Using dual/multple ADCs and splitting the input frequencies with simple analogue active filters. This would reduce the average centre-offset for quiet signals where a relatively loud signal is also present, and possibly also allow a higher gain.
-Compression, but I don't have much hope for this technique.
-Some ideal filter where quiet signals go one way, and loud signals go the other way regardless of frequency.

1. I don't think this method would give you any increased resolution. You would still have to account for full amplitude signals in both frequency bands.

2. This is what I had in mind when I first read your post. An analog AGC in front of the ADC is used in many cases. However you would need to digitize the gain value as well as using the same gain value for the analog expander after the DAC.

3. Do you mean a comparator driving a cmos-switch? I don't quite see how this would work in practice.

A fourth solution would be to use a log amplifier in front of the ADC and an antilog amplifier after the DAC. This would make it similar to the a-law/u-law coding for ISDN.

The problem with all of the above solutions is that the nonlinear circuits does not have linear nonlinearity ;) There would be significant distortion and intermodulation problems.

What we all would want is floating-point ADC and DAC's. For more info on this, se here. :D
 
Re: Re: Suggestions on nonlinear ADC input

ojg said:
A RIAA filter is non-linear with frequency, what you want is a circuit that is non-linear with amplitude, slightly different ;)
I know! The concepts are at least similar if only because they both try to optimise the use of a limited dynamic range.

A log amplifier sounds about right. On the output this would be compensated by an "anti-log" amplifier. There's little point in making the linearisation digital, because that would render the whole exercise useless: there would be no nett improvement in the performance of either the ADC or the DAC.

It would still be ok if the whole system was still slightly non-linear after the "anti-distortion": I could digitally calibrate the system so that the sum of the input + output is linear.

As far as I know the old RIAA system simply tried to eliminate the problem of insufficient dynamic range, and it made use of the fact that music has more energy in the bass. Assuming that music roughly follows a pink-noise spectrum, I could differentiate the input and integrate the output, or something similar. It would improve high-frequency noise performance, and improve low-frequency dynamic range. This would of course sacrifice the high-frequency dynamic range, but I'm hoping to improve this with nonlinear gain. Does anyone have any ideas or links for log/antilog amplifier schematics, or other ideas they could share?

CM
 
Status
This old topic is closed. If you want to reopen this topic, contact a moderator using the "Report Post" button.