The Objective2 (O2) Headphone Amp DIY Project

Yes - two opamps share the one resistor.

Now where did I put my "Duh!" cap.... For some reason I had it in my head the offset would be determined by whichever of the two op amps had the greater input offset current, but I believe you're correct. The currents sum between the pair. Thanks for that (and you accuse me of not listening ;) ).

EDIT: Looking back through my articles, I could only find where I listed it as being a desirable requirement to have the worst case offset under 20mV, not where I claimed that was the actual worst case offset of the O2. If you're aware of some other reference to 20mV, please let me know.
 
Last edited:
Some have suggested a digitally controlled analog volume control but the only decent chip I know of that won't totally destroy the DAC's performance is the flagship part from Cirrus which is really expensive and also would require a microcontroller.

For those that want a fancy volume control, how about this one: The Tentlabs volume (and input) control
 
@Jelle, that's my (and Doug Self's) preferred way of doing a high-end volume control--especially if you use through hole or really large high quality thin film SMT resistors. The only negatives are some don't like the relay racket when you change volume settings, 64 steps is not enough for some, and the 200 Euro cost far exceeds the price of the rest of the entire amplifier--even in assembled form.

EDIT: And, FWIW, I've read the eBay/Asian sourced DACT attenuators often use really small cheap SMT resistors (regardless of what they may claim as it's impossible to know by looking at the parts except they're usually tiny 0805 or even 0603 resistors) that have significant voltage coefficients--i.e. their resistance changes with voltage. Such non-linear resistors directly create distortion that can be far greater than the O2's inherent distortion. Doug Self has documented the distortion and it can be significant--especially at higher operating voltages and the pot in the O2 can have as much as 20 volts of swing on it. I'm fairly confident the Alps pot in the O2 will have lower distortion than the eBay DACTs.
 
Last edited:
Ah, it's much nicer over here! Admittedly I'm biased - just cropped a 7 day ban at Head-Fi for suggesting (well, stating) Cheapskate had filled this thread with crap, was wrong about almost everything on every possible level and was incoherent in his argument. Perhaps slightly OTT in hindsight - oh well.

Back on topic and @NwAvGuy/RocketScientist, I was interested to read the ODAC announcement and in light of that must ask: what exactly do you think about jitter?
From what I've read, the suggestion was that it's a bit of a red herring, as jitter problems are generally apparent in standard measurements such measuring THD+N and thus measuring it in "isolation" (with J-tests and the like), whilst OK for product development, when you want to narrow down problems, is not really necessary when evaluating complete products. The main article I read on this was http://www.theaudiocritic.com/back_issues/The_Audio_Critic_21_r.pdf , but I'm sure I don't need to point out to you the exact studies on jitter audibility that concluded you could get away with quite a lot of it. The general consensus from seemingly reliable sources appears to suggest that jitter is more of something for audiophiles to fuss over than anything approaching a real problem.

That said, seeing that you know considerably more about this than me, I was intrigued to find that your stance on jitter reads as vaguely conservative from an audiophile perspective, at least on your blog.

So what exactly do you think on the matter?
 
@Jelle, that's my (and Doug Self's) preferred way of doing a high-end volume control--especially if you use through hole or really large high quality thin film SMT resistors. The only negatives are some don't like the relay racket when you change volume settings, 64 steps is not enough for some, and the 200 Euro cost far exceeds the price of the rest of the entire amplifier--even in assembled form.

EDIT: And, FWIW, I've read the eBay/Asian sourced DACT attenuators often use really small cheap SMT resistors (regardless of what they may claim as it's impossible to know by looking at the parts except they're usually tiny 0805 or even 0603 resistors) that have significant voltage coefficients--i.e. their resistance changes with voltage. Such non-linear resistors directly create distortion that can be far greater than the O2's inherent distortion. Doug Self has documented the distortion and it can be significant--especially at higher operating voltages and the pot in the O2 can have as much as 20 volts of swing on it. I'm fairly confident the Alps pot in the O2 will have lower distortion than the eBay DACTs.

If I had my way, I'd have a 128-step autoformer as the volume control but the feasibility of such is just not inline with the rest of this project in many ways: cost, size, existence the thing in the first place, etc.
 
Last edited:
@Willakan, I wrote an article on Jitter some time ago that shares many of my thoughts if you haven't read it already. But I should probably update it as I keep learning more about the topic--especially now that I've been involved in DAC development.

Perhaps most significant, I attended an AES presentation on jitter where music tracks were played with varying degrees of jitter. Even without the added sensitivity of a proper ABX test (the demonstrator was coordinating everything and there were gaps between each track) most in the audience could detect the jitter at moderately high levels. And when it was very high, my grandmother in the next room probably could have heard it. If I remember correctly, I think he was playing solo piano.

So there's no question, at some level, jitter is audible. The problem is correlating what's audible with say Julian Dunn's J-Test--arguably the most viable and accepted test for jitter available that works with most gear (the Miller Jitter Analyzer, for example, is based on similar principals). I can't make a definitive correlation with any certainty and I'm not aware of anyone else having done so either.

Given the lack of correlation, it seems prudent to minimize jitter when it's reasonable (and ideally inexpensive) to do so. Often that just means proper implementation of even fairly modest parts including paying attention to the PCB layout for cross contamination, digital signal degradation, ground currents, etc. Certainly some take it to arguably unnecessary extremes.

Perhaps ironically, in the DIY and "boutique" DAC arena many taking extra efforts (such as async, ASRC, excessive isolation, etc.) to reduce jitter rarely seem to properly test the end result to even know if they got it right. Jitter is a lot like other forms of distortion. It can have many sources and they're not all obvious or intuitive.

I also don't agree with the argument jitter shows up properly in simple THD measurements. Dunn's AES paper on the J-Test specifically talks about the requirement of toggling the least significant bit as part of the test. I can attest that one little difference makes a big difference on some DACs compared to just looking at a normal 11025 hz sine wave.

Jitter can be thought of much like road testing a new car. If you drive it on perfectly smooth brand new roads, you may conclude it rides very smooth, has a nearly ideal suspension system, etc. That's the normal sine wave test. But it's not until you take it on some beat up, pot holed road, that you can really appreciate how well the car deals with more adverse conditions--like the J-Test with the bit toggling.

If you feed a relatively low jitter source into a DAC you're likely going to get relatively good performance. But if you feed it a signal from a long cable, noisy source, or that otherwise has a high jitter potential, you start to notice much bigger differences between DACs. That's the idea behind the J-Test--it stresses the DAC more to evalutate its performance under more adverse conditions. It's a lot like test driving a car on a rough road.

So while I can't tell you a precise objective measure of when jitter is "low enough", I can still make an argument for trying to reduce it to reasonable levels--especially when that can be done for free or at a very modest cost. The sort of measures I'm talking about, unlike say improperly implemented async USB or ASRC, cannot cause any harm.

I also plan to be doing more blind DAC ABX testing in the future. And if audible differences are detected, it will be very interesting to compare all the measurements of both sources perceived as different.
 
Last edited:
Thanks for the response. I have indeed read the article - I was struck by how it appeared less forthright and allowed for more uncertainty than your other articles, hence the question!

I would agree that THD+N is certainly not adequate by itself: the article I linked, from an engineer at Analog, just said that "Traditional THD+N versus frequency tests and FFT spectrum plots for input signals at various frequencies are enough to cover the effects caused by jitter." Obviously, this is from the perspective of a final evaluation rather than invalidating investigations of jitter during the design process. The general gist appeared to be that whilst jitter was certainly not irrelevant, the additional attention it receives over, say, opamp distortion doesn't seem objectively justified.

I wasn't of the opinion that jitter is inaudible, but where, very roughly (within an order of magnitude if needs be) you reckon the threshold of audibility lies? From what I've seen from Benjamin and Gannon, we're talking 10s of nanoseconds of jitter to be audible - whilst another paper found that only much higher amounts of jitter (hundreds of nanoseconds) was audible (EDIT: This one http://www.jstage.jst.go.jp/article/ast/26/1/50/_pdf although the methodology is in question to some extent - for example, they let the listeners use their own gear)

Can we agree that the sort of gibbering over hundreds of picoseconds of jitter in Stereophile is a total waste of time :D?
 
Last edited:
I think jitter ought to be indicated as a delta of measured variance against fixed/known peak-peak/trough-trough spacing (where it is on the time scale). My preference would be to feed an overlap of disonant frequencies (26Hz, 490Hz, 2.2kHz, 9.13kHz, 14.5kHz) in a waveform (which gives us defined spacing) then measured against the actual output result. "Ethan's feed" would be a 30 second sample and a fixed figure output of average delta referencing this test, which would indicate the frequency blend and duration.

Even a simple sine wave would be a good start. That would give us a variance that can be shown in delta @ freq. The higher the frequqncy, the higher the delta. It would be easy on my mind. Also the disonant frequencies would repeatedly produce peak-peak spreads in the supersonic range which would be interesting to see as a brutal test of a device's performance. I could envision output figures like Δj = 0.0023% @ 10kHz. I don't think this would particularly be relevant because I'm under the impression that jitter is a fixed timing error without regard to frequency. Let me see if I could come up with a better system.

We audio people like simple scales referenced to a frequency. Jitter should also be presented in an easy format.

The problem is: where do you get a waveform analysis such as I described, where the time deviance is marked and compared? Anybody know?

I've got it! The delta could could also be similar to how the color guys determine delta-E, a distance on the CIELAB color chart. In this case, it would be a cumulative (of absolute values) shift per second. It might also be good to see what the maximum spread is. I can envision output figures like Δj = 13µs/S, ±0.6µs. I think multiple frequencies would still be required in the test to really show what's going on, but I may be wrong about that.
 
Last edited:
Informitive jitter-related posts made while I was composing mine. Also the ± in my post needs to be something else because it looks too much like a margin of error.

I would assume that if we had a percentage deviation scale we'd probably want it below the same thresholds we set for distortion & noise, such as below 0.05% and really aiming for below 0.003%. Perhaps this could be added to THD+N and give a result called CD+N (cumulative distortion + noise).

OK, if you haven't figured it out by now, I'm under the impression that there's no standard established and I'm trying to form one. If there is something in place, let me know. I can't find one that's solid enough to publish, but then my search skills aren't the best.
 
Last edited:
hmm, i actually dont like your idea, jitter is simply a word (and an overused one) used to describe a numbre of related effects that conspire to disrupt how the signal frequency and phase are related (that was hard, my apologies) . i dont think its poignant to try to reduce it to a percentage and i dont think its useful to think of it as fixed
 
Last edited:
hmm, i actually dont like your idea, jitter is simply a word (and an overused one) used to describe a numbre of related effects that conspire to disrupt phase. i dont think its poignant to try to reduce it to a percentage and i dont think its useful to think of it as fixed

Isn't it really just like the doppler effect in end result, with unpredictable phasing and repetition?
 
Absolute jitter, in picoseconds (or nanoseconds) of a digital signal is an objective measure. But it's missing the spectral/frequency component. And, perhaps worse, it's only possible on digital domain signals. It's typically only measured on S/PDIF and AES signals. What about USB? What about feeding a jitter laced S/PDIF signal to different DACs? What about all the playback-only devices like iPods, network media players, Airplay, etc? It's essentially impossible to accurately make absolute jitter measurements with all those devices, yet they can all have jitter issues.

So, IMHO, you're best off looking at jitter's effect on the analog audio performance of a device. And in that domain, as ethanolson suggests, there's no good way I'm aware to distill it down to a single objective number. Miller tried to objectify it a bit more with some success, but it's still a subjective judgement to look at J-Test (or Miller) result and try to judge if it could be problematic.

It's important to understand the distortion produced by a linear amplifier doesn't have a frequency modulation component in it but jitter does. Jitter is much closer to the old wow and flutter measurements of turntables and analog tape recorders. Because it's a different kind of distortion, it's not necessarily a safe assumption to apply the same guidelines we do with say THD.

For example, if the sum of THD+N within the audio spectrum is 80 dB or more below whatever you're listening to, odds are it's effectively inaudible. But you can't necessarily say the same thing about jitter sidebands in a J-Test. And what about the "spread" created by very low frequency jitter? Because it's a different kind of distortion, it's perceived differently.

@ethanolson, if I'm understanding you correctly, the dScope already does some of those measurements. It has a substantially more accurate and stable time base than most audio gear and it can measure both the digital signal (i.e. S/PDIF) for time variations, and also analog audio signals. The frequency resolution of the dScope is spec'd at 0.005 hz. It can, over time, track the peak frequency deviation from some center value. But the frequency at which the signal is modulated is also important.

Ultimately, when you compare the two J-Test results below, I can tell you which one I'd rather be listening to:

Benchmark%20DAC1%20Pre%20USB%2044%20Khz%2016%20bit%20J-Test%20-3%20dBFS%20Jitter%20Spectrum_thumb.png


Intel%20IDT%2092HD73E%20G45%20Motherboard%20Jitter_thumb.png
 
Hmm, I shall look around for more papers on jitter audibility and see what comes up. Even then, as you say, the varying nature of jitter likely makes any simple judgement of audibility difficult to make.

EDIT: @sofaspud: I suspect that thread will probably come once there the details of the design are finalised.
 
@qusp, wouldn't manufacturers always test at more ideal and controlled settings, and we'd all know it? Think about why THD+N is always at 1kHz. If it were at 10kHz (or worse at 20kHz @ 2V), it wouldn't sell much. Also, those characteristics can be thrown off by the environments we introduce our stuff to.

@sofaspud, yeah, though I don't know how much time I'd spend there. Maybe that would seal the deal for some.

@RS, I'm beginning to wonder if jitter needs to be plotted in 3D. I also think it's interesting that the J-Test yields so much more noise and artifact info than just the actual jitter. Is there a way to measure an analog signal's peaks on a timescale? If I had a dataset of the time peaks of a 10kHz sinewave from one channel while J-Test was running on the other channel, then we'd have some pretty awesome information, wouldn't we?!

@Willakan, we'll get it sorted. For now, I believe we have to have a standard that people understand, even if imperfect (as many of our specs already are). What I'd like to avoid is one that's considered invalid and to have objective proof of that fact. Although such a stumbling would help us along the quest.
 
Last edited: