XMOS-based Asynchronous USB to I2S interface

Two issues...

in other words: it's complicated.

1. GMRs on the I2S lines add jitter, this is fact, not speculation. As I recall the amount of added jitter for the best ones is something like 30-90 pS. Of course this will depend on how one measures, as measuring jitter is not straight forward.

2. Noise, from the computer transmitted by the USB cable, and noise from the XMOS processor itself can increase jitter at the DAC-this is why people want galvanic isolation from the computer. How much jitter can this cause? I have not seen or heard of anyone measuring this, and the jitter caused will be entirely dependent on the computer set up, and how much noise the computer system allows to come over the USB bus, and how the USB lines are handled at their termination point on the USB interface (are they filtered?)

So, we see that adding isolation causes additional jitter of its own, but will also reduce jitter by reducing the computer borne noise. Then the answer to the question of whether to use isolation will come down to whehter the reduced jitter is higher than the added jitter.

Additionally-no isolation is perfect. Isolation schemes allow some, higher frequency (and computers have 400 mHz noise), noise products to go through. So the question of whether or not to use isolators is not simple. Some commercial DACs with async USB do not use isolation, as their designers feel the isolation does more harm than good. Right now I am using a non-isolated USB interface, with a relatively low noise laptop and am getting great sound-would the sound be better if it was isolated? I do not know. My understanding is that Lorien's board has both isolated and non isolated I2S outputs, so one could try both and listen test for differences.
 
I also believe that it would be important to determine the source of the noise transmission from the computer. Is it really coming over the data lines only? It seems more likely to me that the significant noise is coming over the USB power and ground lines.

It should be possible to lay out the PCB such that the USB data lines completely avoid vias and connect from the USB jack directly to the processor. If no other traces are near the USB data lines, then noise from that source should be minimized.

Power supply noise (whether coming directly from the USB power and ground lines or by nature of the switching circuitry inside the USB function causing current draw on the filter board supply) should be controllable by changing the quality of the power supply itself and the local filtering.

In other words, you have to know exactly how the noise is getting through if you want to eradicate it without undue tradeoffs (such as increased jitter).
 
Noise, from the computer transmitted by the USB cable, and noise from the XMOS processor itself can increase jitter at the DAC-this is why people want galvanic isolation from the computer. How much jitter can this cause? I have not seen or heard of anyone measuring this

So how do you know it's there? The only way of telling that it's occurring is by comparative measurement.
 
counter culture...

Not sure what you mean here, please read closely, the following statement is carefully worded, and as such entirely rational:

"Noise, from the computer transmitted by the USB cable, and noise from the XMOS processor itself CAN increase jitter at the DAC-this is why people want galvanic isolation from the computer. How much jitter can this cause? I have not seen or heard of anyone measuring this..."

The word CAN does not equal DOES. I am pointing out that I have not heard of anyone measuring both this noise level, and then the jitter level at the DAC chip (or on the I2S out from the interface). My point is, that we cannot be sure what is going on until someone with some serious RF gear and knowledge publishes some measurements correlating the noise from the computer, with the jitter level at the output of the interface. I do know that RF phobic designer Charles Hansen of Ayre uses optocouplers for the I2S lines from his USB receiver to DAC board in the QB-9 USB DAC, despite the fact that the optocouplers themselves will add some jitter.

Some additional thoughts: computer grounds are very noisy, I have this on the good authority of accomplished engineers who have measured. The USB ground connects to the interface board, and the V+ is used by most USB interfaces, at least for handshaking. How much damage does the computer borne noise cause, that is a question I would like to see answered. In the mean time, I do try and reduce noise factors from the computer any reasonable way I can, and usually note subjective sound quality improvements with computer tweaks which reduce computer activity.
 
I do know that RF phobic designer Charles Hansen of Ayre uses optocouplers for the I2S lines from his USB receiver to DAC board in the QB-9 USB DAC, despite the fact that the optocouplers themselves will add some jitter.

Its possible therefore that Charles Hansen, like me, considers common-mode induced noise more of a problem to audio quality than jitter.
 
If it's a 1 it should show itself as a 1 and if it's a null it should show itself as a null. Otherwise it's a bug...
You've only discussed one of the two variables. Timing is also a variable. It's probably safe to assume that I2S rarely or never has a bug such that a bit value is flipped. However, moving even a single sample in time due to jitter has exactly the same result in the analog domain as changing its PCM code. These discussions are basically 100% focused on clock timing and reduction of jitter. Timing is not critical over USB, assuming that asynchronous delivery of the data is accomplished properly. However, I2S is probably the only place where the clock data must be perfect, or as close to perfect as possible, for the best DAC performance.

While we're on the subject (and I hope it's not distracting from the main thread), is it typical for an I2S source to be the master clock, or is it possible for an independent clock on the same board to be the master of both the XMOS FPGA and the I2S slave? In other words, could the DAC be the master clock so that the XMOS FPGA is a slave rather than the source of the timing?
 
While we're on the subject (and I hope it's not distracting from the main thread), is it typical for an I2S source to be the master clock, or is it possible for an independent clock on the same board to be the master of both the XMOS FPGA and the I2S slave? In other words, could the DAC be the master clock so that the XMOS FPGA is a slave rather than the source of the timing?

Yes I2S is really defined as only 3 wires - BCK, DATA, WS. Oftentimes an MCLK (typically 256fs) is sent as well. In all cases the sender is clock master.

There's obviously a need for another protocol where the receiver is clock master - other than asynch USB (which is way complex) I'm unaware of any standard one.
 
rsdio.

If timing alters the values of the stream it's definetly a bug or a collateral descision that it doesn't have any inpact on the listeners experience proven by the success of lossy formats..

Now, put it this way. If the sender sent a 1 and the receiver intepreted it as a 0 does the industry (we?) have a serious problem to deal with?. Now, does it sound bad if there is a bit misinterpreted here or there within every second? Of course bit perfection thru all domains is possible asyncronous - your amp won't play before all bits has arrived. Now, how would it know? Well, of course it can have a md5 sum with every byte and so forth to assure true fidelity... Will it sound better?

Of course I share an idea of reducing the multitude of errors in my signal chain.

Brgds
 
Last edited:
If timing alters the values of the stream it's definetly a bug.
It seems like you don't understand what I am saying. Timing does not alter the digital value. It doesn't even alter the analog value at the point in time at which the DAC converts the digital value to an analog value.

However, if you place the exactly correct analog value at a slightly incorrect point in time, the end result is indistinguishable from changing the analog value, and thus exactly the same as if the digital value has been changed.

I've proven this for myself. Beyond just reading about it in the textbooks, I have designed DAC circuits (operating at sample rates as high as 6 MHz or even 125 MHz), and it is clear that the noise floor increases when the clock timing is altered via jitter. This can be confirmed with an independent ADC to measure the quality of the analog waveform that was produced.

You are correct that the I2S stream should never be altered, and if it is altered then something serious is wrong. However, even if your criterion is met, and the I2S bit stream is perfect, there is still plenty of room for 'errors' in the resulting continuous analog waveform due to clock timing. That's what people are talking about here.
 
Last edited:
Yes I2S is really defined as only 3 wires - BCK, DATA, WS. Oftentimes an MCLK (typically 256fs) is sent as well. In all cases the sender is clock master.
I guess I'm thinking outside the box a bit. I2S is clearly those 3 signals, but who's to say where each of them come from? The XMOS FPGA is completely programmable, so it could be easily altered to be a slave to the BCK. However, I suppose my ponderings are moot, since we're not likely to find a DAC chip that provides BCK as master.

That said, it still seems possible that a standalone DAC board could have a local oscillator that is separate from the DAC chip. This local oscillator would directly drive the BCK pin on the DAC, and also be provided over the I2S connector for the XMOS (or other) FPGA to slave to. Perhaps this is a really dumb idea, because it would not be a standard that interoperates with anything already out there in the I2S world. But it sure would have the potential to work quite well in those combinations where both the data source and DAC boards agreed on the clock source.

There's obviously a need for another protocol where the receiver is clock master - other than asynch USB (which is way complex) I'm unaware of any standard one.
In that realm - a standard bus - there is also FireWire audio. There are quite a few audio interfaces which are self-clocked, and use bidirectional communication with the media source (usually a computer) to pull audio sample data over FireWire with the DAC as the master clock. I'd love to design such a thing, whether it has I2S output or an on-board DAC, but the FireWire specifications are not freely available like USB. It's even rather difficult to find a complete list of which FireWire specifications are necessary to implement Audio.
 
I guess I'm thinking outside the box a bit. I2S is clearly those 3 signals, but who's to say where each of them come from?

Well I believe I2S does say, but if we venture outside the box as you're doing then sure, it makes more sense for longer distances if the DAC were the master. The problem then comes if we have multiple DACs, coz then its impossible to sync them without ASRCs. I think about designing digital XOs so more than two channels is an issue for me :)

The XMOS FPGA is completely programmable, so it could be easily altered to be a slave to the BCK. However, I suppose my ponderings are moot, since we're not likely to find a DAC chip that provides BCK as master.

XMOS isn't an FPGA as such but yeah I'm sure it can be programmed to be slave. I've been thinking about programming a much simpler device than an XMOS to handle this kind of functionality, making a DAC into a clock master. XMOS is a bit power hungry for my liking. I shelved the idea though when I began to consider all the synchronization issues for multiple channels and how to handle all the possible sample rates.

That said, it still seems possible that a standalone DAC board could have a local oscillator that is separate from the DAC chip. This local oscillator would directly drive the BCK pin on the DAC, and also be provided over the I2S connector for the XMOS (or other) FPGA to slave to. Perhaps this is a really dumb idea, because it would not be a standard that interoperates with anything already out there in the I2S world. But it sure would have the potential to work quite well in those combinations where both the data source and DAC boards agreed on the clock source.

So long as one such board is enough, to hell with whether its a standard or not. Do you think nobody will never want more than one such board in a system?
 
Continuing on with my crazy ideas (and begging the pardon of anyone who sees this as off-topic), maybe there is a way to do this without completely violating the I2S standard.

Imagine a stand-alone DAC board with I2S input for the DAC and I2S output that is fed by a local, on-board master clock. A newly-designed USB to I2S interface could also have a pair of I2S I/O connectors along with the ability to configure the clock source.

In the ideal setup, the DAC PCB would have its on-board master clock wired directly to the DAC with the lowest-possible jitter. This clock would also be buffered and send back to the USB PCB via an I2S output (the DATA would be empty: all 0). Meanwhile, the USB PCB would be designed to slave to the incoming BCK and WS that is coming from an I2S input port on the USB PCB (connected to the I2S output port on the DAC PCB). The XMOS FPGA would use the timing information coming from the DAC board to control the USB asynchronous data flow, and would also use it to control the timing of the I2S output that feeds audio sample data to the DAC board.

Admittedly, there are a couple of weak spots in such a design. For one, the XMOS would have to be running slightly ahead of the clock so that propagation delays between boards and chips do not result in errors.

Another problem is that it would be tempting to have configuration switches so that this system would also be compatible with standard I2S links, but any such switches would probably introduce jitter and make the whole exercise moot. In this respect, I think such a design would have to be dedicated to having the DAC board be master clock always.

Personally, I'd much rather have total control over everything, and combine the USB or FW interface on the same board as the DAC chip, so that the DAC clock could be the master for everything. But I realize the advantages of having a generic standard so that one USB board can drive multiple DAC boards, or one DAC board can be driven by multiple data sources. I just wonder whether this DIY community has the wherewithal to come up with a 'standard' for DAC as master clock that could encourage a small community of interchangeable boards.

The only reason I'm mentioning these ideas here is that it seems like a small revision to the XMOS board that this thread refers to could be made to support such a clock design.

Thoughts?
 
I would say that firewire is dead from the industrys point of view. No interest - no money.

I do understand what you are writing rsdio but to be able to hear the timing imperfectness you need a constant error in your data chain. It can't show itself as something happening here or there.

Finally, don't worry about violating the standards as they have 2 sides. The lesser thought of is that it keeps you boxed so that you can't perform better than your competitors.

Brgds
 
Last edited:
The problem then comes if we have multiple DACs, coz then its impossible to sync them without ASRCs. I think about designing digital XOs so more than two channels is an issue for me :)
Thank you for specifically mentioning multichannel. I am also very interested in surround and biamping, or even both at the same time! In my mind, when I consider such things for myself, I envision a single board with everything local, and thus the master clock is still local to all DAC chips.

For a while, I had a 16-channel system in a public space that I was slowly building out, I had 4.2 surround working without biamping, but I was hoping to eventually reach 9.2 surround with as much biamping as I could fit into 16-channels (also limited by my budget for amplifier channels). But this was based on FireWire Audio with 8 locally-clocked DACs and 8 externally-clocked DACs (the latter for the less-critical channels like maybe the subs).

I've been thinking about programming a much simpler device than an XMOS to handle this kind of functionality, making a DAC into a clock master. XMOS is a bit power hungry for my liking.
I only mention the XMOS out of my own (misguided?) sense of politeness to the OP. There are many potential ways to solve this. In fact, if there is a better thread to discuss these topics, please point me in the right direction.

So long as one such board is enough, to hell with whether its a standard or not. Do you think nobody will never want more than one such board in a system?
Right. One board is always going to perform better. I am continually reminded of something from the white papers of Dan Lavry of Lavry Engineeering: There is no external clock that can out-perform a properly-designed internal clock.

I do sometimes hope that a small community can get costs down by designing bus-based interfaces (FW or USB) and standalone DAC boards with 'standard' interoperable interface protocols, but perhaps there are just too many quality tradeoffs in that approach.