Open-source USB interface: Audio Widget

@rsdio, isn't it the USB hardware in the PC that determines the timing of the USB data?
I don't know if that's true for all USB hardware. In other words, it's entirely possible that thread priorities and other software issues in the operating system can affect minute timing variations in the USB data.

My point is that with asynchronous UAC, you don't have to worry about that. Minute timing variations may exist from one frame to the next, but it's not really going to affect the analog signal coming out of a self-clocked DAC which is its own master.

With synchronous or adaptive UAC you just might end up tearing your hair out trying to get the USB drivers to perform their best so that the audio sounds a little better.

Granted, I've only designed USB Devices, not USB Hosts, so I can't give examples of specific chips that handle some things in software. I'm just saying that I doubt that 100% of USB Host controllers are entirely clocked in hardware, and even if they are then the operating system can still fall down when expected to get things right every millisecond in real time.
 
@rsdio, the USB timing specs are very tight. I don't believe any PC can use software to manage the extremely tight timing of a 480 Mbps USB port. You're not aware of such a chip because they don't exist. Just like ethernet, it's all done in dedicated hardware. There's no way for the player software, or the driver, to predictably influence the timing. None.

The only way the driver or player can fall down on the job is to let the hardware buffer run empty or overflow it. And that doesn't cause the "air", "depth" or "detail" in the music to change. It causes a clip, pop, glitch, or blatant drop out. It also shows up on the USB line analyzer. It's black and white. Either you maintain the buffer or you don't. Most things in a PC happen at 16 mS intervals (the classic timer tick).

I agree with your comments about async removing the worry. My Benchmark DAC1 is an async device and that's one of the reasons I bought it. But everyone should be careful about inventing (or promoting) mythical reasons why asynch is better.
 
My Benchmark DAC1 is an async device and that's one of the reasons I bought it.

Well I've heard they claim its async but I'm not so sure. After all, it has an ASRC in there and I can't fathom why that would be needed if it was true UAC2 async. Yeah I know that's used to upsample to 110kHz to optimize the dac chip, but is that the only reason? So call me skeptic. Do you have any real engineering info available for its implementation (as opposed to marketing fluff about jitter immunity and such)?
 
You seem to just like to stir up trouble abraxalito. The DAC1 has been reviewed by a dozen or so reviewers, there are white papers on the design, full Audio Precision test results in the manuals, you can look up how the chips work, and it's not UAC2 but UAC1 using technology licensed from Centrance. I've talked with the guy who designed the DAC1 John Siau and an engineer at Centrance. John's a great guy and really knows what he's doing. The DAC1 is probably one of the best documented audiophile DACs on the planet with remarkably little fake hype.
 
You seem to just like to stir up trouble abraxalito.

Well I enjoy stirring that's for sure, whether its 'trouble' or not is up to you :D

The DAC1 has been reviewed by a dozen or so reviewers, there are white papers on the design, full Audio Precision test results in the manuals, you can look up how the chips work, and it's not UAC2 but UAC1 using technology licensed from Centrance.

OK, that's convincing enough. Still I'm curious as to why they'd use an ASRC just for upsampling though. Did John Siau tell you? Does he disagree with Bruno Putzeys on the use of ASRCs - I believe Bruno says its a bad idea to use an async device for something purely synchronous coz the frequency estimator is one of the weakest links in an ASRC.

The DAC1 is probably the best documented audiophile DAC on the planet with remarkably little fake hype.

Odd that you consider it 'audiophile' - more than one audiophile I've spoken to doesn't rate its sound at all. Its a professional product, designed (and marketed) by the chosen numbers and for that market its wonderful. Its also quite fairly priced, quite unlike a whole lot of 'audiophile' stuff.

<edit> Oh, I just remembered something else I read about the ASRC in the DAC1. On a forum somewhere, someone (either Elias or John) cautioned about substituting the ASRC saying its performance was crucial to the jitter rejection. Why would this be if its only doing an essentially synchronous form of upsampling?
 
Last edited:
@abraxalito, why don't you direct your questions about the DAC1 to Benchmark. I'm sure they'll be happy to answer them and that way we don't take this thread further OT. But it is a proprietary design, not an open source design. The Centrance technology, for example, is only available under non-disclosure. And I don't blame them for making it harder for China Inc to clone their hard work.
 
@abraxalito, why don't you direct your questions about the DAC1 to Benchmark. I'm sure they'll be happy to answer them

I'm not - why would they? I'm never likely to buy one and I'd only use what they told me to stir things up some more. :p

But it is a proprietary design, not an open source design. The Centrance technology, for example, is only available under non-disclosure.

Indeed - more reasons for them not to talk to me.

And I don't blame them for making it harder for China Inc to clone their hard work.

I saw a Heaf-fi thread where there was a review of a clone. Looked the same externally, no idea what they did with the insides. :)
 
Technically yes, but I prefer to sell complete kits. It's easier to design in the module when you have a known working system for it. Plus I don't have loose modules in stock, so by selling one I currently throw away casing and analog board.

But for new batches I will definitely start making loose USB-I2S modules.

Børge

Is available a kit to buy for the USB-I2S module ?
 
I saw the $168 price tag for the module alone. As you see from the link in my signature, the complete AB-1.1 kit is $138 with shipping to Italy.

I can sell you a stand-alone module for $90+$18 shipping. They will become cheaper in new batches. Since you won't have an AB-1.1 to put it into it means you'll have to figure out the IO functionality yourself. (That's still a doable task, all IO is documented on my site.)

Please note that the clocks are on the analog board, not on the USB-I2S module. I believe it is the opposite way in the link you sent.

Børge
 
I don't believe any PC can use software to manage the extremely tight timing of a 480 Mbps USB port. You're not aware of such a chip because they don't exist.
You "believe" or you "know?" Have you implemented primary operating system USB Host drivers? Do you have detailed hardware specifications for any chip?

My comments are based on my experience developing low-level firmware for actual USB Device chips in multiple shipping commercial products. Every USB chip is different in the minute details of where it gets the bits and how and when. There are many ways for minor variations to occur. Some chips have practically no hardware support. Most chips have at least a reasonable amount of hardware support. No chips have 100% hardware support that is impossible for timing variations to occur.

The whole point of the USB specification is to isolate each type of communication packet so that the timing requirements are as lax as possible and still allow reliable transfers.

There's no way for the player software, or the driver, to predictably influence the timing. None.
What about unpredictably? Not all harm is intentional or predictable.
 
You "believe" or you "know?" Have you implemented primary operating system USB Host drivers? Do you have detailed hardware specifications for any chip? <snip>

What about unpredictably? Not all harm is intentional or predictable.

I'm a EE and have designed hardware my entire career including high speed digital hardware, PCI cards, etc. I have been involved with USB driver development but not specifically USB audio drivers. But that's hardly a requirement for understanding this issue.

I know you cannot control the timing of a 0.5 Ghz (480 Mbps) data communication clock from software. That's a clock period of 2 nanoseconds. Are you honestly trying to suggest the CPU in the PC is running out to the USB port every 2 nS to send another bit of data?

USB Host Controllers have their own crystals or other dedicated oscillators. That's what determines the timing of the data sent over the USB interface--not PC software.

USB uses only 3 fixed clocks for the data: 1.5 Mbps, 12 Mbps, and 480 Mbps. It's not some variable dynamic clock controlled by software. It's one of those three speeds, the specification requires at a given speed (especially 480 Mbps) the clock be extremely accurate.

The USB 2.0 spec is 480 Mbps +/- 500 ppm (parts per million) including ALL sources of error. You're trying to argue a PC can achieve that accuracy under software control? I'm sorry, but that's absurd.

This document defines the requirements for the crystal that feeds the PHY (the part of the USB host controller that controls the bit timing over the USB bus--the timing the USB DAC sees) :

www.ti.com/lit/an/slla122/slla122.pdf

For a block diagram of a typical PC USB Host Controller see:

VIA Vectro VT6212 - 4-port USB 2.0 Host Controller - VIA Technologies, Inc.

If you can find a company that doesn't require registration and/or NDA's for their full USB controller user manuals, you'll find what I've already described. The software sends the data to a buffer and issues a command for the USB Host Controller to send it out. Beyond selecting 1.5, 12 or 480 Mbit, the hardware PHY controls the timing.

As for your other point, while someone can try to fabricate an argument that other "activity" in the PC may effect the jitter of the USB PHY hardware (a questionable argument) it's impossible to intentionally do it in any controlled way. The very nature of PC's is they're heavily multi-tasking and, even if your music software player is the only user application running, there are dozens of systems applications still doing things behind the scenes in unpredictable ways. All of the system processes would have similar effects on the USB PHY timing jitter and cannot be controlled by the player software.

To put this another way, even if my some miracle PurePlayer induced less PHY jitter on a given PC under specific circumstances, Windows would only have to run a different background task, and everything would change.
 
I should add the USB frame rate is also determined by the same self contained, self clocked, USB host controller hardware. The "host controller" function in these chips is a fairly complex application specific microprocessor that relieves the PC from having to deal with most of the hassles of USB. Otherwise it would be extremely difficult for the PC to maintain USB protocol and timing standards--especially for USB 2.0.

USB DACs that run in Adaptive mode (I'm not aware of many Synchronous DACs still around) use the USB timing, generated by the hardware host controller in the PC, to adjust a PLL to generate the digital audio clock. As such, the "quality" of that clock can be at the mercy of the the USB hardware in the PC. That's the main argument for asynchronous operation. But the player software has no way to reliably influence the quality of the USB timing.
 
Hi Rsdio,

with experience like this you're very welcome to join the project.

We've been toying with plans for an open source Windows UAC2 driver.
Thanks, but "I don't do Windows."

Frankly, it would be a serious risk of resources to develop an open source Windows UAC2 driver when Microsoft might turn around and release one the next day. Of course, if you really need one and Microsoft is not delivering, then I guess it's better to take that risk despite the potential for wasting your time. I'd hate to be stuck with this particular dilemma.

At first, I thought you were talking about a UAC2 firmware, and I was about to point out the SDR Widget because it seems to implement that. But, now I realize that you're talking about the host side.
 
Last edited:
Most things in a PC happen at 16 mS intervals (the classic timer tick).
I think you're overlooking some very important implementation details. Most things do not happen at 16 ms intervals. The only thing that actually happens is that a piece of memory holding the count is updated once every 16 ms, but that interrupt routine returns immediately after executing an increment. "Most things" refer to that count after the fact, well divorced from the precisely-timed interrupt. In other words, with all that is going on in the OS, you can't force anything to happen at the precise moment the count advances to some specific value. The best you can do is write your code to do something an unpredictable amount of time later. Thankfully, you can check the count to see when your code is actually getting to run, but that's not as precise at the timer tick interrupt itself.

I believe that the thread scheduling is triggered by the 16 ms tick, but that happens at the end of the interrupt rather than the beginning, and thus there is more jitter.

I know you cannot control the timing of a 0.5 Ghz (480 Mbps) data communication clock from software. That's a clock period of 2 nanoseconds. Are you honestly trying to suggest the CPU in the PC is running out to the USB port every 2 nS to send another bit of data?

USB Host Controllers have their own crystals or other dedicated oscillators. That's what determines the timing of the data sent over the USB interface--not PC software.

USB uses only 3 fixed clocks for the data: 1.5 Mbps, 12 Mbps, and 480 Mbps. It's not some variable dynamic clock controlled by software. It's one of those three speeds, the specification requires at a given speed (especially 480 Mbps) the clock be extremely accurate.

The USB 2.0 spec is 480 Mbps +/- 500 ppm (parts per million) including ALL sources of error. You're trying to argue a PC can achieve that accuracy under software control? I'm sorry, but that's absurd.
Again, we're talking about different details, and you seem to be missing important implementation details.

USB does have a steady master clock, but the various packets do not have a specific clock pulse number when they are supposed to start and stop. Within an individual packet, the timing between the start and end is very precise and handled by hardware. But between packets there can be a great deal of variation. Each pack must start on a precise clock edge, but nothing in the USB specification says which clock cycle it should be.

Just because the master clock is locked to a specific crystal does not mean that the timing of the data is always the same. All UAC packets are isochronous, but they can legally be sent by the host at any point within a frame. In fact, if you have several USB devices on a hub then the packet timing will have to change in order to fit everything in. Most types of USB transfers are not even guaranteed to happen within a particular frame, and thus the timing is not entirely predictable. Although isochronous transfers must occur every frame if they are marked with a 1 ms update rate, that doesn't mean that they must always occur at the same clock pulse number within every frame. Other USB data on the same bus can occur before or after the isochronous packet, thus shifting it's timing one way or the other. Granted, a synchronous UAC interface should only look at the SOF packet for timing, and should ignore any jitter in the timing of the isochronous packet itself. But it's at least feasible that certain USB host controllers would vary the timing of the USB data depending upon activity with various USB devices.

My point is not that people are right when they claim that the sound quality improves by playing a file from memory instead of from disk, and using only one program at a time. I'm merely pointing out that system load and activity can affect the timing of USB. There are many USB devices on the market that fail to function when the USB host is overloaded. If every aspect of the USB data flow were under 100% hardware control, then this wouldn't be the issue that it is.


By the way, I agree with your points that you can't control anything by pretending that you're just running one program. Every modern operating system these days is handling multiple tasks and threads.