Adding S/PDIF output to PCM56P?

Status
Not open for further replies.
Yes, the reason is pretty clear.

The ADAU1442 bug list is one item long:

Background
The S/PDIF transmitter outputs two channels of audio data directly from the DSP core at the core rate. It does not preserve or output any additional nonaudio information encoded in the S/PDIF input stream, such as the validity bit, user data, and channel status. The encoded nonaudio data bits in the S/PDIF output stream are hardwired internally to logic low values, except for the validity bit, which is set as logic high.
Issue
In the S/PDIF specification, a high validity bit indicates invalid data. For this reason, if the output from the ADAU1442/ADAU1445/ADAU1446 SPDIFO pin is connected directly to an S/PDIF receiver IC, the receiver may ignore or discard the transmitted audio data because the high validity bit indicates an error. Alternatively, some S/PDIF receivers (including the ADAU1442/ADAU1445/ADAU1446 SPDIFI pin) may ignore the validity bit and pass the data through.
Workaround
This issue cannot be avoided because the validity bit value is internally hardcoded. If an S/PDIF output is required in the system, audio data can instead be routed to the ADAU1442/ADAU1445/ADAU1446 serial ports and then to an I2S-to-S/PDIF transceiver IC.

The Creative external sound card is seeing these bits and concluding the data is invalid, and throwing it away.

It took me ages (and ages!) to finally conclude there was a bug in th ADAU, and go looking. Imaging how pissed I was at myself after all that effort - just to convince myself to go looking for a one page, one error document!

Hence the motivation to throw together an I2S to SPIF board. To be honest I have only really ever used this to run digital - DSP - digital tests, allowing me to test the DSP performance against A/D and D/A performance. Also, in no small part, the reason for me smashing the PCB out without any real "polish" to layout etc - it was an "angry" and fast design - if such a thing exists.
 
With regards to synching SPDIF streams, you have two choices:
- lock everything together to a single master clock. Not so easy with consumer gear. I have not looked, but I would be stunned if professional gear does not allow locking of clocks in large systems.
- Use Asynchronous Sample Rate conversion. This is common on even consumer gear these days. Part of the reason I wanted to play with tests from my sound card outputting digital, the offboard DSP receiving this and processing it, then inputting back into the sound card, was to see what the effect of the ASRC was.


So: If you are just using SPDIF input, the SPDIF receiver recovers the clock(s) from the SPDIF stream - so that is fine, you can lock your D/A to the recovered SPDIF stream.

But: When you want to both output and input from the one system that you might wind up the potential for external systems acting as the clock master (which is kind of implicit in an SPDIF stream - as it contains the clock encoded in the data). Which needs to be reconciled with the output stream - is the output slaved to the input SPDIF stream, or is it clocked by the DSP? There are some good reasons why you might want the DSP to use its own clock, in which case you really need to do a sample rate conversion.


The ASRC artefacts are clearly visible as a range of spurs. They were a long long way down (cant recall exactly now, but they were low enough that I concluded that they were not relevant in a meaningful way). From memory ~-120dBc.

I would expect the golden ears brigade would declare these audible, along with something like "The artefacts smeared the sound stage with peanut butter, making vocal sound like they were upside down." or something.

Unless you have some really spiffy gear an ASRC on SPDIF I/O is ultimately what you will be using. Unless you are super fussy, it won't make all your music sound like Kylie Minogue is hidden in your speakers doing Karaoke.
 
Status
Not open for further replies.