I2S 16 vs. 24 bit as in PCM 1792 data sheet?

Status
Not open for further replies.
Any idea as to why the PCM 1792 has a default 24 bit I2S mode, but offers a 16 bit I2S mode as well? To the best of my knowledge, I2S is (in contrast to standard = Sony = right justified mode) MSB justified. This means a 16 bit signal fed to a 24 bit input receiver will simply set the 8 LSB to 0 which does not make any difference.
 
capslock said:
Any idea as to why the PCM 1792 has a default 24 bit I2S mode, but offers a 16 bit I2S mode as well? To the best of my knowledge, I2S is (in contrast to standard = Sony = right justified mode) MSB justified. This means a 16 bit signal fed to a 24 bit input receiver will simply set the 8 LSB to 0 which does not make any difference.

The only requirement I2S makes of the serial clock or bitclock is that it provides enough edges to load all the data into the input register. That means bitclock can be as low as 32Fs.

ray.
 
Looking at the timing diagrams on page 10 (I believe) of the 1792 data sheet, there seem to be enough bit clock frames. The option to exclude non-audio information in those bits below the 16th one might well be the explanation.
 
Possibly but not probably. The 16bit I2S option predates Crystal's signal level bits by at least a decade. 16bit I2S with a 1.4MHz bitclock will not work with a dac expecting 24bit I2S with a bitclock of 2.1MHz or 2.8MHz.

ray.
 
Status
Not open for further replies.