Re-clocking a must?

Status
This old topic is closed. If you want to reopen this topic, contact a moderator using the "Report Post" button.
I'm inclined to think that the DAC samples the LRCLK input by using the BCK input. .. Every synchronous interface on every DSP chip I've ever worked with has worked this way, and i've seen so much VHDL/verilog code that works this way too. It's just "accepted design procedure" to me. ;)

But there's only one way to find out for sure, and this only works on non-OS DACs... set up three debounced SPDT switches. Manually clock some data into the chip, then alter LRCLK. Does the analog output level change right away? if not, try cycling BCLK and see if it changes.
 
gmarsh said:
I'm inclined to think that the DAC samples the LRCLK input by using the BCK input. .. Every synchronous interface on every DSP chip I've ever worked with has worked this way, and i've seen so much VHDL/verilog code that works this way too. It's just "accepted design procedure" to me. ;)

But there's only one way to find out for sure, and this only works on non-OS DACs... set up three debounced SPDT switches. Manually clock some data into the chip, then alter LRCLK. Does the analog output level change right away? if not, try cycling BCLK and see if it changes.


The DF1700 and most of the NPC digital filters must be the exceptions to the rule and the same must go for any dac they are connected to. These filters use BCK to load the input serial register of the DAC and then stop BCK. If you doubt the datasheets, these devices are slow enough to monitor on a scope.
 
Status
This old topic is closed. If you want to reopen this topic, contact a moderator using the "Report Post" button.