Do Audiophiles want a stand alone high end HDD source?

If someone made a bit perfect low jitter HDD media source, would you buy it?


  • Total voters
    248
Status
This old topic is closed. If you want to reopen this topic, contact a moderator using the "Report Post" button.
Even with the latest firmware, and using a good +12V Linear PSU, the WD TV Live is still far from perfect in what it does, despite being quite good.
PSU has nothing to do with digital output. It is importand on the analog stage but that sucks anyway.
I am happy the 44.1 and 48kHz are now (on 1.04.17) bitperfect (HDCD bit passes trough). I am unhappy that higher samplerates are downsampled on the SPDIF output, but that is a Sigma general issue.

I am using a Denon AVR-3805 receiver. Connected directly via optical to WDTV Live or via coaxial to TV (connected via HDMI cable) I cannot tell that there is a difference. Downsampling of higher that 48kHz occurres anyway... so I don't think optical output is worse in this case.
 
Last edited:
PSU has nothing to do with digital output.
Oh really! Bits is bits & perfect sound forever, I guess. Why not try some experiments first before making these statements or research some
It is importand on the analog stage but that sucks anyway.
I am happy the 44.1 and 48kHz are now (on 1.04.17) bitperfect (HDCD bit passes trough). I am unhappy that higher samplerates are downsampled on the SPDIF output, but that is a Sigma general issue.

PS: I am using a Denon AVR-3805 receiver.
 
Like I said, 3.3V has it's own analog regulator. Anyway... any ripple on the digital power line will be less that the digital levels so it won't change the output. It is called noise immunity.
You can belive what/who ever you want... or you can study yourself some digital electronics books.

I'm not believing anything only my ears & my measurements - not on this particular device but on other digital devices. Listen to SandyK too - he knows of which he speaks :) I can see this discussion will not go anywhere!
 
Anyway... any ripple on the digital power line will be less that the digital levels so it won't change the output.

True enough if by 'won't change the output' you mean the bits represented by the output. But the output is an analog signal like any other, so that might well change - logic has no PSRR on its outputs, so what's on the supply gets on the output signal. Given that logic families these days get their switching threshold from the supply, a 5mV ripple may result in a 2.5mV shift in switching threshold and that translates into timing changes as nobody's got infinite slew rate.

It is called noise immunity.

And that's called a 'red herring'.
 
PSU has nothing to do with digital output.

Sorry SoNic, but PSU does make a BIG difference, less PSU noise will be reflected in the stability of the clock, and a happier circuit all round.

If I was forced to listen to my WD mini with the supplied SMPS, I would take a hammer to it and throw it away. It sounds so bad with SMPS vs. Battery power which is really nice.

You have been around these forums a long time and no doubt tried these things, I'm super surprised to hear this comment from you. I wonder what is going on with your system if you dont notice any improvement with a better power supply?

Sorry this is not meant to be insulting, Im just really, really surprised.
 
I'm reading so much on the "apparent loss due to jitter" on SPDIF interfaces.

Everyone's trusting ears, can't bear some equipment, etc. Why can't we go a little more scientific in the approach of this. Record the SPDIF source of choice back into a computer with a spdif in (yes, I know, this one could be "lossy" too...) and record a WAV.

At some point, of the output/input is bitperfect, which you want to achieve, parts of that WAV should be identical to parts of the original WAV source played back. Just take a few hundred bytes in the source and start looking for them in the recorded WAV.

If you can find them, just have just proven your SPDIF source is "perfect" enough to overcome possible bitshifts caused by jitter in the transport. This now means your digital part is for all practical means perfect. It's all down now to how good your DAC is, but you can sleep on both ears knowing what comes out of the digital source is accurate.

Personally, in the days of async USB connections on DACs (like the Wyred DAC-2) I fail to see what the problem is for needing a beefed up "perfect" source. Once you go async there *is no loss* on the WAV files until brought to the DAC. You can bounce the WAV all over the internet using a chain of proxy's, TCP forwarders and caches and what have you. It will still be the same wav, as the data is buffered & checksummed.

So if you're in doubt, invest in async USB DAC's and be done with the worries?
 
I agree that the theory that you just mentioned is 100% correct. The strange thing is that you could use two different brand of async USB connections running from the computer to DAC and both would sound different. Though both will have bit perfect output. So then we must assume that something else is causing the difference in sound, and a logical place to look is the power supply, track layout, grounding and shielding of the different devices.

I use i2s anyway as I think there is a sonic benefit in avoiding the SPDIF conversion process.
 
"sound" different I will only accept if you can pinpoint it out in ABX comparisons. I will also accept it as a truth if the wav's don't compare well in the idea I suggested.

Because if that would be true, async transfers sounding different, then the data changed. If that's true, then I'll stop doing my job as IT consultant, as I wouldn't trust data transfer anymore. Cloning my computer to an external USB disk? The copy wouldn't be 100% perfect. Actually, if I believe that, then even some letters in the text you just posted could have shifted. I'm sorry but as long as jitter doesn't cause any bitshifts (and when it does, and it's not least significant bits, it would sound horrible, like pops, not just some slight change) - then the source is perfect. (for me :) I don't mind others having other opinions but to me it's terrible to read people believe in those differences)

Now, on another note, if you are in belief that windows processes etc. cause changes in the sound, you can minimize those by going DOS. Sure, you'll still have interrupts running and such, but they are less than what's going on in a windows machine.

I suggest looking into MPXPLAY, a DOS MP3/WAV/FLAC/DTS etc player which is still being developed today, and which has some functions which are interesting to the high-end digital source crowd, like reading the entire file to memory before playing back.

It's very customizable, being able to drive LCD displays on the parallel bus, etc. Many many options. For checking out functionality, there is a win32 build running in windows text mode before you go DOS entirely:

Welcome to the PDSoft Homepage

The fun part is that you don't need a beefy machine at all to make it run. In 1999 I was building a proof of concept car player using this software (idea was playing MP3's+booting the OS from CD at the time) running on a Pentium 100 downclocked to 90. That's right. This software can decode MP3's on a 486DX4-100. I was running it on 1MB RAM at the time (that's megabyte, yes.)
 
Last edited:
I fail to see what the problem is for needing a beefed up "perfect" source. Once you go async there *is no loss* on the WAV files until brought to the DAC.

My experience with NON asynchronous USB > SPDIF converters is that the computer itself has an affect on the final sound, this was proven to myself when I built 3 different computers with different parts and optimized them all the same way (as described on the CMP website) and got a different sound out of each of them.

Are you saying that with an asynchronous USB > SPDIF converter no optimisation of the computer system or power supply is necessary?
And you will get the same high quality sound whether you play from XP or MAC OS? An ipad or laptop or ATX or ITX desktop computer?

I strongly suspect there will be a sonic difference, but I have not tried the asynchronous USB converter so please, you tell me. :)
Actually I challenge you to try this test, and if even one device sounds different, then you must concede that there are other variables at play in determining the final sound.
So anyway, this is one reason for a stand alone audiophile audio player, to minimize the variables.
 
Last edited:
If you can find them, just have just proven your SPDIF source is "perfect" enough to overcome possible bitshifts caused by jitter in the transport.



Bitshifts? Ha, ha. The problem of spdif is not "bitshifts" but reconstructing the clock. And no, it isn't trivial.


So if you're in doubt, invest in async USB DAC's and be done with the worries?


Done with the worries? At least one of the leading manufacturers acknowledges different USB cables still sound different with their async dac.
 
Theoretically, yes, because the buffering of the data happens close to the dac, not in the computer anymore.

Basically the computer sends the data to a FIFO buffer in the DAC, timing doesn't matter as long as the buffer isn't empty (like pouring water into a bucket with a hole on the bottom, you can pour fast, slow, water comes out at a steady rate as long as the bucket isn't empty)

The DAC's clock is being used for feeding the data to the actual DAC chip now, not the PC's clock anymore.

Most high-end DAC's are expensive because of the many many regulated power supplies in them, so the place where jitter can occur, the actual feeding data to the dac (probably I2S), is of much higher quality than any (time) critical component in the PC source.

Now practically, I tried a few times listening to music fed by FOOBAR2000 into async USB to my wyred DAC-2, and comparing it to the same music coming from a stock squeezebox classic into the same dac using coax. I can't hear a difference at all.

Then again, to my ears (and not ABXing here :p) I can only "feel" the slightest difference between squeezebox>coax>DAC2>amplifier and squeezebox>denon4306>amplifier so I'm already on the threshold of my hearing with the DAC2.

Erin, give MPXPLAY a go and tell me what you think. It's not hard to set up a DOS bootable diskette (or USB key) with mpxplay on it, and some wav files. For a final setup you could run a minimal computer (as described before in the thread, atom processors & such) booting DOS into a RAMdisk, into MPXplay, and access your music files over the network to a NAS with FTP capability (or, alternatively from SD if it is somehow accessible in DOS). This would truly be a minimalist solution, and minimalist is high-end for sure! :)
 
I won't get into "different sounding" anymore - it's a never ending yes/no game between believers and non-believers anyway, it's useless.

However, if anyone needs help with MPXplay I'll be willing to help, as it's a piece of software which I still care for and I want to check it out again to see if it's possible building a cheap frontend out of it.
 
Erin, give MPXPLAY a go and tell me what you think. It's not hard to set up a DOS bootable diskette (or USB key) with mpxplay on it, and some wav files. For a final setup you could run a minimal computer (as described before in the thread, atom processors & such) booting DOS into a RAMdisk, into MPXplay, and access your music files over the network to a NAS with FTP capability (or, alternatively from SD if it is somehow accessible in DOS). This would truly be a minimalist solution, and minimalist is high-end for sure! :)

Thank you for the suggestion Yves, I surely will try it when I find the time and enthusiasm for it. I like helpful suggestions. I used to use cPlay and CMP before going over to the WD mini running i2s.
 
Status
This old topic is closed. If you want to reopen this topic, contact a moderator using the "Report Post" button.