We are getting into the magical world of audiophile digital where magic things happen... the dark bits as we have described them on many other threads...
There is mention of burst noise, signal integrity issues etc. but I do not see any information to back anything up and the extraordinary claim that data that has the same bit pattern can sound different... How if the data is exactly the same how can it sound different (please do not go on about noise unless you have explicit proof that different players create wildly different noise signatures from a PC, we are talking bit patterns in isolation here).. the same bit pattern will always produce the same result. Next there will be claims that jitter can be recorded.... Oh and USB cables sounding different...
The only backup so far has been a brief nod towards the wife in the kitchen.....
There is mention of burst noise, signal integrity issues etc. but I do not see any information to back anything up and the extraordinary claim that data that has the same bit pattern can sound different... How if the data is exactly the same how can it sound different (please do not go on about noise unless you have explicit proof that different players create wildly different noise signatures from a PC, we are talking bit patterns in isolation here).. the same bit pattern will always produce the same result. Next there will be claims that jitter can be recorded.... Oh and USB cables sounding different...
The only backup so far has been a brief nod towards the wife in the kitchen.....
"Should not" is the operative phrase - the reality is that it does. A laptop playing sound using internal speakers only is a highly integrated audio environment, very different from a typical audiophile's rig, yet it shows similar symptoms to what the latter at times do. Which is, that the normal sound is very drab and uninspiring, has nothing about it to recommend it as being musically interesting - but careful attention to detail can yield a much higher quality of reproduction, enough to be able to draw one into the performance.
USB wasn't built by people with audiophile goals originally => the packet noise at the USB receiver PHY when signal integrity is impaired affects the DAC chip and DAC clock even in Async USB.
Etc...
My EMU 0404 in loop-back does something like THD 0.006%. Which problem is where?
Last edited:
Comparing apples and pears Frank, an identical bit pattern WILL ALWAYS contain the same information.... Even I can tell a difference between my laptop and my home system, the 4X15" bass drivers give a bit better lower end for a start.....
Vacuphile, all I can suggest is you listen even more intently, on a summers evening I can hear the Elves discussing world domination at the bottom of my garden, even my wife who does not dream on midsummers night has heard them, even as she was preparing food in the kitchen.....
Vacuphile, all I can suggest is you listen even more intently, on a summers evening I can hear the Elves discussing world domination at the bottom of my garden, even my wife who does not dream on midsummers night has heard them, even as she was preparing food in the kitchen.....
iTunes can be made to play bit-perfect.
Audirvana as well.
They absolutely do not sound the same on the same file.
..
Then one or both is not bit-perfect, or the method of comparison is faulty.
Proof's in the pudding, 😀 ... one's techniques evolve over time, you learn to look for, register markers in what you're observing - there are many tricks that are learnt on the way.
Can't be done without feedback. ie, without detailed information on what you were hearing. It is easy to mistake more distortion for better.
Most important is, you ignore "goodness" in the sound, you are only watching out for signs of "badness" - you do whatever it takes to make these obvious - jab the needle in to make you go, Ouch! ... not ask if you are more comfortable now than before, 🙂. When you can't hear any badness, not matter how hard you try, then you're in pretty good shape.
It does not matter if you are listening for presence or absence. There is no baseline measurement telling you of any presence of absence of distortion. Therefore, it is impossible to learn if sample A has more or less than sample B.
Usual training sessions involve identifying sample A or sample B as more/less distortive and then being told the right answer. Without that feedback loop, you are very likely deluding yourself. The amount of delusion is generally proportional to the amount of ego invested in the exercise.
As in "*I* can hear it", "*I* have trained myself" or "You are not as experienced as *I*, so *I* can ignore you".
I am just thankful that people like that are not testing medicine.
We are getting into the magical world of audiophile digital where magic things happen... the dark bits as we have described them on many other threads...
There is mention of burst noise, signal integrity issues etc. but I do not see any information to back anything up and the extraordinary claim that data that has the same bit pattern can sound different... How if the data is exactly the same how can it sound different (please do not go on about noise unless you have explicit proof that different players create wildly different noise signatures from a PC, we are talking bit patterns in isolation here).. the same bit pattern will always produce the same result. Next there will be claims that jitter can be recorded.... Oh and USB cables sounding different...
The only backup so far has been a brief nod towards the wife in the kitchen.....
Those who continue to discuss the magic do not understand what a bit stream is. They argue against isolation because the signalling is ultimately analog. What they fail to comprehend is that the bit stream itself is encapsulated in the analog signalling, and that any noise in the analog signalling does not and can not "end up in the bit stream". It is impossible, but they imagine some sort of ethereal magic happening there.
Once again, if two (or more) players (and the OS) are properly configured to perform the 3 core functions of PC audio playback in a bit-perfect manner, then they will all sound identical. If they do not, then one or more of the players is not configured for bit-perfect playback or the listener is fooling him/herself.
Yes, digital signals are practically immune to analog noise. It takes a very, very large amout of noise to make a '0' register as a '1' and visa versa.
But I would argue that there's no such thing as bit-perfect playback.
Computers are not able to do anything instantaneously. They operate sequentially (with a limited amount of parallel processing in a modern multi-core CPU). Everything happens with tiny amounts of delay, depending on the number of clock cycles being used. No CPU has an opcode pointing to "stream bit-perfect data". Computers are designed to handle "packages", not to stream date with perfect timing.
But I haven't done the math on any of this. There's a chance that the sequential speed of a modern CPU is high enough to make the delays of the software inaudible to the human ear (if the code is not sloppy).
I guess it would be possible to build a device that could do (close to) bit-perfect playback, but it would have to be made with discrete logic and with a design that compensated for the propagation delay in every single gate. It would also need a high precision clock signal.
...And it would probably not be worth the effort.
But I would argue that there's no such thing as bit-perfect playback.
Computers are not able to do anything instantaneously. They operate sequentially (with a limited amount of parallel processing in a modern multi-core CPU). Everything happens with tiny amounts of delay, depending on the number of clock cycles being used. No CPU has an opcode pointing to "stream bit-perfect data". Computers are designed to handle "packages", not to stream date with perfect timing.
But I haven't done the math on any of this. There's a chance that the sequential speed of a modern CPU is high enough to make the delays of the software inaudible to the human ear (if the code is not sloppy).
I guess it would be possible to build a device that could do (close to) bit-perfect playback, but it would have to be made with discrete logic and with a design that compensated for the propagation delay in every single gate. It would also need a high precision clock signal.
...And it would probably not be worth the effort.
Computers are not able to do anything instantaneously. They operate sequentially (with a limited amount of parallel processing in a modern multi-core CPU). Everything happens with tiny amounts of delay, depending on the number of clock cycles being used. No CPU has an opcode pointing to "stream bit-perfect data". Computers are designed to handle "packages", not to stream date with perfect timing.
Completely irrelevant. The timing of the data delivery is noncritical to the analog output signal, unless it's so grossly bad that there are skips and glitches. The only thing that matters is the correct numbers, i.e., being bit-perfect. Which is trivial.
Digital is digital, audiophillia digital follows the same physics as any other digital system, if the data from an audio package is the same it will sound the same...
List your assumptions for that to hold true.
Set iTunes to play bit-perfect, then compare to Audirvana in bit-perfect mode. All your assumptions fail because you don't really know what goes on in a digital playback chain.
You could start by reading the paper by Damien Plisson on AMR's website.
Or do the test with iTunes and Audirvana.
Yes, always best to get a technical education from advertising material written by someone selling something.
'Bit perfect' does not mean lack of timing variation.
Dan.
I agree 100%. As stated before, bit-perfection is not the "Alpha and Omega". It is, however, a fundamental building block if one's goal is excellent PC-based digital audio.
Without bit-perfection, the digital audio bit stream is reduced to a mere facsimile of the original and is vastly more subject to various audio degradations (usually called "features" or "enhancements") in the software. These degradations are orders of magnitude more significant than slight digital timing variations.
Furthermore, without bit-perfection, one is introducing more risk of timing variation (via software), not less.
Last edited:
We are getting into the magical world of audiophile digital where magic things happen... the dark bits as we have described them on many other threads...
There is mention of burst noise, signal integrity issues etc. but I do not see any information to back anything up and the extraordinary claim that data that has the same bit pattern can sound different... How if the data is exactly the same how can it sound different
It is an electrical problem. Focusing on just the bits means you aren't examining or understanding what happens in the overall digital chain.
Measure your own data.
Test with two bit-perfect SW, one of which could be XXHighEnd which allows you to change various parameters while playing in bit-perfect mode.
Comparing apples and pears Frank, an identical bit pattern WILL ALWAYS contain the same information....
Again, it isn't about that...
Then one or both is not bit-perfect, or the method of comparison is faulty.
Do your own comparison, verify bit-perfection and listen.
But I would argue that there's no such thing as bit-perfect playback.
And you would be wrong.
What they fail to comprehend is that the bit stream itself is encapsulated in the analog signalling, and that any noise in the analog signalling does not and can not "end up in the bit stream".
It's the other way round: it is people who focus too much on the bitstream being transferred correctly from one end to another who totally fail at understanding the electrical issues inside the whole digital playback chain that make a file sound different in two setups while being transferred in bit-perfect mode in the two setups.
...And it would probably not be worth the effort.
Why not??? Ease of operation is imo the main benefit to fight for. It has been done commercially I believe and from the report it was successful...
I have tried the DIY route at least twice. Very hard to achieve what can be achieved with normal CD player, where we can control everything (except what is done by integrated circuits) easily.
In the process I have experienced situation where I had to troubleshoot the electronics of the computer, tracing the motherboard to find the faulty chip (luckily only an EEPROM).
I will not do it again, I guess. Better purchase a commercial one. But from what I've read, cPlay project costs only $1000 and comparable with the best CD player??? I think the key is the sound card. I don't really believe in the jitter issue, but the switching noise (interference).
- Status
- Not open for further replies.
- Home
- Member Areas
- The Lounge
- Have you discovered a digital source, that satisfies you, as much as your Turntable?