Sound Quality Vs. Measurements

Status
Not open for further replies.
Hi,

Yes, really. Dealing with other problems is not baffle step compensation. Valid and problems, sometimes addressable with step filters, but not BSC.

Let me make as simple and clear as I can.

The issue is that, at the baffle step frequency what really happens is not a drop in LF output, but a change in directivity from 2PI to 4PI. There is no attenuation of the driver output. However, if you use a anechoic measurement (or the now common pseudo anechoic measurements) you will measure attenuated bass, because you are explicitly exclude the room influence.

If you then apply "theoretical" Baffle Step Correction your speaker will have a 6dB LF boost in the power and in room response. Real studio monitors generally lack this, as it would give a result that was not "true".

As modern speakers (DIY or commercial) all have similar baffle width (almost no-one makes speakers that make acoustic sense anymore) you may find that the common BSC frequencies and level of boost match well with the corner frequency and level of boost for bass boost needed for around 70...75dB SPL listening.

This whole BSC thing is just one more illustration of how dogmatic and moronic Audio has become and how little people in audio nowadays bother to consider real evidence (regardless which "camp" they belong to).

Ciao T
 
Last edited:
Hi,

Don´t know what you mean "extended checksum algorithms found in Binary Usenet Posts", but if two files have the same md5 checksum they _are_ identical, period.

Source file repaired from rar and par/par2 files and checked against the checksum (or whatever is in it) in a sfv file passed. File was corrupt.

Re-download the precisely same posts, repair with different par files and compare to the SAME (not a new) sfv file, the file again passes but is not corrupt.

Note, I stopped using the usenet ages ago (around the time my IPS Blueyonder dropped retention from 180 days to one week), so things may be different now...

Ciao T
 
dvv,
OK you have two CDs One stamped, one a copy. Use good software and rip them both to the hard drive in a loss-less format. Do a bit comparison to verify they actually are the same. If so, play them both back through your best DAC and listen. It would be hard to come up with a reason that follows what we know about physics if there were a difference. This removes the transport and media from the equation.

Now, there's an idea.

If I caught your gist, which correlates with mine, it could well be the software. Just because it's Nero, and hence supposed to be good, in reality may turn out to be not at all so.

Either way, I should try different software, just to be on the safe side.
 
Go for it, SY and Scott. Make great products from IC's! Do it well, and I will congratulate you.

Not to put a too fine an edge to this, but I believe that you can make good stuff with any technology, assuming you are fully aware of its both good and bad sides.

IC op amps have gained a relatively bad reputation because, in my view, they were in fact misused, sometimes laying the blame at the door of their manufacturers and their eager beaver data sheet authors.

As I see it, the manufacturer's specs regarding available current from op amps should not even be read, because in most cases, it's well neigh unusable. They will say that the op amp can deliver say 20 mA of current, and in practice, it starts to choke up and go wild below 10 mA. At least, such is my experience.

The cure is to use discrete transistors as current boosters. This of course complicates circuits, and some start wondering why hadn't they used discrete right from the star, but these op amps CAN in fact be made to sound rather good.

And you don't need any special components, the usual suspects among small signal models will usually work out just fine, although there may be difference. I got rather good results with the classic models, such as MPSA 56/06, even BC 546/556, but I think (may be an illusion) I got the best from BC 639/640.

If there's a pattern, I'd say the difference is at its greatest with circuits which have a lot of capacitors in the signal path.

I have to run now, but later on in the day, I'll draw a few examples just to illustrate the point.
 
Hi,

Thorsten, if you "repaired" a file you altered it. This is a entirely different story and has nothing to do with the original topic.

Clearly you do not understand how large binary files are distributed via the usenet (or at least where in the early 2K's). They are broken up into 50MB chunks and come with parity files (basically the same way parity works on striped RAID5 disks).

So that if you found one of the blocks missing or corrupt (which happened often) you could use a program to re-create the missing information.

Additionally the file set always included a sfv (basically stores CRC32 checksum) file for the original File(s).

Once the complete file was re-created it PASSED the sfv test which should make sure the file was the same as original, binary. However despite passing this test the actual file in fact was corrupt, compared to the original file.

In other words a file that had the same CRC32 checksum was not binary identical with the original. And not only one time.

Hence my point that if the files have the same checksum there is no guarantee that they are identical.

Ciao T
 
Hi,

It was not about CD encoding. The claim was that two identical (!) files on a harddisk sound different depending on the mode they where ripped.

From what I read there was no claim that the files where identical, only that they had the same checksum.

One thing that I have found to affect computer playback was if the HDD (or more precisely the file) was de-fragmented or not. This clearly has identical files. Of course, there are very good reasons why a fragmented file could sound worse on playback.

There are many variables in this story and not all of them have been shown to have been equalised out to make sure "all else being equal".

For fun, at a past CES a dealer of our US distributor brought along a CD Copier that he claimed improved the the sound quality of the copy over the original (He was also selling this contraption BTW).

He was crestfallen when on our system (AMR end to end) the copy's sounded markedly worse than the originals. Funny enough, on the more mainstream "High End" main system we all felt the copy's sounded less annoying, less edgy and more relaxed.

I did a binary compare and an analysis of a pink noise copy, which revealed the copy not to be bit-perfect but to have gone through a rather poor quality sample rate converter. The fun part that this was basically a specific computer copier, not an audio copier, still it detected "audio" CD's and copied them as audio with ASRC, rather then the bit-perfect copies it did using Data CD's...

We all learned several interesting lessons on that day. Including not to make too many assumptions.

Ciao T
 
"Clearly you do not unterstand..." I´m working in the computer industry for more than 17 years now and clearly have a better understanding of these things than you.
You might want to look up what a md5sum or sha sum is in comparison to crc.
The original claim was that the files are identical (which can be checked by sha256 hash for example (and not with some unknown crc implementation of an unspecified compression program)) but yet do "sound" different when played on the same hardware.
I doubt you will encounter a sha256 collision during our short lifetime.
 
hI,

"Clearly you do not unterstand..." I´m working in the computer industry for more than 17 years now

Yes, I remember when I used to be that young... You are talking to a CNE and MCE holder here who used to hack into computers using acoustic couplers and "portable" PC's via public phone booths...

and clearly have a better understanding of these things than you. You might want to look up what a md5sum or sha sum is in comparison to crc.

These where not available to me at the time. What was available was sfv/crc32... In fact, with the very rare (and of course strictly legal) downloads I make I still usually find sfv files and md5 rather more rarely.

The original claim was that the files are identical (which can be checked by sha256 hash for example (and not with some unknown crc implementation of an unspecified compression program)) but yet do "sound" different when played on the same hardware.

That was not the claim in the OP's thread, it was:

"I had already ripped this same CD before, with my PC running in "normal" mode, so I was able to compare the safemode rip, to the normal rip. Both rips were done using EAC. Both files had the same checksum. Both files were the same size. Both files sounded different!

Both files were ripped as .wav files. Both files played back using the same software. Both files played back at the same amplitude (no touching the volume knob)"

I can see the two immediate issues:

1) The files are actually different but hash to the same CRC32 and the OP used CRC32

2) The file ripped in "normal" mode was contiguous and the the one ripped in "safe mode" fragmented (or the other way around).
I doubt you will encounter a sha256 collision during our short lifetime.

sha256 is a rather different to CRC32, wouldn't you say? Did you confirm that the OP in the "files sound different" thread used this and not CRC32?

My point is that I do not accept that two files where binary identical only on the basis "same checksum" as I have had enough experience with this not sufficing, plus there are secondary mechanism at play (incidentally these also provide for the possibility of truly identical files sounding different if played from Flash Memory or Hard Disk...).

So a little more detailed investigation is on order before we make pronouncement, lest we make them from halve-knowledge coupled with a double dash of prejudice.

Ciao T
 
The OP poster did not not specify how he determined the files are identical he just said they are. So his claim _is_ that they are identical. This should be proved with a sha256 checksum (which _is_ perfectly relieable).
"possibility of truly identical files sounding different if played from Flash Memory or Hard Disk...)." Thats BS.

"MCE holder" isn´t it called MCSE ? I always knew those windows people live in a different world that we Unix/Linux people do (I have a RHCE).
 
Hi,

The OP poster did not not specify how he determined the files are identical he just said they are. So his claim _is_ that they are identical.

You must actually READ what other people are writing. He claimed they had identical checksums with no further qualifications. Unless you ask, you do not know what was used.

"possibility of truly identical files sounding different if played from Flash Memory or Hard Disk...)." Thats BS.

Really? Have you ever measured jitter on the output signal for comparison (or for the file heavily fragmented and de-fragmented). Or have you ever looked at the correlation between jitter output from the PC correlated to PSU modulations?

If you did so and found nothing for a number of different configurations and hardware, then you can make informed comments on the topic. Otherwise I believe the BS is exiting from your keyboard. You really should pipe that sort of output to /dev/null...

"MCE holder" isn´t it called MCSE ? I always knew those windows people live in a different world that we Unix/Linux people do (I have a RHCE).

Mickey$soft has several now, I did mine in the original program, when MS SQL Server was still Sybase stuffed into a M$ Box, I cannot remember what that thing was called, too long ago. Before that CNE is Novell (if you remember them).

My point is that I have been and still am seriously involved with computers pretty much since they where available to individuals (I cut my "computer teeth" on Z80 based stuff and east german / russian copies of IBM mainframes replete with punchcards, tapes and teletypes).

Ciao T
 
AX tech editor
Joined 2002
Paid Member
Jan, when dealing with the crude error correction of a CD, error correction is a best guess, not a true correction like we do with data formats like old 7 of 9 code or GCR on the old tape drives, let alone the application layer methods. No ecc or crc bytes, no retry, no blocking out bad blocks. Dust comes and goes. It is interpolation as you suggest. There is no true error correction, as there is no redundant data from which to extract the correction. [snip].

Sorry, this is incorrect. Red Book error correction restores the data to 100% correct bits. After error correction, the error is gone, completely. No trace left. Read about it.

jan
 
"You must actually READ what other people are writing. He claimed they had identical checksums with no further qualifications."

It seems _you_ must read, because that´s exactly what I said:
"The OP poster did not not specify how he determined the files are identical he just said they are."

As for different jitter values from fragmented vs. non fragmented disks, where are those measurements ? You are aware that files reads are cached ? (Well not on an old Z80 machine (I have my ZX81 still somewhere) but of course on any recent OS).
 
Hi,

Sorry, this is incorrect. Red Book error correction restores the data to 100% correct bits.

Unless error correction is unsuccessful and error concealment kicks in. It is fun to actually make a MCU display when that happens (needs very little code) and see how some CD's (even though they do not look very scratched) show very high uncorrectable error rates...

Ciao T
 
Hi,

"You must actually READ what other people are writing. He claimed they had identical checksums with no further qualifications."

It seems _you_ must read, because that´s exactly what I said:
"The OP poster did not not specify how he determined the files are identical he just said they are."

You also wrote:

"The OP poster did not not specify how he determined the files are identical he just said they are."

In fact I tried to find where the OP did so and did not find it, only his comment that the checksums matched.

The claim that "the files are identical"is not from the OP, it is from you.

As for different jitter values from fragmented vs. non fragmented disks, where are those measurements ? You are aware that files reads are cached ?

First, I have done some, but they are not available for posting. Over at diyhifi.org are some quite enlightening discussion on this (generally by people who have left or been made to leave diya) and related topics. I rather doubt they let you in though.

Yes, I am aware that reads are cached. However what does this have to do with the price of tea in china?

An HDD that reads a contiguous data stream from a contiguous file without any other disk activity causing it to seek will have a very different current draw than one that has to seek for fragments (and maybe write a bit of virtual memory at the same time).

The results can be observed with a 'scope on the supply lines and in voltages developed across non-zero ground impedances AND by observing the jitter pattern of any clock powered from this same supply.

In MOST cases in computer audio the supplies for clocks that ultimately drive the audio subsytems vary between quite bad and truly awful... A fairly basic 'scope suffices easily to observe.

Ciao T
 
Status
Not open for further replies.