Difference betweeen Class D and "Digital Amplifiers"

Status
This old topic is closed. If you want to reopen this topic, contact a moderator using the "Report Post" button.
We are now down to the fact that everything in the real world is analog. That is fair. But contrast it to the meaning of digital as in "symbolic".
Yes, printed letters on paper is analog from a normal point of view (I won't argue the quantum mechanics side of things). Yes, the pixels on my LCD screen are analog, even if fixed to a number of intensities, even this part of the real world is analog. And inside a computer the signals are really analog signals, carrying analog pulses, processed by analog circuits like transistors and such.

But is this fundamental "analogness" useful when we discuss analog vs digital processing. I feel that taking a fundamentally physical approach renders the whole concept of digital information meaningless. Then not only Digital Amplification makes no sense, but even the word digital makes no sense. If a PCM signal is not a transfer of digital information, then I will stop arguing right here. In my head it makes no sense to apply the "everything is analog"-definition in its broadest sense.
 
Your remark is pretty much on the money imho. "Digital" is a very abstract notion. As soon as you're representing digital information in the physical world you're necessarily back to analogue. Then, the only difference between an "analogue" or "digital" signal becomes whether it's subsequently interpreted as symbolic or not.

If my monitor is misaligned, and I'm reading text off it, I'm losing no information as long as the text is legible. If on the other hand I'm looking at the photograph on my windows desktop, I do lose "some" information when the monitor is misaligned.

(If I'm a graphic designer and I'm looking also at the font used in the text, I also lose information, again demonstrating that the content of the text is "digital" and its visual representation is analogue.)

Although the implementation of digital circuits is by necessity an analogue affair, this does not make them analogue. Programmers do a fully digital job. Many people get a degree in computer science without knowing what a bit "looks like". To them, the bit is an symbolic quantity in its fullest abstract sense.
 
Nixie said:

A BSc in computer science must be the only Bachelor of Science degree that can be gotten without knowing the scientific method. They call it computer 'science' why, exactly?
Ehrmm.... There is no contradiction between operating in the abstract domain and using perfectly scientific methods. In fact, most mathematical and logical sciences operate mostly in the abstract domain.
 
I know at least a MSEE who designed CPU using Verilog but knows virtually nothing anout electrical engineering or electronics.

Binary algebra is just so elementary and doesn't lend much to other problems. Try to solve the question of how many colors it takes to color a map of states such that no two states sharing a common border be colored with same color. Or the paradox of twin photon.
 
Status
This old topic is closed. If you want to reopen this topic, contact a moderator using the "Report Post" button.