Apropos nothing, Kate Bush's "Cloudbusting" is about Wilhelm Reich, inventor of the Orgone Accumulator IIRC. Another crank like Tesla IMO! 😀
Earlier, Bonsai said: This will of course lead to a deeper question which is, what exactly is entropy...
I have been working at this. Bollzmann's Entropy formula is: S= k log W.
mcchambin seems content with Entropy is: S where dS = dQ / T
That's all Greek to me! My understanding is that Entropy is the LACK of INFORMATION about the State of a system.
It increases with time in an initially ordered system. You can calculate the related Shannon Entropy of a digital system or signal.
Bonsai should be interested as an electronics man.
It's pure statistics and can be based on the Binary Logarithm: https://en.wikipedia.org/wiki/Binary_logarithm
This is ideal for coin toss experiments, or 1 and 0 binary bits, or spin up and spin down electron experiments.
https://math.ucr.edu/home/baez/entropy/
It's magic really. Just the principle that a 50:50 chance coin toss is a half plus a half in the probability times the logarithm of the probability. Maximum Entropy is a State of Ignorance before tossing the coin.
Which is H = - ( 1/2 x log 1/2 + 1/2 x log 1/2) = 1 bit
If it always came out heads, that would give - (1 x log 0 + 0 x log 1) = 0 bits. That would be a prepared (cheating) coin toss, or a Stern-Gerlach where you have already fixed the Silver Ion spin to a particular vertical outcome. And tells you nothing that you didn't know already.
That is a State of Maximum Information. It works for more outcomes too, like 4 or 8 States. This is elegant IMO.
Information Theory is a subject I have always liked.
Earlier, Bonsai said: This will of course lead to a deeper question which is, what exactly is entropy...
I have been working at this. Bollzmann's Entropy formula is: S= k log W.
mcchambin seems content with Entropy is: S where dS = dQ / T
That's all Greek to me! My understanding is that Entropy is the LACK of INFORMATION about the State of a system.
It increases with time in an initially ordered system. You can calculate the related Shannon Entropy of a digital system or signal.
Bonsai should be interested as an electronics man.
It's pure statistics and can be based on the Binary Logarithm: https://en.wikipedia.org/wiki/Binary_logarithm
This is ideal for coin toss experiments, or 1 and 0 binary bits, or spin up and spin down electron experiments.
https://math.ucr.edu/home/baez/entropy/
It's magic really. Just the principle that a 50:50 chance coin toss is a half plus a half in the probability times the logarithm of the probability. Maximum Entropy is a State of Ignorance before tossing the coin.
Which is H = - ( 1/2 x log 1/2 + 1/2 x log 1/2) = 1 bit
If it always came out heads, that would give - (1 x log 0 + 0 x log 1) = 0 bits. That would be a prepared (cheating) coin toss, or a Stern-Gerlach where you have already fixed the Silver Ion spin to a particular vertical outcome. And tells you nothing that you didn't know already.
That is a State of Maximum Information. It works for more outcomes too, like 4 or 8 States. This is elegant IMO.
Information Theory is a subject I have always liked.
Last edited:
My understanding is that Entropy is the LACK of INFORMATION about the State of a system.
Interestingly, as you look back in time, you have 100% knowledge of the system. So it therefore must be highly ordered, or stated otherwise, you go from the future - infinite possibilities and therefore highly disordered - to the past, highly ordered because you know the exact outcomes.
Points to ponder!
Interestingly, as you look back in time, you have 100% knowledge of the system. So it therefore must be highly ordered, or stated otherwise, you go from the future - infinite possibilities and therefore highly disordered - to the past, highly ordered because you know the exact outcomes.
Points to ponder!
Another crank like Tesla IMO! 😀
I caught some of the Goosebumps 2 movie this afternoon, and was surprised to see that it featured the "Power of Tesla".
Interesting. Amazingly insightful real time example when you put chaos theory to the test. Like tossing an eraser randomly onto a blotter paper numerous times, connecting the landing spots with a pencil and ruler resulting in a highly ordered fractal design.My understanding is that Entropy is the LACK of INFORMATION about the State of a system.
Interestingly, as you look back in time, you have 100% knowledge of the system. So it therefore must be highly ordered, or stated otherwise, you go from the future - infinite possibilities and therefore highly disordered - to the past, highly ordered because you know the exact outcomes.
Points to ponder!
...highly disordered...
In statistical physics, entropy is a measure of the disorder of a system.
What disorder refers to is the number of ways that a system can be in a certain state
The more ways, the higher the entropy.
It's always helpful to define the terms in an equation, like this:
S = k logW, where S = entropy, k = Boltzmann's constant and W = the number of ways.
And then attempt to make some sense of the equation, like this:
Take a system of four light bulbs, each of which can be either on or off:
Note: The more ways the light bulb system can be in the same state, the greater the entropy.
Well, I tried! 🤓
Last edited:
That seems to have gone Horribly, Horribly Wrong, Galu... Hope you can fix it!
Disco-Pete, you are showing promise as a Mathematician. I always found "Chaos Theory" interesting.
Even delivered a Lecture on it. Nobody got it except my Buddy.
Let's see if you find this Physics/Maths joke funny... Your final test. Tense, eh?
Q: "What does the B in Benoit B Mandelbrot stand for?
A: Benoit B Mandelbrot! 😀
Well. I though it was funny. Maybe just me.
Disco-Pete, you are showing promise as a Mathematician. I always found "Chaos Theory" interesting.
Even delivered a Lecture on it. Nobody got it except my Buddy.
Let's see if you find this Physics/Maths joke funny... Your final test. Tense, eh?
Q: "What does the B in Benoit B Mandelbrot stand for?
A: Benoit B Mandelbrot! 😀
Well. I though it was funny. Maybe just me.
Well, the insert table function hasn't appeared to have worked. I tried to edit it, but simply lost the data. So I'll try it my own way:
State / Ways, W / Entropy, S
None on / 1 / 0
One on / 4 / 1.91
Two on / 6 / 2.47
Three on / 4 /1.91
Four on / 1 / 0
State / Ways, W / Entropy, S
None on / 1 / 0
One on / 4 / 1.91
Two on / 6 / 2.47
Three on / 4 /1.91
Four on / 1 / 0
Last edited:
Wow that's amazing! He merely contemplated his own name and came up with an math equation!That seems to have gone Horribly, Horribly Wrong, Galu... Hope you can fix it!
Disco-Pete, you are showing promise as a Mathematician. I always found "Chaos Theory" interesting.
Even delivered a Lecture on it. Nobody got it except my Buddy.
Let's see if you find this Physics/Maths joke funny... Your final test. Tense, eh?
Q: "What does the B in Benoit B Mandelbrot stand for?
A: Benoit B Mandelbrot! 😀
Well. I though it was funny. Maybe just me.
Awe inspiring indeed.

I’m challenging the orthodoxy here 😊
If there are infinite (or let’s say possibilities of many orders of magnitude) ahead of us, once some of those events ‘happen’ we know exactly what the outcome is. If we go from ‘countless possibilities’ to ‘knowledge’ surely it’s more ordered, not less?
Ok, some may argue that in acquiring the systems outcomes, entropy must be invoked, but is that the same thing?
Seems to me there is a logical fallacy buried in there somewhere
(donning 🔥 suite on now 🤣🤣🤣)
If there are infinite (or let’s say possibilities of many orders of magnitude) ahead of us, once some of those events ‘happen’ we know exactly what the outcome is. If we go from ‘countless possibilities’ to ‘knowledge’ surely it’s more ordered, not less?
Ok, some may argue that in acquiring the systems outcomes, entropy must be invoked, but is that the same thing?
Seems to me there is a logical fallacy buried in there somewhere
(donning 🔥 suite on now 🤣🤣🤣)
Not sure that knowledge counts in the entropy counts, but even if it does, entropy is a long term trend. Lots of things get more ordered, temporarily. The graph waggles up and down but the trend line is relentless...
If there are infinite (or let’s say possibilities of many orders of magnitude) ahead of us, once some of those events ‘happen’ we know exactly what the outcome is. If we go from ‘countless possibilities’ to ‘knowledge’ surely it’s more ordered, not less?
In Boltzmann's view, a disordered state is one where there are a large number of possible equally probable arrangements.
Entropy doesn't depend on when these arrangements 'happen', but is proportional to the probability of them happening.
The higher the probability, the higher the entropy.
In fact, the W in S= k log W is inspired by the German word for probability, Wahrscheinlichkeit.
Probable particle spining around a nuclei of atom was only 1 for atoms all across the universe before electrons were generated buy someone (may be god may be dog 🙃 ). Whats the entropy then. It makes probability less probable and more concrete facts. Is there a underlying force that dictate entrophy too.
The entropy equation, does it have any practical use in the world we live on. I mean, did it help go to the moon or make the washing machines work...
?
//
?
//
I think Entropy is important. Seems one of those fundamental Engineering ideas about which we could learn a thing or two.
But not my department. I am a Mathematician. And I read Euler:
https://en.wikipedia.org/wiki/Leonhard_Euler
Master of us all.
If you want to stretch me, I find the 24-Cell interesting:
https://en.wikipedia.org/wiki/24-cell
It has no anologue in 3D! 😀
But not my department. I am a Mathematician. And I read Euler:
https://en.wikipedia.org/wiki/Leonhard_Euler
Master of us all.
If you want to stretch me, I find the 24-Cell interesting:
https://en.wikipedia.org/wiki/24-cell
It has no anologue in 3D! 😀
The entropy equation, does it have any practical use in the world we live on?
The Boltzmann statistical entropy equation or formula applies to the special case of an ideal gas (a hypothetical gas whose molecules occupy negligible space and have no interactions, and which consequently obeys the gas laws exactly).
Things get more complicated when we deal with more complex systems, but the Boltzmann equation is a step on the way to understanding the world we live on.
Last edited:
The entropy equation...
Then, of course, there is the Boltzmann Equation or Boltzmann Transport Equation (BTE).
Get your chops round this: https://en.wikipedia.org/wiki/Boltz...rmine,charge carriers in a material as a gas).
How do I understand it better with entropy insight?The Boltzmann statistical entropy equation or formula applies to the special case of an ideal gas (a hypothetical gas whose molecules occupy negligible space and have no interactions, and which consequently obeys the gas laws exactly).
Things get more complicated when we deal with more complex systems, but the Boltzmann equation is a step on the way to understanding the world we live on.
//
I am a severely Practical Man. If I had listened to my good friend Galu. I would have got up early this morning and sought a piccie of a Lunar Eclipse!
https://www.bbc.co.uk/news/world-61461082
Happily good sense intervened. Waste of my time. I have seen these things before. In Entropy terms, there was no surprise.
https://www.bbc.co.uk/news/world-61461082
Happily good sense intervened. Waste of my time. I have seen these things before. In Entropy terms, there was no surprise.
I am a Mathematician.
According to my sources, you are a physicist and telecommunications engineer who is "quite good at maths".
Not to mention, crocodile wrestler! 😀
Attachments
- Status
- Not open for further replies.
- Home
- Member Areas
- The Lounge
- What is the Universe expanding into..