What is the Universe expanding into..

Do you think there was anything before the big bang?

  • I don't think there was anything before the Big Bang

    Votes: 56 12.5%
  • I think something existed before the Big Bang

    Votes: 200 44.7%
  • I don't think the big bang happened

    Votes: 54 12.1%
  • I think the universe is part of a mutiverse

    Votes: 201 45.0%

  • Total voters
    447
Status
Not open for further replies.
Apropos nothing, Kate Bush's "Cloudbusting" is about Wilhelm Reich, inventor of the Orgone Accumulator IIRC. Another crank like Tesla IMO! :D

Earlier, Bonsai said: This will of course lead to a deeper question which is, what exactly is entropy...

I have been working at this. Bollzmann's Entropy formula is: S= k log W.

mcchambin seems content with Entropy is: S where dS = dQ / T

That's all Greek to me! My understanding is that Entropy is the LACK of INFORMATION about the State of a system.

It increases with time in an initially ordered system. You can calculate the related Shannon Entropy of a digital system or signal.

Bonsai should be interested as an electronics man.

It's pure statistics and can be based on the Binary Logarithm: https://en.wikipedia.org/wiki/Binary_logarithm

This is ideal for coin toss experiments, or 1 and 0 binary bits, or spin up and spin down electron experiments.

https://math.ucr.edu/home/baez/entropy/


It's magic really. Just the principle that a 50:50 chance coin toss is a half plus a half in the probability times the logarithm of the probability. Maximum Entropy is a State of Ignorance before tossing the coin.

Which is H = - ( 1/2 x log 1/2 + 1/2 x log 1/2) = 1 bit

If it always came out heads, that would give - (1 x log 0 + 0 x log 1) = 0 bits. That would be a prepared (cheating) coin toss, or a Stern-Gerlach where you have already fixed the Silver Ion spin to a particular vertical outcome. And tells you nothing that you didn't know already.

That is a State of Maximum Information. It works for more outcomes too, like 4 or 8 States. This is elegant IMO.

Information Theory is a subject I have always liked.
 
Last edited:
www.hifisonix.com
Joined 2003
Paid Member
My understanding is that Entropy is the LACK of INFORMATION about the State of a system.

Interestingly, as you look back in time, you have 100% knowledge of the system. So it therefore must be highly ordered, or stated otherwise, you go from the future - infinite possibilities and therefore highly disordered - to the past, highly ordered because you know the exact outcomes.

Points to ponder!
 
Member
Joined 2009
Paid Member
My understanding is that Entropy is the LACK of INFORMATION about the State of a system.

Interestingly, as you look back in time, you have 100% knowledge of the system. So it therefore must be highly ordered, or stated otherwise, you go from the future - infinite possibilities and therefore highly disordered - to the past, highly ordered because you know the exact outcomes.

Points to ponder!
Interesting. Amazingly insightful real time example when you put chaos theory to the test. Like tossing an eraser randomly onto a blotter paper numerous times, connecting the landing spots with a pencil and ruler resulting in a highly ordered fractal design.
 
...highly disordered...

In statistical physics, entropy is a measure of the disorder of a system.

What disorder refers to is the number of ways that a system can be in a certain state

The more ways, the higher the entropy.

It's always helpful to define the terms in an equation, like this:

S = k logW, where S = entropy, k = Boltzmann's constant and W = the number of ways.

And then attempt to make some sense of the equation, like this:

Take a system of four light bulbs, each of which can be either on or off:


Note: The more ways the light bulb system can be in the same state, the greater the entropy.

Well, I tried! :geek:
 
Last edited:
That seems to have gone Horribly, Horribly Wrong, Galu... Hope you can fix it!

Disco-Pete, you are showing promise as a Mathematician. I always found "Chaos Theory" interesting.

Even delivered a Lecture on it. Nobody got it except my Buddy.

Let's see if you find this Physics/Maths joke funny... Your final test. Tense, eh?

Q: "What does the B in Benoit B Mandelbrot stand for?

A: Benoit B Mandelbrot! :D

Well. I though it was funny. Maybe just me.
 
Well, the insert table function hasn't appeared to have worked. I tried to edit it, but simply lost the data. So I'll try it my own way:

State / Ways, W / Entropy, S
None on / 1 / 0
One on / 4 / 1.91
Two on / 6 / 2.47
Three on / 4 /1.91
Four on / 1 / 0
 
Last edited:
Member
Joined 2009
Paid Member
That seems to have gone Horribly, Horribly Wrong, Galu... Hope you can fix it!

Disco-Pete, you are showing promise as a Mathematician. I always found "Chaos Theory" interesting.

Even delivered a Lecture on it. Nobody got it except my Buddy.

Let's see if you find this Physics/Maths joke funny... Your final test. Tense, eh?

Q: "What does the B in Benoit B Mandelbrot stand for?

A: Benoit B Mandelbrot! :D

Well. I though it was funny. Maybe just me.
Wow that's amazing! He merely contemplated his own name and came up with an math equation!
Awe inspiring indeed. :hypno2:
 
www.hifisonix.com
Joined 2003
Paid Member
I’m challenging the orthodoxy here 😊

If there are infinite (or let’s say possibilities of many orders of magnitude) ahead of us, once some of those events ‘happen’ we know exactly what the outcome is. If we go from ‘countless possibilities’ to ‘knowledge’ surely it’s more ordered, not less?

Ok, some may argue that in acquiring the systems outcomes, entropy must be invoked, but is that the same thing?

Seems to me there is a logical fallacy buried in there somewhere

(donning 🔥 suite on now 🤣🤣🤣)
 
If there are infinite (or let’s say possibilities of many orders of magnitude) ahead of us, once some of those events ‘happen’ we know exactly what the outcome is. If we go from ‘countless possibilities’ to ‘knowledge’ surely it’s more ordered, not less?

In Boltzmann's view, a disordered state is one where there are a large number of possible equally probable arrangements.

Entropy doesn't depend on when these arrangements 'happen', but is proportional to the probability of them happening.

The higher the probability, the higher the entropy.

In fact, the W in S= k log W is inspired by the German word for probability, Wahrscheinlichkeit.
 
Probable particle spining around a nuclei of atom was only 1 for atoms all across the universe before electrons were generated buy someone (may be god may be dog 🙃 ). Whats the entropy then. It makes probability less probable and more concrete facts. Is there a underlying force that dictate entrophy too.
 
The entropy equation, does it have any practical use in the world we live on?

The Boltzmann statistical entropy equation or formula applies to the special case of an ideal gas (a hypothetical gas whose molecules occupy negligible space and have no interactions, and which consequently obeys the gas laws exactly).

Things get more complicated when we deal with more complex systems, but the Boltzmann equation is a step on the way to understanding the world we live on.
 
Last edited:

TNT

Member
Joined 2003
Paid Member
The Boltzmann statistical entropy equation or formula applies to the special case of an ideal gas (a hypothetical gas whose molecules occupy negligible space and have no interactions, and which consequently obeys the gas laws exactly).

Things get more complicated when we deal with more complex systems, but the Boltzmann equation is a step on the way to understanding the world we live on.
How do I understand it better with entropy insight?

//
 
I am a Mathematician.

According to my sources, you are a physicist and telecommunications engineer who is "quite good at maths".

Not to mention, crocodile wrestler! :D
 

Attachments

  • Steve in Africa.jpg
    Steve in Africa.jpg
    103.7 KB · Views: 23
Status
Not open for further replies.