**About Guustaaf Damave**
I am a web developer and got swept up in Matt's grand project of catalogueing human history and civilization in a website and 100

cartoons. 😀 Seriously, check out the

21st century grid, the new experimental

games and the

link machine, and let us know what you think!

contact@entropynation.com
### Entropy

Entropy is a most mysterious and enigmatic concept. Entropy has been growing in pubic parlance recently, becoming more commonplace all the time. There are now Allstate Insurance commercials that feature a character called “Mayhem” who causes havoc to spread wherever he goes:
https://www.youtube.com/watch?v=pLoHQtkj8pI
In today’s 24-hour-newscycle world where disasters and suffering are available at the touch of a mouse-button, entropy seems to entered our collective unconsciousness. But the concept of Entropy seems to transcend.

Basically entropy relates to disorder, to chaos and randomness. Entropy also relates to the so-called “arrow of time”, the idea that time only flows forward, from past to present to future, that time never goes backwards. Time never goes backwards even though theoretically there is no reason for backwards time ( - t) in equations of reality. Why ? This is one of the great unanswered questions in physics today and has profound implications.

The thermodynamic definition of entropy concerns how a “system” of molecules is ordered. This “statistical” definition is : S = kb ln Ω
Where S is the total entropy of a system, Ω is the number of states that the molecules can arrange themselves into, ln is the natural logarithm *,
kb is the Boltzman constant, named for Ludwig Boltzmann- a brilliant Austrian physicist who has this famous equation written on his tombstone.
The definition of the amount of total change of entropy to a defined thermodynamic system is defined in the Second Law of Thermodynamics ** :

or

Note that, in this universe, the sign for dS will always be positive (+). dS is the total change in entropy of a closed system, δQ is the amount of heat added (or subtracted), and T is the temperature of the system. This definition speaks to the fact that not all energy in a system is available for work, that there is no perfectly efficient system. Any closed system will undergo a net INCREASE in entropy, like an auto engine where some of the internal energy of the system results in the unwanted heating of the air instead of all the energy converting to spin the wheels. Two more examples of increasing entropy (1-thermodynamic) : A box with heated gas is dived by a panel separating gas from the other side, the panel is removed, eventually all the heated gas will diffuse evenly throughout the box, the entropy of this system will of increased.
(2 – statistical) A playing-card-tower is knocked down, going from a state of higher order to lower order, hence an increase in entropy.

Here is a general explanation:

https://www.youtube.com/watch?v=uQSoaiubuA0

Understanding entropy statistically (chemistry):

https://www.youtube.com/watch?v=YsP4Jv8NtWY&spfreload=10

A more mathematical, thermodynamic definition:

https://www.youtube.com/watch?v=PFcGiMLwjeY

The “Arrow of Time” : Brian Greene:

https://www.youtube.com/watch?v=LzvkO5ai9YQ

Very poetic:

https://www.youtube.com/watch?v=vLACGFhDOp0

Sean Carroll: Longer, more detailed cosmological presentation:

https://www.youtube.com/watch?v=rEr-t17m2Fo

** The Three Laws of Thermodynamics:

https://en.wikipedia.org/wiki/Laws_of_thermodynamics

* The natural logarithm : log_{e}x = ln(x), where e= 2.718281 ….

** The so-called three laws of thermodynamics, there are actually four :

**Zero:** If two systems are both in thermal equilibrium with a third system then they are in thermal
equilibrium with each other.

**First:** ΔUsystem = Q – W (U is the internal energy, Q is heat, W is work)

**Second:** δQ = T δS

**Third:** The entropy of a perfect crystal of any pure substance approaches zero as the
temperature approaches absolute zero.

**Entropy Links**

Wikipedia entropy entry