Entropy Explained, with Sheep (2016)
Entropy, a key thermodynamic concept, explains that macroscopic events generally progress towards higher entropy states, as larger systems are less likely to exhibit low-entropy configurations.
Read original articleEntropy is a fundamental concept in thermodynamics that explains the direction of physical processes. The article uses relatable examples, such as melting ice and dropping an egg, to illustrate that while atomic movements can occur in both directions, macroscopic events tend to follow a one-way path due to increasing entropy. Entropy is defined as the number of possible arrangements of particles or energy within a system, and it tends to increase because there are more ways for particles to be spread out than to be concentrated. The article employs a sheep analogy to demonstrate how energy distribution in solids leads to higher entropy states being more probable. When two solids at different temperatures are combined, energy tends to equalize, resulting in increased entropy. However, in small systems, fluctuations can occur, allowing for temporary decreases in entropy. As systems grow larger, the likelihood of finding them in low-entropy states diminishes, reinforcing the idea that entropy increases over time in macroscopic systems. This exploration of entropy not only addresses why it increases but also connects to broader questions about the universe's origins and its eventual fate.
- Entropy is a measure of the number of possible arrangements of particles or energy in a system.
- Macroscopic events tend to follow a one-way path due to the tendency of entropy to increase.
- Higher entropy states are more probable because there are more ways for particles to be spread out than concentrated.
- In small systems, entropy can fluctuate, allowing for temporary decreases.
- As systems increase in size, the likelihood of low-entropy states decreases significantly.
Related
Negative Temperature
Certain systems can have negative temperatures, hotter than any positive-temperature system. Predicted by Lars Onsager in 1949, they exhibit emergent ordering at high energies, with decreased entropy as energy increases.
What Is Entropy?
The book draft on Azimuth explores entropy as the amount of unknown information in a situation. It covers topics like information theory, Shannon entropy, Gibbs entropy, and Boltzmann distribution. The author emphasizes clarity and precision, making it a structured entry point into understanding entropy in physics.
What Is Entropy?
The article explores entropy in information theory and physics, introducing a forthcoming book on the topic. It covers various aspects like Shannon entropy, Gibbs entropy, and Boltzmann distribution, emphasizing mathematical precision and quantum mechanics.
Language Entropy
The text discusses how abstractness and entropy in language affect information density and communication efficiency, emphasizing the role of reader knowledge and word embeddings in understanding complexity.
What Is Entropy?
The paper "What is Entropy?" by John C. Baez introduces entropy, covering topics like Shannon and Gibbs entropy, and includes calculations for hydrogen gas, aimed at beginners in statistical mechanics.
Almost all of them have Python code to illustrate concepts.
-
1. Entropy of a fair coin toss - https://bytepawn.com/what-is-the-entropy-of-a-fair-coin-toss...
2. Cross entropy, joint entropy, conditional entropy and relative entropy - https://bytepawn.com/cross-entropy-joint-entropy-conditional...
3. Entropy in Data Science - https://bytepawn.com/entropy-in-data-science.html
4. Entropy of a [monoatomic] ideal gas with coarse-graining - https://bytepawn.com/entropy-of-an-ideal-gas-with-coarse-gra...
5. All entropy related posts - https://bytepawn.com/tag/entropy.html
I was extra confused when I discovered that a spread out cloud of hydrogen is lower entropy than the same cloud gravitationally bound together in a star. So entropy isn’t just about “spreading out,” either.
I found that Legos provide a really nice example to illustrate entropy, so I’ll share that here.
Consider a big pile of Legos, the detritus of many past projects. Intuitively, a pile of Legos is high entropy because it is disordered—but if we are trying to move beyond order/disorder, we need to relate it to micro states and macro states.
Therefore, a pile of Legos is high entropy because if you randomly swap positions of the pieces it will all be the same macrostate—ie a big pile of Legos. Nevertheless, each of the Lego pieces is still in a very specific position— and if we could clearly snapshot all those positions, that would be the specific microstate. That means that the macrostate of the pile has an astronomical number of possible microstates — there are many ways to reorganize the pieces that still look like a pile.
On the other hand, consider a freshly built Lego Death Star. This is clearly low entropy. But to understand why in terms of microstates, it is because very few Legos can be swapped or moved without it not really being a Death Star anymore. The low entropy is because there are very few microstates (specific Lego positions) that correspond to the given macro state (being a Death Star).
This specific case helped me grok Boltzmann entropy. To extend it, consider a box with a small ice crystal in it: this has many fewer possible microstates than the same box filled with steam. In the steam, molecules can pretty much we swapped and moved anywhere and the macrostate is the same. With the crystal, if you start randomly swapping molecules to different microstates, it stops being an ice crystal quickly. So an ice crystal is low entropy.
Now, the definition of what counts as a macrostate is very important in this… but this comment is long enough and I still haven’t gotten to the gym…
(Not refuting entropy as the order of time at all, just noting a visual example is not great evidence.)
“The most misunderstood concept in physics", by Veritasium (YouTube, 2023) (https://youtu.be/DxL2HoqLbyA?si=5a_4lCnuv85lRb57)
No-one would believe the scientists explaining that although highly improbable, the uncracked egg does make scientific sense.
I really think education is mostly about providing higher-level intuitions - making correct thought habitual and thus easy.
Part of what's so attractive about this particular article is how it would mesh with related fields (chemistry, statistics, politics, evolution, astophysics, climate science, etc)
It isn’t though.
Entropy is a fancy word for potential distribution over negative potential. Negative potential is the “surface area” over which potential may distribute. The “number of possible arrangements” casually fits into this, yet misses some unintuitive possibilities, like the resistive variance or other characteristics not preempted by who ever constructed the intellectual model.
Idealists insist entropy is a scalar state resolve of delta probability in their model. They are self deceived. Entropy is the existential tendency for potential to distribute toward equilibrium.
As long as boffins can throw away results that do not correlate, they can insist it is anything they like.
Related
Negative Temperature
Certain systems can have negative temperatures, hotter than any positive-temperature system. Predicted by Lars Onsager in 1949, they exhibit emergent ordering at high energies, with decreased entropy as energy increases.
What Is Entropy?
The book draft on Azimuth explores entropy as the amount of unknown information in a situation. It covers topics like information theory, Shannon entropy, Gibbs entropy, and Boltzmann distribution. The author emphasizes clarity and precision, making it a structured entry point into understanding entropy in physics.
What Is Entropy?
The article explores entropy in information theory and physics, introducing a forthcoming book on the topic. It covers various aspects like Shannon entropy, Gibbs entropy, and Boltzmann distribution, emphasizing mathematical precision and quantum mechanics.
Language Entropy
The text discusses how abstractness and entropy in language affect information density and communication efficiency, emphasizing the role of reader knowledge and word embeddings in understanding complexity.
What Is Entropy?
The paper "What is Entropy?" by John C. Baez introduces entropy, covering topics like Shannon and Gibbs entropy, and includes calculations for hydrogen gas, aimed at beginners in statistical mechanics.