What Is Entropy?
The article explores entropy in information theory and physics, introducing a forthcoming book on the topic. It covers various aspects like Shannon entropy, Gibbs entropy, and Boltzmann distribution, emphasizing mathematical precision and quantum mechanics.
Read original articleThe article discusses the concept of entropy and its relation to information theory and physics. The author presents a draft of a book on entropy, aiming to explain the amount of information unknown in a situation that could potentially be learned. The book covers various topics related to entropy, such as information theory, Shannon entropy, Gibbs entropy, Boltzmann distribution, and more. The author avoids delving into the second law of thermodynamics and the role of entropy in biology or black hole physics. Instead, the focus remains on classical systems and the mathematical aspects of entropy. The author emphasizes the importance of understanding entropy through a quantitative and precise lens, incorporating elements of quantum mechanics, such as Planck's constant, to explain the entropy of classical systems like hydrogen gas. The book is described as mathematically rigorous, catering to readers interested in delving into the intricacies of entropy and its applications in physics.
Related
We must seek a widely-applicable Science of Systems
The text discusses the importance of a Science of Systems, focusing on Complex Systems. Emphasizing computer science's role, it explores potential applications in various fields and advocates for scientific progress through unified theories.
Structure and Interpretation of Classical Mechanics
Classical mechanics experiences a revival with a focus on complex behavior like nonlinear resonances and chaos. A book introduces general methods, mathematical notation, and computational algorithms to study system behavior effectively. It emphasizes understanding motion and nonlinear dynamics through exercises and projects.
Why Does Mathematics Describe Reality?
The YouTube video explores quantum mechanics, highlighting math's role in explaining natural phenomena. It covers imaginary numbers, Richard Feynman interactions, math's portrayal of reality, scientific constraints, short timescale event measurement challenges, and particle tunneling.
The Second Law of Thermodynamics
Chemistry principles explain life mishaps. Understanding reactions sheds light on events' randomness. Simple reactions like burning and rusting are emphasized. Frank L. Lambert simplifies entropy and thermodynamics for students.
Schrödinger's cat among biology's pigeons: 75 years of What Is Life?
Physicist Erwin Schrödinger's 1944 book "What Is Life?" explored the connection between physics and biology, proposing a "code-script" for cellular organization and heredity. His interdisciplinary ideas influenced modern genomics and quantum mechanics.
- Several comments discuss different interpretations and definitions of entropy, including Shannon entropy, its subjective nature, and its mathematical formulation.
- Some users share personal anecdotes and teaching methods that helped them understand entropy better, such as thinking in terms of dice rolls or compression algorithms.
- There are references to additional resources and discussions, including links to related articles, videos, and academic papers.
- One comment highlights a user's difficulty in accessing the ebook mentioned in the article.
- Another comment reflects on the philosophical implications of entropy and its role in the universe.
"My greatest concern was what to call it. I thought of calling it 'information,' but the word was overly used, so I decided to call it 'uncertainty.' When I discussed it with John von Neumann, he had a better idea. Von Neumann told me, 'You should call it entropy, for two reasons. In the first place your uncertainty function has been used in statistical mechanics under that name, so it already has a name. In the second place, and more important, no one really knows what entropy really is, so in a debate you will always have the advantage.'"
See the answers to this MathOverflow SE question (https://mathoverflow.net/questions/403036/john-von-neumanns-...) for references on the discussion whether Shannon's entropy is the same as the one from thermodynamics.
The entropy of a variable X is the amount of information required to drive the observer's uncertainty about the value of X to zero. As a correlate, your uncertainty and mine about the value of the same variable X could be different. This is trivially true, as we could each have received different information that about X. H(X) should be H_{observer}(X), or even better, H_{observer, time}(X).
As clear as Shannon's work is in other respects, he glosses over this.
"If you had a really smart compression algorithm, how many bits would it take to accurately represent this file?"
i.e., Highly repetitive inputs compress well because they don't have much entropy per bit. Modern compression algorithms are good enough on most data to be used as a reasonable approximation for the true entropy.
Think of the distribution as a histogram over some bins. Then, the entropy is a measurement of, if I throw many many balls at random into those bins, the probability that the distribution of balls over bins ends up looking like that histogram. What you usually expect to see is a uniform distribution of balls over bins, so the entropy measures the probability of other rare events (in the language of probability theory, "large deviations" from that typical behavior).
More specifically, if P = (P1, ..., Pk) is some distribution, then the probability that throwing N balls (for N very large) gives a histogram looking like P is about 2^(-N * [log(k) - H(P)]), where H(P) is the entropy. When P is the uniform distribution, then H(P) = log(k), the exponent is zero, and the estimate is 1, which says that by far the most likely histogram is the uniform one. That is the largest possible entropy, so any other histogram has probability 2^(-c*N) of appearing for some c > 0, i.e., is very unlikely and exponentially moreso the more balls we throw, but the entropy measures just how much. "Less uniform" distributions are less likely, so the entropy also measures a certain notion of uniformity. In large deviations theory this specific claim is called "Sanov's theorem" and the role the entropy plays is that of a "rate function."
The counting interpretation of entropy that some people are talking about is related, at least at a high level, because the probability in Sanov's theorem is the number of outcomes that "look like P" divided by the total number, so the numerator there is indeed counting the number of configurations (in this case of balls and bins) having a particular property (in this case looking like P).
There are lots of equivalent definitions and they have different virtues, generalizations, etc, but I find this one especially helpful for dispelling the air of mystery around entropy.
Your This Week's Finds were a hugely enjoyable part of my undergraduate education and beyond.
Thank you again.
I can't say I'm overly fond of Baez's definition, but far be it from me to question someone of his stature.
For those interested I am currently reading "Entropy Demystified" by Arieh Ben-Naim which tackles this side of things from much the same direction.
If the file server is down.. anyone could upload the ebook for download?
>I have largely avoided the second law of thermodynamics ... Thus, the aspects of entropy most beloved by physics popularizers will not be found here.
But personally, this bit is the most exciting to me.
>I have tried to say as little as possible about quantum mechanics, to keep the physics prerequisites low. However, Planck’s constant shows up in the formulas for the entropy of the three classical systems mentioned above. The reason for this is fascinating: Planck’s constant provides a unit of volume in position-momentum space, which is necessary to define the entropy of these systems. Thus, we need a tiny bit of quantum mechanics to get a good approximate formula for the entropy of hydrogen, even if we are trying our best to treat this gas classically.
Side note: All reversible energy transfers involve an increase in potential energy. All non-reversible energy transfers involve a decrease in potential energy.
Entropy is just a number you can associate with a probability distribution. If the distribution is discrete, so you have a set p_i, i = 1..n, which are each positive and sum to 1, then the definition is:
S = - sum_i p_i log( p_i )
Mathematically we say that entropy is a real-valued function on the space of probability distributions. (Elementary exercises: show that S >= 0 and it is maximized on the uniform distribution.)
That is it. I think there is little need for all the mystery.
This could be said "the distribution of what ever may be over the surface area of where it may be."
This is erroneously taught in conventional information theory as "the number of configurations in a system" or the available information that has yet to be retrieved. Entropy includes the unforseen, and out of scope.
Entropy is merely the predisposition to flow from high to low pressure (potential). That is it. Information is a form of potential.
Philosophically what are entropy's guarantees?
- That there will always be a super-scope, which may interfere in ways unanticipated;
- everything decays the only mystery is when and how.
Related
We must seek a widely-applicable Science of Systems
The text discusses the importance of a Science of Systems, focusing on Complex Systems. Emphasizing computer science's role, it explores potential applications in various fields and advocates for scientific progress through unified theories.
Structure and Interpretation of Classical Mechanics
Classical mechanics experiences a revival with a focus on complex behavior like nonlinear resonances and chaos. A book introduces general methods, mathematical notation, and computational algorithms to study system behavior effectively. It emphasizes understanding motion and nonlinear dynamics through exercises and projects.
Why Does Mathematics Describe Reality?
The YouTube video explores quantum mechanics, highlighting math's role in explaining natural phenomena. It covers imaginary numbers, Richard Feynman interactions, math's portrayal of reality, scientific constraints, short timescale event measurement challenges, and particle tunneling.
The Second Law of Thermodynamics
Chemistry principles explain life mishaps. Understanding reactions sheds light on events' randomness. Simple reactions like burning and rusting are emphasized. Frank L. Lambert simplifies entropy and thermodynamics for students.
Schrödinger's cat among biology's pigeons: 75 years of What Is Life?
Physicist Erwin Schrödinger's 1944 book "What Is Life?" explored the connection between physics and biology, proposing a "code-script" for cellular organization and heredity. His interdisciplinary ideas influenced modern genomics and quantum mechanics.