July 22nd, 2024

What Is Entropy?

The article explores entropy in information theory and physics, introducing a forthcoming book on the topic. It covers various aspects like Shannon entropy, Gibbs entropy, and Boltzmann distribution, emphasizing mathematical precision and quantum mechanics.

Read original articleLink Icon
EntropyInformationUnderstanding
What Is Entropy?

The article discusses the concept of entropy and its relation to information theory and physics. The author presents a draft of a book on entropy, aiming to explain the amount of information unknown in a situation that could potentially be learned. The book covers various topics related to entropy, such as information theory, Shannon entropy, Gibbs entropy, Boltzmann distribution, and more. The author avoids delving into the second law of thermodynamics and the role of entropy in biology or black hole physics. Instead, the focus remains on classical systems and the mathematical aspects of entropy. The author emphasizes the importance of understanding entropy through a quantitative and precise lens, incorporating elements of quantum mechanics, such as Planck's constant, to explain the entropy of classical systems like hydrogen gas. The book is described as mathematically rigorous, catering to readers interested in delving into the intricacies of entropy and its applications in physics.

Related

We must seek a widely-applicable Science of Systems

We must seek a widely-applicable Science of Systems

The text discusses the importance of a Science of Systems, focusing on Complex Systems. Emphasizing computer science's role, it explores potential applications in various fields and advocates for scientific progress through unified theories.

Structure and Interpretation of Classical Mechanics

Structure and Interpretation of Classical Mechanics

Classical mechanics experiences a revival with a focus on complex behavior like nonlinear resonances and chaos. A book introduces general methods, mathematical notation, and computational algorithms to study system behavior effectively. It emphasizes understanding motion and nonlinear dynamics through exercises and projects.

Why Does Mathematics Describe Reality?

Why Does Mathematics Describe Reality?

The YouTube video explores quantum mechanics, highlighting math's role in explaining natural phenomena. It covers imaginary numbers, Richard Feynman interactions, math's portrayal of reality, scientific constraints, short timescale event measurement challenges, and particle tunneling.

The Second Law of Thermodynamics

The Second Law of Thermodynamics

Chemistry principles explain life mishaps. Understanding reactions sheds light on events' randomness. Simple reactions like burning and rusting are emphasized. Frank L. Lambert simplifies entropy and thermodynamics for students.

Schrödinger's cat among biology's pigeons: 75 years of What Is Life?

Schrödinger's cat among biology's pigeons: 75 years of What Is Life?

Physicist Erwin Schrödinger's 1944 book "What Is Life?" explored the connection between physics and biology, proposing a "code-script" for cellular organization and heredity. His interdisciplinary ideas influenced modern genomics and quantum mechanics.

AI: What people are saying
The comments on the article about entropy in information theory and physics cover a range of perspectives and insights:
  • Several comments discuss different interpretations and definitions of entropy, including Shannon entropy, its subjective nature, and its mathematical formulation.
  • Some users share personal anecdotes and teaching methods that helped them understand entropy better, such as thinking in terms of dice rolls or compression algorithms.
  • There are references to additional resources and discussions, including links to related articles, videos, and academic papers.
  • One comment highlights a user's difficulty in accessing the ebook mentioned in the article.
  • Another comment reflects on the philosophical implications of entropy and its role in the universe.
Link Icon 32 comments
By @Jun8 - 7 months
A well known anecdote reported by Shannon:

"My greatest concern was what to call it. I thought of calling it 'information,' but the word was overly used, so I decided to call it 'uncertainty.' When I discussed it with John von Neumann, he had a better idea. Von Neumann told me, 'You should call it entropy, for two reasons. In the first place your uncertainty function has been used in statistical mechanics under that name, so it already has a name. In the second place, and more important, no one really knows what entropy really is, so in a debate you will always have the advantage.'"

See the answers to this MathOverflow SE question (https://mathoverflow.net/questions/403036/john-von-neumanns-...) for references on the discussion whether Shannon's entropy is the same as the one from thermodynamics.

By @glial - 7 months
I felt like I finally understood Shannon entropy when I realized that it's a subjective quantity -- a property of the observer, not the observed.

The entropy of a variable X is the amount of information required to drive the observer's uncertainty about the value of X to zero. As a correlate, your uncertainty and mine about the value of the same variable X could be different. This is trivially true, as we could each have received different information that about X. H(X) should be H_{observer}(X), or even better, H_{observer, time}(X).

As clear as Shannon's work is in other respects, he glosses over this.

By @dekhn - 7 months
I really liked the approach my stat mech teacher used. In nearly all situations, entropy just ends up being the log of the number of ways a system can be arranged (https://en.wikipedia.org/wiki/Boltzmann%27s_entropy_formula) although I found it easiest to think in terms of pairs of dice rolls.
By @ooterness - 7 months
For information theory, I've always thought of entropy as follows:

"If you had a really smart compression algorithm, how many bits would it take to accurately represent this file?"

i.e., Highly repetitive inputs compress well because they don't have much entropy per bit. Modern compression algorithms are good enough on most data to be used as a reasonable approximation for the true entropy.

By @tasteslikenoise - 7 months
I've always favored this down-to-earth characterization of the entropy of a discrete probability distribution. (I'm a big fan of John Baez's writing, but I was surprised glancing through the PDF to find that he doesn't seem to mention this viewpoint.)

Think of the distribution as a histogram over some bins. Then, the entropy is a measurement of, if I throw many many balls at random into those bins, the probability that the distribution of balls over bins ends up looking like that histogram. What you usually expect to see is a uniform distribution of balls over bins, so the entropy measures the probability of other rare events (in the language of probability theory, "large deviations" from that typical behavior).

More specifically, if P = (P1, ..., Pk) is some distribution, then the probability that throwing N balls (for N very large) gives a histogram looking like P is about 2^(-N * [log(k) - H(P)]), where H(P) is the entropy. When P is the uniform distribution, then H(P) = log(k), the exponent is zero, and the estimate is 1, which says that by far the most likely histogram is the uniform one. That is the largest possible entropy, so any other histogram has probability 2^(-c*N) of appearing for some c > 0, i.e., is very unlikely and exponentially moreso the more balls we throw, but the entropy measures just how much. "Less uniform" distributions are less likely, so the entropy also measures a certain notion of uniformity. In large deviations theory this specific claim is called "Sanov's theorem" and the role the entropy plays is that of a "rate function."

The counting interpretation of entropy that some people are talking about is related, at least at a high level, because the probability in Sanov's theorem is the number of outcomes that "look like P" divided by the total number, so the numerator there is indeed counting the number of configurations (in this case of balls and bins) having a particular property (in this case looking like P).

There are lots of equivalent definitions and they have different virtues, generalizations, etc, but I find this one especially helpful for dispelling the air of mystery around entropy.

By @Tomte - 7 months
By @eointierney - 7 months
Ah JCB, how I love your writing, you are always so very generous.

Your This Week's Finds were a hugely enjoyable part of my undergraduate education and beyond.

Thank you again.

By @yellowcake0 - 7 months
Information entropy is literally the strict lower bound on how efficiently information can be communicated (expected number of transmitted bits) if the probability distribution which generates this information is known, that's it. Even in contexts such as calculating the information entropy of a bit string, or the English language, you're just taking this data and constructing some empirical probability distribution from it using the relative frequencies of zeros and ones or letters or n-grams or whatever, and then calculating the entropy of that distribution.

I can't say I'm overly fond of Baez's definition, but far be it from me to question someone of his stature.

By @ccosm - 7 months
"I have largely avoided the second law of thermodynamics, which says that entropy always increases. While fascinating, this is so problematic that a good explanation would require another book!"

For those interested I am currently reading "Entropy Demystified" by Arieh Ben-Naim which tackles this side of things from much the same direction.

By @utkarsh858 - 7 months
I sometimes ponder where new entropy/randomness is coming from, like if we take the earliest state of universe as an infinitely dense point particle which expanded. So there must be some randomness or say variety which led it to expand in a non uniform way which led to the dominance of matter over anti-matter, or creation of galaxies, clusters etc. If we take an isolated system in which certain static particles are present, will there be the case that a small subset of the particles will get motion and this introduce entropy? Can entropy be induced automatically, atleast on a quantum level? If anyone can help me explain that it will be very helpful and thus can help explain origin of universe in a better way.
By @niemandhier - 7 months
My goto source for understanding entropy: http://philsci-archive.pitt.edu/8592/1/EntropyPaperFinal.pdf
By @jsomedon - 7 months
Am I only one that can't download the pdf, or is the file server down? I can see the blog page but when I try downloading the ebook it just doesn't work..

If the file server is down.. anyone could upload the ebook for download?

By @bdjsiqoocwk - 7 months
Hmmm that list of things that contribute to entropy I've noticed omits particles which under "normal circumstances" on earth exist in bound states, for example it doesn't mentions W bosons or gluons. But in some parts of the universe they're not bound but in different state of matter, e.g. quark gluon plasma. I wonder how or if this was taken I to account.
By @suoduandao3 - 7 months
I like the formulation of 'the amount of information we don't know about a system that we could in theory learn'. I'm surprised there's no mention of the Copenhagen interpretation's interaction with this definition, under a lot of QM theories 'unavailable information' is different from available information.
By @vinnyvichy - 7 months
The book might disappoint some..

>I have largely avoided the second law of thermodynamics ... Thus, the aspects of entropy most beloved by physics popularizers will not be found here.

But personally, this bit is the most exciting to me.

>I have tried to say as little as possible about quantum mechanics, to keep the physics prerequisites low. However, Planck’s constant shows up in the formulas for the entropy of the three classical systems mentioned above. The reason for this is fascinating: Planck’s constant provides a unit of volume in position-momentum space, which is necessary to define the entropy of these systems. Thus, we need a tiny bit of quantum mechanics to get a good approximate formula for the entropy of hydrogen, even if we are trying our best to treat this gas classically.

By @GoblinSlayer - 7 months
There's fundamental nature of entropy, but as usual it's not very enlightening for poor monkey brain, so to explain you need to enumerate all its high level behavior, but its high level behavior is accidental and can't be summarized in a concise form.
By @drojas - 7 months
My definition: Entropy is a measure of the accumulation of non-reversible energy transfers.

Side note: All reversible energy transfers involve an increase in potential energy. All non-reversible energy transfers involve a decrease in potential energy.

By @tromp - 7 months
Closely related recent discussion on The Second Law of Thermodynamics (2011) (franklambert.net):

https://news.ycombinator.com/item?id=40972589

By @tsoukase - 7 months
After years of thought I dare to say the 2nd TL is a tautology. Entropy is increasing means every system tends to higher probability means the most probable is the most probable.
By @tromp - 7 months
Closely related recent discussion: https://news.ycombinator.com/item?id=40972589
By @prof-dr-ir - 7 months
If I would write a book with that title then I would get to the point a bit faster, probably as follows.

Entropy is just a number you can associate with a probability distribution. If the distribution is discrete, so you have a set p_i, i = 1..n, which are each positive and sum to 1, then the definition is:

S = - sum_i p_i log( p_i )

Mathematically we say that entropy is a real-valued function on the space of probability distributions. (Elementary exercises: show that S >= 0 and it is maximized on the uniform distribution.)

That is it. I think there is little need for all the mystery.

By @ctafur - 7 months
The way I understand it is with an analogy to probability. To me, events are to microscopic states like random variable is to entropy.
By @dmn322 - 7 months
This seems like a great resource for referencing the various definitions. I've tried my hand at developing an intuitive understanding: https://spacechimplives.substack.com/p/observers-and-entropy. TLDR - it's an artifact of the model we're using. In the thermodynamic definition, the energy accounted for in the terms of our model is information. The energy that's not is entropic energy. Hence why it's not "useable" energy, and the process isn't reversible.
By @zoenolan - 7 months
Hawking on the subject

https://youtu.be/wgltMtf1JhY

By @foobarbecue - 7 months
How do you get to the actual book / tweets? The link just takes me back to the forward...
By @ThrowawayTestr - 7 months
MC Hawking already explained this

https://youtu.be/wgltMtf1JhY

By @arjunlol - 7 months
ΔS = ΔQ/T
By @illuminant - 7 months
Entropy is the distribution of potential over negative potential.

This could be said "the distribution of what ever may be over the surface area of where it may be."

This is erroneously taught in conventional information theory as "the number of configurations in a system" or the available information that has yet to be retrieved. Entropy includes the unforseen, and out of scope.

Entropy is merely the predisposition to flow from high to low pressure (potential). That is it. Information is a form of potential.

Philosophically what are entropy's guarantees?

- That there will always be a super-scope, which may interfere in ways unanticipated;

- everything decays the only mystery is when and how.