Language Entropy
The text discusses how abstractness and entropy in language affect information density and communication efficiency, emphasizing the role of reader knowledge and word embeddings in understanding complexity.
Read original articleThe text explores the relationship between language, abstraction, and complexity in written communication, particularly in research papers. It introduces key concepts such as abstractness, entropy, and understanding. Abstractness is defined as the distance of a word's definition from tangible objects, while entropy measures randomness and uncertainty. The author posits that more abstract words lead to lower entropy, allowing for denser information in fewer words. This efficiency is likened to a form of compression, where the reader's prior knowledge fills in gaps left by the writer. The complexity of text is influenced by its total abstraction and entropy, with a higher number of abstract words indicating greater complexity. The discussion also touches on word embeddings and their role in quantifying these concepts through mathematical functions. The author emphasizes that complexity is relative to the reader's knowledge base, which also has its own entropy. Ultimately, the piece reflects on the interplay between formalism in language and the realism it seeks to convey.
- The relationship between abstractness and entropy affects the density of information in text.
- More abstract words result in lower entropy, leading to more efficient communication.
- Complexity in text is influenced by the reader's prior knowledge and familiarity with concepts.
- Word embeddings can be used to quantify abstractness and entropy mathematically.
- Understanding of language is a cumulative process, enhancing comprehension over time.
Related
Extraverted People Talk More Abstractly, Introverts Are More Concrete
Introverts use concrete language, while extraverts prefer abstract terms in social situations. This linguistic contrast mirrors their cautious or casual nature, affecting conversation depth and trustworthiness perception.
What Is Entropy?
The book draft on Azimuth explores entropy as the amount of unknown information in a situation. It covers topics like information theory, Shannon entropy, Gibbs entropy, and Boltzmann distribution. The author emphasizes clarity and precision, making it a structured entry point into understanding entropy in physics.
What Is Entropy?
The article explores entropy in information theory and physics, introducing a forthcoming book on the topic. It covers various aspects like Shannon entropy, Gibbs entropy, and Boltzmann distribution, emphasizing mathematical precision and quantum mechanics.
Counting Complexity (2017)
Scroll Notation Complexity (SNC) measures entity complexity through tree structures with a symbolic language. It compares Relative and Total Complexity, suggesting SNC for universal complexity comparisons and visualizing complex systems.
Complex systems emerge from simple rules
The article explores emergent systems, illustrating how complexity arises from simple rules, using examples like the Game of Life and large language models, emphasizing interconnectedness in biology, chemistry, and physics.
Prediction and Entropy of Printed English, Shannon 1951 https://www.princeton.edu/~wbialek/rome/refs/shannon_51.pdf
The author conflates "abstract words" with obscurity, implying reduced understanding, but the importance of this is knowability - even if the word is abstract the concept can be discerned (although may require a dictionary and thesaurus), when we move away from diverse, abstract, or obscure words separated from the closely related synonymous we make the conveyed and intended meaning of the sentence in which it is used much harder to know, and no external reference could enlighten us.
Related
Extraverted People Talk More Abstractly, Introverts Are More Concrete
Introverts use concrete language, while extraverts prefer abstract terms in social situations. This linguistic contrast mirrors their cautious or casual nature, affecting conversation depth and trustworthiness perception.
What Is Entropy?
The book draft on Azimuth explores entropy as the amount of unknown information in a situation. It covers topics like information theory, Shannon entropy, Gibbs entropy, and Boltzmann distribution. The author emphasizes clarity and precision, making it a structured entry point into understanding entropy in physics.
What Is Entropy?
The article explores entropy in information theory and physics, introducing a forthcoming book on the topic. It covers various aspects like Shannon entropy, Gibbs entropy, and Boltzmann distribution, emphasizing mathematical precision and quantum mechanics.
Counting Complexity (2017)
Scroll Notation Complexity (SNC) measures entity complexity through tree structures with a symbolic language. It compares Relative and Total Complexity, suggesting SNC for universal complexity comparisons and visualizing complex systems.
Complex systems emerge from simple rules
The article explores emergent systems, illustrating how complexity arises from simple rules, using examples like the Game of Life and large language models, emphasizing interconnectedness in biology, chemistry, and physics.