The Information: A History, a Theory, a Flood – James Gleick
Summary: The Information outlines humanity’s changing relationship with and understanding of information—how messages were encoded and transmitted at various times throughout history, how humanity began to generate and transmit more and more of these messages over time, and how people developed new ways of thinking about and working with the information around them.
Thoughts: It took me a while—while Gleick outlined the growth and proliferation of information in the centuries leading up to the present day—to get into this book, though on the whole, I found it quite compelling. It’s the book wish I knew about when I initially got an inkling—from Steven Pinker’s Enlightenment Now and especially Sean Carroll’s The Big Picture—of the meaning and importance of entropy.
After he outlined how Claude Shannon’s choice to ignore the meanings of messages was an enormous step forward in how to conceive of and reason about information, I was struck when Gleick pointed out at the end of the book that, in our modern world with its enormous glut of information, the problem of ensuring that the information we consume has a meaningful relationship with reality has grown more and more important.
Gleick’s writing is clear and engaging, and his metaphors and examples insightful and thoughtfully chosen. Lots of exciting and thought-provoking ideas, especially in the book’s second half. Excellent.
(The notes below are not a summary of the book, but rather raw notes - whatever I thought, at the time, might be worth remembering.)
Gleick, James. 2011. The Information: A History, a Theory, a Flood. Pantheon.
1. Drums that Talk
2. The Persistence of the Word
- 30: “Writing, as a technology, requires premeditation and special art. Language is not a technology, no matter how well developed and efficacious. It is not best seen as something separate from the mind; it is what the mind does.” - j: I feel like I’ve gotten this wrong in the past - whether language is a technology vs. writing is a technology
3. Two Wordbooks
- 66: Ambroise Bierce: “dictionary, a malevolent literary device for cramping the growth of a language and making it hard and inelastic.”
4. To Throw the Powers of Thought into Wheel-Work
5. A Nervous System for the Earth
6. New Wires, New Logic
7. Information Theory
8. The Informational Turn
- 244f: of journalists, reporting on scientists talking, ca. 1950, about their computing machines “thinking”: Jean-Pierre Dupuy: “it was, at bottom, a perfectly ordinary situation, in which scientists blamed nonscientists for taking them at their word. Having planted the idea in the public mind that thinking machines were just around the corner, the cyberneticians hastened to dissociate themselves from anyone gullible enough to believe such a thing.”
- 263: mentioned indirectly: Marshall McLuhan’s Understanding Media: The Extensions of Man - “What Marshall McLuhan later called the ‘medium’ was for [Claude] Shannon the channel, and the channel was subject to rigourous mathematical treatment.”
- 265: “In 1910 the Spanish mathematician and tinker Leonardo Torres y Quevedo built a real chess machine, entirely mechanical, called El Ajedrecista, that could play a simple three-piece endgame, king and rook against king.” !!
9. Entropy and its Demons
- 271: Lord Kelvin: “although mechanical energy is indestructible… there is a universal tendency to its dissipation, which produces gradual augmentation and diffusion of heat, cessation of motion, and exhaustion of potential energy through the material universe.”
- 274: Some processes “run in one direction only,” and for this, “probability is the reason. What is remarkable—physicists took a long time to accept it—is that every irreversible process must be explained the same way.” James Clerk Maxwell: “the 2nd law of Thermodynamics has the same degree of truth that if you throw a tumblerful of water into the sea, you cannot get the same tumblerful of water out again.”
- 279: Leó Szilárd: “The very existence of the nervous system… is dependent on continual dissipation of energy.”
- Carl Eckart rephrased this as “thinking generates entropy”
- 279-280: “[Maxwell’s] demon performs a conversion between information and energy, one particle at a time. Szilárd… found that, if he accounted exactly for each measurement and memory, then the conversion could be computed exactly…. Each unit of information rings of corresponding increase in entropy…. Every time the demon makes a choice between one particle and another, it costs one bit of information. The payback comes at the end of the cycle, when it has to clear its memory.”
- 282-283: Erwin Schrödinger: “When is a piece of matter said to be alive? … When it goes on ‘doing something,’ moving, exchanging material with its environment, and so forth, for a much longer period than we would expect an inanimate piece of matter to ‘keep going’ under similar circumstances.”
- organisms feed on negative entropy. Schrödinger: “the essential thing in metabolism is that the organism succeeds in freeing itself from all the entropy it cannot help producing while alive.”
- Gleick: “In other words, the organism sucks orderliness from its surroundings.”
- 286: “Living creatures confound the usual computation of entropy. More generally, so does information. ‘take an issue of The New York Times, the book on cybernetics, and an equal weight of scrap paper,’ suggested [Léon] Brillouin. ‘do they have the same entropy?’ If you are feeding the furnace, yes. But not if you are a reader.”
10. Life’s Own Code
- 287 (epigraph): Richard Dawkins: “What lies at the heart of every living thing is not a fire, not warm breath, not a ‘spark of life.’ It is information, words, instructions. If you want a metaphor, don’t think of fires and sparks and breath. Think, instead, of a billion discrete, digital characters carved in tablets of crystal.”
- 306: “Genes engage in a grand process of mutual co-evolution—competing with one another, and with their alternative alleles, in nature’s vast gene pool, but [not] competing on their own. Their success or failure comes through interaction. ‘Selection favours those genes which succeed in the presence of other genes,’ says Dawkins, ‘which in turn succeed in the presence of them.’”
- 309: “where… is any particular gene—say, the gene for long legs in humans? This is a little like asking where is Beethoven’s Piano Sonata in E minor. Is it in the original handwritten score? The printed sheet music? In any one performance—or perhaps the sum of all performances, historical and potential, real and imagined? ¶ the quavers and crotchets inked on paper are not the music. Music is not a series of pressure waves sounding through the air; nor grooves etched in vinyl or pits burned in CDs; nor even the neuronal symphonies stirred up in the brain of the listener. The music is the information. Likewise, the base pairs of DNA are not genes. They include genes. Genes themselves are made of bits.”
11. Into the Meme Pool
- 314: “Selfishness is defined by the geneticist as the tendency to increase one’s chance of survival relative to [one’s] competitors.”
12. The Sense of Randomness
- 332: Gregory Chaitin developed an “algorithmic definition of randomness…: the size of the algorithm [i.e. the number of bits required to express this algorithm as a Turing machine] measures how much information a given string contains.”
- 343: “Algorithmic information theory applies the same limitations [as quantum physics and chaos did for the sciences] to the universe of whole numbers—an ideal, mental universe.” “Among its lessons were these:”
- “Most numbers are random. Yet very few of them can be proved random.”
- “A chaotic stream of information may yet hide a simple algorithm. Working backward from the chaos to the algorithm may be impossible.”
- “Kolmogorov-Chaitin… complexity is to mathematics what entropy is to thermodynamics: the antidote to perfection. Just as we can have no perpetual-motion machines, there can be no complete formal axiomatic systems.”
- “Some mathematical facts are true for no reason. They are accidental, lacking a cause or deeper meaning.”
- 345: mentioned: Noam Chomsky’s “offbeat and original paper ‘Three Models for the Description of Language,’ applying… information-theoretic ideas to the formalization of structure in language.”
- 354: Charles H. Bennett developed “a measure of value, which he called ‘logical depth’…, meant to capture the usefulness of a message, whatever usefulness might mean in any particular domain…. The value of a message lies in ‘what might be called its buried redundancy—parts predictable only with difficulty, things the receiver could in principle have figured out without being told, but only at considerable cost in money, time, or computation’…. ¶ Mathematicians and logicians had developed a tendency to think of information processing as free…. But it embodies work after all, and Bennett suggests that we recognize this work, reckon its expense in understanding complexity.”
13. Information is Physical
- 360: “What is the physical cost of logical work? ‘Computers,’ [Charles Bennett] wrote provocatively, ‘may be thought of as engines for transforming free energy into waste heat and mathematical work.’ Entropy surfaced again. A tape [for a Turing machine] full of zeroes, or a tape encoding the works of Shakespeare, or a tape rehearsing the digits of π, has ‘fuel value.’ A random tape has none.”
- 361: Rolf Landauer discovered/reasoned that during a computation, “it seemed that most logical operations have no entropy cost at all. When a bit flips from zero to one, or vice-versa, the information is preserved. The process is reversible. Entropy is unchanged; no heat needs to be dissipated. Only an irreversible option, he argued, increases entropy.”
- 362: Charles Bennett subsequently “confirmed that a great deal of computation can be done with no energy cost at all. In every case, Bennett found, heat dissipation only occurs when information is erased. Erasure is the irreversible logical operation. When the head on a Turing machine erases one square of the tape, or when an electronic computer clears a capacitor, a bit is lost, and then heat must be dissipated. In Szilárd’s thought experiment, the demon does not incur an entropy cost when it observes or chooses a molecule. The payback comes at the moment of clearing the record, when the demon erases one observation to make room for the next. ¶ Forgetting takes work.”
14. After the Flood
15. New News Every Day
Epilogue
- 419: “language maps a boundless world of objects and sensations and combinations onto a finite space.”
Posted: Jun 25, 2023. Last updated: Aug 31, 2023.