![rw-book-cover](https://images-na.ssl-images-amazon.com/images/I/514ToTCeWNL._SL200_.jpg) ## Metadata - Author: [[James Gleick]] - Full Title: The Information - Category: #books ## Highlights - “Theories permit consciousness to ‘jump over its own shadow,’ to leave behind the given, to represent the transcendent, yet, as is self-evident, only in symbols.” ([Location 98](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=98)) - For the purposes of science, information had to mean something special. Three centuries earlier, the new discipline of physics could not proceed until Isaac Newton appropriated words that were ancient and vague—force, mass, motion, and even time—and gave them new meanings. Newton made these terms into quantities, suitable for use in mathematical formulas. Until then, motion (for example) had been just as soft and inclusive a term as information. For Aristotelians, motion covered a far-flung family of phenomena: a peach ripening, a stone falling, a child growing, a body decaying. That was too rich. Most varieties of motion had to be tossed out before Newton’s laws could apply and the Scientific Revolution could succeed. In the nineteenth century, energy began to undergo a similar transformation: natural philosophers adapted a word meaning vigor or intensity. They mathematicized it, giving energy its fundamental place in the physicists’ view of nature. It was the same with information. A rite of purification became necessary. And then, when it was made simple, distilled, counted in bits, information was found to be everywhere. Shannon’s theory made a bridge between information and uncertainty; between information and entropy; and between information and chaos. ([Location 116](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=116)) - “What lies at the heart of every living thing is not a fire, not warm breath, not a ‘spark of life,’ ” declares the evolutionary theorist Richard Dawkins. “It is information, words, instructions.… If you want to understand life, don’t think about vibrant, throbbing gels and oozes, think about information technology.” ([Location 137](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=137)) - Economics is recognizing itself as an information science, now that money itself is completing a developmental arc from matter to bits, stored in computer memory and magnetic strips, world finance coursing through the global nervous system. Even when money seemed to be material treasure, heavy in pockets and ships’ holds and bank vaults, it always was information. Coins and notes, shekels and cowries were all just short-lived technologies for tokenizing information about who owns what. ([Location 146](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=146)) - Increasingly, the physicists and the information theorists are one and the same. The bit is a fundamental particle of a different sort: not just tiny but abstract—a binary digit, a flip-flop, a yes-or-no. It is insubstantial, yet as scientists finally come to understand information, they wonder whether it may be primary: more fundamental than matter itself. They suggest that the bit is the irreducible kernel and that information forms the very core of existence. Bridging the physics of the twentieth and twenty-first centuries, John Archibald Wheeler, the last surviving collaborator of both Einstein and Bohr, put this manifesto in oracular monosyllables: “It from Bit.” ([Location 155](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=155)) - “What we call reality,” Wheeler wrote coyly, “arises in the last analysis from the posing of yes-no questions.” He added: “All things physical are information-theoretic in origin, and this is a participatory universe.” The whole universe is thus seen as a computer—a cosmic information-processing machine. ([Location 163](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=163)) - For the Yaunde, the elephant is always “the great awkward one.” The resemblance to Homeric formulas—not merely Zeus, but Zeus the cloud-gatherer; not just the sea, but the wine-dark sea—is no accident. In an oral culture, inspiration has to serve clarity and memory first. The Muses are the daughters of Mnemosyne. Neither Kele nor English yet had words to say, allocate extra bits for disambiguation and error correction. Yet this is what the drum language did. Redundancy—inefficient by definition—serves as the antidote to confusion. ([Location 421](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=421)) - Writing, as a technology, requires premeditation and special art. Language is not a technology, no matter how well developed and efficacious. It is not best seen as something separate from the mind; it is what the mind does. “Language in fact bears the same relationship to the concept of mind that legislation bears to the concept of parliament,” says Jonathan Miller: “it is a competence forever bodying itself in a series of concrete performances.” Much the same might be said of writing—it is concrete performance—but when the word is instantiated in paper or stone, it takes on a separate existence as artifice. It is a product of tools, and it is a tool. And like many technologies that followed, it thereby inspired immediate detractors. One unlikely Luddite was also one of the first long-term beneficiaries. Plato (channeling the nonwriter Socrates) warned that this technology meant impoverishment: For this invention will produce forgetfulness in the minds of those who learn to use it, because they will not practice their memory. Their trust in writing, produced by external characters which are no part of themselves, will discourage the use of their own memory within them. You have invented an elixir not of memory, but of reminding; and you offer your pupils the appearance of wisdom, not true wisdom. ([Location 504](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=504)) - “The written symbol extends infinitely, as regards time and space, the range within which one mind can communicate with another; it gives the writer’s mind a life limited by the duration of ink, paper, and readers, as against that of his flesh and blood body.” But the new channel does more than extend the previous channel. It enables reuse and “re-collection”—new modes. It permits whole new architectures of information. Among them are history, law, business, mathematics, and logic. Apart from their content, these categories represent new techniques. The power lies not just in the knowledge, preserved and passed forward, valuable as it is, but in the methodology: encoded visual indications, the act of transference, substituting signs for things. And then, later, signs for signs. ([Location 534](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=534)) - In all the languages of earth there is only one word for alphabet (alfabet, alfabeto, ). The alphabet was invented only once. All known alphabets, used today or found buried on tablets and stone, descend from the same original ancestor, which arose near the eastern littoral of the Mediterranean Sea, sometime not much before 1500 BCE, in a region that became a politically unstable crossroads of culture, covering Palestine, Phoenicia, and Assyria. To the east lay the great civilization of Mesopotamia, with its cuneiform script already a millennium old; down the shoreline to the southwest lay Egypt, where hieroglyphics developed simultaneously and independently. Traders traveled, too, from Cyprus and Crete, bringing their own incompatible systems. With glyphs from Minoan, Hittite, and Anatolian, it made for a symbolic stew. ([Location 561](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=561)) - The paleographer has a unique bootstrap problem. It is only writing that makes its own history possible. The foremost twentieth-century authority on the alphabet, David Diringer, quoted an earlier scholar: “There never was a man who could sit down and say: ‘Now I am going to be the first man to write.’ ” The alphabet spread by contagion. The new technology was both the virus and the vector of transmission. It could not be monopolized, and it could not be suppressed. ([Location 571](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=571)) - “We know that formal logic is the invention of Greek culture after it had interiorized the technology of alphabetic writing,” Walter Ong says—it is true of India and China as well—“and so made a permanent part of its noetic resources the kind of thinking that alphabetic writing made possible.” For evidence Ong turns to fieldwork of the Russian psychologist Aleksandr Romanovich Luria among illiterate peoples in remote Uzbekistan and Kyrgyzstan in Central Asia in the 1930s. Luria found striking differences between illiterate and even slightly literate subjects, not in what they knew, but in how they thought. Logic implicates symbolism directly: things are members of classes; they possess qualities, which are abstracted and generalized. Oral people lacked the categories that become second nature even to illiterate individuals in literate cultures: for example, for geometrical shapes. Shown drawings of circles and squares, they named them as “plate, sieve, bucket, watch, or moon” and “mirror, door, house, apricot drying board.” ([Location 651](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=651)) - “It was assumed that the Babylonians had had some sort of number mysticism or numerology,” wrote Asger Aaboe in 1963, “but we now know how far short of the truth this assumption was.” The Babylonians computed linear equations, quadratic equations, and Pythagorean numbers long before Pythagoras. In contrast to the Greek mathematics that followed, Babylonian mathematics did not emphasize geometry, except for practical problems; the Babylonians calculated areas and perimeters but did not prove theorems. Yet they could (in effect) reduce elaborate second-degree polynomials. Their mathematics seemed to value computational power above all. That could not be appreciated until computational power began to mean something. By the time modern mathematicians turned their attention to Babylon, many important tablets had already been destroyed or scattered. Fragments retrieved from Uruk before 1914, for example, were dispersed to Berlin, Paris, and Chicago and only fifty years later were discovered to hold the beginning methods of astronomy. To demonstrate this, Otto Neugebauer, the leading twentieth-century historian of ancient mathematics, had to reassemble tablets whose fragments had made their way to opposite sides of the Atlantic Ocean. In 1949, when the number of cuneiform tablets housed in museums reached (at his rough guess) a half million, Neugebauer lamented, “Our task can therefore properly be compared with restoring the history of mathematics from a few torn pages which have accidentally survived the destruction of a great library.” ([Location 748](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=748)) - The book Cawdrey made was the first English dictionary. The word dictionary was not in it. ([Location 941](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=941)) - In the ancient world, alphabetical lists scarcely appeared until around 250 BCE, in papyrus texts from Alexandria. The great library there seems to have used at least some alphabetization in organizing its books. The need for such an artificial ordering scheme arises only with large collections of data, not otherwise ordered. And the possibility of alphabetical order arises only in languages possessing an alphabet: a discrete small symbol set with its own conventional sequence (“abecedarie, the order of the Letters, or hee that useth them”). Even then the system is unnatural. It forces the user to detach information from meaning; to treat words strictly as character strings; to focus abstractly on the configuration of the word. ([Location 964](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=964)) - The dictionary ratifies the persistence of the word. It declares that the meanings of words come from other words. It implies that all words, taken together, form an interlocking structure: interlocking, because all words are defined in terms of other words. This could never have been an issue in an oral culture, where language was barely visible. Only when printing—and the dictionary—put the language into separate relief, as an object to be scrutinized, could anyone develop a sense of word meaning as interdependent and even circular. Words had to be considered as words, representing other words, apart from things. In the twentieth century, when the technologies of logic advanced to high levels, the potential for circularity became a problem. “In giving explanations I already have to use language full blown,” complained Ludwig Wittgenstein. He echoed Newton’s frustration three centuries earlier, but with an extra twist, because where Newton wanted words for nature’s laws, Wittgenstein wanted words for words: “When I talk about language (words, sentences, etc.) I must speak the language of every day. Is this language somehow too coarse and material for what we want to say?” Yes. And the language was always in flux. ([Location 1117](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=1117)) - The founders of the dictionary explicitly meant to find every word, however many that would ultimately be. They planned a complete inventory. Why should they not? The number of books was unknown but not unlimited, and the number of words in those books was countable. The task seemed formidable but finite. It no longer seems finite. Lexicographers are accepting the language’s boundlessness. They know by heart Murray’s famous remark: “The circle of the English language has a well-defined centre but no discernable circumference.” ([Location 1212](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=1212)) - The whole word hoard—the lexis—constitutes a symbol set of the language. It is the fundamental symbol set, in one way: words are the first units of meaning any language recognizes. They are recognized universally. But in another way it is far from fundamental: as communication evolves, messages in a language can be broken down and composed and transmitted in much smaller sets of symbols: the alphabet; dots and dashes; drumbeats high and low. These symbol sets are discrete. The lexis is not. It is messier. It keeps on growing. Lexicography turns out to be a science poorly suited to exact measurement. English, the largest and most widely shared language, can be said very roughly to possess a number of units of meaning that approaches a million. Linguists have no special yardsticks of their own; when they try to quantify the pace of neologism, they tend to look to the dictionary for guidance, and even the best dictionary runs from that responsibility. The edges always blur. A clear line cannot be drawn between word and unword. ([Location 1254](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=1254)) - The lexis is a measure of shared experience, which comes from interconnectedness. The number of users of the language forms only the first part of the equation: jumping in four centuries from 5 million English speakers to a billion. The driving factor is the number of connections between and among those speakers. A mathematician might say that messaging grows not geometrically, but combinatorially, which is much, much faster. ([Location 1297](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=1297)) - Like the printing press, the telegraph, and the telephone before it, the Internet is transforming the language simply by transmitting information differently. What makes cyberspace different from all previous information technologies is its intermixing of scales from the largest to the smallest without prejudice, broadcasting to the millions, narrowcasting to groups, instant messaging one to one. ([Location 1302](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=1302)) - “Those who enjoy leisure can scarcely find a more interesting and instructive pursuit than the examination of the workshops of their own country, which contain within them a rich mine of knowledge, too generally neglected by the wealthier classes.” ([Location 1336](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=1336)) - “Numbers have many charms, unseen by vulgar eyes, and only discovered to the unwearied and respectful sons of Art. Sweet joy may arise from such contemplations.” ([Location 1384](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=1384)) - Beginning in 1767, England’s Board of Longitude ordered published a yearly Nautical Almanac, with position tables for the sun, moon, stars, planets, and moons of Jupiter. Over the next half century a network of computers did the work—thirty-four men and one woman, Mary Edwards of Ludlow, Shropshire, all working from their homes. Their painstaking labor paid £70 a year. Computing was a cottage industry. Some mathematical sense was required but no particular genius; rules were laid out in steps for each type of calculation. In any case the computers, being human, made errors, so the same work was often farmed out twice for the sake of redundancy. (Unfortunately, being human, computers were sometimes caught saving themselves labor by copying from one other.) ([Location 1406](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=1406)) - As for his own engine—the one that would travel nowhere—he had found a fine new metaphor. It would be, he said, “a locomotive that lays down its own railway.” ([Location 1919](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=1919)) - She devised a process, a set of rules, a sequence of operations. In another century this would be called an algorithm, later a computer program, but for now the concept demanded painstaking explanation. The trickiest point was that her algorithm was recursive. It ran in a loop. The result of one iteration became food for the next. Babbage had alluded to this approach as “the Engine eating its own tail.” A.A.L. explained: “We easily perceive that since every successive function is arranged in a series following the same law, there would be a cycle of a cycle of a cycle, &c.… The question is so exceedingly complicated, that perhaps few persons can be expected to follow.… Still it is a very important case as regards the engine, and suggests ideas peculiar to itself, which we should regret to pass wholly without allusion.” ([Location 1973](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=1973)) - “I find that my plans & ideas keep gaining in clearness, & assuming more of the crystalline & less & less of the nebulous form.” ([Location 2005](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=2005)) - Meant first to generate number tables, the engine in its modern form instead rendered number tables obsolete. Did Babbage anticipate that? He did wonder how the future would make use of his vision. He guessed that a half century would pass before anyone would try again to create a general-purpose computing machine. In fact, it took most of a century for the necessary substrate of technology to be laid down. ([Location 2080](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=2080)) - “Electricity is the poetry of science,” an American historian declared in 1852. ([Location 2148](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=2148)) - Before there were electric telegraphs, there were just telegraphs: les télégraphes, invented and named by Claude Chappe in France during the Revolution.* They were optical; a “telegraph” was a tower for sending signals to other towers in line of sight. ([Location 2186](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=2186)) - All the would-be inventors of the electrical telegraph—and there were many—worked from the same toolkit. They had their wires, and they had magnetic needles. They had batteries: galvanic cells, linked together, producing electricity from the reaction of metal strips immersed in acid baths. They did not have lights. They did not have motors. They had whatever mechanisms they could construct from wood and brass: pins, screws, wheels, springs, and levers. In the end they had the shared target at which they all aimed: the letters of the alphabet. ([Location 2352](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=2352)) - Information that just two years earlier had taken days to arrive at its destination could now be there—anywhere—in seconds. This was not a doubling or tripling of transmission speed; it was a leap of many orders of magnitude. It was like the bursting of a dam whose presence had not even been known. ([Location 2470](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=2470)) - In this time of conceptual change, mental readjustments were needed to understand the telegraph itself. Confusion inspired anecdotes, which often turned on awkward new meanings of familiar terms: innocent words like send, and heavily laden ones, like message. There was the woman who brought a dish of sauerkraut into the telegraph office in Karlsruhe to be “sent” to her son in Rastatt. She had heard of soldiers being “sent” to the front by telegraph. There was the man who brought a “message” into the telegraph office in Bangor, Maine. The operator manipulated the telegraph key and then placed the paper on the hook. The customer complained that the message had not been sent, because he could still see it hanging on the hook. ([Location 2532](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=2532)) - “They string an instrument against the sky,” said Robert Frost, “Wherein words whether beaten out or spoken / Will run as hushed as when they were a thought.” ([Location 2551](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=2551)) - The wires resembled nothing in architecture and not much in nature. Writers seeking similes thought of spiders and their webs. They thought of labyrinths and mazes. And one more word seemed appropriate: the earth was being covered, people said, with an iron net-work. “A net-work of nerves of iron wire, strung with lightning, will ramify from the brain, New York, to the distant limbs and members,” said the New York Tribune. “The whole net-work of wires,” wrote Harper’s, “all quivering from end to end with signals of human intelligence.” Wynter offered a prediction. “The time is not distant,” he wrote, “when everybody will be able to talk with everybody without going out of the house.” He meant “talk” metaphorically. ([Location 2553](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=2553)) - “For in the general we must note, That whatever is capable of a competent Difference, perceptible to any Sense, may be a sufficient Means whereby to express the Cogitations.” A difference could be “two Bells of different Notes”; or “any Object of Sight, whether Flame, Smoak, &c.”; or trumpets, cannons, or drums. Any difference meant a binary choice. Any binary choice began the expressing of cogitations. Here, in this arcane and anonymous treatise of 1641, the essential idea of information theory poked to the surface of human thought, saw its shadow, and disappeared again for four hundred years. ([Location 2730](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=2730)) - One was Berry’s paradox, first suggested to Russell by G. G. Berry, a librarian at the Bodleian. It has to do with counting the syllables needed to specify each integer. Generally, of course, the larger the number the more syllables are required. In English, the smallest integer requiring two syllables is seven. The smallest requiring three syllables is eleven. The number 121 seems to require six syllables (“one hundred twenty-one”), but actually four will do the job, with some cleverness: “eleven squared.” Still, even with cleverness, there are only a finite number of possible syllables and therefore a finite number of names, and, as Russell put it, “Hence the names of some integers must consist of at least nineteen syllables, and among these there must be a least. Hence the least integer not nameable in fewer than nineteen syllables must denote a definite integer.”* Now comes the paradox. This phrase, the least integer not nameable in fewer than nineteen syllables, contains only eighteen syllables. So the least integer not nameable in fewer than nineteen syllables has just been named in fewer than nineteen syllables. ([Location 3034](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=3034)) - “There is a molecular archeology in the making,” says Werner Loewenstein. The history of life is written in terms of negative entropy. “What actually evolves is information in all its forms or transforms. If there were something like a guidebook for living creatures, I think, the first line would read like a biblical commandment, Make thy information larger.” ([Location 5121](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=5121)) - Why? Chaitin asked. He wondered if at some level Gödel’s incompleteness could be connected to that new principle of quantum physics, uncertainty, which smelled similar somehow. Later, the adult Chaitin had a chance to put this question to the oracular John Archibald Wheeler. Was Gödel incompleteness related to Heisenberg uncertainty? Wheeler answered by saying he had once posed that very question to Gödel himself, in his office at the Institute for Advanced Study—Gödel with his legs wrapped in a blanket, an electric heater glowing warm against the wintry drafts. Gödel refused to answer. In this way, Wheeler refused to answer Chaitin. ([Location 5462](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=5462)) - It is a simple word, random, and everyone knows what it means. Everyone, that is, and no one. Philosophers and mathematicians struggled endlessly. Wheeler said this much, at least: “Probability, like time, is a concept invented by humans, and humans have to bear the responsibility for the obscurities that attend it.” ([Location 5472](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=5472)) - John Maynard Keynes tackled randomness in terms of its opposites, and he chose three: knowledge, causality, and design. What is known in advance, determined by a cause, or organized according to plan cannot be random. ([Location 5478](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=5478)) - Von Neumann realized that a mechanical computer, with its deterministic algorithms and finite storage capacity, could never generate truly random numbers. He would have to settle for pseudorandom numbers: deterministically generated numbers that behaved as if random. They were random enough for practical purposes. “Any one who considers arithmetical methods of producing random digits is, of course, in a state of sin,” said von Neumann. ([Location 5508](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=5508)) - Shannon’s first formulation of information theory treated messages statistically, as choices from the ensemble of all possible messages—in the case of A and B, 250 of them. But Shannon also considered redundancy within a message: the pattern, the regularity, the order that makes a message compressible. The more regularity in a message, the more predictable it is. The more predictable, the more redundant. The more redundant a message is, the less information it contains. ([Location 5545](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=5545)) - Because the spirit of frugal telegraph operators kept the lights on at Bell Labs, it was natural for Claude Shannon to explore data compression, both theory and practice. Compression was fundamental to his vision: his war work on cryptography analyzed the disguising of information at one end and the recovery of the information at the other; data compression likewise encodes the information, with a different motivation—the efficient use of bandwidth. ([Location 5798](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=5798)) - When humans or computers learn from experience, they are using induction: recognizing regularities amid irregular streams of information. From this point of view, the laws of science represent data compression in action. A theoretical physicist acts like a very clever coding algorithm. “The laws of science that have been discovered can be viewed as summaries of large amounts of empirical data about the universe,” wrote Solomonoff. ([Location 5819](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=5819)) - Solomonoff, Kolmogorov, and Chaitin tackled three different problems and came up with the same answer. Solomonoff was interested in inductive inference: given a sequence of observations, how can one make the best predictions about what will come next? Kolmogorov was looking for a mathematical definition of randomness: what does it mean to say that one sequence is more random than another, when they have the same probability of emerging from a series of coin flips? And Chaitin was trying to find a deep path into Gödel incompleteness by way of Turing and Shannon—as he said later, “putting Shannon’s information theory and Turing’s computability theory into a cocktail shaker and shaking vigorously.” They all arrived at minimal program size. And they all ended up talking about complexity. ([Location 5825](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=5825)) - The empty string is as simple as can be; the random string is maximally complex. The zeroes convey no information; coin tosses produce the most information possible. Yet these extremes have something in common. They are dull. They have no value. If either one were a message from another galaxy, we would attribute no intelligence to the sender. If they were music, they would be equally worthless. Everything we care about lies somewhere in the middle, where pattern and randomness interlace. ([Location 5963](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=5963)) - The younger man pursued Landauer’s principle by analyzing every kind of computer he could imagine, real and abstract, from Turing machines and messenger RNA to “ballistic” computers, carrying signals via something like billiard balls. He confirmed that a great deal of computation can be done with no energy cost at all. In every case, Bennett found, heat dissipation occurs only when information is erased. Erasure is the irreversible logical operation. ([Location 6110](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=6110)) - In Szilárd’s thought experiment, the demon does not incur an entropy cost when it observes or chooses a molecule. The payback comes at the moment of clearing the record, when the demon erases one observation to make room for the next. Forgetting takes work. ([Location 6115](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=6115)) - “A nonrandom whole can have random parts,” says Bennett. “This is the most counterintuitive part of quantum mechanics, yet it follows from the superposition principle and is the way nature works, as far as we know. ([Location 6169](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=6169)) - “Feynman’s insight,” says Bennett, “was that a quantum system is, in a sense, computing its own future all the time. ([Location 6233](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=6233)) - Embodying a superposition of states gives the qubit more power than the classical bit, always in only one state or the other, zero or one, “a pretty miserable specimen of a two-dimensional vector,” as David Mermin says. “When we learned to count on our sticky little classical fingers, we were misled,” Rolf Landauer said dryly. “We thought that an integer had to have a particular and unique value.” But no—not in the real world, which is to say the quantum world. ([Location 6237](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=6237)) - Quantum information is like a dream—evanescent, never quite existing as firmly as a word on a printed page. “Many people can read a book and get the same message,” Bennett says, “but trying to tell people about your dream changes your memory of it, so that eventually you forget the dream and remember only what you said about it.” ([Location 6313](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=6313)) - It was John Wheeler who left behind an agenda for quantum information science—a modest to-do list for the next generation of physicists and computer scientists together: “Translate the quantum versions of string theory and of Einstein’s geometrodynamics from the language of continuum to the language of bit,” he exhorted his heirs. “Survey one by one with an imaginative eye the powerful tools that mathematics—including mathematical logic—has won … and for each such technique work out the transcription into the world of bits.” And, “From the wheels-upon-wheels-upon-wheels evolution of computer programming dig out, systematize and display every feature that illuminates the level-upon-level-upon-level structure of physics.” ([Location 6327](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=6327)) - storehouses. The persistence of information, the difficulty of forgetting, so characteristic of our time, accretes confusion. ([Location 6361](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=6361)) - Long before Borges, the imagination of Charles Babbage had conjured another library of Babel. He found it in the very air: a record, scrambled yet permanent, of every human utterance. What a strange chaos is this wide atmosphere we breathe!… The air itself is one vast library, on whose pages are for ever written all that man has ever said or woman whispered. There, in their mutable but unerring characters, mixed with the earliest, as well as the latest sighs of mortality, stand for ever recorded, vows unredeemed, promises unfulfilled, perpetuating in the united movements of each particle, the testimony of man’s changeful will. ([Location 6377](https://readwise.io/to_kindle?action=open&asin=B004DEPHUC&location=6377))