Larger Font   Reset Font Size   Smaller Font  

What Just Happened: A Chronicle From the Information Frontier

James Gleick




  Copyright © 2011 by James Gleick

  All rights reserved. Published in the United States by Pantheon Books, a division of Random House, Inc., New York, and in Canada by Random House of Canada Limited, Toronto.

  Pantheon Books and colophon are registered trademarks of Random House, Inc.

  Library of Congress Cataloging-in-Publication Data

  Gleick, James.

  The information : a history, a theory, a flood / James Gleick.

  p. cm.

  Includes bibliographical references and index.

  eISBN 978-0-307-37957-3

  1. Information science—History. 2. Information society. I. Title.

  Z665.G547 2011 020.9—dc22 2010023221

  www.around.com

  www.pantheonbooks.com

  Jacket design by Peter Mendelsund

  v3.1

  FOR CYNTHIA

  Anyway, those tickets, the old ones, they didn’t tell you where you were going, much less where you came from. He couldn’t remember seeing any dates on them, either, and there was certainly no mention of time. It was all different now, of course. All this information. Archie wondered why that was.

  —Zadie Smith

  What we call the past is built on bits.

  —John Archibald Wheeler

  Contents

  Prologue

  Chapter 1. Drums That Talk

  Chapter 2. The Persistence of the Word

  Chapter 3. Two Wordbooks

  Chapter 4. To Throw the Powers of Thought into Wheel-Work

  Chapter 5. A Nervous System for the Earth

  Chapter 6. New Wires, New Logic

  Chapter 7. Information Theory

  Chapter 8. The Informational Turn

  Chapter 9. Entropy and Its Demons

  Chapter 10. Life’s Own Code

  Chapter 11. Into the Meme Pool

  Chapter 12. The Sense of Randomness

  Chapter 13. Information Is Physical

  Chapter 14. After the Flood

  Chapter 15. New News Every Day

  Epilogue

  Acknowledgments

  Notes

  Bibliography

  Index

  A Note About The Author

  Illustration Credits

  PROLOGUE

  The fundamental problem of communication is that of reproducing at one point either exactly or approximately a message selected at another point. Frequently the messages have meaning.

  —Claude Shannon (1948)

  AFTER 1948, which was the crucial year, people thought they could see the clear purpose that inspired Claude Shannon’s work, but that was hindsight. He saw it differently: My mind wanders around, and I conceive of different things day and night. Like a science-fiction writer, I’m thinking, “What if it were like this?”♦

  As it happened, 1948 was when the Bell Telephone Laboratories announced the invention of a tiny electronic semiconductor, “an amazingly simple device” that could do anything a vacuum tube could do and more efficiently. It was a crystalline sliver, so small that a hundred would fit in the palm of a hand. In May, scientists formed a committee to come up with a name, and the committee passed out paper ballots to senior engineers in Murray Hill, New Jersey, listing some choices: semiconductor triode … iotatron … transistor (a hybrid of varistor and transconductance). Transistor won out. “It may have far-reaching significance in electronics and electrical communication,” Bell Labs declared in a press release, and for once the reality surpassed the hype. The transistor sparked the revolution in electronics, setting the technology on its path of miniaturization and ubiquity, and soon won the Nobel Prize for its three chief inventors. For the laboratory it was the jewel in the crown. But it was only the second most significant development of that year. The transistor was only hardware.

  An invention even more profound and more fundamental came in a monograph spread across seventy-nine pages of The Bell System Technical Journal in July and October. No one bothered with a press release. It carried a title both simple and grand—“A Mathematical Theory of Communication”—and the message was hard to summarize. But it was a fulcrum around which the world began to turn. Like the transistor, this development also involved a neologism: the word bit, chosen in this case not by committee but by the lone author, a thirty-two-year-old named Claude Shannon.♦ The bit now joined the inch, the pound, the quart, and the minute as a determinate quantity—a fundamental unit of measure.

  But measuring what? “A unit for measuring information,” Shannon wrote, as though there were such a thing, measurable and quantifiable, as information.

  Shannon supposedly belonged to the Bell Labs mathematical research group, but he mostly kept to himself.♦ When the group left the New York headquarters for shiny new space in the New Jersey suburbs, he stayed behind, haunting a cubbyhole in the old building, a twelve-story sandy brick hulk on West Street, its industrial back to the Hudson River, its front facing the edge of Greenwich Village. He disliked commuting, and he liked the downtown neighborhood, where he could hear jazz clarinetists in late-night clubs. He was flirting shyly with a young woman who worked in Bell Labs’ microwave research group in the two-story former Nabisco factory across the street. People considered him a smart young man. Fresh from MIT he had plunged into the laboratory’s war work, first developing an automatic fire-control director for antiaircraft guns, then focusing on the theoretical underpinnings of secret communication—cryptography—and working out a mathematical proof of the security of the so-called X System, the telephone hotline between Winston Churchill and President Roosevelt. So now his managers were willing to leave him alone, even though they did not understand exactly what he was working on.

  AT&T at midcentury did not demand instant gratification from its research division. It allowed detours into mathematics or astrophysics with no apparent commercial purpose. Anyway so much of modern science bore directly or indirectly on the company’s mission, which was vast, monopolistic, and almost all-encompassing. Still, broad as it was, the telephone company’s core subject matter remained just out of focus. By 1948 more than 125 million conversations passed daily through the Bell System’s 138 million miles of cable and 31 million telephone sets.♦ The Bureau of the Census reported these facts under the rubric of “Communications in the United States,” but they were crude measures of communication. The census also counted several thousand broadcasting stations for radio and a few dozen for television, along with newspapers, books, pamphlets, and the mail. The post office counted its letters and parcels, but what, exactly, did the Bell System carry, counted in what units? Not conversations, surely; nor words, nor certainly characters. Perhaps it was just electricity. The company’s engineers were electrical engineers. Everyone understood that electricity served as a surrogate for sound, the sound of the human voice, waves in the air entering the telephone mouthpiece and converted into electrical waveforms. This conversion was the essence of the telephone’s advance over the telegraph—the predecessor technology, already seeming so quaint. Telegraphy relied on a different sort of conversion: a code of dots and dashes, not based on sounds at all but on the written alphabet, which was, after all, a code in its turn. Indeed, considering the matter closely, one could see a chain of abstraction and conversion: the dots and dashes representing letters of the alphabet; the letters representing sounds, and in combination forming words; the words representing some ultimate substrate of meaning, perhaps best left to philosophers.

  The Bell System had none of those, but the company had hired its first mathematician in 1897: George Campbell, a Minnesotan who had studied in G�
�ttingen and Vienna. He immediately confronted a crippling problem of early telephone transmission. Signals were distorted as they passed across the circuits; the greater the distance, the worse the distortion. Campbell’s solution was partly mathematics and partly electrical engineering.♦ His employers learned not to worry much about the distinction. Shannon himself, as a student, had never been quite able to decide whether to become an engineer or a mathematician. For Bell Labs he was both, willy-nilly, practical about circuits and relays but happiest in a realm of symbolic abstraction. Most communications engineers focused their expertise on physical problems, amplification and modulation, phase distortion and signal-to-noise degradation. Shannon liked games and puzzles. Secret codes entranced him, beginning when he was a boy reading Edgar Allan Poe. He gathered threads like a magpie. As a first-year research assistant at MIT, he worked on a hundred-ton proto-computer, Vannevar Bush’s Differential Analyzer, which could solve equations with great rotating gears, shafts, and wheels. At twenty-two he wrote a dissertation that applied a nineteenth-century idea, George Boole’s algebra of logic, to the design of electrical circuits. (Logic and electricity—a peculiar combination.) Later he worked with the mathematician and logician Hermann Weyl, who taught him what a theory was: “Theories permit consciousness to ‘jump over its own shadow,’ to leave behind the given, to represent the transcendent, yet, as is self-evident, only in symbols.”♦

  In 1943 the English mathematician and code breaker Alan Turing visited Bell Labs on a cryptographic mission and met Shannon sometimes over lunch, where they traded speculation on the future of artificial thinking machines. (“Shannon wants to feed not just data to a Brain, but cultural things!”♦ Turing exclaimed. “He wants to play music to it!”) Shannon also crossed paths with Norbert Wiener, who had taught him at MIT and by 1948 was proposing a new discipline to be called “cybernetics,” the study of communication and control. Meanwhile Shannon began paying special attention to television signals, from a peculiar point of view: wondering whether their content could be somehow compacted or compressed to allow for faster transmission. Logic and circuits crossbred to make a new, hybrid thing; so did codes and genes. In his solitary way, seeking a framework to connect his many threads, Shannon began assembling a theory for information.

  The raw material lay all around, glistening and buzzing in the landscape of the early twentieth century, letters and messages, sounds and images, news and instructions, figures and facts, signals and signs: a hodgepodge of related species. They were on the move, by post or wire or electromagnetic wave. But no one word denoted all that stuff. “Off and on,” Shannon wrote to Vannevar Bush at MIT in 1939, “I have been working on an analysis of some of the fundamental properties of general systems for the transmission of intelligence.”♦ Intelligence: that was a flexible term, very old. “Nowe used for an elegant worde,” Sir Thomas Elyot wrote in the sixteenth century, “where there is mutuall treaties or appoyntementes, eyther by letters or message.”♦ It had taken on other meanings, though. A few engineers, especially in the telephone labs, began speaking of information. They used the word in a way suggesting something technical: quantity of information, or measure of information. Shannon adopted this usage.

  For the purposes of science, information had to mean something special. Three centuries earlier, the new discipline of physics could not proceed until Isaac Newton appropriated words that were ancient and vague—force, mass, motion, and even time—and gave them new meanings. Newton made these terms into quantities, suitable for use in mathematical formulas. Until then, motion (for example) had been just as soft and inclusive a term as information. For Aristotelians, motion covered a far-flung family of phenomena: a peach ripening, a stone falling, a child growing, a body decaying. That was too rich. Most varieties of motion had to be tossed out before Newton’s laws could apply and the Scientific Revolution could succeed. In the nineteenth century, energy began to undergo a similar transformation: natural philosophers adapted a word meaning vigor or intensity. They mathematicized it, giving energy its fundamental place in the physicists’ view of nature.

  It was the same with information. A rite of purification became necessary.

  And then, when it was made simple, distilled, counted in bits, information was found to be everywhere. Shannon’s theory made a bridge between information and uncertainty; between information and entropy; and between information and chaos. It led to compact discs and fax machines, computers and cyberspace, Moore’s law and all the world’s Silicon Alleys. Information processing was born, along with information storage and information retrieval. People began to name a successor to the Iron Age and the Steam Age. “Man the food-gatherer reappears incongruously as information-gatherer,”♦ remarked Marshall McLuhan in 1967.♦ He wrote this an instant too soon, in the first dawn of computation and cyberspace.

  We can see now that information is what our world runs on: the blood and the fuel, the vital principle. It pervades the sciences from top to bottom, transforming every branch of knowledge. Information theory began as a bridge from mathematics to electrical engineering and from there to computing. What English speakers call “computer science” Europeans have known as informatique, informatica, and Informatik. Now even biology has become an information science, a subject of messages, instructions, and code. Genes encapsulate information and enable procedures for reading it in and writing it out. Life spreads by networking. The body itself is an information processor. Memory resides not just in brains but in every cell. No wonder genetics bloomed along with information theory. DNA is the quintessential information molecule, the most advanced message processor at the cellular level—an alphabet and a code, 6 billion bits to form a human being. “What lies at the heart of every living thing is not a fire, not warm breath, not a ‘spark of life,’ ”♦ declares the evolutionary theorist Richard Dawkins. “It is information, words, instructions.… If you want to understand life, don’t think about vibrant, throbbing gels and oozes, think about information technology.” The cells of an organism are nodes in a richly interwoven communications network, transmitting and receiving, coding and decoding. Evolution itself embodies an ongoing exchange of information between organism and environment.

  “The information circle becomes the unit of life,”♦ says Werner Loewenstein after thirty years spent studying intercellular communication. He reminds us that information means something deeper now: “It connotes a cosmic principle of organization and order, and it provides an exact measure of that.” The gene has its cultural analog, too: the meme. In cultural evolution, a meme is a replicator and propagator—an idea, a fashion, a chain letter, or a conspiracy theory. On a bad day, a meme is a virus.

  Economics is recognizing itself as an information science, now that money itself is completing a developmental arc from matter to bits, stored in computer memory and magnetic strips, world finance coursing through the global nervous system. Even when money seemed to be material treasure, heavy in pockets and ships’ holds and bank vaults, it always was information. Coins and notes, shekels and cowries were all just short-lived technologies for tokenizing information about who owns what.

  And atoms? Matter has its own coinage, and the hardest science of all, physics, seemed to have reached maturity. But physics, too, finds itself sideswiped by a new intellectual model. In the years after World War II, the heyday of the physicists, the great news of science appeared to be the splitting of the atom and the control of nuclear energy. Theorists focused their prestige and resources on the search for fundamental particles and the laws governing their interaction, the construction of giant accelerators and the discovery of quarks and gluons. From this exalted enterprise, the business of communications research could not have appeared further removed. At Bell Labs, Claude Shannon was not thinking about physics. Particle physicists did not need bits.

  And then, all at once, they did. Increasingly, the physicists and the information theorists are one and the same. The bit is a fundamental particle of a different so
rt: not just tiny but abstract—a binary digit, a flip-flop, a yes-or-no. It is insubstantial, yet as scientists finally come to understand information, they wonder whether it may be primary: more fundamental than matter itself. They suggest that the bit is the irreducible kernel and that information forms the very core of existence. Bridging the physics of the twentieth and twenty-first centuries, John Archibald Wheeler, the last surviving collaborator of both Einstein and Bohr, put this manifesto in oracular monosyllables: “It from Bit.” Information gives rise to “every it—every particle, every field of force, even the spacetime continuum itself.”♦ This is another way of fathoming the paradox of the observer: that the outcome of an experiment is affected, or even determined, when it is observed. Not only is the observer observing, she is asking questions and making statements that must ultimately be expressed in discrete bits. “What we call reality,” Wheeler wrote coyly, “arises in the last analysis from the posing of yes-no questions.” He added: “All things physical are information-theoretic in origin, and this is a participatory universe.” The whole universe is thus seen as a computer—a cosmic information-processing machine.

  A key to the enigma is a type of relationship that had no place in classical physics: the phenomenon known as entanglement. When particles or quantum systems are entangled, their properties remain correlated across vast distances and vast times. Light-years apart, they share something that is physical, yet not only physical. Spooky paradoxes arise, unresolvable until one understands how entanglement encodes information, measured in bits or their drolly named quantum counterpart, qubits. When photons and electrons and other particles interact, what are they really doing? Exchanging bits, transmitting quantum states, processing information. The laws of physics are the algorithms. Every burning star, every silent nebula, every particle leaving its ghostly trace in a cloud chamber is an information processor. The universe computes its own destiny.