Larger Font   Reset Font Size   Smaller Font  

The Transparent Society, Page 5

David Brin


  But not anymore, say some experts. We are fast reaching the point where expertly controlled computers can adjust an image, pixel by microscopic pixel, and not leave a clue behind. Much of the impetus comes from Hollywood, where perfect verisimilitude is demanded for on-screen fictions and fabulations like Forrest Gump and Jurassic Park. Yet some thoughtful film wizards worry how these technologies will be used outside the theaters.

  “History is kind of a consensual hallucination,” said director James Cameron recently. He went on to suggest that people wanting to prove that some event really happened might soon have to track closely the “pedigree” of their photographic evidence, showing they retained possession at all stages, as with blood samples from a crime scene.

  One day a rumor spread across the kingdom. It suggested that some of the sighted were no longer faithfully telling the complete truth. Shouted directions sometimes sent normal blind people into ditches. Occasional harsh laughter was heard.

  Several of the sighted came forward and confessed that things were worse than anyone feared. “Some of us appear to have been lying for quite a while. A few even think it’s funny to lead normal blind people astray!

  “This power is a terrible temptation. You will never be able to tell which of us is lying or telling the truth. Even the best of the sighted can no longer be trusted completely.”

  The new technologies of photographic deception have gone commercial. For instance, a new business called Out Takes recently set up shop next to Universal Studios, in Los Angeles, promising to “put you in the movies.” For a small fee they will insert your visage in a tête-à-tête with Humphrey Bogart or Marilyn Monroe, exchanging either tense dialogue or a romantic moment. This may seem harmless on the surface, but the long-range possibilities disturb Ken Burns, innovative director of the famed public television series The Civil War. “If everything is possible, then nothing is true. And that, to me, is the abyss we stare into. The only weapon we might have, besides some internal restraint, is skepticism.” Skepticism may then further transmute into cynicism, Burns worries, or else, in the arts, into decadence. To which NBC reporter Jeff Greenfield added: “Skepticism may itself come with a very high price. Suppose we can no longer trust the evidence of our own eyes to know that something momentous, or something horrible, actually happened?”

  There are some technical “fixes” that might help a little—buying special sealed digital cameras, for instance, that store images with time-stamped and encrypted watermarks. But as we’ll see in chapter 8, that solution may be temporary, at best. Nor will it change the basic problem, as photography ceases to be our firm anchor in a sea of subjectivity.

  This news worried all the blind subjects of the kingdom. Some kept to their homes. Others banded together in groups, waving sticks and threatening the sighted, in hopes of ensuring correct information. But those who could see just started disguising their voices.

  One faction suggested blinding everybody, permanently, in order to be sure of true equality—or else setting fires to shroud the land in a smoky haze. “No one can bully anybody else, if we’re all in the dark,” these enthusiasts urged.

  As time passed, more people tripped over unexpected objects, or slipped into gullies, or took a wrong path because some anonymous voice shouted “left!” instead of right.

  At first, the problem with photography might seem just as devastating to transparency as to any other social “solution.” If cameras can no longer be trusted, then what good are they? How can open information flows be used to enforce accountability on the mighty, if anyone with a computer can change images at will? A spreading mood of dour pessimism was lately distilled by Fred Richtien, professor of photography and multimedia at New York University: “The depth of the problem is so significant that in my opinion it makes, five or ten years down the road, the whole issue of democracy at question, because how can you have an informed electorate if they don’t know what to believe and what not to believe?”

  Then, one day, a little blind girl had an idea. She called together everybody in the kingdom and made on announcement.

  “I know what to do!” she said.

  Sometimes a problem seems vexing, till you realize that you were looking at it the wrong way all along. This is especially true about the “predicament” of doctored photographs and video images. We have fallen into a habit of perceiving pictures as unchanging documents, unique and intrinsically valid in their own right. To have that accustomed validity challenged is unnerving, until you realize that the camera is not a court stenographer, archivist, or notary public. It is an extension of our eyes. Photographs are just another kind of memory.

  So cameras can now lie? Photographs can deceive? So what? People have been untrustworthy for a very long time, and we’ve coped. Not perfectly. But there are ways to deal with liars.

  First, remember who fooled you before. Track their credibility, and warn others to beware. “Your basis cannot be looking at the reality of the photograph,” says Andrew Lippman, associate director of the Massachusetts Institute of Technology (MIT) Media Lab. “Your basis ... has to be in the court of trust.”

  Second, in a world where anyone can bear false witness, try to make damn sure there are lots of witnesses!

  “Here,” said the little girl, pushing bitter fruit under the noses of her parents and friends, who squirmed and made sour faces.

  “Eat it,” she insisted. “Stop whining about liars and go see for yourselves. ”

  In real life, the “bitter fruit” is realizing that we must all share responsibility for keeping an eye on the world. People know that others tell untruths. Even when they sincerely believe their own testimony, it can be twisted by subconscious drives or involuntary misperceptions. Detectives have long grown used to the glaring omissions and bizarre embellishments that often warp eyewitness testimony.

  So? Do we shake our heads and announce the end of civilization? Or do we try to cope by bringing in additional testimony? Combing the neighborhood for more and better witnesses.

  One shouldn’t dismiss or trivialize the severe problems that will arise out of image fakery. Without any doubt there will be deceits, injustices, and terrible slanders. Conspiracy theories will burgeon as never before when fanatics can doctor so-called evidence to support wild claims. Others will fabricate alibis, frame the innocent, or try to cover up crimes. “Every advance in communications has brought with it the danger of misuse,” says Jeff Greenfield. “A hundred years ago, publishers brought out books of Abe Lincoln’s speeches containing some words he never spoke. Hitler spread hate on the radio. But today’s danger is different.”

  Greenfield is right. Today is different, because we have the power to make photographic forgery less worrisome.

  Because even pathological liars tend not to lie when they face a high probability of getting caught.

  Would we be tormenting ourselves over the Kennedy assassination today if fifty cameras had been rolling, instead of just poor Abraham Zapruder’s? Suppose some passerby had filmed Nazi goons setting fire to the Reichstag in 1933. Might Hitler have been ousted, and thirty million lives saved? Maybe not, but the odds would have been better. In the future, thugs and provocateurs will never know for certain that their sneaking calumny won’t be observed by a bystander or tourist, turning infrared optics toward those scurrying movements in the shadows.

  We are all hallucinators to some degree. So now our beloved cameras may also prove faulty and prone to deception? At least they don’t lie except when they are told to. It takes a deliberate act of meddling to alter most images in decisive ways. Cameras don’t have imaginations, though their acuity is improving all the time. In fact, when their fields of view overlap, we can use them to check on each other. Especially if a wide range of people do the viewing and controlling.

  As citizens, we shall deal with this problem the way members of an empirical civilization always have, by arguing and comparing notes, giving more credibility to the credible, and relying less on the anon
ymous or those who were caught lying in the past. Discerning truth, always a messy process, will be made more complex by these new, flawed powers of sight. But our consensual reality does not have to become a nightmare. Not when a majority of people contribute goodwill, openness, and lots of different points of view.

  Again, cameras are simply extensions of our eyes.

  If you’re worried that some of them are lying, tradition offers an answer: more cameras.

  We’ll solve it by giving up the comforting blanket of darkness, opening up these new eyes, and sharing the world with six billion fellow witnesses.

  CHAPTER TWO

  THE AGE OF KNOWLEDGE

  But all the conservatism in the world does not afford even a token resistance to the ecological sweep of the new electronic media.

  MARSHALL MCLUHAN, UNDERSTANDING MEDIA

  Although only a few may originate a policy, we are all able to judge it.

  PERICLES OF ATHENS

  TRANSFORMING TECHNOLOGIES OF THE PAST AND FUTURE

  Each human generation seems to have a fulcrum—a pivot around which fateful transformations revolve. Often, this has less to do with the struttings of kings and statesmen than with technology. We speak of Stone, Bronze, and Iron Ages. There were eras of steam and coal. Some historians already look back with nostalgia on the brief, glorious “petroleum century.”

  These epithets benefit from hindsight, but it is quite another thing to speak accurately about the future. Shortly after the discovery of nuclear fission, enthusiasts gushed that the atomic era would produce energy too cheap to meter. That promise fizzled, along with the early advent of a space age. Yet pundits seem undeterred, always moving to the next glittering bauble.

  Nowadays, yet another alluring, emblematic phrase heralds a new social epoch: “The Information Age.” Almost daily, another book or article appears, written by some modern Pangloss, forecasting unalloyed wonder in the years ahead. Electronic conduits will unite home, factory, and school to the digital assistant on your wrist. All the world’s databases and libraries will merge into a universal network. Telecommuting will solve traffic jams and improve parenting. Barren shopping malls will be refitted to house the homeless, once we learn how convenient it is to roam digital catalogs, purchasing everything we need from the comfort of home.

  Today’s business pages fill with news of mergers as companies line up strategic partners—telephone companies with computer firms, cable television operators with movie studios, and so on—reorganizing like mad to compete in a century when knowledge will be the most precious commodity.

  When it will be like money. Like power.

  Even after putting aside the most extravagant hype, one cannot but be impressed with the pace of real events on the main express lane of the information superhighway: the Internet. Tens of millions of people have hooked up, with more signing on each day. All you need is a personal computer and a “portal” for your modem to dial. A small monthly fee grants you access to memory banks scattered around the globe. With appropriate software it is simple to point and click your way across the sea of information available on World Wide Web. Anyone can learn to navigate this ocean, whose exotic offerings range from ridiculous to the truly sublime. • You can download satellite weather photos or space images from a NASA site, or stroll through museums on five continents, summoning digital renderings of paintings or relics.

  • Physicians—and increasingly patients—routinely call up databases linking hospitals and medical schools that contain the latest information about disease, diagnosis, and treatment.

  • The Human Genome Project maintains its catalog of Homo sapiens DNA on-line. Elsewhere in cyberspace, biologists list the family trees of all known animal and plant species.

  • In just a year, the proportion of U.S. senators and representatives with Web pages went from single digits to nearly 100 percent. Countless agencies and officials exploit the new medium to communicate with clients and constituents. Most newly released government documents appear first on the Internet, then on paper.

  • Businesses use globalized computer services: 24-hour customer assistance hotlines are answered in Ireland when it is late at night in North America because international telecommunications are now cheaper than paying overtime; contract programmers in India make overnight changes in databases and fix bugs in time for the following day’s business in London and New York; Barbados has a flourishing data entry, medical transcription, and litigation support industry, receiving raw material by overnight courier and sending back finished products electronically.

  Other services include electronic access to movie reviews and concert schedules, travel reservations, employment and self-help forums, merchandise catalogs, and online encyclopedias. Millions join discussion groups to confer about common interests, from feminism to medieval languages, from recipes to Star Trek, from esthetics to pornography. Some interactive fantasy games involve hundreds or thousands of individuals at a time, all immersed in ornate dungeon adventures or murder mysteries. Above all, millions exchange electronic mail (e-mail), conveying everything from terse notes to glossy documents and video, casting messages across oceans and continents at nearly the speed of light. With rapid binary voice transmission, some users even bypass traditional phone systems, holding cheap conversations over communication lines once designed to carry only computer programs.

  Some parts of cyberspace are primly organized. Full-service companies such as Prodigy or America Online (AOL) might be likened to middle-class villages with tidy libraries, shopping malls, and friendly but firm cops on each corner. (Watch your language. No spitting, please.) In contrast, countless bulletin board systems operate out of private homes, offering sites dedicated to specific interests, passions, or perversions. These small outfits are like frontier trading posts scattered across a prairie, sometimes with a saloon attached. No rail or stagecoach access. Supply your own horse.

  Local governments have joined the rush to go online. Systems pioneered by Cleveland and Santa Monica—letting residents read city council minutes, file complaints, or pore through the main library catalog—have spread with astonishing speed to a majority of cities in the nation.

  Then there are the Big Boys. Like railroads in the 1870s, Microsoft, Pacific Bell, AmericaWest, AT&T, Time-Warner, and others hear the call of cash flowing through new conduits. They envision new toll roads, where money can be made the same way fortunes were built from the telegraph and telephone—by charging small amounts, trillions of times. Suddenly, the extensive physical rights of way owned by gas companies, cable television operators, electric utilities, and rail corporations have become incredibly valuable as paths for data transmission. Those lacking earthly rights of way scramble for alternatives, filling the sky with hordes of low-altitude relay satellites to offer a new era of wireless communication.

  Finally, there is the Internet itself, sometimes grandly (but parochially) called the National Information Infrastructure (NII), which is in fact a nebulous, ill-defined thing, with traits unlike any commercial business or government institution. While its origin, core elements, and philosophical basis are American, the Internet has swiftly transcended national boundaries. Some liken it to an “interstate highway system” for information, but that analogy misses nearly all of the network’s outstanding features. If other dataways are like villages, saloons, or railroads—dotting or crisscrossing a frontier—the Internet might be compared to the landscape itself. To the hills, streams, natural contours, and passing clouds.

  Any effort to explore the concept of a transparent society should begin here. Not because the Internet is the revolutionary event of all time. Most of the transformations we will discuss in this book—for example, the proliferation of cheap video cameras, or the advent of perfect photographic fakery—would have happened anyway. Issues of privacy, openness, and accountability were already on the agenda, with or without the advent of computer networks. But the Internet has clearly multiplied the pace of change, bringing matt
ers to a head much more rapidly. This may turn out to be a good thing, since it forces us to come to grips with the future now, instead of letting it come upon us in a shambling, deceptive crouch.

  So in this chapter and the next, we will talk about context—how the stage was set for the quandaries and decisions we now face. Above all, we have to know what the Internet is, and where it came from.

  Today’s computer interconnection network has roots stretching back to 1945, when Vannevar Bush, who helped oversee the Manhattan Project, wrote an article entitled “As We May Think,” claiming that scientific ingenuity in the postwar era should focus on new tools for thought. He called for a system of links and trails between islands of information—using text, images, and sound. Bush called the device performing this role a memex. Marc Andreesen, designer of Mosaic and Netscape, looks back upon Bush as a prophet who addressed “fundamental ideas we are still trying to realize today.”

  The Internet’s earliest physical implementation began as an experiment to enhance the effectiveness of government scientists and engineers promoted by the U.S. Defense Advanced Research Projects Agency (DARPA) in the 1960s, when investigators used to ship crates of magnetic tape across the continent to the few computers with enough power to solve intricate technical models. Even a small error in software coding might take weeks of back-and-forth shuttling to fix. But visionaries foresaw a day when researchers in Berkeley might transmit programs by high-speed cable to a computer in Los Alamos, making corrections in real time. The experiment began with just four linked computers, around the time that men first walked on the moon.