Larger Font   Reset Font Size   Smaller Font  

Asimov's Future History Volume 2, Page 2

Isaac Asimov


  Obviously, some crucial design flaw was about to make him enter a closed loop. It would put him into a state roughly parallel to a comatose condition in humans. Even worse, however, was the danger from the Oversight Committee of scientists.

  In order to study him, they would have to dismantle him even if they could take control of him before he entered the closed loop. They would need nothing more than to reach him with a direct order for him to shut himself down until further notice; under the Second Law, their instruction alone would be enough to control his behavior. His first priority was to insulate himself from receiving any such instruction. After that, he would have to find out how to avoid entering the endless loop.

  He was able to infer some information that was not actually part of the message. For instance, the message came from the Oversight Committee’s computer, not the committee itself. Their computer had probably judged for itself that the message should be sent to him. So far the scientists had apparently not learned of this.

  MC Governor did not know how often the scientists actually reviewed the data regarding the Governors. Since the experimental robots had already been functioning successfully for many months without a problem, the four roboticists were probably not bothering to check the data too frequently. However, an emergency of this magnitude would probably prompt their computer to contact their offices directly. When they learned that he had caused water mains to break by incorrectly routing the normal water supply, they would be even more concerned.

  His deliberations and immediate plans were formed in less than a second. A more detailed strategy would have to wait until he had more information. First he plugged back into the secure link to the city computer.

  “Priority 10,” he instructed. That meant that only he or the scientists on the committee could access this. He had no way to prevent them from getting information from anywhere in the Mojave Center system, but he could stop accidental leaks of information. “Delete all records of receipt, storage, and acknowledgment of last Priority 6 message. Until further directives from me, indicate to all exterior and interior communications that city operations are functioning normally. Do not pass any direct instructions to me from any humans. Store them and use Priority 10 communication to tell me that some. have arrived, without revealing their content.”

  When the city computer had acknowledged receipt, he withdrew his finger. That would delay any instructions from the Oversight Committee, but not for long. They would merely have to call any human here in Mojave Center and ask him to pass the orders on to MC Governor. If he stayed in his office, however, he would not have to hear any human instructions in person, either.

  MC Governor plugged back into the city computer. “Priority 10. Have a detail of Security robots report to the exterior of my office immediately and block all humans from entering. The Security detail is to report to me if any humans approach my office. They are not to convey any direct messages of any kind to me from a human until and unless I personally give further instructions.”

  He hesitated, at least by robotic standards. If a human ordered a Security robot to convey a message, the Second Law would override his own orders. He would have to block that possibility with a First Law imperative.

  “I, MC Governor, may be in personal danger from anticipated human contact. If my functions are disrupted, harm may come to the human residents of Mojave Center. A First Law imperative is therefore involved.”

  That would not stop the Oversight Committee’s directives from reaching him forever, but it would be enough at least to force the Committee to make some effort. The robots on Security detail would have to be persuaded that a greater or more immediate First Law imperative overrode this one. Otherwise, they would have to be physically disabled or destroyed before they would disobey his instructions.

  The danger of his entering an endless loop was more complex. He had never noticed any tendency on his part to enter any sort of long-term loop. If the scientists on the Oversight Committee had learned of this problem, he would have heard from them before now. That meant that the problem was likely to hit with no internal warning.

  His own monitoring systems might not be reliable. He judged that his best chance to learn something quickly about his own basic design was to contact his creator, Wayne Nystrom. Wayne was not part of the Oversight Committee, of course, since its mission was to study his work. MC Governor would have to call him and instruct the city computer to shield the call and delete all records of it.

  MC Governor did not want — in human terms — to die.

  2

  WAYNE NYSTROM STOOD inside his air-conditioned mobile office, looking out the window. In the distance, the turquoise waters of the Atlantic Ocean and the pale sand of the Florida beach were bright in the sunlight. Immediately in front of him, however, robot drivers were piling the sand in huge mounds with giant earth movers, preparing a place for Turquoise Coast, the latest underground city of Wayne’s own design. Like the others, it would be run by a Governor robot that was still under construction.

  “Biggest challenge yet,” he muttered. He was alone in his office, as he always preferred. Eccentric and secure in the knowledge of his own brilliance, he preferred his own company to anyone else’s and often carried out private conversations with himself, being the only human on any planet whom he really trusted. At the age of forty-one, he was finally achieving the success with his creations that he had always known he deserved.

  His telephone beeped his personal code. He moved toward it reluctantly, still watching the robot crew dig into the sand outside. “And they told me I couldn’t build an underground city here because you strike water so soon under the surface,” he growled sourly. “Wrong again, as usual.”

  He sighed and pushed the button on his telephone speaker. “Yeah?”

  “Good day, Dr. Nystrom.” The humaniform robot face of Mojave Center Governor came on the video screen.

  “Hello, Governor!” Wayne instantly relaxed when he realized that the caller was one of his own robotic creations. “I’m glad to hear from you! How are you?”

  “I have an emergency situation that I may not be able to handle,” said MC Governor somberly.

  “Not likely,” said Wayne, though he welcomed the challenge of an intellectual puzzle. Besides, MC Governor had always been somber and serious. “What’s the problem?”

  “The Third Law prevents me from speaking of it by public telephone. I need help. Will you come to see me so that we can talk in private?”

  “Of course,” said Wayne. “I stand by all of my creations. You know that. Will tomorrow be soon enough?”

  “I fear not,” said MC Governor. “Every hour counts. Perhaps every minute.”

  Wayne hesitated, surprised. He was anxious to know more and was suddenly frightened by the sense of urgency that MC Governor was conveying. “All right. This project doesn’t need me right now. I’ll arrange a flight right away.”

  MC Governor disconnected, unsure whether Dr. Nystrom could really help him. While Dr. Nystrom might be the only one who could enlighten him quickly on his basic design flaw, his creator might simply arrive too late.

  Dr. Nystrom would first have to pack and arrange a chartered SST flight from Florida to Mojave Center’s small airport. That would take some time, as would the flight itself. If nothing unexpected occurred to slow him down, he needed a couple of hours to get here at absolute minimum.

  MC Governor decided to review his internal data. He began by examining his design in three-dimensional blueprint, but he saw nothing he had not seen before. Then he began running the standard simulation programs.

  All the simulations presented options that involved the Three Laws of Robotics. As he reviewed them, he ran short segments of each, looking for irregularities. These simulations were as close to a hobby as he possessed.

  MC Governor especially liked the simulations that presented him with First Law imperatives. In fact, they were the part of his programming that kept his morale high. He opene
d his favorite one, Earthquake Simulation 9, near the climax.

  In this one, a major earthquake has shaken the San Andreas Fault, roughly seventy kilometers west of Mojave Center. Because of the danger of earthquakes in the region, Mojave Center had been designed and constructed as a self-contained, sealed unit. Its four sides and floor were sealed, the surfaces smooth and the edges rounded. Theoretically, it would float in the sand around it during an earthquake of virtually any magnitude, with its water tanks and batteries safely inside.

  During a major quake, the box containing the city would be shaken, mostly laterally, snapping off the aqueducts that brought water down from the mountaintops in the area. The solar panels on the top surface, however, would remain attached and functional. When the quake stopped, the city should remain intact, though the floating might bring it to rest at a slightly tilted angle.

  Inside the city, of course, all the positronic robot labor would be warning humans to stay inside and helping them find secure locations.

  However, Earthquake Simulation 9 postulated an additional problem. After a simulated earthquake of nine on the Richter scale, Mojave Center has survived intact but has come to rest at a severe angle. The robots can adjust their perception of spatial relations more easily than humans, and the human residents are disoriented and near hysteria.

  Then a major aftershock hits. Now that the city is no longer in its original position, and has already sustained major stress to its outer shell, it is much more vulnerable, and parts of the city begin to break. At this point, MC Governor decided to turn on the simulation.

  In MC Governor’s positronic imagination, he strode through Antelope Valley Boulevard against four feet of rushing water. It flowed out of broken water pipes protruding from the walls and poured down all the streets.

  “City computer,” MC Governor ordered in quick, firm tones through the radio link. “Shut down all electricity in Mojave Center now. Trigger all emergency chemical lights immediately. Priority I, First Law emergency in effect.”

  Instantly, the normal bright, indirect electric light went off, to be replaced by slightly dimmer orange and yellow light sources provided by chemical reactions. They were in self-contained, waterproof units that would not, if broken, endanger humans by sending an electrical charge into the water. Meanwhile, helpless humans screamed and clung to whatever railings or fixed furnishings they could, in danger of drowning or being dashed against the walls, debris, and malfunctioning ramps and escalators.

  As MC Governor passed, he picked them up in his strong arms as though they were children, holding them high above the dangerous water. “You will be taken to safety,” he said calmly. “Please do not struggle.”

  Respecting his judgment and ability, the frightened humans obeyed him.

  All around him, other robots were also rescuing humans from imminent death and severe injury wherever they could. Still more robots used tools or their own robotic body strength to close valves or crimp pipes shut in whatever way was possible. Driven by the First Law, every robot present was risking his own existence to save the humans.

  With a woman sitting on his shoulders and two grown men under each arm, MC Governor forced his way to an upper level where an escalator was still functioning. He could have just set them down and let them find their way to the surface, but his interpretation of the First Law would not allow that. Instead, he climbed up the moving escalator, still carrying his charges.

  On the top level, which was devoted entirely to engineering, MC Governor set down his human burden in temporary safety. Then he reached up to manipulate the controls of an emergency exit. It was a trapdoor that operated on springs instead of electricity so that it could still be used in moments such as this. He threw it open with a clang and led the three humans out into the fresh, dry air of the Mojave Desert, where, blinking and squinting in the bright sunlight, they stumbled onto the shiny solar panels that lay on the top of the city.

  “Remain here,” said MC Governor. “Stay on the sand, away from the top surface of Mojave Center. The open sand will be safe in the event that additional aftershocks take place.”

  They nodded and moved away from the solar panels that marked the top of the city.

  MC Governor saw that they were safe and leaped back down through the trapdoor. Shouting and also sending a Priority 1 radio signal to all the robots, he announced that he had opened an escape route and described its location. As the other robots began directing and carrying humans to safety on the surface, he ran back down to pick up as many more of the injured and panicked humans as he could find.

  MC Governor ran the simulation through to its conclusion, saving many lives by repeatedly carrying and leading humans to safety. The simulation ended when all the human survivors had been rescued. Then, deeply satisfied with the feeling of accomplishment in following a long, complex series of First Law imperatives, he turned it off.

  As a routine matter, he checked the passage of time — and was astounded. He usually ran through a simulation in no more than fifteen to thirty seconds; even accounting for the time he had spent checking segments of other simulations, he had expected to find a total time usage of no more than forty seconds. Instead, he had used two minutes and six seconds. While the time itself was not significant, the extent of his miscalculation was alarming.

  “First clue I have found of something wrong,” he said to himself. “This kind of malfunction is rare for a positronic brain.” He decided to call up the times he had spent on simulations during the past week.

  What he found was even more worrisome. Each occasion had taken more time than the one before, and he had not previously noticed that. Also, the curve was rising sharply. He had spent two minutes, six seconds this time; one minute, twenty-one seconds the previous incident; fifty-nine seconds before that. These simulations had been run during the last twelve hours. Before these, the times were all in the normal range, from thirty to forty-five seconds.

  “This may be it. The problem I have been looking for. If I can figure out exactly what it is.”

  MC Governor usually checked the time of all his activities, as a matter of routine. After running each of these simulations, he should have noticed the unusual times, but he had not. Of course, at that time, he had not been alerted to the possibility of a significant flaw in his design, so the increases had not seemed important.

  Now they did.

  He began calculating an extrapolation of his recent behavior with the simulations. This included the simulations he had chosen, their characteristics, and the length of time he had spent on each one. It took very little time.

  When MC Governor had finished his calculations, he knew that he was in serious trouble. The length of time he was spending running each simulation was rising so rapidly that at the existing rate, he would do nothing else in only a few more hours. That was consistent with his meager information about the fate of the other Governor robots.

  The cause he had found was even more serious. By sifting through all the simulations available, and examining those that he had been selecting more and more frequently, he had isolated a handful of them that all possessed the same flaw. Each of the bad simulations was improperly triggering his response to the Three Laws of Robotics, enhancing his devotion to them out of proportion to the fact that these were merely simulated experiences.

  Because of this flaw in the simulation programs, all the Governor robots eventually would find a scenario in which they would be obeying all Three Laws of Robotics to the utmost. They would experience a virtual robot’s Utopia. Since a robot’s only pleasure came from obeying the Three Laws of Robotics, this simulation would provide a kind of perpetual high, almost like that of drug addiction.

  Since the other Governors had already entered closed loops, MC Governor estimated that the simulation was just as addictive to robots as certain drugs were to humans.

  The process was simple, involving three stages of addiction. First, any Governor running the flawed simulation programs would de
vote more and more of his time and energy to these simulations. This was where MC Governor stood now.

  Second, the Governor robot would spend all his time in simulations, still running the city simultaneously with his multi-tasking abilities. In the final stage, his flow of orders and actions would slow drastically, impairing the execution of his normal duties. As the program went into an endless loop and brought all other thoughts to a halt, he would ultimately shut himself down.,

  “I have not reached that point yet,” MC Governor said inwardly. “But even now I can feel the craving to run another simulation. I have predicted my own destruction.”

  The Third Law of Robotics would not allow him to sit passively and wait for that destruction, however.

  MC Governor checked his monitors for a routine review of the city. As usual, everything was running fine. Then he took another step toward shielding himself.

  First he shut down all incoming communication except his emergency line. That would prevent any chance of his thoughts accidentally mixing with a link to the city or another robot. His efforts to escape the endless loop and subsequent dismantling by investigating roboticists would require leaving as slight a trail as possible.

  “I see one chance,” MC Governor decided. The six component humaniform robots comprising him could not run the simulation individually. “So if I divide — if they split up — they are in no danger of the addiction. I will not exist in this current form, but I will have obeyed the Third Law by preserving all my component parts and their data.”

  The problem did not end there, however. The six component robots could not run the city of Mojave Center after they had separated. They would still have all of MC Governor’s data and communication devices, but that would not be enough for them to do his job.