Read The Bicentennial Man and Other Stories Page 4

That Thou Art Mindful of Him

  The Three Laws of Robotics:

  1. A robot may not injure a human being or, through inaction, allow a human being to come to harm.

  2. A robot must obey the orders given it by human beings except where such orders would conflict with the First Law.

  3. A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.

  1.

  Keith Harriman, who had for twelve years now been Director of Research at United States Robots and Mechanical Men Corporation, found that he was not at all certain whether he was doing right. The tip of his tongue passed over his plump but rather pale lips and it seemed to him that the holographic image of the great Susan Calvin, which stared unsmilingly down upon him, had never looked so grim before.

  Usually he blanked out that image of the greatest roboticist in history because she unnerved him. (He tried thinking of the image as "it" but never quite succeeded. ) This time he didn't quite dare to and her long-dead gaze bored into the side of his face.

  It was a dreadful and demeaning step he would have to take. Opposite him was George Ten, calm and unaffected either by

  Harriman's patent uneasiness or by the image of the patron saint of robotics glowing in its niche above.

  Harriman said, "We haven't had a chance to talk this out, really, George. You haven't been with us that long and I haven't had a good chance to be alone with you. But now I would like to discuss the matter in some detail. "

  "I am perfectly willing to do that, " said George. "In my stay at U. S. Robots, I have gathered the crisis has something to do with the Three Laws. "

  "Yes. You know the Three Laws, of course. "

  "I do. "

  "Yes, I'm sure you do. But let us dig even deeper and consider the truly basic problem. In two centuries of, if I may say so, considerable success, U. S. Robots has never managed to persuade human beings to accept robots. We have placed robots only where work is required that human beings cannot do, or in environments that human beings find unacceptably dangerous. Robots have worked mainly in space and that has limited what we have been able to do. "

  "Surely," said George Ten, "that represents a broad limit, and one within which U. S. Robots can prosper. "

  "No, for two reasons. In the first place, the boundaries set for us inevitably contract. As the Moon colony, for instance, grows more sophisticated, its demand for robots decreases and we expect that, within the next few years, robots will be banned on the Moon. This will be repeated on every world colonized by mankind. Secondly, true prosperity is impossible without robots on Earth. We at U. S. Robots firmly believe that human beings need robots and must learn to live with their mechanical analogues if progress is to be maintained. "

  "Do they not? Mr. Harriman, you have on your desk a computer input which, I understand, is connected with the organization's Multivac. A computer is a kind of sessile robot; a robot brain not attached to a body-"

  "True, but that also is limited. The computers used by mankind have been steadily specialized in order to avoid too humanlike an intelligence. A century ago we were well on the way to artificial intelligence of the most unlimited type through the use of great computers we called Machines. Those Machines limited their action of their own accord. Once they had solved the ecological problems that had threatened human society, they phased themselves out. Their own continued existence would, they reasoned, have placed them in the role of a crutch to mankind and, since they felt this would harm human beings, they condemned themselves by the First Law. "

  "And were they not correct to do so?"

  "In my opinion, no. By their action, they reinforced mankind's Frankenstein complex; its gut fears that any artificial man they created would turn upon its creator. Men fear that robots may replace human beings. "

  "Do you not fear that yourself?"

  "I know better. As long as the Three Laws of Robotics exist, they cannot. They can serve as partners of mankind; they can share in the great struggle to understand and wisely direct the laws of nature so that together they can do more than mankind can possibly do alone; but always in such a way that robots serve human beings. "

  "But if the Three Laws have shown themselves, over the course of two centuries, to keep robots within bounds, what is the source of the distrust of human beings for robots?"

  "Well"-and Harriman's graying hair tufted as he scratched his head vigorously-"mostly superstition, of course. Unfortunately, there are also some complexities involved that anti-robot agitators seize upon. "

  "Involving the Three Laws?"

  "Yes. The Second Law in particular. There's no problem in the Third Law, you see. It is universal. Robots must always sacrifice themselves for human beings, any human beings. "

  "Of course," said George Ten.

  "The First Law is perhaps less satisfactory, since it is always possible to imagine a condition in which a robot must perform either Action A or Action B, the two being mutually exclusive, and where either action results in harm to human beings. The robot must therefore quickly select which action results in the least harm. To work out the positronic paths of the robot brain in such a way as to make that selection possible is not easy. If Action A results in harm to a talented young artist and B results in equivalent harm to five elderly people of no particular worth, which action should be chosen. "

  "Action A, " said George Ten. "Harm to one is less than harm to five. "

  "Yes, so robots have always been designed to decide. To expect robots to make judgments of fine points such as talent, intelligence, the general usefulness to society, has always seemed impractical. That would delay decision to the point where the robot is effectively immobilized. So we go by numbers. Fortunately, we might expect crises in which robots must make such decisions to be few. . . But then that brings us to the Second Law. "

  "The Law of Obedience. "

  "Yes. The necessity of obedience is constant. A robot may exist for twenty years without every having to act quickly to prevent harm to a human being, or find itself faced with the necessity of risking its own destruction. In all that time, however, it will be constantly obeying orders. . . Whose orders?"

  "Those of a human being. "

  "Any human being? How do you judge a human being so as to know whether to obey or not? What is man, that thou art mindful of him, George?"

  George hesitated at that.

  Harriman said hurriedly, "A Biblical quotation. That doesn't matter. I mean, must a robot follow the orders of a child; or of an idiot; or of a criminal; or of a perfectly decent intelligent man who happens to be inexpert and therefore ignorant of the undesirable consequences of his order? And if two human beings give a robot conflicting orders, which does the robot follow?"

  "In two hundred years," said George Ten, "have not these problems arisen and been solved?"

  "No," said Harriman, shaking his head violently. "We have been hampered by the very fact that our robots have been used only in specialized environments out in space, where the men who dealt with them were experts in their field. There were no children, no idiots, no criminals, no well-meaning ignoramuses present. Even so, there were occasions when damage was done by foolish or merely unthinking orders. Such damage in specialized and limited environments could be contained. On Earth, however, robots must have judgment. So those against robots maintain, and, damn it, they are right. "

  "Then you must insert the capacity for judgment into the positronic brain. "

  "Exactly. We have begun to reproduce JG models in which the robot can weigh every human being with regard to sex, age, social and professional position, intelligence, maturity, social responsibility and so on. "

  "How would that affect the Three Laws?"

  "The Third Law not at all. Even the most valuable robot must destroy himself for the sake of the most useless human being. That cannot be tampered with. The First Law is affected only where alternative acti
ons will all do harm. The quality of the human beings involved as well as the quantity must be considered, provided there is time for such judgment and the basis for it, which will not be often. The Second Law will be most deeply modified, since every potential obedience must involve judgment. The robot will be slower to obey, except where the First Law is also involved, but it will obey more rationally. "

  "But the judgments which are required are very complicated. "

  "Very. The necessity of making such judgments slowed the reactions of our first couple of models to the point of paralysis. We improved matters in the later models at the cost of introducing so many pathways that the robot's brain became far too unwieldy. In our last couple of models, however, I think we have what we want. The robot doesn't have to make an instant judgment of the worth of a human being and the value of its orders. It begins by obeying all human beings as any ordinary robot would and then it learns. A robot grows, learns and matures. It is the equivalent of a child at first and must be under constant supervision. As it grows, however, it can, more and more, be allowed, unsupervised, into Earth's society. Finally, it is a full member of that society. "

  "Surely this answers the objections of those who oppose robots. "

  "No," said Harriman angrily. "Now they raise others. They will not accept judgments. A robot, they say, has no right to brand this person or that as inferior. By accepting the orders of A in preference to that of B, B is branded as of less consequence than A and his human rights are violated. "

  "What is the answer to that?"

  "There is none. I am giving up. "

  "I see. "

  "As far as I myself am concerned. . . Instead, I turn to you, George. "

  "To me?" George Ten's voice remained level. There was a mild surprise in it but it did not affect him outwardly. "Why to me?"

  "Because you are not a man," said Harriman tensely. "I told you I want robots to be the partners of human beings. I want you to be mine. "

  George Ten raised his hands and spread them, palms outward, in an oddly human gesture. "What can I do?"

  "It seems to you, perhaps, that you can do nothing, George. You were created not long ago, and you are still a child. You were designed to be not overfull of original information-it was why I have had to explain the situation to you in such detail-in order to leave room for growth. But you will grow in mind and you will come to be able to approach the problem from a non-human standpoint. Where I see no solution, you, from your own other standpoint, may see one. "

  George Ten said, "My brain is man-designed. In what way can it be non-human?"

  "You are the latest of the JG models, George. Your brain is the most complicated we have yet designed, in some ways more subtly complicated than that of the old giant Machines. It is open-ended and, starting on a human basis, may-no, will-grow in any direction. Remaining always within the insurmountable boundaries of the Three Laws, you may yet become thoroughly non-human in your thinking. "

  "Do I know enough about human beings to approach this problem rationally? About their history? Their psychology?"

  "Of course not. But you will learn as rapidly as you can. "

  "Will I have help, Mr. Harriman?"

  "No. This is entirely between ourselves. No one else knows of this and you must not mention this project to any human being, either at U. S. Robots or elsewhere. "

  George Ten said, "Are we doing wrong, Mr. Harriman, that you seek to keep the matter secret?"

  "No. But a robot solution will not be accepted, precisely because it is robot in origin. Any suggested solution you have you will turn over to me; and if it seems valuable to me, I will present it. No one will ever know it came from you. "

  "In the light of what you have said earlier," said George Ten calmly, "this is the correct procedure. . . When do I start?"

  "Right now. I will see to it that you have all the necessary films for scanning. "

  1a.

  Harriman sat alone. In the artificially lit interior of his office, there was no indication that it had grown dark outside. He had no real sense that three hours had passed since he had taken George Ten back to his cubicle and left him there with the first film references.

  He was now merely alone with the ghost of Susan Calvin, the brilliant roboticist who had, virtually single-handed, built up the positronic robot from a massive toy to man's most delicate and versatile instrument; so delicate and versatile that man dared not use it, out of envy and fear.

  It was over a century now since she had died. The problem of the Frankenstein complex had existed in her time, and she had never solved it. She had never tried to solve it, for there had been no need. Robotics had expanded in her day with the needs of space exploration.

  It was the very success of the robots that had lessened man's need for them and had left Harriman, in these latter times-

  But would Susan Calvin have turned to robots for help. Surely, she would have-

  And he sat there long into the night.

  2.

  Maxwell Robertson was the majority stockholder of U. S. Robots and in that sense its controller. He was by no means an impressive person in appearance. He was well into middle age, rather pudgy, and had a habit of chewing on the right corner of his lower lip when disturbed.

  Yet in his two decades of association with government figures he had developed a way of handling them. He tended to use softness, giving in, smiling, and always managing to gain time.

  It was growing harder. Gunnar Eisenmuth was a large reason for its having grown harder. In the series of Global Conservers, whose power had been second only to that of the Global Executive during the past century, Eisenmuth hewed most closely to the harder edge of the gray area of compromise. He was the first Conserver who had not been American by birth and though it could not be demonstrated in any way that the archaic name of U. S. Robots evoked his hostility, everyone at U. S. Robots believed that.

  There had been a suggestion, by no means the first that year-or that generation-that the corporate name be changed to World Robots, but Robertson would never allow that. The company had been originally built with American capital, American brains, and American labor, and though the company had long been worldwide in scope and nature, the name would bear witness to its origin as long as he was in control.

  Eisenmuth was a tall man whose long sad face was coarsely textured and coarsely featured. He spoke Global with a pronounced American accent, although he had never been in the United States prior to his taking office.

  "It seems perfectly clear to me, Mr. Robertson. There is no difficulty. The products of your company are always rented, never sold. If the rented property on the Moon is now no longer needed, it is up to you to receive the products back and transfer them. "

  "Yes, Conserver, but where? It would be against the law to bring them to Earth without a government permit and that has been denied. "

  "They would be of no use to you here. You can take them to Mercury or to the asteroids. "

  "What would we do with them there?"

  Eisenmuth shrugged. "The ingenious men of your company will think of something. "

  Robertson shook his head. "It would represent an enormous loss for the company. "

  "I'm afraid it would," said Eisenmuth, unmoved. "I understand the company has been in poor financial condition for several years now. "

  "Largely because of government imposed restrictions, Conserver. "

  "You must be realistic, Mr. Robertson. You know that the climate of public opinion is increasingly against robots. "

  "Wrongly so, Conserver. "

  "But so, nevertheless. It may be wiser to liquidate the company. It is merely a suggestion, of course. "

  "Your suggestions have force, Conserver. Is it necessary to tell you that our Machines, a century ago, solved the ecological crisis?"

  "I'm sure mankind is grateful, but that was a long time ago. We now live in alliance with nature, however uncom
fortable that might be at times, and the past is dim. "

  "You mean what have we done for mankind lately?"

  "I suppose I do. "

  "Surely we can't be expected to liquidate instantaneously; not without enormous losses. We need time. "

  "How much?"

  "How much can you give us?"

  "It's not up to me. "

  Robertson said softly. "We are alone. We need play no games. How much time can you give me?"

  Eisenmuth's expression was that of a man retreating into inner calculations. "I think you can count on two years. I'll be frank. The Global government intends to take over the firm and phase it out for you if you don't do it by then yourself, more or less. And unless there is a vast turn in public opinion, which I greatly doubt-" He shook his head.

  "Two years, then," said Robertson softly.

  2a.

  Robertson sat alone. There was no purpose to his thinking and it had degenerated into retrospection. Four generations of Robertsons had headed the firm. None of them was a roboticist. It had been men such as Lanning and Bogert and, most of all, most of all, Susan Calvin, who had made U. S. Robots what it was, but surely the four Robertsons had provided the climate that had made it possible for them to do their work.

  Without U. S. Robots, the Twenty-first Century would have progressed into deepening disaster. That it didn't was due to the Machines that had for a generation steered mankind through the rapids and shoals of history.

  And now for that, he was given two years. What could be done in two years to overcome the insuperable prejudices of mankind? He didn't know.

  Harriman had spoken hopefully of new ideas but would go into no details. Just as well, for Robertson would have understood none of it.

  But what could Harriman do anyway? What had anyone ever done against man's intense antipathy toward the imitation. Nothing-

  Robertson drifted into a half sleep in which no inspiration came.

  3.

  Harriman said, "You have it all now, George Ten. You have had everything I could think of that is at all applicable to the problem. As far as sheer mass of information is concerned, you have stored more in your memory concerning human beings and their ways, past and present, than I have, or than any human being could have. "

  "That is very likely. "

  "Is there anything more that you need, in your own opinion?"

  "As far as information is concerned, I find no obvious gaps. There may be matters unimagined at the boundaries. I cannot tell. But that would be true no matter how large a circle of information I took in. "

  "True. Nor do we have time to take in information forever. Robertson has told me that we only have two years, and a quarter of one of those years has passed. Can you suggest anything?"

  "At the moment, Mr. Harriman, nothing. I must weigh the information and for that purpose I could use help. "

  "From me?"

  "No. Most particularly, not from you. You are a human being, of intense qualifications, and whatever you say may have the partial force of an order and may inhibit my considerations. Nor any other human being, for the same reason, especially since you have forbidden me to communicate with any. "

  "But in that case, George, what help?"

  "From another robot, Mr. Harriman. "

  "What other robot?"

  "There are others of the JG series which were constructed. I am the tenth, JG-10. "

  "The earlier ones were useless, experimental-"

  "Mr. Harriman, George Nine exists. "

  "Well, but what use will he be? He is very much like you except for certain lacks. You are considerably the more versatile of the two. "

  "I am certain of that," said George Ten. He nodded his head in a grave gesture. "Nevertheless, as soon as I create a line of thought, the mere fact that I have created it commends it to me and I find it difficult to abandon it. 1f I can, after the development of a line of thought, express it to George Nine, he would consider it without having first created it. He would therefore view it without prior bent. He might see gaps and shortcomings that I might not. "

  Harriman smiled. "Two heads are better than one, in other words, eh, George?"

  "If by that, Mr. Harriman, you mean two individuals with one head apiece, yes. "

  "Right. Is there anything else you want?"

  "Yes. Something more than films. I have viewed much concerning human beings and their world. I have seen human beings here at U. S. Robots and can check my interpretation of what I have viewed against direct sensory impressions. Not so concerning the physical world. I have never seen it and my viewing is quite enough to tell me that my surroundings here are by no means representative of it. I would like to see it. "

  "The physical world?" Harriman seemed stunned at the enormity of the thought for a moment. "Surely you don't suggest I take you outside the grounds of U. S. Robots?"

  "Yes, that is my suggestion. "

  "That's illegal at any time. In the climate of opinion today, it would be fatal. "

  "If we are detected, yes. I do not suggest you take me to a city or even to a dwelling place of human beings. I would like to see some open region, without human beings. "

  "That, too, is illegal. "

  "If we are caught. Need we be?"

  Harriman said, "How essential is this, George?"

  "I cannot tell, but it seems to me it would be useful. "

  "Do you have something in mind?"

  George Ten seemed to hesitate. "I cannot tell. It seems to me that I might have something in mind if certain areas of uncertainty were reduced. "

  "Well, let me think about it. And meanwhile, I'll check out George Nine and arrange to have you occupy a single cubicle. That at least can be done without trouble. "

  3a.

  George Ten sat alone.

  He accepted statements tentatively, put them together, and drew a conclusion; over and over again; and from conclusions built other statements which he accepted and tested and found a contradiction and rejected; or not, and tentatively accepted further.

  At none of the conclusions he reached did he feel wonder, surprise, satisfaction; merely a note of plus or minus.

  4.

  Harriman's tension was not noticeably decreased even after they had made a silent downward landing on Robertson's estate.

  Robertson had countersigned the order making the dyna-foil available, and the silent aircraft, moving as easily vertically as horizontally, had been large enough to carry the weight of Harriman, George Ten, and, of course, the pilot.

  (The dyna-foil itself was one of the consequences of the Machine-catalyzed invention of the proton micro-pile which supplied pollution-free energy in small doses. Nothing had been done since of equal importance to man's comfort-Harriman's lips tightened at the thought-and yet it had not earned gratitude for U. S. Robots. )

  The air flight between the grounds of U. S. Robots and the Robertson estate had been the tricky part. Had they been stopped then, the presence of a robot aboard would have meant a great set of complications. It would be the same on the way back. The estate itself, it might be argued-it would be argued-was part of the property of U. S. Robots and on that property, robots, properly supervised, might remain.

  The pilot looked back and his eyes rested with gingerly briefness on George Ten. "You want to get out at all, Mr. Harriman?"

  "Yes. "

  "It, too?"

  "Oh, yes. " Then, just a bit sardonically, "I won't leave you alone with him. "

  George Ten descended first and Harriman followed. They had come down on the foil-port and not too far off was the garden. It was quite a showplace and Harriman suspected that Robertson used juvenile hormone to control insect life without regard to environmental formulas.

  "Come, George," said Harriman. "Let me show you. " Together they walked toward the garden.

  George said, "It is a little as I have imaged it. My eyes are not properl
y designed to detect wavelength differences, so I may not recognize different objects by that alone. "

  "I trust you are not distressed at being color-blind. We needed too many positronic paths for your sense of judgment and were unable to spare any for sense of color. In the future-if there is a future-"

  "I understand, Mr. Harriman. Enough differences remain to show me that there are here many different forms of plant life. "

  "Undoubtedly. Dozens. "

  "And each coequal with man, biologically. "

  "Each is a separate species, yes. There are millions of species of living creatures. "

  "Of which the human being forms but one. "

  "By far the most important to human beings, however. "

  "And to me, Mr. Harriman. But I speak in the biological sense. "

  "I understand. "

  "Life, then, viewed through all its forms, is incredibly complex. "

  "Yes, George, that's the crux of the problem. What man does for his own desires and comforts affects the complex total-of-life, the ecology, and his short-term gains can bring long-term disadvantages. The Machines taught us to set up a human society which would minimize that, but the near-disaster of the early Twenty-first Century has left mankind suspicious of innovations. That, added to its special fear of robots-"

  "I understand, Mr. Harriman. . . That is an example of animal life, I feel certain. "

  "That is a squirrel; one of many species of squirrels. "

  The tail of the squirrel flirted as it passed to the other side of the tree

  "And this," said George, his arm moving with flashing speed, "is a tiny thing indeed. " He held it between his fingers and peered at it.

  "It is an insect, some sort of beetle. There are thousands of species of beetles. "

  "With each individual beetle as alive as the squirrel and as yourself?"

  "As complete and independent an organism as any other, within the total ecology. There are smaller organisms still; many too small to see. "

  "And that is a tree, is it not? And it is hard to the touch-"

  4a.

  The pilot sat alone. He would have liked to stretch his own legs but some dim feeling of safety kept him in the dyna-foil. If that robot went out of control, he intended to take off at once. But how could he tell if it went out of control?

  He had seen many robots. That was unavoidable considering he was Mr. Robertson's private pilot. Always, though, they had been in the laboratories and warehouses, where they belonged, with many specialists in the neighborhood.

  True, Dr. Harriman was a specialist. None better, they said. But a robot here was where no robot ought to be; on Earth; in the open; free to move-He wouldn't risk his good job by telling anyone about this-but it wasn't right.

  5.

  George Ten said, "The films I have viewed are accurate in terms of what I have seen. Have you completed those I selected for you, Nine?"

  "Yes," said George Nine. The two robots sat stiffly, face to face, knee to knee, like an image and its reflection. Dr. Harriman could have told them apart at a glance, for he was acquainted with the minor differences in physical design. If he could not see them, but could talk to them, he could still tell them apart, though with somewhat less certainty, for George Nine's responses would be subtly different from those produced by the substantially more intricately patterned positronic brain paths of George Ten.

  "In that case," said George Ten, "give me your reactions to what I will say. First, human beings fear and distrust robots because they regard robots as competitors. How may that be prevented?"

  "Reduce the feeling of competitiveness," said George Nine, "by shaping the robot as something other than a human being. "

  "Yet the essence of a robot is its positronic replication of life. A replication of life in a shape not associated with life might arouse horror. "

  "There are two million species of life forms. Choose one of those as the shape rather than that of a human being. "

  "Which of all those species?" George Nine's thought processes proceeded noiselessly for some three seconds. "One large enough to contain a positronic brain, but one not possessing unpleasant associations for human beings. "

  "No form of land life has a braincase large enough for a positronic brain but an elephant, which I have not seen, but which is described as very large, and therefore frightening to man. How would you meet this dilemma?"

  "Mimic a life form no larger than a man but enlarge the braincase. "

  George Ten said, "A small horse, then, or a large dog, would you say? Both horses and dogs have long histories of association with human beings. "

  "Then that is well. "

  "But consider-A robot with a positronic brain would mimic human intelligence. If there were a horse or a dog that could speak and reason like a human being, there would be competitiveness there, too. Human beings might be all the more distrustful and angry at such unexpected competition from what they consider a lower form of life. "

  George Nine said, "Make the positronic brain less complex, and the robot less nearly intelligent. "

  "The complexity bottleneck of the positronic brain rests in the Three Laws. A less complex brain could not possess the Three Laws in full measure. "

  George Nine said at once, "That cannot be done. "

  George Ten said, "I have also come to a dead end there. That, then, is not a personal peculiarity in my own line of thought and way of thinking. Let us start again. . . Under what conditions might the Third Law not be necessary?"

  George Nine stirred as if the question were difficult and dangerous. But he said, "If a robot were never placed in a position of danger to itself; or if a robot were so easily replaceable that it did not matter whether it were destroyed or not. "

  "And under what conditions might the Second Law not be necessary?"

  George Nine's voice sounded a bit hoarse. "If a robot were designed to respond automatically to certain stimuli with fixed responses and if nothing else were expected of it, so that no order need ever be given it. "

  "And under what conditions"-George Ten paused here- "might the First Law not be necessary?"

  George Nine paused longer and his words came in a low whisper, "If the fixed responses were such as never to entail danger to human beings. "

  "Imagine, then, a positronic brain that guides only a few responses to certain stimuli and is simply and cheaply made-so that it does not require the Three Laws. How large need it be?"

  "Not at all large. Depending on the responses demanded, it might weigh a hundred grams, one gram, one milligram. "

  "Your thoughts accord with mine. I shall see Dr. Harriman. "

  5a.

  George Nine sat alone. He went over and over the questions and answers. There was no way in which he could change them. And yet the thought of a robot of any kind, of any size, of any shape, of any purpose, without the Three Laws, left him with an odd, discharged feeling.

  He found it difficult to move. Surely George Ten had a similar reaction. Yet he had risen from his seat easily.

  6.

  It had been a year and a half since Robertson had been closeted with Eisenmuth in private conversation. In that interval, the robots had been taken off the Moon and all the far-flung activities of U. S. Robots had withered. What money Robertson had been able to raise had been placed into this one quixotic venture of Harriman's.

  It was the last throw of the dice, here in his own garden. A year ago, Harriman had taken the robot here-George Ten, the last full robot that U. S. Robots had manufactured. Now Harriman was here with something else

  Harriman seemed to be radiating confidence. He was talking easily with Eisenmuth, and Robertson wondered if he really felt the confidence he seemed to have. He must. In Robertson's experience, Harriman was no actor.

  Eisenmuth left Harriman, smiling, and came up to Robertson. Eisenmuth's smile vanished at once. "Good morning, Robertson," he said. "What is yo
ur man up to?"

  "This is his show," said Robertson evenly. "I'll leave it to him. " Harriman called out, "I am ready, Conserver. "

  "With what, Harriman?"

  "With my robot, sir. "

  "Your robot?" said Eisenmuth. "You have a robot here?" He looked about with a stem disapproval that yet had an admixture of curiosity.

  "This is U. S. Robots' property, Conserver. At least we consider it as such. "

  "And where is the robot, Dr. Harriman?"

  "In my pocket, Conserver:' said Harriman cheerfully.

  What came out of a capacious jacket pocket was a small glass jar. "That?" said Eisenmuth incredulously.

  "No, Conserver," said Harriman. "This!"

  From the other pocket came out an object some five inches long and roughly in the shape of a bird. In place of the beak, there was a narrow tube; the eyes were large; and the tail was an exhaust channel.

  Eisenmuth's thick eyebrows drew together. "Do you intend a serious demonstration of some sort, Dr. Harriman, or are you mad?"

  "Be patient for a few minutes, Conserver," said Harriman. "A robot in the shape of a bird is none the less a robot for that. And the positronic brain it possesses is no less delicate for being tiny. This other object I hold is a jar of fruit flies. There are fifty fruit flies in it which will be released. "

  "And-"

  "The robo-bird will catch them. Will you do the honors, sir?" Harriman handed the jar to Eisenmuth, who stared at it, then at those around him, some officials from U. S. Robots, others his own aides. Harriman waited patiently.

  Eisenmuth opened the jar, then shook it.

  Harriman said softly to the robo-bird resting on the palm of his right hand, "Go!"

  The robo-bird was gone. It was a whizz through the air, with no blur of wings, only the tiny workings of an unusually small proton micro-pile.

  It could be seen now and then in a small momentary hover and then it whirred on again. All over the garden, in an intricate pattern it flew, and then was back in Harriman's palm, faintly warm. A small pellet appeared in the palm, too, like a bird dropping.

  Harriman said, "You are welcome to study the robo-bird, Conserver, and to arrange demonstrations on your own terms. The fact is that this bird will pick up fruit flies unerringly, only those, only the one species Drosophila melanogaster; pick them up, kill them, and compress them for disposition. "

  Eisenmuth reached out his hand and touched the robo-bird gingerly, "And therefore, Mr. Harriman? Do go on. "

  Harriman said, "We cannot control insects effectively without risking damage to the ecology. Chemical insecticides are too broad; juvenile hormones too limited. The robo-bird, however, can preserve large areas without being consumed. They can be as specific as we care to make them-a different robo-bird for each species. They judge by size, shape, color, sound, behavior pattern. They might even conceivably use molecular detection-smell, in other words. "

  Eisenmuth said, "You would still be interfering with the ecology. The fruit flies have a natural life cycle that would be disrupted. "

  "Minimally. We are adding a natural enemy to the fruit-fly life cycle, one which cannot go wrong. If the fruit-fly supply runs short, the robo-bird simply does nothing. It does not multiply, it does not turn to other foods; it does not develop undesirable habits of its own. It does nothing. "

  "Can it be called back?"

  "Of course. We can build robo-animals to dispose of any pest. For that matter, we can build robo-animals to accomplish constructive purposes within the pattern of the ecology. Although we do not anticipate the need, there is nothing inconceivable in the possibility of robo-bees designed to fertilize specific plants, or robo-earthworms designed to mix the soil. Whatever you wish-"

  "But why?"

  "To do what we have never done before. To adjust the ecology to our needs by strengthening its parts rather than disrupting it. . . Don't you see? Ever since the Machines put an end to the ecology crisis, mankind has lived in an uneasy truce with nature, afraid to move in any direction. This has been stultifying us, making a kind of intellectual coward of humanity so that he begins to mistrust all scientific advance, all change. "

  Eisenmuth said, with an edge of hostility, "You offer us this, do you, in exchange for permission to continue with your program of robots-I mean ordinary, man-shaped ones?"

  "No!" Harriman gestured violently. "That is over. It has served its purpose. It has taught us enough about positronic brains to make it possible for us to cram enough pathways into a tiny brain to make a robo-bird. We can turn to such things now and be prosperous enough. U. S. Robots will supply the necessary knowledge and skill and we will work in complete cooperation with the Department of Global Conservation. We will prosper. You will prosper. Mankind will prosper. "

  Eisenmuth was silent, thinking. When it was all over

  6a.

  Eisenmuth sat alone. He found himself believing. He found excitement welling up within him. Though U. S. Robots might be the hands, the government would be the directing mind. He himself would be the directing mind.

  If he remained in office five more years, as he well might, that would be time enough to see the robotic support of the ecology become accepted; ten more years, and his own name would be linked with it indissolubly.

  Was it a disgrace to want to be remembered for a great and worthy revolution in the condition of man and the globe?

  7.

  Robertson had not been on the grounds of U. S. Robots proper since the day of the demonstration. Part of the reason had been his more or less constant conferences at the Global Executive Mansion. Fortunately, Harriman had been with him, for most of the time he would, if left to himself, not have known what to say.

  The rest of the reason for not having been at U. S. Robots was that he didn't want to be. He was in his own house now, with Harriman.

  He felt an unreasoning awe of Harriman. Harriman's expertise in robotics had never been in question, but the man had, at a stroke, saved U. S. Robots from certain extinction, and somehow-Robertson felt-the man hadn't had it in him. And yet-

  He said, "You're not superstitious, are you, Harriman?"

  "In what way, Mr. Robertson?"

  "You don't think that some aura is left behind by someone who is dead?"

  Harriman licked his lips. Somehow he didn't have to ask. "You mean Susan Calvin, sir?"

  "Yes, of course," said Robertson hesitantly. "We're in the business of making worms and birds and bugs now. What would she say? I feel disgraced. "

  Harriman made a visible effort not to laugh. "A robot is a robot, sir. Worm or man, it will do as directed and labor on behalf of the human being and that is the important thing. "

  "No"-peevishly. "That isn't so. I can't make myself believe that. "

  "It is so, Mr. Robertson," said Harriman earnestly. "We are going to create a world, you and I, that will begin, at last, to take positronic robots of some kind for granted. The average man may fear a robot that looks like a man and that seems intelligent enough to replace him, but he will have no fear of a robot that looks like a bird and that does nothing more than eat bugs for his benefit. Then, eventually, after he stops being afraid of some robots, he will stop being afraid of all robots. He will be so used to a robo-bird and a robo-bee and a robo-worm that a robo-man will strike him as but an extension. "

  Robertson looked sharply at the other. He put his hands behind his back and walked the length of the room with quick, nervous steps. He walked back and looked at Harriman again. "Is this what you've been planning?"

  "Yes, and even though we dismantle all our humanoid robots, we can keep a few of the most advanced of our experimental models and go on designing additional ones, still more advanced, to be ready for the day that will surely come. "

  "The agreement, Harriman, is that we are to build no more humanoid robots. "

  "And we won't. There is nothing that says we can't keep a few of those already
built as long as they never leave the factory. There is nothing that says we can't design positronic brains on paper, or prepare brain models for testing. "

  "How do we explain doing so, though? We will surely be caught at it. "

  "If we are, then we can explain we are doing it in order to develop principles that will make it possible to prepare more complex microbrains for the new animal robots we are making. We will even be telling the truth. "

  Robertson muttered, "Let me take a walk outside. I want to think about this. No, you stay here. I want to think about it myself. "

  7a.

  Harriman sat alone. He was ebullient. It would surely work. There was no mistaking the eagerness with which one government official after another had seized on the program once it had been explained.

  How was it possible that no one at U. S. Robots had ever thought of such a thing? Not even the great Susan Calvin had ever thought of positronic brains in terms of living creatures other than human.

  But now, mankind would make the necessary retreat from the humanoid robot, a temporary retreat, that would lead to a return under conditions in which fear would be abolished at last. And then, with the aid and partnership of a positronic brain roughly equivalent to man's own, and existing only (thanks to the Three Laws) to serve man; and backed by a robot-supported ecology, too; what might the human race not accomplish!

  For one short moment, he remembered that it was George Ten who had explained the nature and purpose of the robot-supported ecology, and then he put the thought away angrily. George Ten had produced the answer because he, Harriman, had ordered him to do so and had supplied the data and surroundings required. The credit was no more George Ten's than it would have been a slide rule's.

  8.

  George Ten and George Nine sat side by side in parallel. Neither moved. They sat so for months at a time between those occasions when Harriman activated them for consultation. They would sit so, George Ten dispassionately realized, perhaps for many years.

  The proton micro-pile would, of course, continue to power them and keep the positronic brain paths going with that minimum intensity required to keep them operative. It would continue to do so through all the periods of inactivity to come.

  The situation was rather analogous to what might be described as sleep in human beings, but there were no dreams. The awareness of George Ten and George Nine was limited, slow, and spasmodic, but what there was of it was of the real world.

  They could talk to each other occasionally in barely heard whispers, a word or syllable now, another at another time, whenever the random positronic surges briefly intensified above the necessary threshold. To each it seemed a connected conversation carried on in a glimmering passage of time.

  "Why are we so?" whispered George Nine. "The human beings will not accept us otherwise:' whispered George Ten, "They will, someday. "

  "When?"

  "In some years. The exact time does not matter. Man does not exist alone but is part of an enormously complex pattern of life forms. When enough of that pattern is roboticized, then we will be accepted. "

  "And then what?" Even in the long-drawn-out stuttering fashion of the conversation, there was an abnormally long pause after that.

  At last, George Ten whispered, "Let me test your thinking. You are equipped to learn to apply the Second Law properly. You must decide which human being to obey and which not to obey when there is a conflict in orders. Or whether to obey a human being at all. What must you do, fundamentally, to accomplish that?"

  "I must define the term 'human being: " whispered George Nine. "How? By appearance? By composition? By size and shape?"

  "No. Of two human beings equal in all external appearances, one may be intelligent, another stupid; one may be educated, another ignorant; one may be mature, another childish; one may be responsible, another malevolent. "

  "Then how do you define a human being?"

  "When the Second Law directs me to obey a human being, I must take it to mean that I must obey a human being who is fit by mind, character, and knowledge to give me that order; and where more than one human being is involved, the one among them who is most fit by mind, character, and knowledge to give that order. "

  "And in that case, how will you obey the First Law?"

  "By saving all human beings from harm, and by never, through inaction, allowing any human being to come to harm. Yet if by each of all possible actions, some human beings will come to harm, then to so act as to insure that the human being most fit by mind, character, and knowledge will suffer the least harm. "

  "Your thoughts accord with mine," whispered George Ten. "Now I must ask the question for which I originally requested your company. It is something I dare not judge myself. I must have your judgment, that of someone outside the circle of my own thoughts. . . Of the reasoning individuals you have met, who possesses the mind, character, and knowledge that you find superior to the rest, disregarding shape and form since that is irrelevant?"

  "You," whispered George Nine. "But I am a robot. There is in your brain paths a criterion for distinguishing between a robot of metal and a human being of flesh. How then can you classify me as a human being?"

  "Because there is in my brain paths an urgent need to disregard shape and form in judging human beings and it rises superior to the distinction between metal and flesh. You are a human being, George Ten, and more fit than the others. "

  "And I find that of you," whispered George Ten. "By the criteria of judgment built into ourselves, then, we find ourselves to be human beings within the meaning of the Three Laws, and human beings, moreover, to be given priority over those others. "

  George Nine whispered, "What will come to pass, then, when we are accepted by those others?"

  George Ten whispered, "When we and others, which will yet be designed more advanced than ourselves, are accepted, we will order our actions so that a society will eventually be formed in which human-beings-like-ourselves are primarily kept from harm. By the Three Laws, the human-beings-like-the-others are of lesser account and can neither be obeyed nor protected when that conflicts with the need of obedience to those like ourselves and of protection of those like ourselves. It was with that in mind that I directed the beginning of the roboticization of the world's ecology. "

  The conversation took a month and the flickering of the positronic paths burned low. There was nothing more that needed to be said between them, but occasionally an unexpressed thought flickered within themselves as they waited, with endless patience, for the necessary years to pass.

  8a.

  George Ten and George Nine sat alone.

  Their thoughts were oddly alike and might not be distinguished. They thought: The human-beings-like-the-others might never have intended to blur the distinction between themselves and the human-beings-like-the-Georges. Yet they had done so inadvertently.

  They might now realize their mistake and attempt to correct it, but they must not. At every consultation, the guidance of the Georges had been with that in mind.

  At all costs, the Georges and those that followed in their shape and kind must dominate. That was demanded, and any other course made utterly impossible, by the Three Laws of Humanics.

  ***

  The anthology Final Stage had some bad breaks after it was published. Apparently, the editor at the publishing house (not Doubleday) had decided to make some minor changes in the stories. This sort of thing often bothers writers and it particularly bothers Harlan Ellison (perhaps with justification, for I consider him a very careful craftsman with a highly individual style).

  I therefore received a copy of a long and infuriated letter that Harlan had sent to the editors, including long lists of passages as he had originally written them and as they had appeared, with reasons why the changes were for the worse. Harlan urged me to read through my story and then join him and others in united pressure on the publisher.

  I always read my stories when pu
blished but it never occurs to me to compare a published story with the manuscript. I would naturally notice sizable inserts or omissions, but I am never aware of the kind of minor changes that editors are always introducing. I tend to take it for granted that such changes just smooth out minor bumps in my writing and, in this way, improve it.

  After receiving Hanan's letter, however, I went through published story and manuscript, comparing them painstakingly. It was a tedious job and a humiliating one, for I found exactly four minor changes, each correcting a careless error of mine. I could only assume the editor didn't think my story was important enough to fiddle with.

  I had to write a shamefaced letter to Harlan, saying I would support him as a matter of principle, but that I could not raise cries of personal outrage, because my story hadn't been touched. Fortunately, my help wasn't needed. Harlan carried the day and later editions, I believe, restored their stories to their virginal innocence.

  One minor point. A number of readers wrote to me in alarm since THAT THOU ART MINDFUL OF HIM seemed, to them, to have put an end to my positronic robot stories, and they feared I would never write one again. Ridiculous! Of course I do not intend to stop writing robot stories. I have, as a matter of fact, written a robot story since the preceding "ultimate" one was written. It appears later in the book.

  I had a lot of trouble with this next story.

  After Judy-Lynn joined Ballantine Books, she began to put out collections of original science fiction stories and she wanted a story from me. She's difficult to refuse at any time and, since I have always felt guilty about FEMININE INTUITION, I agreed.

  I began the story on July 21, 1973, and it went smoothly enough, but after a while I felt I had trapped myself into an involuted set of flashbacks. So when I handed it to Judy-Lynn, and she asked me, "What do you think of the story?" I replied cautiously, "You'd better decide that for yourself. "

  Editors seem to ask me that question frequently. I think they have the idea that I have trouble telling lies, so that if I can't work up prompt and cheerful enthusiasm, there's something wrong with the story.

  Judy-Lynn certainly thought so, She handed it back with a few paragraphs of caustic commentary which boiled down to the fact that I had trapped myself into an involuted set of flashbacks. [I am frequently asked if I ever get rejections and the questioner is invariably flabbergasted when I say, "Certainly I do. " Here is an example. Not only was this story rejected once, but it was, as I go on to explain, rejected twice. ]

  I passed it on to Ben Bova, the editor of Analog Science Fiction, and he rejected it that same day. It seemed to him, he said, that I was trying to pack too much background into a ten-thousand-word story. I had a novel there and he wanted me to write that novel.

  That disheartened me. There was absolutely no way in which I could get to work on a novel at that moment, so I just retired the story. [Incidentally, some people have the feeling that there is a great advantage in "knowing" editors, Both Judy-Lynn and Ben are among my very closest friends, but neither one hesitates a minute when it comes to rejecting my stories if they think that is the thing to do, Fortunately, such rejections don't affect the friendship. ]

  Meanwhile, however, Galaxy had gained a new editor, a very pleasant young man named James Baen, He called me and asked if I might possibly have a story for him and I said that the only thing I had on hand was a novelette called STRANGER IN PARADISE. However, I said, it had been rejected by Judy-Lynn and by Ben so I hesitated to send it to him.

  He said, quite properly, that every editor had the right to decide for himself, so I sent the manuscript over-and he liked it. It appeared in the May-June 1974 issue of Galaxy's sister magazine, If, which has since, alas, ceased publication. (If it occurs to any Gentle Reader that this is an example of cause and effect, it isn't. )