Read The Naked Sun Page 17

“I don’t know. He was working with Jothan to the end.”

  “But he thought Jothan was a bad Solarian for refusing to marry?”

  “Rikaine once said that marriage was the hardest thing in life, but that it had to be endured.”

  “What did you think?”

  “About what, Elijah?”

  “About marriage. Did you think it was the hardest thing in life?”

  Her expression grew slowly blank as though she were painstakingly washing emotion out of it. She said, “I never thought about it.”

  Baley said, “You said you go on walks with Jothan Leebig all the time, then corrected yourself and put that in the past. You don’t go on walks with him anymore, then?”

  Gladia shook her head. Expression was back in her face. Sadness. “No. We don’t seem to. I viewed him once or twice. He always seemed busy and I didn’t like to——You know.”

  “Was this since the death of your husband?”

  “No, even some time before. Several months before.”

  “Do you suppose Dr. Delmarre ordered him not to pay further attention to you?”

  Gladia looked startled. “Why should he? Jothan isn’t a robot and neither am I. How can we take orders and why should Rikaine give them?”

  Baley did not bother to try to explain. He could have done so only in Earth terms and that would make things no clearer to her. And if it did manage to clarify, the result could only be disgusting to her.

  Baley said, “Only a question. I’ll view you again, Gladia, when I’m done with Leebig. What time do you have, by the way?” He was sorry at once for asking the question. Robots would answer in Terrestrial equivalents, but Gladia might answer in Solarian units and Baley was weary of displaying ignorance.

  But Gladia answered in purely qualitative terms. “Midafternoon,” she said.

  “Then that’s it for Leebig’s estate also?”

  “Oh yes.”

  “Good. I’ll view you again as soon as I can and we’ll make arrangements for seeing.”

  Again she grew hesitant. “Is it absolutely necessary?”

  “It is.”

  She said in a low voice, “Very well.”

  There was some delay in contacting Leebig and Baley utilized it in consuming another sandwich, one that was brought to him in its original packaging. But he had grown more cautious. He inspected the seal carefully before breaking it, then looked over the contents painstakingly.

  He accepted a plastic container of milk, not quite unfrozen, bit an opening with his own teeth, and drank from it directly. He thought gloomily that there were such things as odorless, tasteless, slow-acting poisons that could be introduced delicately by means of hypodermic needles or high-pressure needle jets, then put the thought aside as being childish.

  So far murders and attempted murders had been committed in the most direct possible fashion. There was nothing delicate or subtle about a blow on the head, enough poison in a glass to kill a dozen men, or a poisoned arrow shot openly at the victim.

  And then he thought, scarcely less gloomily, that as long as he hopped between time zones in this fashion, he was scarcely likely to have regular meals, Or, if this continued, regular sleep.

  The robot approached him. “Dr. Leebig directs you to call sometime tomorrow. He is engaged in important work.”

  Baley bounced to his feet and roared, “You tell that guy——”

  He stopped. There was no use in yelling at a robot. That is, you could yell if you wished, but it would achieve results no sooner than a whisper.

  He said in a conversational tone, “You tell Dr. Leebig, or his robot if that is as far as you’ve reached, that I am investigating the murder of a professional associate of his and a good Solarian. You tell him that I cannot wait on his work. You tell him that if I am not viewing him in five minutes, I will be in a plane and at his estate seeing him in less than an hour. You use that word, seeing, so there’s no mistake.”

  He returned to his sandwich.

  The five minutes were not quite gone, when Leebig, or at least a Solarian whom Baley presumed to be Leebig, was glaring at him.

  Baley glared back. Leebig was a lean man, who held himself rigidly erect. His dark, prominent eyes had a look of intense abstraction about them, compounded now with anger. One of his eyelids drooped slightly.

  He said, “Are you the Earthman?”

  “Elijah Baley,” said Baley, “Plainclothesman C-7, in charge of the investigation into the murder of Dr. Rikaine Delmarre. What is your name?”

  “I’m Dr. Jothan Leebig. Why do you presume to annoy me at my work?”

  “It’s easy,” said Baley quietly. “It’s my business.”

  “Then take your business elsewhere.”

  “I have a few questions to ask first, Doctor. I believe you were a close associate of Dr. Delmarre. Right?”

  One of Leebig’s hands clenched suddenly into a fist and he strode hastily toward a mantelpiece on which tiny clockwork contraptions went through complicated periodic motions that caught hypnotically at the eye.

  The viewer kept focused on Leebig so that his figure did not depart from central projection as he walked. Rather the room behind him seemed to move backward in little rises and dips as he strode.

  Leebig said, “If you are the foreigner whom Gruer threatened to bring in——”

  “I am.”

  “Then you are here against my advice. Done viewing.”

  “Not yet. Don’t break contact.” Baley raised his voice sharply and a finger as well. He pointed it directly at the roboticist, who shrank visibly away from it, full lips spreading into an expression of disgust.

  Baley said, “I wasn’t bluffing about seeing you, you know.”

  “No Earthman vulgarity, please.”

  “A straightforward statement is what it is intended to be. I will see you, if I can’t make you listen any other way. I will grab you by the collar and make you listen.”

  Leebig stared back. “You are a filthy animal.”

  “Have it your way, but I will do as I say.”

  “If you try to invade my estate, I will—I will——”

  Baley lifted his eyebrows. “Kill me? Do you often make such threats?”

  “I made no threat.”

  “Then talk now. In the time you have wasted, a good deal might have been accomplished. You were a close associate of Dr. Delmarre. Right?”

  The roboticist’s head lowered. His shoulders moved slightly to a slow, regular breathing. When he looked up, he was in command of himself. He even managed a brief, sapless smile.

  “I was.”

  “Delmarre was interested in new types of robots, I understand.”

  “He was.”

  “What kind?”

  “Are you a roboticist?”

  “No. Explain it for the layman.”

  “I doubt that I can.”

  “Try! For instance, I think he wanted robots capable of disciplining children. What would that involve?”

  Leebig raised his eyebrows briefly and said, “To put it very simply, skipping all the subtle details, it means a strengthening of the C-integral governing the Sikorovich tandem route response at the W-65 level.”

  “Double-talk,” said Baley.

  “The truth.”

  “It’s double-talk to me. How else can you put it?”

  “It means a certain weakening of the First Law.”

  “Why so? A child is disciplined for its own future good. Isn’t that the theory?”

  “Ah, the future good!” Leebig’s eyes glowed with passion and he seemed to grow less conscious of his listener and correspondingly more talkative. “A simple concept, you think. How many human beings are willing to accept a trifling inconvenience for the sake of a large future good? How long does it take to train a child that what tastes good now means a stomach-ache later, and what tastes bad now will correct the stomach-ache later? Yet you want a robot to be able to understand?

  “Pain inflicted by a robot on a child se
ts up a powerful disruptive potential in the positronic brain. To counteract that by an anti-potential triggered through a realization of future good requires enough paths and bypaths to increase the mass of the positronic brain by 50 percent, unless other circuits are sacrificed.”

  Baley said, “Then you haven’t succeeded in building such a robot.”

  “No, nor am I likely to succeed. Nor anyone.”

  “Was Dr. Delmarre testing an experimental model of such a robot at the time of his death?”

  “Not of such a robot. We were interested in other more practical things also.”

  Baley said quietly, “Dr. Leebig, I am going to have to learn a bit more about robotics and I am going to ask you to teach me.”

  Leebig shook his head violently, and his drooping eyelid dipped further in a ghastly travesty of a wink. “It should be obvious that a course in robotics takes more than a moment. I lack the time.”

  “Nevertheless, you must teach me. The smell of robots is the one thing that pervades everything on Solaria. If it is time we require, then more than ever I must see you. I am an Earthman and I cannot work or think comfortably while viewing.”

  It would not have seemed possible to Baley for Leebig to stiffen his stiff carriage further, but he did. He said, “Your phobias as an Earthman don’t concern me. Seeing is impossible.”

  “I think you will change your mind when I tell you what I chiefly want to consult you about.”

  “It will make no difference. Nothing can.”

  “No? Then listen to this. It is my belief that throughout the history of the positronic robot, the First Law of Robotics has been deliberately misquoted.”

  Leebig moved spasmodically. “Misquoted? Fool! Madman! Why?”

  “To hide the fact,” said Baley with complete composure, “that robots can commit murder.”

  14

  A MOTIVE IS REVEALED

  Leebig’s mouth widened slowly. Baley took it for a snarl at first and then, with considerable surprise, decided that it was the most unsuccessful attempt at a smile that he had ever seen.

  Leebig said, “Don’t say that. Don’t ever say that.”

  “Why not?”

  “Because anything, however small, that encourages distrust of robots is harmful. Distrusting robots is a human disease!”

  It was as though he were lecturing a small child. It was as though he were saying something gently that he wanted to yell. It was as though he were trying to persuade when what he really wanted was to enforce on penalty of death.

  Leebig said, “Do you know the history of robotics?”

  “A little.”

  “On Earth, you should. Yes. Do you know robots started with a Frankenstein complex against them? They were suspect. Men distrusted and feared robots. Robotics was almost an undercover science as a result. The Three Laws were first built into robots in an effort to overcome distrust and even so, Earth would never allow a robotic society to develop. One of the reasons the first pioneers left Earth to colonize the rest of the Galaxy was so that they might establish societies in which robots would be allowed to free men of poverty and toil. Even then, there remained a latent suspicion not far below, ready to pop up at any excuse.”

  “Have you yourself had to counter distrust of robots?” asked Baley.

  “Many times,” said Leebig grimly.

  “Is that why you and other roboticists are willing to distort the facts just a little in order to avoid suspicion as much as possible?”

  “There is no distortion!”

  “For instance, aren’t the Three Laws misquoted?”

  “No!”

  “I can demonstrate that they are, and unless you convince me otherwise, I will demonstrate it to the whole Galaxy, if I can.”

  “You’re mad. Whatever argument you may think you have is fallacious, I assure you.”

  “Shall we discuss it?”

  “If it does not take too long.”

  “Face to face? Seeing?”

  Leebig’s thin face twisted. “No!”

  “Good-bye, Dr. Leebig. Others will listen to me.”

  “Wait. Great Galaxy, man, wait!”

  “Seeing?”

  The roboticist’s hands wandered upward, hovered about his chin. Slowly a thumb crept into his mouth and remained there. He stared, blankly, at Baley.

  Baley thought: Is he regressing to the pre-five-year-old stage so that it will be legitimate for him to see me?

  “Seeing?” he said.

  But Leebig shook his head slowly. “I can’t. I can’t,” he moaned, the words all but stifled by the blocking thumb. “Do whatever you want.”

  Baley stared at the other and watched him turn away and face the wall. He watched the Solarian’s straight back bend and the Solarian’s face hide in shaking hands.

  Baley said, “Very well, then, I’ll agree to view.”

  Leebig said, back still turned, “Excuse me a moment. I’ll be back.”

  Baley tended to his own needs during the interval and stared at his fresh-washed face in the bathroom mirror. Was he getting the feel of Solaria and Solarians? He wasn’t sure.

  He sighed and pushed a contact and a robot appeared. He didn’t turn to look at it. He said, “Is there another viewer at the farm, besides the one I’m using?”

  “There are three other outlets, master.”

  “Then tell Klorissa Cantoro—tell your mistress that I will be using this one till further notice and that I am not to be disturbed.”

  “Yes, master.”

  Baley returned to his position where the viewer remained focused on the empty patch of room in which Leebig had stood. It was still empty and he settled himself to wait.

  It wasn’t long. Leebig entered and the room once more jiggled as the man walked. Evidently focus shifted from room center to man center without delay. Baley remembered the complexity of viewing controls and began to feel a kind of appreciation of what was involved.

  Leebig was quite master of himself now, apparently. His hair was slicked back and his costume had been changed. His clothes fitted loosely and were of a material that glistened and caught highlights. He sat down in a slim chair that folded out of the wall.

  He said soberly, “Now what is this notion of yours concerning First Law?”

  “Will we be overheard?”

  “No. I’ve taken care.”

  Baley nodded. He said, “Let me quote the First Law.”

  “I scarcely need that.”

  “I know, but let me quote it, anyway: A robot may not harm a human being or, through inaction, allow a human being to come to harm.”

  “Well?”

  “Now when I first landed on Solaria, I was driven to the estate assigned for my use in a ground-car. The ground-car was a specially enclosed job designed to protect me from exposure to open space. As an Earthman——”

  “I know about that,” said Leebig impatiently. “What has this to do with the matter?”

  “The robots who drove the car did not know about it. I asked that the car be opened and was at once obeyed. Second Law. They had to follow orders. I was uncomfortable, of course, and nearly collapsed before the car was enclosed again. Didn’t the robots harm me?”

  “At your order,” snapped Leebig.

  “I’ll quote the Second Law: A robot must obey the orders given it by human beings except where such orders would conflict with the First Law. So you see, my order should have been ignored.”

  “This is nonsense. The robot lacked knowledge——”

  Baley leaned forward in his chair. “Ah! We have it. Now let’s recite the First Law as it should be stated: A robot may do nothing that, to its knowledge, will harm a human being; nor, through inaction, knowingly allow a human being to come to harm.”

  “This is all understood.”

  “I think not by ordinary men. Otherwise, ordinary men would realize robots could commit murder.”

  Leebig was white. “Mad! Lunacy!”

  Baley stared at his finger ends
. “A robot may perform an innocent task, I suppose; one that has no damaging effect on a human being?”

  “If ordered to do so,” said Leebig.

  “Yes, of course. If ordered to do so. And a second robot may perform an innocent task, also, I suppose; one that also can have no damaging effect on a human being? If ordered to do so?”

  “Yes.”

  “And what if the two innocent tasks, each completely innocent, amount to murder when added together?”

  “What?” Leebig’s face puckered into a scowl.

  “I want your expert opinion on the matter,” said Baley. “I’ll set you a hypothetical case. Suppose a man says to a robot, ‘Place a small quantity of this liquid into a glass of milk that you will find in such and such a place. The liquid is harmless. I wish only to know its effect on milk. Once I know the effect, the mixture will be poured out. After you have performed this action, forget you have done so.’ ”

  Leebig, still scowling, said nothing.

  Baley said, “If I had told the robot to add a mysterious liquid to milk and then offer it to a man, First Law would force it to ask, ‘What is the nature of the liquid? Will it harm a man?’ And if it were assured the liquid was harmless, First Law might still make the robot hesitate and refuse to offer the milk. Instead, however, it is told the milk will be poured out. First Law is not involved. Won’t the robot do as it is told?”

  Leebig glared.

  Baley said, “Now a second robot has poured out the milk in the first place and is unaware that the milk has been tampered with. In all innocence, it offers the milk to a man and the man dies.”

  Leebig cried out, “No!”

  “Why not? Both actions are innocent in themselves. Only together are they murder. Do you deny that that sort of thing can happen?”

  “The murderer would be the man who gave the order,” cried Leebig.

  “If you want to be philosophical, yes. The robots would have been the immediate murderers, though, the instruments of murder.”

  “No man would give such orders.”

  “A man would. A man has. It was exactly in this way that the murder attempt on Dr. Gruer must have been carried through. You’ve heard about that, I suppose.”

  “On Solaria,” muttered Leebig, “one hears about everything.”