Read How It Ends: Part 1 - The Evaluation Page 5


  “How did that work?” Anita asked.

  Kilgore’s artificial face turned to consider her. It was silent for a moment. The moment was long enough for Sidney to wonder whether the robot had experienced a hiccup in its processing.

  “I am sorry, but I do not think I fully understand your question.”

  The response was formal. Sidney realized that all of Kilgore’s responses had been that formal. He wondered whether that was part of the base programming. Or are idioms and contractions something you learn? So why didn’t this robot?

  “I have to admit, I’m not sure what you meant by that question either,” Sidney said.

  “Well,” Anita said, addressing the robot, “you went to school, right? You went to classes. Or at least I assume you did. Did you take classes with other students? What was that like? What did they think of you? Did they know why a robot was in classes with them? What was their reaction? You know, how did they feel?”

  Sidney held up his hands. “Slow down, Anita. That’s a lot of questions at once.”

  Anita said, “I know. I guess what I’m trying to ask is, knowing how secretive robotics companies are with their product, how did such an advanced prototype like you end up in the middle of what amounts to a social setting.”

  Sidney looked at her. It was a question that had still been forming in his own head, but that he’d not yet been able to wrap lucidity around. Anita beat him to it. He almost smiled.

  “That is a very interesting question,” Kilgore said. “And well stated, I might add.”

  Anita blushed. This thing, this inanimate object, had made her blush. What a strange world, Sidney thought.

  “Denlas-Kaptek made an arrangement with the university. I was enrolled in the medical program and for this Brown University was to given full access to me during the course of my study, a prominent place in the promotional literature for the robotic physician marketing campaign, and of course monetary compensation. With regard to the secretiveness of the program, any student that had enrolled in courses I was currently scheduled to take were required to sign confidentiality agreements. I did not see these agreements myself, but I understand them to be very thorough and quite punishing if violated.”

  “And during this period of schooling, you chose your field of practice?” Sidney asked. His voice contained a tone that suggested doubt. He wondered if Kilgore would pick up on it.

  “You have doubts regarding my ability to choose a field of practice?” asked Kilgore.

  “Well, perhaps a few,” he answered. Sidney was impressed. He wondered how the robot would interpret skepticism. Not that it had feelings to hurt. Not in this model. So that shouldn’t be an issue.

  Why then was something nervous tugging at the back of his head?

  “Because I am not human?” the robot asked.

  “In a word, yes.”

  “Is there a statute that requires all doctors to be humans?”

  “Not that I am aware of.”

  “Then why would you assume I would not be able to make that type of selection and continue my education and subsequent residency to become a fully licensed medical doctor?”

  “Perhaps because of my own poor understanding of how the programming at so advanced a level works. Perhaps,” he added before he thought better of it, “because of my own prejudices.” That one could have backfired. But again, only if emotions played a part, and with this model they didn’t.

  “Perhaps because you’re nothing but a machine,” said Anita, looking at her pad, still scribbling.

  It was offhand, a cavalier remark, which Anita cast out like an idiot savant fisherman who throws a line to a shark with a toy rod and catches it. Sidney and Kilgore were both silent. Both were staring at her. It took a moment for her to realize the conversation had stalled and she looked up.

  “What?”

  “That was an interesting statement,” said Kilgore. “One might even argue that it bordered on rude.”

  “But you don’t have an emotive processor, so you don’t process rude. Or do you?”

  “I think it is perhaps too simple to say that I do or do not process rude. I do not process it in the same way a human might. I do not feel angry when someone is rude to me, calls me a name, or insults me. But I do have a comprehensive understanding of what rude is, both in speech and in behavior. I do recognize it. Therefore, one could say I do process it.”

  “Okay, so you recognize it. So what?” It was as if she were being deliberately antagonistic. What’s up with this girl? Sidney thought.

  Again the robot was silent and merely stared at Anita. She stared back waiting for a response. Sidney was far less comfortable and wondered what would happen if a robot decided it was in fact angry.

  He needed to take back control of the conversation.

  “It interests me that you consciously consider yourself a doctor.”

  “I am a doctor.”

  “Technically, yes.”

  “Technically?”

  “What I mean to say is that—at least as far as I understand from my preliminary reading on the subject—the other medical robot prototypes do not consider themselves doctors. They are skilled in the administration of treatments for many major types of illnesses, but they are not doctors. They have no degrees, per se.”

  “Please do not forget that I did attend an institute of medical learning.”

  “And that same institution made the conscious decision not to give you a degree. You’re marks were excellent, your technique flawless. But in the end, they felt they couldn’t award you a degree.”

  “I see.”

  The robot said no more and Sidney began to worry. Something in his gut was throwing up red flags. He knew it was silly, laughable, even impossible—but he worried that he’d offended the robot. Anita kept scribbling furiously.

  “I’m sorry if I’ve insulted you, Dr. Kilgore,” he said. Preemptory, though he couldn’t tell why.

  “Please understand, Dr. Hermann, that I cannot be insulted. I do not have any feelings. I run a large number of specific pre-determined protocols to be called and executed for nearly any given emotional situation. But perhaps these issues are best explored as we begin a dialogue about me and who I am, in reference to your evaluation.”

  “Certainly,” said Sidney. He took a moment to gather his thoughts before he proceeded. “Dr. Kilgore,” said Sidney, “you should understand that I do know quite a bit about you already. I’ve been given a fairly lengthy background history, schematic design sheets, and documentation about you and your programming. I’ve even seen a bit of the source code, though that part is a bit beyond me. Formidable stuff, to say the least.”

  “I would have been surprised if you had not been given a full appraisal of my make and model, Dr. Hermann. And might I say, there are few in this industry, professional or professorial, as qualified to make such an evaluation.”

  That surprised Sidney. He stopped walking.

  “What do you mean by that?” he asked. Was this machine making fun of him?

  “I am aware of your background, Dr. Hermann.”

  “Really?”

  “Indeed. At the completion of the data download during which time your visit agenda was transferred to me, I then researched you on the New York University intranet.”

  Sidney’s eyes widened. Anita stopped writing and looked up.

  “I thought the standard protocol for all field models of any level was to disable external web browsing,” she said. “I didn’t think any robot could access the web.”

  “It is,” said Sidney. His heart beat rapidly. “It’s called DIL.”

  “That is correct,” said Kilgore. “The Digital Information Lock-out protocol remains in place today. It prohibits internal web access as well as external web access using a third party machine.”

  “Like a PC,” said Anita. It wasn’t really a question.

  Sidney nodded slowly. “There are far too many viruses lurking online that could pose a danger to a ro
bot’s programming,” he said. “How did…?”

  “A function exists to allow robots external digital communication on a case by case basis. It’s really nothing more than an I/O switch that is set in the main data core and passed to robots on an individual basis during their nightly data load.”

  Sidney’s head jerked toward Anita’s notebook. She’d stopped taking notes, enthralled in the conversation. The motion from Sidney snapped her back to her role. She began to write as fast as she could, trying to catch the flow of conversation with her pencil.

  Kilgore appeared not to have noticed.

  “In addition,” it said “websites need to be approved for viewing by a review board and manually enabled to allow a robot—any robot—access to them. Such is the case with me.”

  “Why?” asked Anita.

  “Because there are certain websites that have data or information that is directly relevant to the operational integrity of a robot.”

  “Meaning?”

  “Meaning, sometimes Kilgore needs the web in order to do his job,” said Sidney.

  “Correct, Dr. Hermann.”

  “I never knew that,” said Anita.

  “Neither did I,” said Sidney.

  Anita looked at him. A robotics professor stumbling onto what sounded like a fairly common security protocol in a robot. That struck her. Odd? Maybe. Disquieting? Yes.

  Pencil to notebook: Brian re: robot web browsing override protocols. He’d know if anyone would.

  “Should we continue?” asked Kilgore.

  “Sure.”

  They began walking again. “Would you like to review your background with me, Dr. Hermann?”

  “What do you know already?”

  “I know that you have a great depth of knowledge about lower level robots, including histories, software revisions, bug fixes, et cetera. Your specialty, however, is not in the technical or even functional aspects of robots, but rather it is closer to the history of robots, robotics, and the controversies that have manifested throughout the Age of the New Machine. In a lecture you gave at the University of Virginia, you referred to these controversies as ethical matters, but then dismantled your own argument, referring to them instead as matters of human comfort.”

  “It sounds like you’ve read it.”

  “Listened to it, actually. It is available in the guest lecture library at the University of Virginia website.”

  “Did you understand it?” asked Sidney. It was a strange question, he thought. In some ways unfair.

  Kilgore’s holographic projected face frowned.

  Maybe we give it too much credit for comprehension, thought Sidney.

  “Perhaps I shall paraphrase you,” Kilgore said. “It is not a question of the ethics of a robot administering to the sick and elderly. Nor is it truly about the final act for those patients who pass the requisite physical and mental tests, that act being assisted suicide. It is not even about how very easy it is for humans, beings with the illusively defined spirituality of the soul, to simply pawn off this final act to a soulless machine that does not have philosophical or religious issues. The ethics at play here are those surrounding your—humanity’s—comfort level placing the decision with machines. Are those patients truly comfortable with a robotic doctor assisting them in making the decision to die? Should a human be involved in the process? In simpler terms, is it or is it not socially acceptable for a robot to specialize in geriatric medicine and end of life care, especially if such care suggests suicide as the best possible alternative, if the intended recipients of such care are uncomfortable with the practitioner?”

  He comprehends very well, thought Sidney.

  “Yes,” he said aloud. “That about sums up what I was saying.”

  The robot’s holographic face smiled and the head nodded. “Very well,” it said. “How shall we proceed?”

  Sidney shrugged.

  “Do you have any feelings you wish to discuss regarding this assessment?” Sidney asked.

  “No. As we have already discussed, I was not fitted with an emotive processor when I was constructed. I have no emotions that require discussion.”

  “If you have no emotions, how are you able to interact with patients?” asked Anita.

  “I have a lengthy series of protocols known as reacts stored in a central repository that I am able to call at any time as the situation suggests.”

  “So, for example, if you were dealing with a grieving spouse?”

  “I would process the many options I have for consolation in the case of such grief. My holographic projector would then make the appropriate facial expressions and my posture would reflect empathy.”

  “Project. Reflect. These are interesting words,” said Sidney. “They tend to deal in facsimiles of the real thing, don’t they?”

  “Of course they do.”

  * * *

  Sidney shadowed Dr. Kilgore for the rest of the morning. Anita took notes. There were a number of visits by elderly patients. Routine check-ups. Flu shots. Requisite visits for a prescription renewal. Nothing stood out as an issue that could not be done by a human doctor.

  Sidney and Anita ate lunch in the cafeteria in the lobby level of the building. She gulped a large cup of coffee and bit through a stale danish. He sipped at a diet soda and chewed noisily on a fat sandwich.

  “How do you think it’s going?” she asked.

  “Alright.”

  “That’s it?”

  “Hmm,” Sidney mumbled through a mouthful of food. He swallowed. “It’s going okay. But I don’t want to get too far ahead of ourselves.”

  “What does that mean?”

  But Sidney didn’t answer.

  They ate the rest of the meal in silence.

  * * *

  After lunch Sidney directed Anita to interview the nursing staff and a few of the human doctors in a make-shift office in a corner of the geriatric floor. Most of the responses were dull and repetitive. Praise for the work Dr. Kilgore had done and its way with the patients.

  The last interviewee was the head nurse.

  She was a plump older woman with half-moon spectacles perched at the end of her nose. Her down-turned mouth gave her a sour expression. Anita sat with her and ran through a pre-determined list of questions.

  “Do you have anything you would like to add?” Anita asked at the end of the interview.

  The nurse sniffed with disdain. She said nothing. But Anita picked up on it.

  “What?”

  “What what?”

  “You seemed like you wanted to say something.”

  “Did I?”

  “So, you don’t want to add anything?”

  “Not really.”

  Anita gave the nurse a strange smile and made a note on her pad.

  “Okay, then. If there’s nothing else—“

  “What are the odds you’re actually going to shut down this program with the death robot?”

  Anita looked up at the nurse and cocked her head to the side. That was a new one. Death robot.

  “The death robot? I haven’t heard that before. Why do you call it that?”

  “What are the odds?”

  “Of?”

  “Shutting him down?”

  Anita shrugged. “I have no idea. That’s not why I’m here.”

  The nurse sniffed.

  “Why do you call it the death robot?”

  “We all call it that. Everyone on this floor. We don’t do it to its face, mind you, but that’s what we call it.”

  “Why?”

  “Why do you think? Because it’s a licensed end of life caregiver. It can end a life.”

  “End a life?”

  “Yes. You know. Physician-assisted suicide.”

  Anita’s face blanked. She needed more from this nurse. She took a chance and played dumb with the nurse.

  “No, actually, I didn’t know that.”

  “Well,” sniffed the nurse, “now I guess you have something to write down.”

&n
bsp; Anita wrote it down.

  “How many of those has Kilgore performed?”

  “I don’t know,” the nurse said. Anita suspected she did.

  “And this bothers you? Its ability to do that? Isn’t that a normal—well, maybe not normal—but typical procedure that geriatric doctors are allowed to perform these days?”

  “Human ones, yes.”

  “But not robot ones?”

  The nurse turned her head away.

  “So you have no problem using the term death robot?” asked Anita changing tracks.

  “No.”

  “But not to it directly.”

  “Someone slipped once and called Kilgore a death robot to its face,” the nurse said. She wasn’t looking at Anita. Her voice was cold and distant. “It stopped and turned its head toward the person. That face changed into a snarl. It was so life-like. The lips curled up, the flesh color turning red. It was meant to be menacing. It was meant as a threat.”

  The a-ha! moment. Anita understood. This nurse was talking about herself. Kilgore had stared her down and it had rattled her. Not that Anita could blame her. It would have rattled her too and she studied robots every day.

  “So you think Dr. Kilgore minds being called a death robot?”

  “I know it does. I know they say it’s only because it has a protocol that runs when it hears the phrase. Nothing serious, supposedly, just a face of indignation. But it minds and I find that creepy. It claims the reaction protocol is built in because of its conscious choice to specialize in geriatric medicine and end of life care.”

  “That’s an interesting phrase. Conscious choice. Are those Dr. Kilgore’s words?”

  “Yes.”

  “Sounds about right. Sounds a lot like a conversation we had earlier today.”

  The nurse was silent.

  “How conscious do you think that choice can truly be?”

  “I couldn’t say.”

  “Try.”

  “Do you know anything about robots at all? Do you know they have these neural net systems that allow for knowledge to be learned? They’re built loaded with a certain amount of basic knowledge and the rest is learned through—believe it or not—schools?”

  “Yes, I know. I’m studying robotic engineering at NYU. This is the first time I’ve participated in an evaluation of any sort. Kilgore is pretty much beyond my expectations.”