“I really couldn’t have done this without your help,” Sidney said. “It would have taken me forever to cull through all the academic research. That sort of thing is not my strong suit.”
He looked at her.
“Thank you. I mean that.”
Anita wasn’t listening. Her brow was pulled into a furrow. A strange look was on her face.
“Is something wrong?” he asked.
“My name isn’t on it as an author.”
“No, it’s not.”
“Uh, shouldn’t it be?”
“Well, no,” Sidney said. Almost sheepish, embarrassed. “You’re listed as the research assistant.”
“I wrote half the material in this thing.”
“I understand that—”
“You wouldn’t have gotten half the quality of research if you’d done it yourself,” she cut him off.
“I realize that—”
“And the best I get is ‘research assistant’?”
Her voice was rising even as her body remained motionless. Sidney could tell her muscles were tightening. Like a snake, coiling.
“That’s kind of a standard credit people in your position get.”
“People in my position?” Fuming. “What ‘position’ is that?”
“Look,” he said. He was beginning to feel agitated himself. “This is how things work in academia,” he said. “RAs do the grunt work and professors get the credit.”
“Well, a fat lot of good that does me.”
She tossed the report across the desk at him. He lifted his arms in a futile attempt to catch it. He had never been mistaken for athletic.
Now he was angry. Angry at her attitude, angry at the way she showed it, angry at her tantrum. And after a bit of thought, he was angry at Brian for not explaining this to her upfront.
“I can take your name off if you’d like,” he snapped.
Silence.
“It would certainly help you get another such opportunity or even get to lead one yourself. But if you want nothing more to do with it, I can take your name off.”
He barely heard her. She muttered as softly as she could.
“No.”
“Okay, then.”
After a moment of silent she said, “This pretty much sucks.”
“Maybe, but you can use this credit to your advantage.”
She was silent. Time to nip this in the bud.
“Stop being an insolent child and say thank you.”
“Thank you.”
“Very good. Now good-bye.”
It was a brusque send-off, far more brusque than Sidney usually was, but that was just too bad. He wasn’t going to sit and be harassed by a hissy fit.
She paused in the doorway, door knob in hand. He could tell she wanted to slam the door. She didn’t. She let go of the door knob and walked away.
* * *
Sidney sat in the wood paneled waiting area outside Eric’s office. He busied himself by reading through the report once more. He had a copy of it with him. He had sent DKI their copy a week ago. Within a day he’d been asked to come in and review it. He knew his conclusion would not be popular with DKI. He suspected they hadn’t even read the report. He suspected they turned right to the conclusion page and read.
The intercom beeped at the administrative assistant’s station. A robot sat there. It had a boy’s facial features with high cheek bones, a narrow nose and blonde synthetic hair. The silicon of its face was a Caucasian skin tone. Appropriate color had been added in the cheeks and the lips. Its expression was almost natural. Yet it was just artificial enough to make the people uncomfortable. It was dressed smartly in a jacket and tie. It pushed a button on the desk. The door to the office buzzed once. With a soft click the door opened slightly. The robot smiled at Sidney and motioned him inside with a carefully masked mechanical hand.
The transition from the area outside Eric’s office to inside was pronounced. From wood paneling to bright white walls and a shimmering glass wall window. The Hudson was visible beyond, gleaming brightly with the day. The furniture was severe, hard steel and glass and angular. The stark walls were unobscured by pictures of any kind. Eric Breckenridge motioned Sidney into one of the chairs without rising himself.
Eric said, “You’ve reached an interesting conclusion, Sidney.”
“Dr. Hermann, please.”
Eric arched his eyebrows. As if he cared how Sidney wanted to be addressed.
“As I said, you’ve reached an interesting conclusion.”
“You think so?”
“No doubt. Interesting and troubling.”
“No doubt.”
Silence between them.
“I’m sorry the report was not what you were hoping for,” Sidney offered.
“On the contrary. It is exactly what I was hoping for.”
“Really?”
“Absolutely.”
“I’m afraid I don’t understand.”
Eric folded his hands. His fingers were long and looked serpentine when he entwined them.
“Your report is exactly what I was looking for. Raw feedback. Emotional feedback. I—we—need that kind of feedback.”
“Then you agree with my conclusion.”
An inarticulate noise escaped Eric. Sidney thought it might have been amusement. Eric might have laughed. Sidney wasn’t sure.
“Hardly.”
Eric stood and walked to one corner of his office. In the morning he or perhaps his robotic assistant made a habit of opening the long vertical blinds along the window wall. Pulling them back into the corner. There they sat, bunched and swaying in the artificial breeze of the environmental controls. Sunlight spilled through them, casting long stripes of shadows on the wall.
He stood among the shadows.
“I’m confused,” said Sidney. “My report was what you were hoping for but you’re contesting my conclusions?”
“Absolutely.”
“That’s contradictory.”
“Not at all.”
“Really? How so?”
“We needed to get an unvarnished opinion of how the public perceives the robotic doctor. We needed to get this from a legitimate source. A scholar in the field of robotics is an absolute fit. And some of the reactions you recorded were not unexpected. But your final recommendation is suspension of the program and re-evaluation of its merits. This is completely out of the question.”
Sidney shifted in the chair. It was square and modern in its design. It was uncomfortable. He shifted again.
“I think it is in the best interest of the company and in the hospital to terminate the program and re-assign robot 78190736—Kilgore, as he is called—to a different environment,” Sidney said. That would not go over well but Eric seemed a man of directness who would respect directness in return.
Eric’s face darkened. His brows came together.
“Did it misdiagnose a patient?”
“No.”
“Did it fail to act, medically speaking, when action was needed?”
“No.”
“Did one of its protocols fail?”
“No.”
“Did harm or death come to any patient it shouldn’t have under its care?”
“No.”
“Then I fail to see why you feel the program should be shut down.”
His glare deepened into something dangerous. Sidney shifted again in the uncomfortable chair. He had little choice. Explain the gut feeling. Explain or retract the recommendation.
“It made me, well, I guess you’d call it nervous,” he said.
“Nervous?”
“Yes. Nervous.”
It sounded stupid as he said it. He knew it sounded stupid. He couldn’t help it. Nervous was the best he had. Eric looked at him. He felt his face flush.
“All the observations, all the empirical data, all of the information regarding the construction and programming of the unit, and your entire recommendation boils down to ‘nervous’?”
Er
ic walked back across the office from his spot in the corner of the window. He pulled out the chair and sat down in it with a deliberate motion and leaned back and placed his hands behind his head in a gesture of relaxation. What to do, what to do. The program rested on this evaluation. Not the entire program perhaps but a significant setback could be incurred on a recommended shutdown. This would not be particularly pleasant to explain to the board. The fat board member in the white suit would be especially pleased. Eric frowned at this thought. He pulled his thoughts back to the report in front of him. To the man sitting across from him. A great slob of a man with shirttails sticking out from a shirt too small and exposing softness of pale flesh.
Gently. Don’t scare him off. Trap him. Wrap him up in his own arguments. Burrow the fangs into his quickly beating heart and drink the logic from his argument. Make his argument a shell. Then crush him.
“Could you explain this a little more clearly?” he asked. Gently.
Sidney described his experience. Talking with the robot. Relating to the robot. Observing the robot. Interviewing those who worked with the robot. He came to the moment when Mr. Carroway took his final breath. He described what seemed to be the act of self-preservation on the part of the robot. These things together made him nervous.
Eric sighed.
“I had hoped, Sidney, that you would be more open-minded.”
“I was very open-minded.”
“It doesn’t sound like it. You were picked for a couple of reasons. You are academically qualified. You’ve done fieldwork. Granted, mostly with labor robots.”
He paused.
“You’ve never done any fieldwork with a robot above a level F, have you?”
“No.”
“I didn’t think so. Let me try to put your fears to rest. The look the robot gave you: purely a response protocol. Happens all the time.”
“You didn’t see the look.”
“I don’t have to. I know for a fact that these kinds of protocols are wired right into the units and they pull them up when the situation requires it. You simply got caught up in one of those moments.”
Sidney shook his head unconvinced.
“Let me help you with your other concern,” said Eric. “Self-preservation is one of the features we program into these units.”
“Even the ones without emotive processors?”
“Yes.”
“But wouldn’t you think self-preservation is tied to emotional responses to immediate danger?”
“Not at all. Animals have a self-preservation instinct. From the most intelligent dolphin to the lowliest sponge, every animal has the urge, the instinct to save itself if danger emerges.”
“Yes, but instinct is one of those elements that only living beings have. How can you program instinct into a robot?”
“The same way we program emotions. It’s incredibly difficult to do, but it’s doable. We definitely program instinct into these units.”
“But instinct is based on the adaptation of generations of observations and experiences made in our exploration of the world. As we or any species moves through the world, we log the external stimuli until they become an inherent disposition toward specific behaviors. It’s almost like having a gut feeling, if you will.”
“That was an awful lot of words to get to ‘gut feeling’, Sidney.”
Sidney frowned at Eric.
“Gut feeling is exactly the point,” said Eric. “Gut feelings and instincts are essential to basic human survival. Fight or flight. What happens when you hear a sound that scares you? You can run, you can freeze, or you can stay and fight. One of the things we try to do is incorporate these instincts into the programming of robotic brains.”
Sidney shifted in the chair. He felt uncomfortable under Eric’s stare.
“So what you’re saying is that, in addition to intelligence and emotions, we can program instincts and gut feelings into newer higher end models.”
“Yes.”
“What if, in the interest in their self-preservation, a robot decides to kill a human?”
So close yet still so far, thought Eric. So much to learn. He needs to be exposed to more advanced models. Exposed to their capabilities.
Eric smiled. He had a sudden and wicked idea.
Exposure. It’s all about exposure.
Time to educate the professor.
He stabbed a button on the intercom with his finger.
“Gammons, come in here please,” Eric said.
The intercom crackled the reply. “Yes, sir.”
“Sidney, you must be familiar with the behavior inhibitors,” said Eric with a smile. He took a pair of sharp scissors out of a drawer in his desk and placed them at the edge of the desk. Sidney stared at the scissors, then back to Eric.
“Yes, I am.”
“Theories, applications, history, et cetera?”
“Yes.”
“Have you ever seen it in action?”
“No. Few have.” He was not sure where this was going but he didn’t like the tone of Eric’s questions.
The door to Eric’s office opened and in walked the admin robot. It was built as a protocol unit and as such it was more invaluable to a man in Eric’s position than a simple administrative assistant. It was footman, valet, and perhaps even occasional bodyguard. Its purpose was most often that of admin though and it was in this capacity that it served him the best.
Gammons.
Sidney reflected on his first impressions of Gammons while he waited in the waiting area for Eric to admit him. He assumed that various pneumatics and high-technology wires and motors made the face move in a narrow range of expressions. Compared to the vast assortment of facial looks that he’d discovered with Kilgore’s holographic face, Gammons was a bit of a letdown.
The robot’s voice was higher in pitch than Kilgore’s. Specifically designed to be heard above any din if necessary. Specifically targeted toward his master, Eric.
“Yes, sir?”
“Gammons. Excellent. Could you explain to my guest, Dr. Hermann, about your behavioral inhibitor?”
“Certainly.” The robot turned toward Sidney. “Embedded in my programming is a logical constraint that keeps robots such as myself from acting in certain ways. Harming humans is one of the situations that calls up the constraint program. Trying to dismantle the program itself is another. Trying to dismantle another robot’s is a third way.”
“In other words,” said Eric, “these constraint programs kick in if the robot exhibits some type of specific behavior.”
“Exactly,” Gammons continued. “And should we proceed with the action, the program is designed to shut down all power systems and emit a radio wave frequency to the manufacturer advising that a robot has violated its inhibitor protocols.”
Sidney waved off the robot. “I know all this. I also know that Kilgore gets overrides for those patients—and those patients only—who have signed the end of life form and have had the revised protocol downloaded to his system during the re-power cycle.”
“True,” Eric said. “But it would only work on those patients we allow as an exception in the code. Any attempt to harm anyone else would cause failure.”
“So the theories go.”
“You’ve never really seen it,” said Eric.
“No. Why would I? These kinds of shutdowns are so rare that almost no one has. I presume the programmers and the QA guys have seen it, but why would I? Isn’t that what you test and QA for? So that no one ends up ‘seeing’ it?”
Eric smiled at Sidney. He found Sidney’s—what would be the best way to describe it? Rant? Tirade?—amusing, almost quaint. He nodded. He looked at Gammons. “Would you, please?” Eric asked.
Gammons did not respond. Instead it stared without expression at Eric. Eric’s patience grew short and his eyes narrowed. They looked harder somehow. Gammons needed no further command. With the corners of his silicon mouth pulled down slightly he nodded. Sidney watched the non-verbal communication between them.
He was still wondering what it meant when the robot attacked.
Gammons lunged forward and grabbed a pair of scissors that Eric had placed on the edge of his desk.
Once in his hand Gammons swung the scissors outward in a wide arc. The point jutted out of the robot’s closed fist. In the path of the downward swing was Sidney.
Sidney cried out.
As abruptly as Gammons attacked, it stopped in midair. All movement in its body froze midway through the attack. All motors and servos and gears and joints locked. Its eyes went dead and it hung motionless in the middle of the room.
“That is how the behavioral inhibitor works,” said Eric.
Sidney struggled to catch his breath.
For Eric, the conversation was over.
“That, I think, will be all for today,” he said.
“Are you kidding me?” Sidney panted. “You try to have me killed and then just dismiss me from your office?”
“You were never in any real danger. It was simply a demonstration of how the inhibitor works.”
“You scared the shit out of me!”
“Like any good horror movie or roller coaster would. Again, no real danger.”
“Not the same thing. Not the same thing, not by a long shot. And you know it.”
Eric sighed. Was this overweight blob of a man really going to make a federal case out of this?
“What do you want Sidney? Do you want an apology? If so, then accept mine.”
There was nothing apologetic about Eric’s voice at all.
Sidney ignored him for the moment. His eyes were now focused on Gammons.
“What happens to it now?”
“Who, Gammons? It’ll be led downstairs to the Foundry floor to be cleaned up and reset.”
“Cleaned up?”
“Yes. There are a series of protocols for resetting a robot whose shutdown was the result of an inhibitor trip. Checking the moving parts, cleaning out the gears, things like that. Surely you’re familiar with them?”
“The IRC protocols you mean? I didn’t realize you followed them.” His breath was almost normal once more.
“Everyone follows the IRC standards. That’s why they’re the IRC. They set the standards.”
“Yes, they are standards, but they’re not laws. There’s no way to enforce conformity.”
“True,” Eric said, “but since we’re members of the International Robotics Consortium, and we even have a few people who sit on the Consortium’s board, it would be bad form indeed to ignore their recommended practices.”