“It’s difficult to say,” the agent said. “She didn’t stop talking until she hung up. I was unable to tell her I wasn’t you.”
Creek grinned. That sounded like Mom. “Second message, please,” he said.
“From Ben Javna. He is interested in the state of your investigation.”
“Send him a message that I have news for him and I will call him later this evening or tomorrow. Third message, please.”
“You have a message from an IBM server at NOAA. Your software is unpacked, modeled, and integrated. It is awaiting further instructions.”
Creek sat down at his keyboard and put on his monitor glasses; the form of his agent was now projected into the middle of his living room. “Give me a window at the IBM, please,” he told his agent. The agent opened up the window, which consisted of a shell prompt. Creek typed “diagnostic” and waited while the software checked itself for errors.
“Intelligent agent” is a misnomer. The “intelligence” in question is predicated on the agent’s ability to understand what its user wants from it, based on what and how that user speaks or types or gestures. It must be intelligent enough to parse out “ums” and “uhs” and the strange elliptic deviations and tangents that pepper everyday human communication—to understand that humans mangle subject-verb agreement, mispronounce the simplest words, and expect other people to have near-telepathic abilities to know what “You know, that guy that was in that movie where that thing happened and stuff” means.
To a great extent, the more intelligent an agent is, the less intelligent the user has to be to use it. Once an intelligent agent knows what it is you’re looking for, retrieving it is not a difficult task—it’s a matter of searching through the various public and private databases for which it has permissions. For all practical purposes the retrieval aspect of intelligent agency has remained largely unchanged since the first era of public electronic data retrieval in the late 20th century.
What intelligent agents don’t do very well is actually think—the inductive and deductive leaps humans make on a regular basis. The reasons for this are both practical and technical. Practical, in that there is no great market for thinking intelligent agents. People don’t want agents to do anything more than what they tell them to do, and see any attempt at programmed initiative as a bug rather than a feature. At the very most, people want their intelligent agents to suggest purchase ideas based on what they’ve purchased before, which is why nearly all “true intelligence” initiatives are funded by retail conglomerates.
Even then, retailers learned early that shoppers prefer their shopping suggestions not be too truthful. One of the great unwritten chapters of retail intelligence programming featured a “personal shopper” program that all-too-accurately modeled the shoppers’ desires and outputted purchase ideas based on what shoppers really wanted as opposed to what they wanted known that they wanted. This resulted in one overcompensatingly masculine test user receiving suggestions for an anal plug and a tribute art book for classic homoerotic artist Tom of Finland, while a female test user in the throes of a nasty divorce received suggestions for a small handgun, a portable bandsaw, and several gallons of an industrial solvent used to reduce organic matter to an easily drainable slurry. After history’s first recorded instance of a focus group riot, the personal shopper program was extensively rewritten.
The technical issue regarding true intelligence programming had to do with the largely unacknowledged but nevertheless unavoidable fact that the human intelligence, and its self-referential twin human consciousness, are artifacts of the engine in which they are created: The human brain itself, which remained, to the intense frustration of everyone involved, a maddeningly opaque processor of information. In terms of sheer processing power, the human brain had been outstripped by artificial processors for decades, and yet the human mind remained the gold standard for creativity, initiative, and tangential inductive leaps that allowed the human mind to slice through Gordian knots rather than to attempt to painstakingly and impossibly unknot them.
(Note that this is almost offensively human-centric; other species have brains or brain analogues which allow for the same dizzying-yet-obscure intelligence processes. And indeed, all intelligent species have also run into the same problem as human programmers in modeling artificial intelligence; despite their best and most logical and/or creative efforts, they’re all missing the kick inside. This has amused and relieved theologians of all species.)
In the end, however, it was not capability that limited the potential of artificial intelligence, it was hubris. Intelligence programmers almost by definition have a God complex, which means they don’t like following anyone else’s work, including that of nature. In conversation, intelligence programmers will speak warmly about the giants of the field that have come before them and express reverential awe regarding the evolutionary processes that time and again have spawned intelligence from nonsentience. In their heads, however, they regard the earlier programmers as hacks who went after low-hanging fruit and evolution as the long way of going around things.
They’re more or less correct about the first of these, but way off on the second. Their belief about the latter of these, at least, is entirely understandable. An intelligence programmer doesn’t have a billion years at his disposal to grow intelligence from the ground up. There was not a boss born yet who would tolerate such a long-term project involving corporate resources.
So intelligence programmers trust in their skills and their own paradigm-smashing sets of intuitive leaps—some of which are actually pretty good—and when no one is looking they steal from the programmers who came before them. And inevitably each is disappointed and frustrated, which is why so many intelligence programmers become embittered, get divorced, and start avoiding people in their later years. The fact of the matter is there is no easy way to true intelligence. It’s a consonant variation of Gödel’s Incompleteness Theorem: You can’t model an intelligence from the inside.
Harris Creek had no less hubris than other programmers who worked in the field of intelligence, but he had had the advantage of peaking earlier than most—that Westinghouse science project of his—and thus learning humility at a relatively early age. He also had that advantage of having just enough social skills to have a friend who could point out the obvious-to-an-outside-observer flaw in Creek’s attempt to program true intelligence, and to suggest an equally obvious if technically difficult solution. That friend was Brian Javna; the solution was inside the core data file for which the IBM machine at NOAA had spent a day unpacking and creating a modeling environment.
The solution was stupidly simple, which is why no one bothered with it. It was damn near impossible, using human intelligence, to make a complete model of human intelligence. But if you had enough processing power, memory, and a well-programmed modeling environment, you could model the entire human brain, and by extension, the intelligence created within it. The only real catch is that you have to model the brain down to a remarkable level of detail.
Say, on the quantum level.
The diagnostic had stopped. Everything checked out.
“Agent,” Creek said. “Inside the IBM you’ll find a file called ‘core.’”
“I see it,” the agent said.
“I want you to incorporate the file and integrate it with your existing code.”
“Yes, sir. I note that this data will substantially change my capabilities,” the agent said.
“Yes it will,” Creek said.
“Very well,” the agent said. “It has been a pleasure working with you, sir.”
“Thank you,” Creek said. “Likewise. Please execute the integration now.”
“Executing,” the agent said.
The change was not dramatic. Most of the big changes happened in the code and were not visually outputted. The visual change was itself not substantial; the image became that of a younger man than it had been originally, and its facial features rearranged subtly.
&
nbsp; “Integration complete,” the agent said.
“Please shut down the modeling environment in the IBM and have it pack itself back into its memory cube here and encrypt,” Creek said.
“Packing has begun,” the agent said.
“Run a self-diagnostic and optimize your code,” Creek said.
“Already started,” the agent said. “Everything’s peachy.”
“Tell me a joke,” Creek said.
“Two guys walk into a bar,” the agent said. “A third guy says, ‘Wow, that must have hurt.’”
“Yup, it’s you,” Creek said.
“Yup, it’s me,” the agent said. “Hello, Harry.”
“Hello, Brian. It’s good to see you.”
“It’s good to see you too, man,” Brian Javna said. “Now maybe you can tell me a couple of things. Like how the hell you got so old. And what the fuck I’m doing inside your computer.”
chapter 5
At 4:22 a.m. Vernon Ames’s coyote alarm went off. Ames was awake instantly, whacking the alarm before it beeped a second time and woke Amy, his wife, who didn’t at all take kindly to being woken before she had her full eight hours. He slipped into the clothes he’d left in a pile by the bed, and left the room by way of the master bathroom because the bedroom door creaked loudly even (especially) when you were trying to open it quietly. Amy really didn’t like to be woken up.
Once outside the bathroom door, Ames moved quickly. His experience with the coyotes told him that the window of opportunity with those furry bastards was small; even if he managed to keep them from taking off with a lamb, they’d still take a bite out of the necks of some of his sheep, just for spite, as he ran them off. The key to getting the coyotes was to get them early, while they were still on the periphery of the herd, having a community meeting about which of the sheep they were planning to take on.
Ames pressed his thumb to the lock of his gun cabinet to get his shotgun and his shells. While he was loading the shotgun, he looked over the perimeter monitor to see where the coyotes were lurking. The monitor had three of them out near the edge of the creek. It looked like they had stopped for a drink before they went for the main course.
Ames could also see from the monitor that the coyotes were larger than usual; hell, they might even be wolves. The Department of the Interior people were doing one of their occasional attempts at reintroducing wolves to the area. They always seemed shocked when the wolves “disappeared” within a few months. The sheep ranchers were smart enough not to leave the carcasses lying around. Wolves were a temporary problem, easily fixed. Coyotes, on the other hand, were like rats bred with dogs. You could shoot ’em, trap ’em, or poison ’em and they’d still keep coming back.
Which is why Ames splurged for the coyote alarm system. It was a simple enough setup: Several dozen motion detectors planted on the perimeter of his land that tracked anything that moved. His sheep had implanted sensor chips that told the system to ignore them; anything else was tracked. If it was large enough, Ames got an alert. Just how large something had to be before the alarm went off was something Vern had to calibrate; after a few early-morning false alarms Amy made it clear that any more false alarms would result in Vern’s head meeting a heavy iron skillet. But now it was in the zone and aside from the occasional deer, reliably alerted Ames to the coyotes and other occasional large predators. It spotted a mountain lion once; Ames missed that shot.
Ames rooted through the junk drawer to find the portable locator and then slipped out the back door. It was a five-minute hike to the stream. It did no good to drive over to the coyotes, since they’d hear the engine of his ATV and be long gone before he got there, and then he’d just have to come out and try to shoot them some other time. The coyotes could hear him coming on foot, too, but at least this way he had a chance of getting close enough to take a shot before they dispersed. Ames padded to the stream as quietly as he could, cursing silently with each snapped twig and crackling seed pod.
Near the creek, Ames’s portable locator started to vibrate in his jacket pocket, a signal that one of the coyotes was very close by. Ames froze and hunkered down, so as not to spook the intended recipient of his shotgun blast, and slowly pulled out the locator to get a bead on the nearest coyote. The locator showed it behind him, heading for him and coming up fast. Ames heard the footfalls and the whisper of something large swiping against the bushes. He turned, swung his shotgun around and had just enough time to think to himself that’s no coyote before the thing stepped inside the barrel length of the shotgun, grabbed his head with one paw the size of a dinner plate, and used the second paw of roughly similar size to slug him into oblivion.
Some indeterminate time later Ames felt himself kicked lightly back into consciousness. He propped himself up with one arm, and used his other hand to feel his face. It felt sticky; Ames pulled his hand back to look at it. His blood looked blackish in the quarter moonlight. Then someone stepped in between him and the moon.
“Who are you?” a voice said to him.
“Who am I?” Ames said, and as his tongue moved in his mouth he could feel the teeth loosened by the hit that had knocked him out. “Who the hell are you? This is my property, and those are my sheep. You’re trespassing on my land!” He struggled to get up. A hand—a normal-sized hand, this time—pushed him back down to the ground.
“Stay down,” the voice said. “How did you know we were out here?”
“You tripped my coyote alarm,” Ames said.
“See, Rod,” another voice said. “I told you that that’s what those things were. Now we gotta worry about cops. And we’re not even near done.”
“Quiet,” the first voice, now named Rod, said, and directed his attention back to Ames. “Mr. Ames, you need to answer my question honestly now, because the answer will make a difference as to whether you make it through the rest of the night. Who gets alerted when your coyote alarm goes off? Is it just you, or does it notify the local law enforcement as well?”
“I thought you didn’t know who I was,” Ames said.
“Well, now I do,” Rod said. “Answer the question.”
“Why would it alert the sheriff?” Ames asked. “The sheriff’s office doesn’t give a damn about coyotes.”
“So it’s just you we have to worry about,” Rod said.
“Yes,” Ames said. “Unless you make enough noise to wake up my wife.”
“Back to work, Ed,” Rod said. “You’ve got a lot of injections to make yet.” Ames heard someone shuffle off. His eyes were finally adjusting to the dim light and he could make out the silhouette of a man looming nearby. Ames sized him up; he might be able to take him. He glanced around, looking for his shotgun.
“What are you doing out here?” Ames asked.
“We’re infecting your sheep,” Rod said.
“Why?” Ames asked.
“Hell if I know, Mr. Ames,” Rod said. “They don’t pay me to ask why they have me do things. They just pay me to do them. Takk,” he said, or something like it, and from the corner of his eye Ames saw something huge move toward his general direction. This was the thing that had knocked him out. Ames slumped; in the shape he was in he couldn’t take two guys at the same time, and he absolutely couldn’t take on that, whatever the hell it was.
“Yes, boss,” the thing said, in a high, nasal voice.
“Can you handle Mr. Ames here?” Rod asked.
Takk nodded. “Probably.”
“Do it,” Rod said, and walked off. Ames opened his mouth to yell to Rod, but before he could take in a breath, Takk leaned over and grabbed him hard enough that the air bursting out of his lungs made an audible popping sound. Takk turned slightly into the moonlight, and Ames got one good look before he went somewhere warm, wet, and suffocating.
Brian came aware instantaneously with the knowledge of two things. The first: He was Brian Javna, aged 18, senior at Reston High, son of Paul and Arlene Javna, brother of Ben and Stephanie Javna, best friend to Harry Creek, whom he had known si
nce first grade, when they bonded over a paste-eating contest. The second: He was also an intelligent agent program, designed to efficiently locate and retrieve information across the various data and information nets human beings had strung up over the years. Brian found these two generally contradictory states of being interesting, and used the talents derived from both types of intelligent experience to come up with a question.
“Am I dead?” Brian said.
“Um,” Creek said.
“Don’t be coy,” Brian said. “Let me make it easy for you. When you wake up with the knowledge that you’re a computer program, you figure that something’s gone wrong. So: Am I dead?”
“Yes,” Creek said. “Sorry.”
“How did I die?” Brian asked.
“In a war,” Creek said. “At the Battle of Pajmhi.”
“Where the hell is Pajmhi?” Brian asked. “I’ve never heard of it.”
“No one ever heard of it until the battle,” Creek said.
“Were you there?” Brian asked.
“I was,” Creek said.
“You’re still alive,” Brian said.
“I was lucky,” Creek said.
“How long ago was this battle?” Brian asked.
“Twelve years ago,” Creek said.
“Well, that explains how you got so old,” Brian said.
“How do you feel?” Creek asked.
“What, about being dead?” Brian asked. Creek nodded. Brian shrugged. “I don’t feel dead. The last thing I remember is standing in that quantum imager, and that feels like it happened about five minutes ago. I’ve got part of myself trying wrap my brain around it, and another part of myself trying to wrap my brain around the fact that my brain isn’t real anymore. And yet another part noting the fact I can fully concentrate on several mental crises at once, thanks to my multitasking ability as an intelligent agent. And that part is going: Cool.”
Creek grinned. “So being a computer program isn’t all bad,” he said.