chapter 13
TEAM 2, YEAR: 2016
Time Remaining: 185 Days
“Destroy it? That’s absurd! Why would you want to do that?” Owen was aghast to hear that Riley wanted him to aid her in destroying the most abundant source of power the planet had ever seen.
“It’s rather complicated,” said Riley. “Let me try to explain. Back in the late forties—2040s that is—miners came across a handful of white stones as they were cleaning up debris after a routine blast. At first glance, they thought it was quartz. But when they looked more closely, they saw that the stones had a light, slow-pulsing glow. Having never seen anything like it before, the foreman took a sample to the surface so the lab could run tests on it. By the time the foreman got to the lab, word had spread that some glowing rocks had been discovered and people from all over the mine flocked to the lab to get a peek. The geologists knew immediately that there was something remarkably unique and unusual about the stones. To be cautious, they stored the rocks inside a lead-lined, radioactive material container, which was good, because the stones were horribly toxic. Unfortunately, it was too late for everyone who had spent more than five minutes around the samples; they all became sick and died the next day. The miners, dead. Geologists, dead. Everybody else who just got a quick peek or passed within a ten-foot radius got sick to some degree but recovered.”
“What did they die from? Is Elevanium radioactive?” Owen flipped through his notes wondering if he missed something.
“No, it isn’t. But it emits something that’s unlike anything documented on Earth before. We don’t know exactly what it is, so it’s been coined ‘Elevanium Poisoning.’ To make a long story short, the Elevanium deposit was expropriated by the government, studied and commercialized. By 2051, the owners of homes, apartment complexes, and retail and office buildings could purchase a retrofit kit and power their buildings with an Elevanium-based battery pack. As you hypothesized, it revolutionized power consumption across the country,” explained Riley.
“That sounds like it would resolve a lot of energy shortage problems,” observed Owen.
“It did, but it created a lot of social controversy at the same time,” said Finn. “People were angry because it was pretty expensive to get a retrofit kit. I mean, in the long run, it paid for itself hundreds of times over, but the conversion cost was steep for the average homeowner. A lot of people were angry because they saw it as another luxury for the wealthy. Politically, it was a hot topic because it seemed unfair that a country as power rich as ours had sole control over a nearly infinite energy source. Especially when it was really just a stroke of luck that it happened to be located where it was. World leaders argued that because it was from outer space, it shouldn’t be just one country’s to control.”
“I hear what you’re saying, but surely with all of your time-travelling abilities you could go back in time to when they were researching this and integrate it a bit better? Change some of the mistakes from the first time?” asked Owen.
“No. This is only the first part of the problem. Part B is the even bigger problem,” said Finn. “Robotic technology, as I’m sure you can believe, has progressed a lot between now and 2097. Fast-forward to around 2090, 2091. The world is heavily dependent on robot technology to function and the market is flooded with hundreds of models. Because there is so much competition, prices are low and improved models are launched weekly. You’ll find at least one or two robots in every home and hundreds in offices and businesses. They’re great for a couple of years as long as they can recharge somewhere and have a limited set of task features, like a robot that cleans your home or a garbage bot in an office. They can exist and function without much supervision, but they become obsolete quickly. Not all owners are savvy enough to know how to maintain their robots, so they use them until they no longer function or the manufacturer no longer supports them. It’s really tricky and time-consuming to keep a robot’s programming current and it was cheaper and easier to buy a new robot with the latest task programming and newest capabilities.”
“By the time a robot was one or two years old, it had become obsolete and so devalued that people just pitched them,” said Riley. “Robots had become disposable items and the numbers being tossed away were astronomical. Regular recycling facilities couldn’t accommodate the unwanted robots because of their complex construction, so they were sent to landfills. It created a tremendous problem. I mean, these aren’t leftovers we’re talking about. Some of these robots are as big as the average human or larger, depending on their function. You couldn’t walk down any street without seeing robot parts or even whole discarded robots lying in junk piles waiting for garbage collection. So in 2091, the government tasked NRD to solve the problem. The NRD collected all the robots from the landfills and created a Robot Recycling Depot where people could drop off their unwanted robots. The volume they accumulated was staggering. This massive depot still spans three city blocks and is continually overfilled with discarded robots waiting to be processed, operating far beyond its intended capacity. The piles never seemed to shrink because deliveries of more defunct robots came in from cities all over the country. Countries without recycling facilities shipped their robots here as well, just to be rid of the garbage. Inside the depot the workers separated the different components of the robots. They recycled what they could, threw out what they couldn’t and melted down the metal content into cubes and sold them for reuse.”
“It was a big job and no private company wanted to take on something that was expected to yield such little profit,” said Finn. “When no private organizations put in any bids, the government tasked the NRD with it.”
“After the depot had been in operation for six months, it became apparent that the NRD was going to turn a very healthy profit. The government decided to invest the money back into robotic research with the goal of developing robots that were smarter and easier to upgrade. Artificial Intelligence seemed like the only logical solution. Developing a model of robots that could continually learn would solve all of the issues that made the current robots so disposable. They would know how to keep themselves maintained and they would require no upgraded task programming because they would continually learn,” said Riley.
“That sounds like a sustainable solution,” said Owen.
“It solved a lot of problems,” agreed Finn. “But people didn’t believe AI could be done. They were extremely vocal about flushing trillions of government dollars down the toilet on research that so many other organizations had already attempted and failed at.”
“One of the top guys at the NRD, Ian Turner, was convinced he had the people who could get it done and he pitched his plan to the NRD,” said Finn. “They agreed, but he needed to find private investors to fund the rest.”
“So are you telling me that it takes until 2090 to get AI mastered?” asked Owen. “I thought it would have been much sooner than that.”
“Well, more like 2095,” said Riley, stretching her arms above her head and shifting in her chair. “Like Finn said, it had been attempted many times before but no one could successfully get it off the ground. There was always a missing link.”
“So by the end of 2094, the AI Project had gone as far as it was going to go,” said Finn.
“Was it successful?”
“No, it fell flat on its face. Quite literally, in fact,” said Finn.
“Well, the AI Project wasn’t a complete failure,” said Riley. “The robots learned, which was further than everyone else had gotten, but there was still a major problem.”
“If the robots learned, wouldn’t that mean success?” asked Owen.
“Well, yes and no,” said Riley. “The robots’ thought processes were transmitted back to the lab to study their learning progress. The number of things a sedentary robot could calculate in a matter of seconds blew the minds of the engineers. Within seconds of being powered up, a robot had calculated the lab’s size, volume, temperature, humidity, noted changes in air pres
sure and predicted weather patterns for the next twenty-four hours.”
“All that in just a few seconds? How is that failure? That sounds remarkably impressive.”
“The engineers were impressed by their learning capacity. But the problem was that the robots didn’t do anything,” said Finn.
“So the brain worked but the body didn’t?” asked Owen.
“No, the robots were quite agile, in fact,” said Riley. “The problem was that when they tested the robots’ abilities, they found the robots wouldn’t do anything without instruction. They needed to be told what to do. It’s kind of tricky to determine where the line is between mind-reading and Artificial Intelligence. Humans, for example, learn all the time but at the same time, at work, they still have managers that check-in and periodically tell them what to do and keep them on track. Generally, most humans behave within a set of socially acceptable boundaries. So these were the baseline standards to which the robots would be tested.
“So when they tested the robots, they treated the robots like employees and asked them to do very simple tasks like putting different shaped blocks into matching holes or stacking empty boxes. The robots did as they were instructed but nothing more than that. They were behaving like First-Gen robots.” Riley saw Owen’s face blank at the term. “First-Gen robots or First Generation robots, are all of the task-driven robots that came before the AI robots. So these crazy expensive, uber-intelligent AI robots were behaving the same as all the task-driven robots lying wasted in piles at the robot recycling depot. The engineers were stunned and thought the AI programming must be wrong. When they looked at the robots’ thought processes, it showed they had learned as they completed the tasks. The robots calculated more about the little plastic shapes and the corresponding holes than a human could ever imagine to be calculable. The engineers spent hundreds of hours running the robots through similar types of tests and the results were always the same. Their behaviour was nothing like what the scientists and engineers had predicted.”
“Yeah,” said Finn chuckling, “one of the tests was getting a robot to catch a baseball and it wouldn’t engage. The engineer threw the ball and clocked the robot in the head. They analysed the robot’s thought process after the fact. The robot had calculated the size of the ball, the speed it approached, the weight of the ball and, based on the trajectory and speed, where the point of impact on its head would be. But it did nothing about it.”
“It became apparent that the robots weren’t going to do anything unless they were specifically instructed to and this boggled the minds of the engineers and programmers,” said Riley. “The behaviour didn’t fit the agreed-upon definition of genuine artificial intelligence. Going back to the human employee benchmark, most people know that once they’ve completed their task, there is something else they can do. In most cases, they don’t need to be micromanaged. But the robots had to be instructed task after task. The engineers finally decided that maybe more motivational testing was needed to kick-start their activity. They put a robot in a test car and showed it how to operate the car. Then, they identified the wall the car was going to hit, but gave it no instructions. As the car careened down the test strip, the robot did nothing to avoid the wall. No braking, no swerving. It didn’t even put its hands on the wheel. The robot and the car were smashed to pieces. Another robot was taken up in a plane and given a parachute pack. The robot received an explanation on parachutes and how to operate the chute. Then, they tossed the robot out of the plane. The engineers watched the video footage from the camera in the robot’s eyes. All they saw were streaks of blue, green, blue, green as the robot tumbled through the air, then black when the robot smashed into the ground.”
“The data the robots had logged was crazy,” said Finn. “Like the simpler tests, the robots logged a ton of data. The robot in the car calculated the rate of acceleration, the amount of force with which the car would hit the wall and even how fast it would fly through the windshield. The robot that went skydiving knew what the weather was going to be for the next month and predicted the growing conditions for crops for the rest of the season. It knew exactly how fast it was going, how fast it would be going when it hit the ground and, and, and….”
“Why didn’t they do anything?” asked Owen. He had been so mesmerized he had forgotten about his coffee. He took a sip and found it lukewarm.
“All further tests were immediately stopped after the parachute test. These prototype robots were not cheap and they were being destroyed with every test. The programmers checked their code for errors and the engineers reviewed the logic for flaws. When neither team found any problems, they pored over the robots’ thought processes for any clues to shed light on their lack of engagement. The robots would only process the information their sensors picked up and that was it.”
“How did they solve it?” asked Owen.
Finn chuckled. “Would you believe it was a Chinese food delivery kid who figured it out?”
A week after the parachute test, the programmers and engineers closeted themselves in a boardroom to discuss the problems, determined to find the missing link. They worked well into the evening and ordered in Chinese food for dinner. On this night, all of the restaurant’s delivery bots were on deliveries so the food was delivered by an employee. When he entered the boardroom to drop off the food, he saw lines of code, data charts and exploded views of the robots’ mechanical schematics projected in the air above the boardroom table like a bizarre, three-dimensional buffet of data. Mesmerized by what he saw, the pimply teenager asked what they were working on.
One of the exasperated developers humoured the uniformed driver. “We’ve designed robots, but they won’t do anything and we can’t figure out why.” The woman absently pitched balls of crumpled paper across the room into the mouth of garbage bot. As she threw another crumpled ball, the trajectory was off. The can zipped to the left and the paper ball sailed smoothly through the open lid and landed neatly inside.
“Maybe they just don’t feel like it,” said the kid jokingly, zipping up his jacket to leave.
The room had gone silent as the exhausted, sunken faces looked at one another. The delivery boy thought his attempt at humour may have hit a nerve. Glad to have already been given his tip, he backed away quietly.
“Oh…my…God…” said one of the engineers. “That’s gotta be it. They’ve got no goddamn motivation. Think about it. The robots think and learn, but they don’t physically do anything until it’s requested of them. If they have no reason to want to do it, why would they? What does it matter to them if they exist or don’t exist if they don’t care about their existence?”
“Needless to say, the delivery kid got a bigger tip and from that conversation, Artificial Emotional Intelligence—or AEI—was born!” proclaimed Finn, who threw his arms wide for dramatic effect.
Riley batted one of his arms out of her way good-naturedly. “The robots needed to feel. With emotion would come interest, curiosity, motivation and desire. So, blah, blah, blah, Ian goes back to the NRD and the private investors and tells them he needs more money. Then, he pitches them on a plan to program the robots with different human personalities.”
“Once the AEI robots were created, the government launched a year-long pilot project in the city. Businesses and individuals could apply for a robot and participate in the project. Soon these AEI robots were working everywhere. Manufacturing, hospitals, restaurants, and so on, working as receptionists, teachers and construction workers, you name it. At first, people didn’t know what to make of them and some folks were genuinely afraid. Total social acceptance took several months. For most people, they were the greatest thing since sliced bread. Some people never warmed to them at all, saying they were unnatural and refused to work or interact with them,” said Finn.
“I could see how that would be a hard thing for people to get used to,” said Owen. He imagined what his reaction would be if his director came into his office with a shiny robot as his new partner and equ
al.
“Overall, they were a wild success. The robots were so good at everything they did and they were very likeable. People had become as fascinated with them as the robots had become with humans,” said Finn.
“So did these robots live where they worked?” asked Owen.
“No. That’s another difference between AEI and First-Gen robots. There was a community on the outskirts of the base where they lived,” said Riley. “Don’t forget that the robots were programmed with human emotions. They were essentially humans, no different than you or me, but they just happen to be made of metal. So, they worked their eight-hour shifts just like the other employees and at the end of the day, they wanted to socialize and be with others like them,” said Finn.
“It’s difficult to wrap my mind around a robot wanting to socialize,” said Owen.
“It was really weird getting used to it at first but really, most feelings a human can experience, these robots can experience,” said Riley. “And that’s where the problems started. When you look at the fundamental reason for any kind of robot, they are primarily built for one purpose: improved efficiency. Every robot is created to do something so a human doesn’t have to, or to do something quicker or more efficiently.”
Owen nodded and thought about how, even in his lifespan, the numerous times he had heard about lay-offs because workplaces had become automated.
“If the fundamental reason for the existence of robots is to do something better or faster,” said Riley, “can you see where the problem begins?”
“I don’t actually. It all sounds really amazing.”
“Part of efficiency is finding the path of least resistance. If you were a robot, and you needed something, does it make sense for you to work at your job, save money and pay for it? Or go through the proper channels for it? Or does it just make sense to take it? If you take it now, you have it, and that’s much more efficient than waiting three months until you’ve saved enough money.”
“Yeah, okay, but morality is part of what defines a personality. Wouldn’t they have inherited some morals from the human personalities?” asked Owen.
“Yes, and a good observation, but here is the problem. A human’s morals can be shaped by events as they go through life; they aren’t fixed in your personality like say, intelligence is. For example, you can learn that one plus one is two and that can’t be taken away from you, but it comes down the old question. Is it alright to steal bread if your family is starving? You get mixed answers on it. In fact, you yourself might think one thing, then because of something that happens in your life, your paradigm shifts and you believe something different. Apply that argument to another grey area like ending someone’s life. If you asked a person if it’s right to murder people, their answer is likely going to be no. But if you spin the question and ask if the death penalty should be enforced in a case where someone has murdered thirteen people, you might be surprised how many people answer that question differently. So in the case of the robots, their personality programming frequently conflicted with their core, fundamental programming goal of efficiency.”
“So, the robots’ behaviour degraded to a level below what most people would deem moral,” said Owen.
“Exactly. Ultimately, I believe that the robots’ intentions started out innocent enough, but over time they justified their actions to satisfy their needs. Kind of like how a person who feels underpaid could justify taking a few small bribes, but then finds themselves on the take for millions a few short years later. It’s a slippery slope, and it rarely starts big. Each time the robots went against what they were programmed to believe was moral, their actions would become easier and easier until eventually they no longer felt any guilt or remorse.
“The final contributing factor became apparent at around the four- or five-month mark of the pilot project. Many of the robots began to resent humans because they felt like second-class citizens and the negative feelings spread like a virus through the robot community. They despised being the product of humans and it enraged them that no one dictated when a new human was born, or how many. The death of a human was never decided upon by the government, either. The robots felt like the humans were dictators and that they were created merely as a slave race. So, after a lot of debate, the robots were allowed to run the Robot Recycling Depot. The robots felt that the recycling of a robot was the equivalent to the end of a human’s life. In addition, they would manufacture new AEI robots; the start of a robot’s life. Obviously, the creation of robots was contingent on several factors, for example, Elevanium, plus supply and demand. The robots didn’t love it but had accepted it, or so we thought,” said Riley. “They took over the production and it went very smoothly, though they had made it known they were unhappy with several terms of the agreement. Being told how many robots they were allowed to manufacture really rankled at them, so they silently created their own plan and began by quietly collecting Elevanium.”
“How could they do that if it’s all under lock and key at the NRD base?” asked Owen.
“They stole it,” said Riley. “It wasn’t noticeable at first. Abandoned warehouses and manufacturing plants were broken into, but the break-ins went undetected. The few break-ins that were reported were blamed on gangs or homeless people looking for shelter; no one even thought to check the power system. This went on for a while and other mysterious break-ins began occurring at seasonal buildings, abandoned homes and empty offices. Again, no evidence, no suspects, no charges.”
“The robots also bought a lot off the black market,” said Finn. “They’d buy Elevanium from anyone who was looking to sell, no questions asked. To this day, those initial break-ins really could have been anyone.”
“Where did they get the money for that?” Owen asked.
“Don’t forget, all the robots had jobs. They got paid like everyone else and they pooled it together. The robots had lower expenses than humans. They didn’t have grocery bills, cars, mortgages or investments and they lived and socialized among themselves. The only things they paid for were rent at the domes and maintenance supplies. They have tons of cash. Well, had. Their accounts have been frozen,” said Finn.
“By the end of the year-long pilot project, the robots had secretly created an enormous army ready to deploy,” said Riley.
“Wow. So what happened?” asked Owen.
A shadow crossed both Riley and Finn’s faces. “They’ve essentially waged war on the city, intent on collecting as much Elevanium as they can get their hands on. Most of our downtown has been shut down for weeks. It’s too dangerous for people to return to work or live downtown. They’ve crashed at least one city power grid so it’s complete chaos in the north end. Buses and cabs have stalled out. City lights and air traffic guidance systems are down. Every public service that is powered by that city grid no longer runs. Most people have fled the city but for those who have stayed to defend their homes or businesses, the area has become a war zone. So far, we’ve been able to contain the turmoil to our city but it’s just a matter of time before they shift their focus to other major cities. In fact, we suspect that they’re already in other cities, laying the groundwork and buying more black-market Elevanium,” said Riley.
“The irony of the situation is that the robots were programmed to be intelligent and diverse in their thoughts. But they devolved to the point where they had become so fixated on getting Elevanium, it’s like they were programmed to complete that one task only,” said Finn.
“They’re behaving like First-Gen robots,” said Owen.
Riley nodded. “So while we’re sitting here, this city is under attack. We’re here to change past events so we can put a stop to it and hopefully reverse the damage.”
“But where do I come into this? Surely you guys have far more qualified people that know more about this than I do. I’ve never seen the stuff,” said Owen.
“The short answer is that because people in the science community believe it’s jinxed,” said Finn.
“Jinxe
d? Are you telling me that scientists actually think that something is jinxed?” Owen looked perplexed. “Wait, what’s the long answer?”
“Well, maybe ‘jinxed’ isn’t the best term,” said Riley. “Some scientists believe that there is something unique about it—something that they don’t understand that protects itself in some unseen way—in addition to the Elevanium-poisoning. Most people who have any previous experience with Elevanium won’t have anything to do with it. Throughout the years, there have been more incidents than I can count involving the people that have worked with it. In the late forties and early fifties, members of the original team of scientists that studied it ended up dead. There were seven people on the team at the start and it was suspected they had been poisoned. Two of them died and five of them were so sick that they barely recovered.”
“Was it Elevanium-poisoning? Were they exposed to it in some way?” Owen asked.
“That’s what everyone thought at first, but no. Their symptoms weren’t consistent with exposure. But there was no explanation for what made them sick. The research was put on hold while the remaining scientists recovered. When their work resumed, one of the scientists had a car accident and died. I think he swerved to miss something on the road and he hit a tree. Later, a gas leak was found inside one of the other scientist’s homes. Fortunately, no one was hurt,” said Riley. “Immediately one of the scientists quit, saying everything was too coincidental and wanted no part of it. The three remaining men continued on only to die in a lab explosion a few months later. Much of that knowledge went with them when they died or was destroyed in the blast.”
“That is weird,” said Owen. “So what was it? Was someone out to get them? Did someone not want them researching it?”
“Nobody knows,” said Riley. “There was another seventeen unexplainable deaths in the sixties and twelve more in the late seventies, not to mention countless, countless injuries. After all that, no one would touch it.”
“And I’ve only known about this stuff for a few weeks and I just about got hit by a bus today.”
Riley retrieved a small glass jar from her backpack and tossed it to Owen. “Behold, Elevanium.”
“Whoa,” breathed Owen, holding up the jar for closer inspection. Little white stone chips filled the bottom. They reminded him of the decorative crushed quartz that was common in gardens but with more depth somehow. The white colouring seemed partially transparent, though it was hard to tell because of its slow, pulsing glow.
Owen had a thought. “Why can’t you just shrink the whole deposit down with one of those fancy tools and bury it again or toss it into the ocean?”
“That would have been a great idea, but the compression tool doesn’t work on it in its potent state. We can’t shrink it and we can’t make it bigger.”
Owen glanced down at his watch and stood abruptly. “I’m sorry, I’ve got a meeting. I’d love to help you, but I just don’t have the answers you need.” As he handed the jar of Elevanium chips back to Riley, he looked at them longingly, like he was parting with old friends. “I’m sorry. I just don’t see how I can help you.”
The room fell silent. Riley looked at Finn for a long moment. Owen wondered if she was mulling over a thought or having a conversation with Finn telepathically. At this point, he would believe either.
Riley slid her arms into her pack. She turned toward the door, but not before sliding a business card onto Owen’s desk. “Come by our lab after work. Let us show you what we’ve got going on. If you still think you can’t help us, we won’t hassle you any further.”
Owen picked up the card and read it. He looked up and they were gone.