“Probably not. Though who can say?”
“Please consider carefully the wisdom of transmitting to me any novel/germane information immediately upon receipt, and returning as rapidly as possible.”
“I will. I take the point that my future relations with the person concerned are of relatively little consequence.”
“The person concerned sounds – to be polite – eccentric.”
“That would be polite to the point of over-generosity,” Tefwe said. “Awkward, tetchy and unreasonable might be closer to the truth.”
“Would the person concerned be an exceptionally old human?”
“No,” Tefwe said, “the person concerned is an exceptionally old drone.”
The ship drone’s aura flashed amused red then disappeared. “I’ll see how that breakfast’s coming along.”
“Followed?”
“Maybe.”
“How maybe?”
“About fourteen per cent maybe, averaged.”
“Averaged from what? What does that even mean?”
“Averaged from different scenarios. In ambient peacetime the likelihood would be less than five per cent. In all-out wartime, closer to forty per cent.”
“How does that average out at fourteen?”
“There are other scenarios to be taken into account,” the avatar told her. “Then there’s weighting.”
Cossont opened her mouth to query that as well, then decided it wasn’t worth it.
“You generally start to react to anything over fifteen per cent,” Berdle said, apparently trying to be helpful.
Pyan made a throat-clearing noise. “Are we in any danger?” it asked.
“A little,” the avatar admitted.
“Then, do you carry lifeboats?”
Berdle looked at Cossont. “Is it being serious?”
“About as serious as it knows how to seem.”
“I was being serious!”
“Be quiet.”
“This is my life too!”
“Shut up!” Cossont looked at Berdle. “Does this change anything?” They were still in the module, sitting on comfortable but ordinary-looking couches. The avatar lounged elegantly.
“It may be best to lose them before we visit Ospin,” Berdle said, “or employ some method to obscure your exact destination, which, of course, I have yet to learn.”
“I’ll tell you when we get there,” Cossont said, feeling oddly guilty. “Shit. When do we get there?” She’d just remembered, alarmingly, that when she’d taken a clipper to get out to Ospin, on the occasion she’d gone there to deposit the glittering grey cube holding QiRia’s mind-state, the voyage had lasted weeks. If it took that long this time the Subliming would already have happened by the time they got there.
“In about eighty-five hours. Less if we hurry.”
“Why don’t we hurry?”
“Because to do so will damage my engines. They are still repairing themselves after my dash to Izenion.”
“So how are you going to lose the ship that might be following us?”
“With difficulty, possibly.”
“So what was the other alternative?”
“Employ some method to obscure your exact destination. That should be easier. The android Eglyle Parinherm might be useful at that point. I’d like to wake it.”
“Fine by me. See if you can convince it it’s not all a sim.”
xLOU Caconym
oMSV Pressure Drop
So, our little group has grown with the inclusion of the You Call This Clean? as well as the Empiricist. And something called the Smile Tolerantly is another accomplice of the long-lived Mr Q. Only, looking at my edition of the relevant ship lists, the Smile Tolerantly disappeared up its own wazoo some 140 years before the Warm, Considering was even built, and was itself only brought into being just under four thousand years ago, so leaving open the question of who else might have been aiding and abetting the elusive geezer in the various meantimes. Any ideas?
∞
None. Further research is required.
∞
Wait. I want to write that down.
Fourteen
(S -14)
“Does she touch you … like this?”
“Ah … ah yes, very much like that. Just … like that.”
“And kiss you, like this?”
“… Somewhat like that. Only a little like that.”
“Only a little?”
“She kisses me differently. Oh, this is not … I shouldn’t … I really shouldn’t be telling you any of this. This is so … You are the most terrible man.”
“I know, I know. I hate myself. Am I not just the most terrible, terrible man?”
“You are, I don’t know … Oh, now what?”
“Let me see … does she kiss you like that?”
“No. Again, no, not quite like that.”
“In some other way?”
“In some other way.”
“How many ways are there, to kiss? I … I really have no idea. I am not so well versed as you might … let’s see …”
“… Well, then. Now. She kisses, let me see … more lightly, just as … passionately, but with less, less … less intensity, less muscularity.”
“Muscularity?”
“I think that is the word.”
“And, with the touching, is it like this …?”
“Oh, ah, yes, sometimes, though …”
“Yes?”
“Her hands, her fingers.”
“Like this?”
“No, not … something like … her hands, see, are more slender, the fingers are longer, they are more delicate. Yours are … fuller, more …”
“Filling?”
“Yes. And grasping.”
“Grasping? Grasping? I’m shocked, Virisse! Am I really grasping?”
“Ha, I mean … hungry, given to clutching, gripping, even grabbing.”
“And now grabbing! Good gracious!”
“You grabbed me, don’t you remember? That first time, when we were in the garden of the parliament. That evening? You said you wanted to talk about some long-term aspect of her schedule, do you remember?”
“Of course I remember. Here is more comfortable, though, don’t you think? I swear if we weren’t all departing so shortly I might have to have this bed reinforced, just to cope with our exertions.”
“Her birthday, the following year. You said you wanted to plan something special for her because it would be her sixtieth. Then. That was when you grabbed at me, almost as soon as you had me alone, in that night-shady bower.”
“I grabbed you? Are you sure this was me?”
“Oh, now. Who else? You know this. You did.”
“I thought you wanted to be grabbed.”
“I might have.”
“Just as well I did, then, wouldn’t you say?”
“I think I will say anything, when we are like this, when you hold me like this.”
“Really? I must think up something extremely terrible, then, to exploit that admission.”
“You mustn’t. You can’t. I’m at your mercy; it would be cruel, wrong.”
“To the contrary! It would only be right. You offer me this, I must take it. You lay yourself open; I must in.”
“Ah. Ah … yes. Oh, dear Scribe … But … not anything. Not anything at all. I am not so totally … I am not so … I am not …”
“Not what, Virisse?”
“I can’t remember. I have forgotten what I am not.”
“Well. Better that than forgetting what you are.”
“You steal all I have away, like this, when we lie like this. I feel laid bare, all washed away.”
“Does she do this? Does she have that same effect?”
“Oh, my love, why do you always have to ask me to make these …?”
“Because I’m fascinated. Everything about you fascinates me. How can I not be consumed by … not envy, but interest, to know how you lie with her, how much of what you do wit
h me is what you do with her?”
“Is it not enough to know that you and I do what we do? Do we have to compare? Must we always compete?”
“How can we not? The urge to compare and compete is as basic as any. As basic as this.”
“Must it be, though?”
“It is; that’s all that matters. And does she touch you like this?”
“Oh. Oh. Oh. Yes, yes she does.”
“How I’d love to compare. How I’d love to see. How I’d love to watch.”
“See, my love? Watch?”
“Is that too much to ask?”
“Prophet’s kiss, Septame! You want the three of us …?”
“Septame, is it, now! Why …”
“Oh, don’t stop, don’t stop to laugh. Laugh if you must, but don’t stop.”
“Well, I plough on. But no, I didn’t think to suggest that she and I might have you both, together.”
“No. That would be too … I couldn’t … anyway, she would never …”
“No, of course not. When I have you, I want you all to myself. And I want you to have me, all to yourself. I’d have no dilution of this … concentration, this … tenacity.”
“What, then?”
“Just to watch, just to see you with her.”
“She still would not.”
“I know. I wouldn’t expect her to. And secreting myself within some hidden space, like a courtier in some ancient tragedy, would be absurd.”
“Put it from your mind, my love. Concentrate on this, on now, on us.”
“The more I do, the more I want to see you and know you in all your states, in all your moods and passions, and that must include with her. Just once, just to see, later, not at the time. And it would be so easily accomplished; I can source the sort of means no civilian or journalist can find.”
“Oh, dear gods, you’ve thought about this. You’re serious! No, no, still; don’t stop …”
“Serious, ardent. Please. It would mean so much.”
“It might mean my job! My career; all my standing. She’s the president!”
“She is president for thirteen more days, as I am a septame for thirteen more days. All that means nothing then and already starts to mean even less now. What does matter is that she is a woman, you are a woman and I am a man.”
“But still …”
“Still nothing, my love, my beautiful love. We strip off our importance with our clothes. That’s all that matters; not our titles. They only have meaning in public, not in moments like these, when we are purely, perfectly ourselves. Only a man – a weak man, a hopelessly curious, desperate-to-know man – asks this of you, my Virisse, not a politician. Just a man. Your man, your man, your man.”
“But if … if … if …”
“I’d protect you. There would be no risk. And so close to the end of this world, the start of the next, who really cares? All the rules are questioned now, all the laws loosened. Everything is licensed. Like this, and this, and this.”
“Oh. Oh. Oh.”
“And I’d make sure nothing would happen to you. I swear; I swear I swear I swear. Even if anything was found, it would never be traceable to you or I. Say you will. Say yes. Say it for me. Say you will. Say yes, say yes, say yes. Say it.”
“… Yes …”
The Simming Problem – in the circumstances, it was usually a bad sign when something was so singular and/or notorious it deserved to be capitalised – was of a moral nature, as the really meaty, chewy, most intractable problems generally were.
The Simming Problem boiled down to, How true to life was it morally justified to be?
Simulating the course of future events in a virtual environment to see what might happen back in reality, and tweaking one’s own actions accordingly in different runs of the simulated problem to see what difference these would make and to determine whether it was possible to refine those actions such that a desired outcome might be engineered, was hardly new; in a sense it long pre-dated AIs, computational matrices, substrates, computers and even the sort of mechanical or hydrological arrangements of ball-bearings, weights and springs or water, tubes and valves that enthusiastic optimists had once imagined might somehow model, say, an economy.
In a sense, indeed, such simulations first took place in the minds of only proto-sentient creatures, in the deep pre-historic age of any given species. If you weren’t being too strict about your definitions you could claim that the first simulations happened in the heads – or other appropriate body- or being-parts – of animals, or the equivalent, probably shortly after they developed a theory of mind and started to think about how to manipulate their peers to ensure access to food, shelter, mating opportunities or greater social standing.
Thoughts like, If I do this, then she does that … No; if I do that, making him do this … in creatures still mystified by fire, or unable to account for the existence of air, or ice, above their watery environment – or whatever – were arguably the start of the first simulations, no matter how dim, limited or blinded by ignorance and prejudice the whole process might be. They were, also, plausibly, the start of a line that led directly through discussions amongst village elders, through collegiate essays, flow charts, war games and the first computer programs to the sort of ultra-detailed simulations that could be shown – objectively, statistically, scientifically – to work.
Long before most species made it to the stars, they would be entirely used to the idea that you never made any significant societal decision with large-scale or long-term consequences without running simulations of the future course of events, just to make sure you were doing the right thing. Simming problems at that stage were usually constrained by not having the calculational power to run a sufficiently detailed analysis, or disagreements regarding what the initial conditions ought to be.
Later, usually round about the time when your society had developed the sort of processal tech you could call Artificial Intelligence without blushing, the true nature of the Simming Problem started to appear.
Once you could reliably model whole populations within your simulated environment, at the level of detail and complexity that meant individuals within that simulation had some sort of independent existence, the question became: how god-like, and how cruel, did you want to be?
Most problems, even seemingly really tricky ones, could be handled by simulations which happily modelled slippery concepts like public opinion or the likely reactions of alien societies by the appropriate use of some especially cunning and devious algorithms; whole populations of slightly different simulative processes could be bred, evolved and set to compete against each other to come up with the most reliable example employing the most decisive short-cuts to accurately modelling, say, how a group of people would behave; nothing more processor-hungry than the right set of equations would – once you’d plugged the relevant data in – produce a reliable estimate of how that group of people would react to a given stimulus, whether the group represented a tiny ruling clique of the most powerful, or an entire civilisation.
But not always. Sometimes, if you were going to have any hope of getting useful answers, there really was no alternative to modelling the individuals themselves, at the sort of scale and level of complexity that meant they each had to exhibit some kind of discrete personality, and that was where the Problem kicked in.
Once you’d created your population of realistically reacting and – in a necessary sense – cogitating individuals, you had – also in a sense – created life. The particular parts of whatever computational substrate you’d devoted to the problem now held beings; virtual beings capable of reacting so much like the back-in-reality beings they were modelling – because how else were they to do so convincingly without also hoping, suffering, rejoicing, caring, loving and dreaming? – that by most people’s estimation they had just as much right to be treated as fully recognised moral agents as did the originals in the Real, or you yourself.
If the prototypes had rights, so did the fai
thful copies, and by far the most fundamental right that any creature ever possessed or cared to claim was the right to life itself, on the not unreasonable grounds that without that initial right, all others were meaningless.
By this reasoning, then, you couldn’t just turn off your virtual environment and the living, thinking creatures it contained at the completion of a run or when a simulation had reached the end of its useful life; that amounted to genocide, and however much it might feel like serious promotion from one’s earlier primitive state to realise that you had, in effect, become the kind of cruel and pettily vengeful god you had once, in your ignorance, feared, it was still hardly the sort of mature attitude or behaviour to be expected of a truly civilised society, or anything to be proud of.
Some civs, admittedly, simply weren’t having any of this, and routinely bred whole worlds, even whole galaxies, full of living beings which they blithely consigned to oblivion the instant they were done with them, sometimes, it seemed, just for the glorious fun of it, and to annoy their more ethically angst-tangled co-civilisationalists, but they – or at least those who admitted to the practice, rather than doing it but keeping quiet about it – were in a tiny minority, as well as being not entirely welcome at all the highest tables of the galactic community, which was usually precisely where the most ambitious and ruthless species/civs most desired to be.
Others reckoned that as long as the termination was instant, with no warning and therefore no chance that those about to be switched off could suffer, then it didn’t really matter. The wretches hadn’t existed, they’d been brought into existence for a specific, contributory purpose, and now they were nothing again; so what?
Most people, though, were uncomfortable with such moral brusqueness, and took their responsibilities in the matter more seriously. They either avoided creating virtual populations of genuinely living beings in the first place, or only used sims at that sophistication and level of detail on a sustainable basis, knowing from the start that they would be leaving them running indefinitely, with no intention of turning the environment and its inhabitants off at any point.