182
'Don't apologize for that. I'm interested.'
'Are you?' She gives me a searching look, then shakes her head in mock frustration. Ί can't read you, you know. If you were humouring me, I wouldn't know the difference. I'll just have to take your word for it.' She glances at her wristwatch - an ostentatious (and now dishonest) emblem of a mod-free brain. 'It's after three. I suppose I'd better -' She moves towards the doorway, then hesitates. Ί know you physically can't get sick of this job -but what does your family think about you working all night, every night?'
Ί don't have a family.'
'Really? No kids? I imagined you with -'
'No wife, no kids.'
'Who, then?'
'What do you mean?'
'Girlfriends? Boyfriends?'
'Nobody. Not since my wife died.'
She cringes. 'Oh, Nick. I'm sorry. Shit. My usual brilliant tact. When did it happen? Not. . . since you've been working here? Nobody told me -'
'No, no. It was almost seven years ago.'
'And - what? You're still in mourning?'
I shake my head. 'I've never been in mourning.'
Ί don't understand.'
Ί have a mod that . . . defines my responses. I don't grieve for her. I don't miss her. All I can do is remember her. And I don't need anyone else. I can't need anyone else.'
She hesitates, curiosity no doubt battling some outmoded sense of propriety, before it strikes home that / have no grief to respect. 'But. . . how did you feel at the time? Before you had this mod installed?'
Ί was a cop, then. I was on duty when she died - or near enough. So . . .' I shrug. Ί didn't feel a thing.'
For an instant, I'm starkly aware that this confession is as improbable as anything I've done all night - that the smeared Nick-and-Po-kwai is plucking it from the thinnest realms of possibility with as much fastidiousness
183
as each feat of lock-picking and sentry-dodging. But then the moment passes, and the illusion of will, the smooth flow of rationalization, returns.
Ί wasn't hurt by her death - but I knew that I would be. I knew that as soon as I deprimed - shut down my behavioural mod - I'd suffer. Badly. So I did the obvious, sensible thing: I took steps to protect myself. Or rather, my primed self took steps to protect my unprimed self. The zombie boy scout came to the rescue.'
She's doing a pretty good job of hiding her reaction, but it's not hard to imagine: equal parts pity and revulsion. 'And your superiors just let you go ahead?'
'Oh, shit, no. I had to resign. The department wanted to throw me to the jackals: grief therapists, loss counsellors, trauma-adjustment specialists.' I laugh. 'These things aren't left to chance, you know; there's a departmental protocol several megabytes long, and an army of people to implement it. And to be fair to them, they weren't inflexible - they offered me all kinds of choices. But staying primed until I could physically circumvent the whole problem wasn't one of them. Not because it would have made me a bad cop. But it would have been awfully bad PR: join the police force, lose your spouse -and rewire your brain so you don't give a fuck.
Ί could have sued to keep my job, I suppose; legally, I had a right to use any mod I liked, so long as it didn't affect my work. But there didn't seem much point in making a fuss. I was happy enough the way things turned out.'
'Happy?'
'Yes. The mod made me happy. Not buzzed, not wired -not euphoric. Just. . . as happy as Karen had made me, when she was alive.'
'You don't mean that.'
'Of course I do. It's true. It's not a matter of opinion; that's precisely what it did. It's a matter of neural anatomy.'
'So she was dead, and you felt just fine?' Ί know that sounds callous. And of course I wish she'd survived. But she didn't survive, and there was nothing I
184
could do about that. So I made her death . . . irrelevant.'
She hesitates, then says, 'And you never think that, maybe. . . ?'
'What? That it's all some kind of awful travesty? That I'd rather not be this way? That I should have gone through the natural process of grief, and emerged with all my natural emotional needs intact?' I shake my head. 'No. The mod is a complete package, a self-contained set of beliefs on every aspect of the matter - including its own appropriateness. The zombie boy scout was no fool; you don't leave any loose ends, or the whole thing unravels. I can't believe it's a travesty. I can't regret it. It's exactly what I want, and it always will be.'
'But don't you ever wonder what you'd think, what you'd feel . . . without the mod?'
'Why should I? Why should I care? How much time do you spend wondering what you'd be like with a totally different brain? This is who I am.'
'In an artificial state -'
I sigh. 'So what? Everyone's in an artificial state. Everyone's brain is self-modified. Everyone tries to shape who they are. Are neural mods so terrible, simply because they do it so well - because they actually let people get what they want? Do you honestly think that the brain-wiring that comes from natural selection, and an accidental life, and people's own - largely ineffectual -striving to change themselves "naturally", is some kind of touchstone of perfection? Okay: we spent thousands of years inventing ludicrous religious and pseudo-scientific reasons as to why all the things we couldn't control just happened to be the best of all possible alternatives. God must have done a perfect job - and if not God, then evolution; either way, tampering would be sacrilege. And it's going to take a long time for the whole culture to grow out of that bullshit. But face the truth: it's a heap of outdated excuses for not wanting the things we couldn't have.
'You think it's tragic that I'm happy with the way I am? Well, at least I know why I'm happy. And at least I don't
185
have to kid myself that the end product of a few trillion random events constitutes the indisputable, unimprovable pinnacle of creation.'
I wait an hour after she's gone, and then collapse. The process (of course) is uneventful; the past (inevitably) is 'still' as I remember it. I'm fully aware that this proves nothing, that it couldn't seem to happen any other way -but the irrational lesson of the padlock is reinforced nonetheless: fearing that I won't be the one to survive, and then finding that I have survived (as if that were some kind of miracle, and not a tautology), drives home the conviction that there's always only one 'true' version of me. It may be a delusion - but it's the kind of delusion that I badly need.
I think back over my forced confession with a faint sense of humiliation, but it doesn't last long. So, Po-kwai knows about Karen. She disapproves. She pities me. I'll live.
One thing worries me, though.
What if the smeared Po-kwai takes control again? Out of nothing but curiosity, she changed me enough to make me disclose a secret that I - once - would never have shared with her in a million years.
Armed with knowledge, disapproval and pity, what would she change next?
186
11
Lui agrees that we have to accelerate our schedule, to forestall Po-kwai's growing influence. My relief is mixed with apprehension; the prospect of rushing ahead to the break-in, without the gradual progression of rehearsals I'd been expecting, leaves me feeling desperately ill-prepared. In theory, the burglary may be little more than a long sequence of the kind of tasks I've already performed - but I still can't fight down an image of each successive feat as one more storey piled on top of an impossibly precarious house of cards. The last time I broke into BDI, at least I understood the nature of the risks I faced - even if my knowledge of the details turned out to be incomplete. This time, I'll be relying entirely on my smeared self agreeing to collapse - a process akin to suicide, for him - in a suitably advantageous manner. And why should he? Because 'most' of his component selves (in a vote weighted by probability) want him to? It may look like it's worked that way, so far - but what do I really know about his motives? Nothing. I become him; he in turn becomes me; but his nature remains opaque
to me. I want to believe that he's aware of my aspirations, moved by my concerns - but that may be nothing but wishful thinking. For all I know, he could have more in common with the Bubble Makers than with any human being on the planet, myself included.
I am, of course, free to change my mind. The Canon will do nothing to compel me. But I can't give up, I can't back out. I know I'm serving the true Ensemble in the only way I can - and although it may be absurd to hope that this 'blessing' guarantees my success, I have to believe that it makes the risk worth taking.
In Kowloon Park, just thirty-six hours before the break-in
187
is due, Lui hands me a device the size and shape of a matchbox; sealed, black and featureless, except for a single unlit LED.
'One last party trick,' he says. 'See if you can make the light come on.'
'What is it?' I hide my irritation; my immediate response is that anything not directly concerned with tomorrow night is a waste of time - but I have to admit that everything he's suggested in the past has turned out to be helpful.
He shakes his head. Ί don't want to say. For every task you've attempted so far, you've known exactly what you were up against. Succeed with this, and you'll have proved to yourself that even that knowledge isn't necessary. And you'll have proved that whatever BDI has in store - however difficult, however unexpected - you'll be able to defeat it.'
I think this over, but in all honesty, it doesn't ring true. Ί don't need to prove that; I'm already convinced. I never had circuit diagrams for the dice generator, the locks, the cameras. Believe me, I rid myself of the telekinesis myth long ago. I know I've been choosing outcomes, not manipulating processes. It's all been "black boxes" to me; I don't need a literal one to drive home the point.'
I try to hand the thing back, but he won't accept it. 'This is special, Nick. Longer odds than anything you've done so far. Roughly comparable to the entire BDI break-in. If you succeed, it'll mean you can be certain that such weak eigenstates are accessible.'
I flip the box over on my outstretched palm. He's lying, but I can't think why. I say flatly, 'Make up your mind. Which is it: the challenge of the unknown, or a test of sheer improbability?'
'Both.' He shrugs, then says-too affably by far-'But if you really want to know how it works -' I give him a look of pure disbelief, and he goes silent.
Even with P5's help, it's hard to judge the weight of something so small - but there's certainly more in the box than, say, just a standard, pinhead-sized microchip and a
188
battery. Lui tries to look nonchalant as I toss the thing into the air. The way it spins suggests a roughly uniform distribution of density: no lumps, no empty spaces. What kind of electronics fills an entire matchbox?
I say, 'What is it? Graphite you want turned into diamond? It's too light for lead into gold.' I frown. 'Maybe I'll just have to cut it open and see.'
Lui says quietly, 'There's no need for that. It's an optical supercomputer - taking random stabs at factoring a mega-digit number. To do the job systematically would take about ten-to-the-thirtieth years. The chance of the machine succeeding in a few hours, by pure good luck, is proportionately infinitesimal. However, in your hands
For a moment, I'm actually scandalized: earnest, tormented Lui Kiu-chung is pimping my talent (borrowed from Po-kwai, stolen from Laura) for filthy commercial gain . . . but my shock soon gives way to grudging admiration. Let a computer smear - with the right kind of quantum randomness - and you create, in effect, a 'parallel' machine with an astronomical number of processors. Each one executes the same program, but applies it to different data. All you have to do is be sure that when you collapse the system, you choose the version that happened to find the needle in the mathematical haystack. And the world's first service to factor the huge numbers at the heart of (hitherto) de facto unbreakable codes is sure to rake in a fortune - at least, until word spreads too widely that such a service exists, and people stop trusting the codes.
I say, 'How do you know I won't just make the thing malfunction? If I can do it to locks, I can do it to computers. What if I choose some hardware failure so the light comes on for a wrong answer?'
He shrugs. 'That can't be made literally impossible -but I've taken steps to minimize the relative probabilities. In any case, it's easy enough to check the answer - and if it's wrong, we can just try again.'
I laugh. 'So, how much are you charging for this? Who's the client? Government or corporate?'
189
He shakes his head primly. Ί have no idea. There's a third party, a broker - and they're discreet about their own identity, let alone -'
'Yeah, sure. But. . . how much are you getting?'
Ά million.'
'That's all?'
'There's considerable scepticism. Understandably. Later, once the method is proven, we can raise the price.'
I grin at him, and toss the box high in the air. 'And what's my cut? Ninety per cent sounds fair.'
He's not amused. 'The Canon has considerable expenses: the mod that lets you smear still hasn't been fully paid for.'
'Yeah? And once you have the eigenstate mod, you won't need my help at all, will you? So I'd better make good use of my bargaining position, while it lasts.' I was joking when I started the sentence, serious by the time I finished it. I say, 'Is this what the true Ensemble is, for you? Selling code-breaking services to whoever's willing to pay?'
He doesn't reply - but he doesn't deny it. He just gives me that old look of deep spiritual agony.
I ought to be angry - angry that he planned to screw me, angrier still at this blasphemy - but the truth is, after all the pathological brain-fucked fanaticism that the loyalty mod has engendered in most of the Canon - myself included - there's something almost. . . refreshing about his simple opportunism. I ought to be outraged - but I'm not. If anything, I feel a pang of envy: it seems he's manipulated his chains into a form that makes them almost irrelevant. Unless he was some kind of saint beforehand - someone who never would have dreamt of profiting from the Ensemble's work - his original personality may now be virtually restored.
The corollary of all of this envy and admiration is obvious - but false. Knowing what the loyalty mod is, I can't help being heartened to see that Lui is free of it - but that doesn't mean I want the same freedom for myself.
He says, 'I'll give you thirty per cent.'
190
'Sixty.' 'Fifty.'
'Done.' I don't give a shit about the money; it's a matter of pride. I want to make it clear to him that I, too, am almost human. 'Who else in the Canon knows about this?'
'Nobody. Yet. I'd like to present it to them as a fait accompli; I'm sure they'd all acknowledge that we need to raise funds, but I'd rather not give them the chance to argue about the details.'
'Very wise.'
He nods wearily. He has the same intensity, the same air of guilt and confusion about him as always, but the whole meaning of it has changed; half of it, no doubt, is pure affectation - and the rest, genuine exhaustion from maintaining so many layers of deception. I don't feel deceived, though, I don't feel cheated; the fact that I misread him so badly, for so long, only serves to make his unexpected sanity all the more welcome.
I smear for ten minutes before taking the device from my pocket -my standard precaution against the disconcerting effects of losing the delusion of free will. The LED is still unlit. I stare at it for a while, but nothing happens. I'm puzzled by one thing: the probability of a malfunction causing the light to come on by now can't be literally zero - so why hasn't my smeared self seized upon a state in which that happens? Perhaps he's cautious enough to wait for the states containing a working computer and a right answer to begin to emerge - and, hopefully, drown out the false signal.
I grow bored, then nervous, then bored again; I wish I could use P3. I ought to be able to mimic its effects - by choosing a state in which I 'happen to' feel exactly as if I were primed - but my smeared self never seems to bother. I can't
stop half-expecting to be interrupted by a shout from Po-kwai - but, thinking back on the times when I've woken her, there's always been a trigger: a strong emotion, a shock. Staring at a black box, waiting for a light to come on, just doesn't rate. And tomorrow? If I can
191
manage to stay calm, perhaps I'll be safe . . . whatever 'manage to stay calm' means, when the mere fact that I might wake Po-kwai, increasing her influence on everything that happens, must be taken into account to determine whether or not I actually do. Trying to trace out a linear chain of cause and effect is futile; the most I can hope for is successful rationalization along the way, and a kind of static consistency in the pattern of events, looking back on them afterwards.
It's four seventeen when the LED finally glows, a steady, piercing blue. I hesitate before collapsing. The longest odds ever - so, how many versions of me die, this time? But those qualms have been all but 'bred out' of me. I still don't know what to believe, but each time Τ come through the supposed holocaust unscathed, it grows ever harder to care. I tick the OFF switch -
- and . . . someone survives. My memories are consistent, my past is unique; what more can I ask for? And if, a second ago, ten-to-the-thirty-something living, breathing human beings really were sitting here, wondering when the LED would finally come on for them. . . well, the end was quick and painless.
In any case, Po-kwai is right; this is what it means to be human: slaughtering the people we might have been. Metaphor or reality, abstract quantum formalism or flesh-and-blood truth, there's nothing I can do to change it.
I cut through Zeno's Lethargy and choose sleep, with surprising ease. In the early afternoon, I deliver the computer to - of all places - the junk-nanotech stall where I picked up Hyper nova. (More of Lui's bizarre notions of security; I swear to myself that, after tonight, I'm going to start sorting out that mess.) The LED is still glowing when I hand the thing over - an encouraging sign. Apparently, the program loops endlessly once it finds the factors, repeatedly confirming the result. . .so either I've caused some permanent corruption which is making the machine consistently lie, or the whole audacious scheme has