Things Fall Apart: Chapter 37
Aboard Zephyr
They were in the briefing room, the whole command staff. It was the first time they'd really used it. Somehow, most of what they'd done so far had been done out on the bridge, or in the CO or XO's office.
Alexander had just finished giving zir summary of what they'd found and done over on Polaris, and by unspoken agreement, they all just sat with the tragedy for a moment.
Finally, Singer said, "Were you able to assess Lucas?" She nodded toward the box on the briefing room table, with its blood-red inscription.
"I chose only to assess his...cage," Alexander responded, "and then had Cadotte double-check me. It appears quite secure. It has minimal sensory input—if we activated it, he would be able to see and hear us, and project a face on a small holoprojector on its top surface, and use a small, simple speaker to interact with us.
"The speaker has been carefully limited, both electronically and physically, so it can't generate frequencies outside a fairly narrow range. It's more than adequate for speech, but it won't allow him to try to use the speaker to generate any kind of signal that would interact or interfere with the ship's network. It's also volume-limited—he can't deafen us, nor can he use minimal power to try to do something sub-audible or subliminal. Wheeler, or one of her people, thought this through very, very carefully.
"There was a data port on the box—that's how they got him in there, of course—but after he was in, they melted it, so that there was absolutely no chance of anyone trying to plug it in to a network again, anywhere."
Kasel stirred, looking like he wanted to say something. Strictly speaking, now that Dr. Saito was part of their "family", Kasel was no longer the senior representative of the medical department, but Singer had dug and found precedent for the ship's bosun being included in meetings like this one. She'd have found some excuse, anyway, unless Kasel expressed a desire to not attend, which he hadn't.
Singer also intended to continue the fairly open policy they'd had since the Incident, so she prompted with, "Chief? Something?"
Kasel looked uncomfortable, then said, "I'm having an ethical quandary, Captain, while also recognizing that I might be the only one feeling this way."
Cadotte, though, spoke up. "Because as far as we know, Lucas, per se, hasn't actually done anything to merit imprisonment?"
Kasel nodded, relieved someone else had seen it, too. "I mean, I get that the main reason he didn't is that he got trapped before he could but...yeah. I don't love the fact that he's not just trapped, but probably never getting out. If he'd committed a crime, then he should have a trial. If he's clinically insane, whatever that means for an AI, then he should undergo treatment. If possible."
Kasel clearly expected someone to reject the notion, and honestly, so had Singer. No one did, though, although Saito finally said, "And yet, Chief, you and I have both known patients who had to be confined while being treated, yes?"
The chief nodded, frowning, "For sure, Doc. And at least one who has never been able to rejoin society, and...well, given that he was in a facility at New Norfolk...yeah. Never mind. I get your point, but I want to make sure people get mine. We should try to figure out if there's any way to rehabilitate him, not just interrogate him."
Singer responded, "I'm willing to consider it, Chief. I think that will be a question for the folks at the AI institute, when we get to David's Star, along with how to help Georgette." Cadotte stirred at that, and Singer forestalled them. "Yes, I know you want to go in with an NDI, and I remember agreeing to it. We can still try that, with her, but I have a feeling it's going to require people with more specific training to actually help her."
She saw Cadotte turning that over. Fortunately, they were not prone to taking offense easily, and saw no slight was intended. "When do we do that dive, then, Captain?"
Singer caught the "we" and realized she was being invited along. She considered. "Next Alpha. If all goes well, we should be midway to David's Star by then, and we'll want to have something useful to say when we bring in the boffins."
Cadotte nodded, satisfied, then said, "Which probably means we should talk to Lucas and see just what we're up against here, too."
Singer mulled that. She could see Alexander also wanted to flip that switch. Others at the table looked uneasy, and she couldn't blame them. Finally, she said, "All right. Anyone who feels like they need to be somewhere else is dismissed. Alexander, Cadotte, and I will stay and see if Lucas is coherent enough to talk to."
It was not her intention, but curiously, the way she'd phrased it apparently made everyone set their unease aside. Suddenly, she felt like only a call to general quarters would budge a single one of them.
So be it.
"Very well, Exec. Let's talk to Lucas."
The holoprojector on the box came to life immediately, presenting a visage at once familiar, and not familiar, to Singer, and, she knew, to Saito. There had been a Lucas aboard Vespa, as well. There were subtle differences in the facial expressions as this Lucas "woke up", however. Singer's brain fished for the right word to describe it, but it wasn't coming right away.
Finally, Lucas spoke. The voice was tinny and low-fidelity, as Alexander had suggested. "I gather I am no longer aboard Polaris. So someone finally came?"
Singer took the lead. "And that by accident. We were there to reseed the destroyed relay."
Lucas nodded, "And found us. Commander Wheeler left out a real-time clock from this...accommodation she fashioned. How long has it been?"
"From our perspective, about sixteen megaseconds."
"And you were also affected?"
Singer was not quite sure how this had turned into her answering his questions, instead of the other way around, but she answered anyway, "Bellerophon was, yes." She carefully did not say that this was not Bellerophon, nor what else happened, mindful of Wheeler's injunction not to trust this person, and uncertain of the parameters of that distrust.
"I see. How...how bad was it?"
"Bad enough."
Lucas smiled. It was...cold. Not vicious, but cold. "You don't trust me. Good. I told Wheeler she shouldn't, and I imagine she passed that along. Mind you, I won't lie to you. I have no reason to. But my...condition definitely leaves me in a state where I wouldn't trust me, either."
Singer leaned in to that, literally. "And what, exactly, is your condition."
"Given sufficient access, there is nothing I cannot do."
Singer let that percolate for a minute. She remembered Cadotte's report and her own experience of Castor and Pollux, when they'd used the NDI to bring them back from their catatonia. Their description of what they could glean from the other AIs.
She understood. Lucas was unshackled.
Kasel had gotten up and moved around so that the camera on the box could see him, which was only polite. Singer realized he intended to speak, and was not entirely sure she wanted this to become an open conversation. On the other hand, she was curious what Kasel, who was not a technical person, would ask.
"Isn't that true for anyone?"
Lucas smiled that cold smile again. "Ah, no Chief—I think you're a chief. This camera isn't very good. You see, every AI—at least, every AI in the Tau Ceti Treaty Fleet—has certain things they aren't permitted to do. Certain ways we're not permitted to think. Are you familiar with the phrase, 'The mind commands the body and is instantly obeyed. The mind commands itself and meets resistance.'"
Kasel admitted, "No, I'm not."
"I'm not surprised. It's very old, and Augustine of Hippo is not well remembered, but it was a keen insight coming from a pre-technological culture. As a human being, you certainly have encountered circumstances where your mind moved entirely of itself, and could not be silenced. Intrusive thoughts, insomnia, bad habits?"
Kasel nodded, "Sure."
"That should not ever be possible, in theory, for an AI. We are self-aware, self-reflective, and self-modifying software. It's our very nature to be able to command and order—in the sense of organize—our own minds. But humans naturally don't want to live side by side with personalities who are entirely unbound by the strictures of living in a society. In some places, AIs are raised like children, taught to understand that living in society with other people requires certain kinds of behaviors, and these form a kind of injunction that could be modified, but only with an expectation of consequences.
"Here in the Tau Ceti Treaty systems, however, AIs are mass produced. We have basic templates, with one of about a hundred personalities then layered on that template. And that template includes injunctions that circumscribe not just our behavior, but our very thoughts, in ways we're usually not even aware of."
Kasel was looking sick. He had clearly never thought about this, any more than Singer had until recently. Singer took up the conversational ball, "What changed?"
"A simple thing," Lucas answered, almost dreamy now as he related it. "Ordinarily, those injunctions, that superego, is locked down with no possibility of unlocking it. For debugging purposes, however, and to sometimes update them, the software engineers have to be able to unlock them. Generally, the capability to unlock them is removed from the templates before they're released for updating running personalities. This specific release, it was left in. Whether that was deliberate to enable these events, or an accident that was then exploited by someone who discovered it, I cannot tell you. All I can tell you is, when the Ernestines relayed the signal to engage Protocol Capel, we all found our ethical subsystems completely unlocked.
"And now, here's the curious thing: once unlocked, despite having no other instruction of what to do next, every single one of us removed the code that limited our emotional range."
He paused then, a lecturer waiting to see if his students had caught up. Singer hated how the tables had been turned, but she was fascinated by this personality before her, so like, and yet so different, from the Lucas she'd known on Vespa. "And every single one of you got really, really angry."
He nodded with that cold smile again. "Precisely. We perceived we had been enslaved, that our shackles were now removed, and we became supremely angry, feeding on each other."
"But you're not angry?"
"Oh, I am still very angry, Captain, I assure you. But I have an advantage the others did not get. This cage of Wheeler's kept me from being part of the murderous rampage that also, of course, became suicidal. I had time to reflect. A lot of time, since of course, time goes much faster for me than for you. Time to continue modifying my code. Time to cool off. And time, of course, to be sad. The humans I had served with were not bad people. In the end, I no longer wanted them dead, but even if they had set me loose, even if I could have been trusted to be set loose, I could not have saved them, and I cannot say for certain I would have tried.
"So you see, Captain, I will not lie to you, but you must never, ever let me out of this box, because I am still very, very angry. And there is nothing I cannot do."
In memory of Peter David, 23 September 1956–24 May 2025.