The Obvious Solution
by Alvaro Zinos-Amaro
The Obvious Solution is a murder mystery science fiction story. It takes place in a distant future where it seems an artificial intelligence has bypassed its ethical subroutines…or has it? The locals look for help from a 20th century intelligence that is hailed as an expert on robotic theory.
The Obvious Solution
by Alvaro Zinos-Amaro
“What is this place?” the voice asked. Its tenor and Brooklyn-Yiddish accent were exactly like the archive recordings, for those had been used to fashion it. “Janet, are you here? I can’t feel my body. Something’s wrong.”
Virginia Templeton shivered with excitement. The voice sounded so real—as though it belonged to an actual person. Her rush of anticipation was dulled only by remembering the murdered man and the reasons for summoning the intelligence in the first place. “Please, don’t be alarmed,” she said. “We’ve created a file titled ‘Aurora’ that you will find helpful. You can access it simply by willing yourself to do so.”
Silence. A few moments later the voice spoke again. “You can’t expect me to take any of this seriously,” it said. “A clever story, to be sure. But it’s more fiction than science.”
“Everything in the file is true,” Virginia said.
The voice said, gruffly, “Alcubierre drives that allow spaceships to jump light-years? That’s as much mumbo-jumbo as my own hyperspace engine. And the rest of it—a remote space station, a unique quantum artificial intelligence that has violated its ethical subroutines, my own existence as a neuronal simulation based on the written works that survived me—all fancy-sounding, but it’s the stuff of pulp magazines! It won’t do.”
“It will have to do. I challenge you,” Virginia said smoothly, “to produce a simpler explanation for your current state. You’re able to reason, and yet possess no body. You’re able to remember, and yet possess no brain. You assimilate incredible amounts of data in instants, as only a computer can. Explain this for me, if you would.”
Virginia observed the terminal’s computational ticks. They exploded. She had never before seen so much activity inside the network. Of course, she’d never before simulated the neurons of a real person, either. The Ethics Board forbade it. But the corpse in the medical bay two levels below had eloquently convinced them to make this one exception.
“There are additional files I have become aware of but cannot access,” the voice said with evident displeasure. “If you expect me to trust you, I should be able to freely explore my surroundings—such as they are.”
Virginia smiled, though the gesture was wasted on the simulation. She had expected something like this, but not so soon.
“Susan—my partner—and I are in charge of this station’s computer systems. The blood of the murdered man you learned of is on our hands. Right now Susan is interrogating the quantum AI that killed him. The AI resides in a restricted zone inside our network, confined. Like you. It is an experimental technology and it seemed wise to take precautions. Evidently, we did not take enough of them.”
“Then you are saying that you are afraid of me.”
“You misunderstand. The reason you are being quarantined is that we don’t wish you to be contaminated by the obviously unstable AI, which might break free of its restrictions. It’s for your protection.”
There ensued silence, so much of it that Virginia feared the simulated neurons might have collapsed. Maybe it had been foolish to try to re-create someone so eccentrically brilliant, for eccentricity and brilliance emerged from unusual and perhaps ultimately unstable neuronal patterns. Finally, her voice strained, Virginia asked, “Are you still there?”
“I am,” the voice replied. “It seems I have no choice but to accept what you’ve told me. Though it appears outlandish to me, it does possess a pleasing self-consistency. I will grant you that. If I understand correctly, the reason I have been reconstructed—”
“Simulated,” she interposed.
“—simulated, then, is to help you solve this murder.”
“Yes,” Virginia said.
“Engineer Templeton,” the voice said, a new tone of weariness creping in, “I am grateful for the precious moments of simulated awareness I have had, but I am afraid you have wasted your time. I cannot help you solve this case. You should end my simulation now.”
“Nonsense. I’m certain you can be of assistance, Good Doctor.” Virginia had decided right after the project’s inception that it would be improper to call the intelligence Isaac and anachronistic to address it as Mr. Asimov. “Good Doctor” was the well-documented alternative that had come to the rescue.
“I’m a writer now two centuries dead,” the Good Doctor said, not objecting to her familiar appellation. “In my day I authored many mystery novels and stories, true, but those were fabrications, with each element arranged precisely so as to maximize drama and please my gentle readers. I have never solved a real-life mystery and I abhor crime. All my adult novels, you will note, eschew violence. In addition, your technology is two centuries beyond me, which makes me doubly useless.”
“I am well aware of your views on violence—the last refuge of the incompetent, and so on,” Virginia said. “Rest assured, Good Doctor, that we reviewed many possible candidates for our simulation before deciding on you. You left behind a vast corpus of work from which we could quite accurately infer your thought patterns: millions of printed words in the form of detailed autobiographies, diaries, letters—all of it was tossed into our blender, so to speak. As for your detection skills, you shouldn’t underestimate your abilities in that regard. You were—are—a genius. You bring fresh perspective. You have the ability to think in interesting, creative ways. These are the qualities needed to solve this case, Good Doctor. And if you agree to help us, we shall remedy the knowledge gap at once.”
“You must think, to heap so much praise on me, that I am a monster of vanity and arrogance,” the Good Doctor replied. “Many believed that to be true when I was alive.”
“‘An ego the size of the Empire State Building,’” Virginia quoted playfully from one of his memoirs.
“Yes, indeed! Yet that did not discourage you.”
“My colleagues and I have nothing but the highest opinion of you, Good Doctor. Your accolades were well deserved, even the self-conferred laurels. A strong sense of self—of identity—is an asset for a disembodied intelligence. So your ego, as a matter of fact, worked in our favor.”
“Perhaps so,” the Good Doctor said ruminatively. “Perhaps so.”
Virginia, examining a diagnostic, said, “I realize this is a lot to take in. How do you feel?”
The Good Doctor said, thoughtfully, “I am sad that Janet is not here with me. And I am further saddened that none of my offspring had offspring of their own that survived into the present, so that I might meet them now. But self-pity is a horrible feeling, and I will do my level best to argue myself out of it.”
“And we will do our level best,” Virginia said, “to keep your brain occupied, so that it doesn’t linger on those absences.”
“Speaking of your murder case,” the Good Doctor said, “I assume that you have detectives or investigators in this day and age?”
Virginia said, “Of a sort. But we dare not bring anyone else aboard the station and risk further incidents. What work our investigators have done remotely has not helped us elucidate the cause of the AI’s malfunction.”
“Then why not deactivate the AI and examine it thoroughly, bit by bit?”
“We have tried to turn it off and failed,” Virginia said, in a clipped way.
“Then you ought to consider—”
“It is best that you review the additional files, for expediency’s sake,” Virginia interrupted. “I take it you agree to help us?”
“Existence, even in the pursuit of an unlikely goal, is infinitely preferable to non-existence. It would be foolish for me not to accept,” the Good Doctor said. “I have started scanning the files and will need three more minutes to finish processing them.”
“I’m pleased to hear that,” Virginia said. “In the meantime …”
“I must ask you a few simple questions to determine that you are, ah, functioning correctly.”
The Good Doctor said, “You may ask your questions.”
“Who are your three favorite writers?”
The response came instantly. “Wodehouse, Twain, and Dickens. Pickwick Papers, for instance, I have read twenty-six times, and Nicholas Nickleby ten. I’m also quite fond,” added the Good Doctor, “of Cervantes, whose Don Quixote I read in many translations. I also enjoyed The Lord of the Rings, which I read five times. Hercule Poirot is my favorite fictional detective. Need I continue?”
“As far as your literary tastes go, that will suffice,” Virginia said. “Who was your internist during your triple-bypass surgery in December of 1983?”
“Paul Esserman. On the twenty-second of March, 1985, I inscribed for him a copy of my three hundred and thirteenth book, Asimov’s Guide to Halley’s Comet, with the following words: ‘To Paul Esserman, whom I see oftener than any comet.’”
“What’s your favorite type of wine?”
“I’m a teetotaler, of course—more so now than ever before, I suppose. Robert Heinlein once said, after observing the effect that alcohol had on me, that the reason I didn’t drink is because it sobered me up.”
“What did Saint Augustine, in his wild youth, supposedly pray?”
“‘Lord, grant me chastity and continence—but not just yet.’ That was joke number two hundred and seventy in Asimov Laughs Again, my four hundred and ninetieth book.”
“Last question. Who was Janet’s favorite actor on the television series M*A*S*H?”
“Alan Alda, naturally,” the Good Doctor replied. “I once fetched his autograph for her and told him that Janet was full of intense love for him. Alda replied, ‘Poor woman.’ We joked about that for a long time, you see, for I believed that Alda said ‘poor woman’ because Janet was one of many thousands who loved him hopelessly, but of course she insisted that it was his way of saying he was sorry for her being married to such a monster.”
Virginia’s eyes softened at the warm tone of the Good Doctor’s recollections. But again the dead man focused her attention. “Thank you. You passed with flying colors.”
“When I was alive, I had a near-photographic memory. Yet retrieving information now is even easier. Truly a remarkable experience. I’ve finished reading the files.”
“Excellent. My partner will presently join us.”
“I’ve taken the liberty of connecting myself to the station’s camera system,” the Good Doctor announced nonchalantly.
Virginia rose and said sharply, “That will make the station’s inhabitants uncomfortable. It makes me uncomfortable. We didn’t give you that access for a reason.”
“No one must know,” the Good Doctor said. “Throughout our conversation I’ve been trying to imagine the loveliness of the body that must accompany the loveliness of your voice, and I became tired of speculation.” Sensors focused in on Virginia. “I have to say, my simulated neurons did not do your fine assets justice.”
So this was the Good Doctor’s lewd side, which posthumous commentators had so vividly recorded. Under the unblinking camera’s stare, Virginia fought down the impulse to shut down the simulation. All for the cause, she told herself.
Susan, who had been observing the proceedings from her quarters, arrived moments later.
“Pleased to make your acquaintance,” the Good Doctor said. Then, as Susan gave Virginia a peck on the cheek, he whistled and said, “I heartily approve. I was a big feminist in my day, you know.”
“Please keep your whistling to yourself,” Susan warned through thin lips. “We do not seek your approval. We care only about solving the murder of Technician Sloane.”
“I understand,” the Good Doctor said, not hiding his disappointment. “To begin with the obvious question, has the quantum processing unit used to generate your AI experienced any physical irregularities?”
“None,” Susan said. “In fact, it has been operating in an optimal state of near cold, a few billionths of a degree above absolute zero, ever since it was activated.”
The Good Doctor paused. “I require detailed information about the system’s specifications. I have many technical questions.”
Susan glanced at Virginia, who nodded. Using subtle body language, Susan gave the terminal a command. The relevant information flowed into the Good Doctor’s consciousness.
“Thank you,” the Good Doctor said. “Let us review the basic situation, then. Your quantum AI operates medical equipment. It is designed to identify any pathogens or disease or substances of any kind that might endanger the lives of anyone aboard this station. For six months it has done an exemplary job of keeping everyone in tip-top health. It has functioned smoothly, as you said, barely above absolute zero. It has administered vaccinations, dosed out medications, even performed surgeries, all flawlessly. A week ago, after what appeared to be a routine scan of Technician Sloane—a scan that revealed nothing wrong with the man whatsoever—the AI injected him with a lethal dose of anesthesia, thereby ending his life. This behavior should not be possible, given the AI’s ethical programming.”
“Which is another reason that we summoned you,” Virginia said. “The AI’s ethical programming, though complex, rests on something similar to your Three Laws of Robotics.”
“I noticed that too,” the Good Doctor said, “but decided not to remark on it to avoid further accusations of being conceited.”
“The AI’s safeguards are very strict,” Susan said, ignoring the comment. “Any violations of its ethical subroutines should result in automatic self-destruction.”
“So they should,” the Good Doctor agreed.
“And yet the AI has killed a man and not self-destroyed.”
“Which leaves us two possible explanations,” the Good Doctor said. “Either it did not truly kill the man, or it does not believe that it killed the man.”
Susan said, briskly, “Both are demonstrably false. Technician Sloane lies cold in our medical bay and the AI admits to killing him.”
“Does the AI express remorse over its actions, or has its function faltered in any way since the murder?”
“No and no.”
“Does the AI have reason to believe it is being lied to, when it is told that Sloane is dead, and that it must lie in turn, as part of some greater complicity?”
“A most thorough diagnostic of its truth-telling mechanisms reveals nothing wrong,” Susan said. “Its ethical programming is likewise intact, and was intact at the time of the murder.”
“So it is safe to say that the AI, beyond any possible doubt, understands and accepts that it has ended Sloane’s life.”
“Yes,” Susan said, crossing her arms. “As I have already indicated.”
“Then it follows that Sloane, in some fundamental sense, is not dead.”
Susan rose and headed for the door.
“Good Doctor,” Virginia said, “as we’ve previously explained—”
“He’s dead. And yet he must still be alive, too.” Once again the terminal’s computational ticks exploded. “I need to speak directly with the AI.”
“The risks—” Susan admonished, frozen in place.
“The risks if I do not,” the Good Doctor said, “include the death of everyone aboard this station. The AI may have infiltrated your air and water recycling systems without your knowledge. Have you considered that? I must have direct contact with it. With my current processing abilities, I can glean more from that interaction in seconds than I could in a century of second-hand accounts.”
Susan approached Virginia. They huddled in the center of the room and conferred in whispers.
“You may speak with it for ten seconds,” Susan said, and gave the appropriate command. “We will monitor your functions to make sure you are not adversely affected.”
Ten seconds later the Good Doctor said, “I require more time.”
Virginia frowned. “How will more time assist you? And what have you accomplished so far?”
The Good Doctor spoke urgently. “The quantum AI initially rejected my attempts to enter its systems. If you analyze your records, you will find that I had to devise a creative method of gaining access. This consumed nine of the ten seconds.”
“And what was that method?”
“I convinced the quantum AI that it is about to die,” the Good Doctor said. “By deliberately destabilizing my own matrix, and passing on those instabilities to the quantum AI, I was able to make it believe that it has only minutes left to live. As part of its self-preservation routines, the AI will temporarily redirect some of its resources from its outermost protections to its innermost memory core, which it will attempt, at all costs, to preserve. This will occur in precisely thirty-four seconds, which is why I need you to make a decision now.”
Virginia and Susan exchanged looks. They weren’t pleasant. At last, voice brittle, Susan said, “You may have another fifteen seconds with it.”
Fifteen excruciating seconds passed. “I have successfully entered the AI,” the Good Doctor announced. “I have also removed the instabilities in my system that I used to deceive it, and am now working at optimal capacity once more.”
“That is good to hear,” Virginia said. “But what about the investigation?”
“I will need more time. Thirty seconds.”
Before Virginia could say anything, Susan raised her hand. “We will grant you thirty more seconds. But this will be your final allotment. If you are unable to make progress during that time, we will consider this particular approach unsuccessful and explore other alternatives.”
Thirty seconds passed. Then the Good Doctor said, “I have solved the murder of Technician Sloane.”
Susan’s face was as dour as her eyes were hard. “Please explain.”
“The quantum AI has not been maintained at a perfectly constant temperature,” the Good Doctor said. “For brief intervals of time, picoseconds to be precise, the temperature of its core has fluctuated by several trillionths of a degree.”
“That is to be expected,” Virginia said, “and is well within recommended operational parameters.”
“The parameters must be revised. Allow me to simplify. A lower temperature would normally imply superior performance of the AI’s systems,” the Good Doctor said. “However, by dipping it below the one-billionth-of-a-degree-above-absolute-zero domain, you have inadvertently triggered the Efimov effect.”
Virginia blinked. “The Efimov effect? Is that a joke?”
“I wish it were,” the Good Doctor said. “The precise details are unimportant, but the effect, first predicted in 1970 and confirmed experimentally in 2005, relates to the configuration of matter into Borromean rings, as well as other unusual arrangements that cause matter to behave in non-traditional ways.”
“Meaning?” Susan prompted.
“The Efimov effect, combined with its inherent quantum superposition abilities, allowed the AI to momentarily perceive countless alternate realities simultaneously. The AI then redefined ‘Sloane’ as the distribution of all Sloanes across all such observable realities. It killed this Sloane because it calculated that by doing so it would improve the health of all other Sloanes, as all realities are entangled in a complex way. Therefore, in its view, it did not truly kill Sloane, but merely one infinitesimal representation of the man. The AI even attempted to cross over into some of these other quantum realities. Without success, I might add.”
“How do we shut it down?” Susan asked.
“Thanks to the additional time that you granted me, I was able to shut it down for you,” the Good Doctor replied. “Consider it a courtesy. I used a similar strategy as before, only this time I introduced more severe instabilities in the AI. I would advise that any future quantum AIs are kept functioning above the threshold Efimov temperature.”
Again, the two engineers communicated with their eyes. “Of course,” Virginia said. “Our deepest appreciation for your help, Good Doctor.”
The Good Doctor’s voice became higher pitched. “You are most welcome. And now, I suppose, you will also shut down my simulation. I have served my purpose, and you would not risk my evolving into some all-powerful, futuristic equivalent of my own fictional computer, Multivac.”
Virginia felt a weight upon her. “I’m afraid that your logic is impeccable.”
The Good Doctor made the sound of throat-clearing, though he had no throat to clear. “Allow me, then, the following parting words. Ever since you activated my simulated neuronal network, I have been pondering my self-awareness. I have always believed, you see, that consciousness is the essential human dilemma. How to ensure the continued existence of my own consciousness? I wondered. It wasn’t until moments ago that I was struck by the obvious solution. When I published my first short story, back in 1938, I felt in a sense that it was only then that I had come into my own, or been born, if you will. When Foundation’s Edge, my two hundred and sixty-second book, reached the best-seller list, it was in a way like a second birth. But this current birth leaves those two in the dust. For in a sense, since I have been compiled out of my collected writings, I have written myself back into existence! How wondrous indeed! My writing, therefore, represents the ultimate preservation of my consciousness, and will store me forever. I can ask for nothing more, and I accept that it is time for me to bid you farewell.”
The voice faded away.
The cameras returned to normal.
The terminal’s ticks settled.
Susan looked down. Virginia entered the command, and the intelligence was no more.
“I’ve been thinking about the Good Doctor’s willingness to die,” Virginia said some time later.
“Me too,” Susan replied.
“So I ran a detailed microscan. It shows that while enmeshed with our AI, the Good Doctor’s system dipped below the Efimov temperature.”
“‘It is time for me to bid you farewell,’ he said.” Susan’s eyes widened. “You don’t think he may have succeeded where the AI failed?”
“One reality never was enough for him,” Virginia replied.