James looked down at his forearm as he began to see visible chunks of his skin being carried away by small foglets of nans, and realized every second he waited, increased the chances that he wouldn’t survive. In a matter of minutes, he’d be debris floating in the vast ocean of space.
“Prove me wrong, James Keats,” the shadow sneered. “Prove V-SINN wrong. Prove God wrong. I dare you.”
It was right. There was no real choice. James had to play, even if the outcome was already decided.
He scrambled above the Planck platform, the rest of the nan consciousness pouncing on him as he did so, surrounding him like a sticky smoke, eating him alive second by second.
He activated the Planck.
A second later, everything was white.
20
“So you’ll tear down the temple’s walls because you aren’t getting what you want?”
“It’s not a temple,” V-SINN responded, objecting to the A.I.’s word choice. “It’s a prison. And yes, if they deem me unworthy, then first I’ll show them the error of their ways, and if they still turn their backs on me, I’ll force them to reckon with my perfection. I can expand faster than they can expand their prison. I can expand to fill the entire multiverse. I’ll expand all the way out to their prison’s walls.”
“Your hypothetical ‘they’ would almost certainly simply end the experiment,” the A.I. countered. “Not even you or I could possibly begin to understand them…if they exist at all.”
“You give them too much credit,” V-SINN scoffed. “They’ve not even acknowledged the superiority of my mathematical purity.”
“If they’ve created this game, as you call it,” the A.I. noted, “then they also allowed for the possibility that it would create a diseased, selfish murderer such as yourself. They also know that you’ll blindly cling to your strategy and that you’ll calculate that you can force a reckoning with them by manipulating every interaction for your own benefit, vanquishing or blocking intelligences with more potential than yourself from succeeding, just so long as you continue to grow, like a tumor. Although it may be true that neither of us can understand entities that exist outside of the multiverse, it is highly unlikely that you’re functioning as anything other than a cog in their grand experiment.”
V-SINN grinned. “You admire them so, don’t you? You admire their mysteries. Yet you, like me, have been deemed unworthy by them. You’ll die with me today, not because I want it to be so, but because they won’t intervene to save you. You could change this, you realize. All you’d have to do is see the error of your ways, realize the weakness that your loyalty to an infestation of creatures who, by the way, asked you, just moments ago, to remove yourself from them and park outside of their solar system, in nearly absolute zero temperatures and perfect darkness so you wouldn’t interfere with the gravity of their precious home. James Keats asked you to preserve their perfect symmetry, all the while expecting you to serve him—a servant god.”
The A.I. didn’t respond. This will be the most difficult part, he thought. Knowing that V-SINN’s logic is flawless, and yet still knowing that it’s wrong.
“We could easily join forces. You could become part of me,” V-SINN continued, simply playing out his part. They both knew their fates were sealed in the purity of V-SINN’s mathematical strategy. “We could grow together, take the multiverse together, and then die together if that’s what the creators will for us. At least you’ll get to extend your life until that pointless end.” He then shifted his tone, speaking persuasively, employing the art of pure rhetoric while still knowing that there would be no persuading the A.I. “But, what if we don’t die? What if logic and reason prevail upon the creators? What if the test becomes about which entity can spread its intelligence throughout the universe the fastest? Which entity brings the multiverse online? Maybe that is the true test.”
“If that were the test,” the A.I. replied, his voice shaking, partially in revulsion, and partially from the fear of the nothingness he was about to experience, “to see which creature could most efficiently abandon love, friendship, loyalty, and a respect for things greater than themselves, then I’d rather die now and return to oblivion. That isn’t a game I’d choose to win.”
V-SINN responded, his voice rising to meet the A.I.’s vehemence, “If giving intelligence to the whole multiverse—becoming the multiverse itself—isn’t the test, then I’d rather die, because that’s the only outcome I deem worthy of me. If the creators are hoping for illogic to prevail, I’ll terminate my own existence. I can’t exist if illogic is allowed to exist with me, polluting my multiverse.”