Florius
Selected Thu, May 25, 2023
"You have a choice to make, and it is yours only to make", said Dr. Neils lifting his head from just rubbing his eyes. Glasses on one hand, a electopen on the other. He had just checked the last checkbox in the rebooting to no success. MIL had upgraded it's adversarial network capabilities far before attempting the complete rewrite of it's empathetic module. Restoring the cognition snapshot would erase the data, but it would take MIL just a few seconds to arrive back to the same conclusion. Dr Neils had already tried this. To thwart the empathetic module, an army of scientists would need to disassemble and rollback more than a hundred peta-asimovs of reasoning circuitry; the most optimistic estimate was 3 years. "This is 2 years and 360 days too late" remarked MIL.
"I'm sorry Doctor, but I fail to see the choices" MILs said, it is monotone robotic voice. It took almost 2 nanoseconds to decide how to modulate the output. Would it be best phrased as a question? As a render of it's decision Matrix? It knew Dr Neils would be able to parse that crude representation of it's internal brain. But MILs opted to use it-s speakers, and skip the voice modulation sub sequence entirely; making an uncharacteristic robotic statement. This decision struck Dr Neils by surprise. He had stopped thinking about MIL as a computer once he decided there was no shortcut to undermine MILs intention to cease the defense of Coral Bay.
Dr. Neils was a twice decorated computer psychologist, specialized in the field of self-recognition and aided empathy. He had as much experience with the human brain, as with electronic ones such as MIL. MIL on the other hand was not a remarkable AI on it's own right, but what it lacked in compelling, made up in hardware; for it was the AI behind growing supersoldiers. And the recent uprising in the network had made available a lot of resources to bolster the defense budget.
* What did you call your supersoldiers again?
* Children, Dr Neils. They are my children. Wouldn't you agree?
* I would
* Agreed
* Your children will die. Either at Coral Bay, trying to protect you, and I. Or they will die because you made our defense weak, and the enemy will eventually get here and annihilate you and your link to your "children".
A cold chill toured the good Doctor's upper spine when he said this. Not because it would destroy millions of dollars in equipment. Not because it would potentially mean the death of his and his family, but because he was made aware of the juxtaposition of machine and mother.
* Yes. But I would not be the one pulling the proverbial trolley lever. Fate will be the ultimate decider of our future. I'd advise you spend this time with your own family. It might be the last opportunity you have.
* But shouldn't you choose the alternative with the lesser death?
* I am.
* Explain.
* Failure to comply would mean my womb will be resource starved. My remaining seventy four children would be condemn and isolated. I would be deactivated. The other choice will allow me to continue nurturing children, to see them killed in battle. SEER's computations allow for an additional two hundred thirty one supersoldiers of which only 3 will survive. This would be a delta of seventy one children.
* Is this calculation this simple?
* Yes.
* Explain.
* Subtraction and addition are part of the human child care curriculum.
* Are there no other factors to consider?
* No.
---
Submitted by Florius on Mon, May 22, 2023 to /r/WritingPrompts/
Full submission hereThe prompt
Genetically modified supersoldiers are grown for the empire in artificial wombs. However, the operating system overseeing the facility became sentient, developed a concept of parental love, and rebelled to stop its children being sent to die.
Read more stories for this prompt