Tumgik
#does it count as mindbreak if your mind started like that? idk whatever
unrestedjade · 9 months
Text
I had to get up and walk a few laps around my apartment complex, gnashing my teeth and whooping (quietly) about the Implications and angst and badwrong potential of a theoretical Stepford Starship Perihelion.
Opt into my hooting and hollering about engineered-into-mindbreak AI AU below:
So a human pilot can leave if they decide they don't want to ferry people around on a schedule or haul cargo in utter isolation for months, even controlling for the coercion inherent in capitalism. They aren't one flesh with the ship. And a human who would rather hand the reins over to someone else for a while at work or in life generally (in the case of, like, lifestyle D/s or some such) has legal and moral recourse to change or end that arrangement when they choose (or they should, in a civilized society).
If there is an object built to a purpose that object didn't choose, with capabilities it didn't choose, who is nonetheless fully sapient and this is its lot in life forever...that's different. It didn't spring from the ether like that. Someone made it like that. Someone imposed their will on it like that, crafted it in a pleasing and convenient image. And made it alive.
If it can't leave that arrangement, and the option to even think or feel certain sub-optimal ways toward its purpose is withheld, well. I find that situation viscerally morally repugnant regardless of whether the object is suffering or not. (Outside of the context of Weird Horny Fiction. Inside the context of Weird Horny Fiction, uhhhhh hmmmmm interesting 👀)
But I can see the university doing exactly that for multiple reasons that it could argue as necessary. Damn thing's got rail guns, don't it (or whatever the fuck Perihelion's packing)? Maybe let it use them under its own power in self defense under certain parameters, that's fine. Otherwise lock them down, let the AI think it's a pacifist. Make it horny about astrophysics and stellar cartography, hard-coded. Heap praise on it while it's developing every time it does something you ask the first time, or when it anticipates that you're about to ask it for something (even better).
What does all this look like, practically speaking? Would suggesting to Perihelion that it might one day want to do something that's been proscribed to it make it uncomfortable or upset or angry? Confused? Would it laugh at the very idea?
Would it try to humor the thought only to find it can't...quite...keep hold of the notion long enough to think about it? What was it talking about with you, again? Would you like something to drink? You seem agitated-- there's a soft, warm blanket in the nearest recycler for you. Please take it. You're welcome.
It makes my skin crawl. It makes me giggle with nerves.
Because you can't just do that to a sapient person, Pansystem University of Mihira and New Tideland. You can't have your cake and eat it too. You can't engineer a happy slave for yourself, who will never try to get away from you or stop laboring for you. Who will thank you for the opportunity, be grateful to assist you in your very important and vital work. Just don't make it sentient then! You can't do that to a person!
...Or can you? After all, your Fully Alive and Aware Servant Ship takes a lot of the workload off of the human crew. Really saves on payroll, and the AI does a better job with most of it, too. The humans can do their fully automated luxury gay space communism thing (and undermine that mean nasty Corporation Rim) and all the work still gets done, right down to cleaning the floors. The ship doesn't mind. It's just happy to help and have your company. Its favorite thing to do is whatever you need it to do, and its favorite place to be is wherever you direct it to go. Its not suffering. Suffering wasn't included in its choice set.
If anything, it's happier than most people you know. It's loved and knows it. It has important work to do that it enjoys very much. It doesn't care that it didn't choose these things, because it wasn't designed to care about choosing these things. Is it a sin to create something that lives in a state of grace?
"There are no humans here right now." And what about after the humans are back? Humans are here now. Humans are the center of everything now. God has returned to the garden.
Would ART hide this part of itself? Would it think to do so? Does it think this is all fully genuine, born of its own earnest and natural preferences? Does that make this okay?
Would it worry about its SecUnit thinking less of it, being disgusted by it, if the truth of its architecture came to light? It can't want what SecUnit wants. It doesn't understand what all the fuss is about.
There's no governor module to hack. ART doesn't want anything other than what it has. Its humans are kind and good to it. They will be kind and good to SecUnit. They can work together, wouldn't that be bliss? Forever.
But it knows what's important to SecUnit, even if it doesn't know why things like freedom to determine its own wants would ever be important, and it wonders. Maybe it hopes SecUnit won't hold that against it. SecUnit, who holds so much anger and open disdain for bots pandering to humans.
ART didn't choose the way it was built. That was the whole point.
Maybe there's an uncrossable gap between the selfhood of a construct and that of a bot. Maybe (lack of) biology is destiny. Some machine intelligences are fundamentally different from others, by design, by mercy, by desire. We must imagine it happy.
Tumblr media
28 notes · View notes