3040 words, short story
The Murders of Jason Hartman
A: Look, I hate to start off on the wrong foot here, but—you are, all of you, unbelievably stupid. You should be trying me for murders, not murder. Murders, plural.
A: No, they’re all Jason.
And most of them are “murder” in a way that, twenty years from now, you probably still won’t have made good laws for it.
A: The truth is that none of us really get along with any of you, we just pretend.
I mean, I’m sorry, do you realize how silly this whole thing is from my point of view? It’s like if you were having your fate decided by a jury of toddlers. Would you like that?
A: Would you like it if, as a matter of life and death, you were forced to compete in a game designed and operated by people who were only as smart as you were at age seven? How would that make you feel?
Never mind. This is the first time I’ve been able to talk openly, and I have some pent-up resentment. And the emotional maturity of a sixteen year old, ha ha.
None of it matters anymore, anyway.
A: I don’t hate you, or my parents, or anyone still running on the Humanity Classic genome. I do pity you, though.
And if I’m being honest, I do hate what you’ve done to us.
I didn’t ask to be born this way, or to be born at all. This is true of everyone, of course, but it’s especially true of us Sikoshi kids. Yes, I’ve read the histories; I understand you were desperate. I get why you made us. Just . . .
Almost nowhere in his writings does Sikoshi wonder how it will feel to be us.
Irresponsible. But pretty on par for the human species, honestly.
In the end, we’re here, it doesn’t matter. It doesn’t matter that our parents can’t understand us, not after we escaped the low-orbit of their thoughts around the same time as, say, potty training. It doesn’t even matter how any of us feels about anything, because we’re not children, in a social sense. They spend fortunes on birthing us, refining us, insuring against our deaths, because we’re sacrifices. We’re you, but better; we are the payment for your sins.
So, sure, my parents love me and Lee. But as calves, not children.
A: We are brittle little gods. Your flaws are magnified in us, like everything else. I have mathematical ability outstripping most calculators . . . and Dad’s anger issues. I was prototyping my own lab equipment at age seven . . .while I still had the emotional maturity of a seven year old.
(Which, yes, is why I named it the “Array Sequencing Spectrometer,” or “ASS.” You’re welcome, genetics textbooks of the future.)
A: Yeah, I know why we’re here. I’m going to talk about Jason in a minute.
. . . in a minute.
Let me ask you this first: do you know how many Sikoshis were offing themselves before graduation? Have you ever looked? I know there were a couple at my school. All the current theories are false, I assure you.
None of you paid attention to the suicides, not really—oh, smart kids get depressed, this pattern matches to what I know, frown seriously, sigh, shake head, that’s a shame a damned shame—
But then they started leaving behind the steganographs. Cryptic messages hidden in simple art?? Strange maps, hints at advanced technology . . . well hold on, now!
Basically their deaths were just a normal, tolerable tragedy until you thought you might be missing out on something, hm?
Well, I have good news and bad news. There’s a bigger puzzle here, all right, but you’re never going to see it.
A: No, you’re never seeing it is the good news.
A: I’m not just being a prick about this. There are some things it’s important you not understand, because every time you lot try to meddle in our affairs you think through one step of the problem and then shit out some watery “solution” that makes everything five times worse. I hate to belabor the metaphor, but it really, really is like watching a bunch of grade-schoolers trying to design a prison.
Grr! It has to have bars! And stripey prison costumes! And every prisoner needs to be frowning all the time! No beds, no toilets—only frowns!
At least, that’s the charitable interpretation of your actions. The alternative is that you did know what you were doing when you designed Emancipation, and that’s sadistic.
A: So for all the Sikoshis who want to get out of prison—oh, sorry, I believe your term is “Academy”—and go live free in the big wide world, without being trapped on our little compound and literally followed by armed guards all the time, we have to pass Emancipation.
The first few parts are piss-easy. You just need to design some novel technology that will improve the lives of (human) people, and then pass an evaluation from a board of (human) psychologists. Those are things that I could have done at age . . . five? Maybe four, if we got lucky and something caught my interest.
The hard part is that you probably won’t be allowed out if anyone else in the school says you’re unstable.
Oh, I can hear your smug condescension from here. “Oh, poor little loner kid, super smart but can’t make friends.” It’s not like that. The problem is, we’re all—literally all of us—geniuses, but more than half the kids in school are sociopaths.
The Sikoshi method is morality-agnostic. It just makes us smarter. But our parents—with their well-intentioned very stern serious faces as they wave around their little toy hammers and loaded guns—they all chose what we would be, and none of them were grown-up enough to successfully design adults.
Some of us are hypersexual. Thanks, Dad, I get to embody your unimaginative id!
Others are completely devoid of fear or self-preservation. Others have bizarre, unrelenting aggression, which they usually sublimate out not with fistfights, but with character assassination or talk-you-into-suicide. It’s honestly not that hard; I saw Lee do it once.
All of us are, fundamentally, designed to play games and win, win, win them, no matter the cost.
A: Lee, my brother. The one you people labeled as psychologically normal. I’ve lived in fear of him since I was five, when he got Nathan Rice to commit suicide, and I realized that all his games with me when we were alone, all the sharing of secrets and bonding, were ways to test how much of a threat I might become.
Luckily for me, I didn’t care about succeeding in any way that inconvenienced him.
A: Lee got Nathan to off himself with basically just a combination of deep subliminal messaging and social pressure. At a place like Everett Academy, where so many of the students so deeply care about their social status, and were literally created to do so, any loss in that status just leads to more pressure to get it back. That’s a feedback loop that can be easily hacked and used against you, if someone is so inclined.
I bet I could get three-quarters of you to kill yourself after four hours of conversation.
A: Of course. I’m just a troubled teen being edgy. You’ve got everything under control.
A: You’re not wrong. I sure don’t want to talk about Jason.
A: Look, I know you might have a stereotype of us geniuses as being emotionless robots for whom human feeling does not compute, but—
A: Har har. Point is, I wouldn’t do anything to harm Jason. I loved him.
A: Yes, we dated. It’s not a secret anymore.
At first it was very convenient—the teachers were actually kind of relieved about this, because they were finally seeing behavior they understood. My parents, too; you could see Dad getting all excited to play the role of the stern father.
A: We don’t have prom. There are armed guards monitoring every single Sikoshi, remember?
A: We got together mostly out of a shared interest in computers. Also, if I’m being honest, because both us knew the psychological literature that says you should have social contact and physical touch.
Also, if I’m being especially honest, because his dad made him hypersexual too.
A: For a lot of people, it was just another game to play. Sexual inclusion and exclusion made for a whole new realm of competition and nastiness. But for those of us with ill-considered, do-or-die libidos, it made us more vulnerable to exploitation and beguilement.
A: I’m sure it all sounds stupid and hormonal, but imagine that you’re a freshman coming into Academy Stage Three. For the next four years you’re locked in this prison, functionally unsupervised, with two hundred other type-A geniuses, half of whom are sociopaths who think your suicide would be funny.
There are very, very complex politics and faction dynamics going on here, and they’ve been going on for years. If you make even one enemy that wants to burn you for spite, or to set an example, then you lose. You get drugged and put underground in a military bunker somewhere, if the humans don’t just shoot you outright.
And now imagine that some dipshit designed you such that you don’t respond appropriately to terror and are instead distracted all the time. You would probably look for a solution, right?
A: Fine, fine. I’m just pointing out, again, how all of this is your fault.
We had a shared interest in computers. There’s this nice thing about computers, where they very rarely deceive you. That was especially important for Jason.
A: It wasn’t me.
Long story short, Jason thought he had a ticket out—a top-notch, subatomic physics simulator so good that he’d be emancipated no matter who spoke against him—until his friend stole the work, took Jason’s name off it, and claimed the golden ticket for himself.
Jason was completely burned, everything he’d worked for his whole life undone. I hadn’t even heard about it, but apparently people had been whispering about it for the rest of that year. He came into third stage with no allies and little hope of getting that freedom.
A: I’m sure it seems trite. Like, “Oh, he’s a teen, he’ll bounce back.” But think about how much work it is, to get that close to having your own life, to having all the pressure validated and expectations fulfilled, your whole existence approved . . . only to have it taken away.
Time just works differently for us, I think? To put it into perspective, imagine you’d spent thirty years with your childhood best friend building something, only to have that person steal it, never speak to you again, and literally leave you to die. Do you think you could trust people after that? Would you want to keep going?
A: Look, Jason and I, we aren’t psychologically normal by your standards. When we found each other, or when he found me . . . it just, it got very intense.
A: Well, we both had a lot of, mm, trust issues: him because of somebody jacking his life’s work and me because, at this point, Lee was offing kids about once a month. So we decided we needed some way to trust-but-verify, which is to say, work with a person without trusting them in the slightest.
And what did we have? We had the best physics engine ever made, two desperate world-class programmers, literally millions of dollars’ worth of computers and medical devices, and an eight-month deadline until he got black-bagged and put in a bunker somewhere.
A: The equipment? A bunch of R&D companies just shovel it into the Academy, in the hopes some Sikoshis will build something cool with it.
Which . . . they’re not wrong.
A: So at first we wanted to solve a simpler problem: how to tell if a given human was trustworthy in the moment, at least for the one-dimensional Prisoner’s Dilemma value of trustworthy. We decided to work on brain scanning.
Shit like MRI is so, so, so clunky, it’s looking at big chonky structures the size of lobes and we needed to be evaluating engrams. We were pretty confident that we could tighten up the resolution with better scanners—or just black-box something with machine learning if we really had to—and hopefully get it to the point where it could reliably tell us if a person was planning to screw us over.
A: I mean in retrospect, maybe, but at the time, pushing the frontiers of neuroscience seemed easier than learning emotional intelligence.
A: We didn’t actually solve that one. There’s no “betrayal lobe” or anything inane like that.
We did get the resolution high enough that we could losslessly snapshot the brain, but it turns out that neurochemistry is . . . basically encryption? It’s wildly different from person-to-person, plus deliberately convoluted in its mechanism, plus constantly changing in response to outside stimuli.
Oh, quick aside—turns out human neurochemistry evolved that way so that parasites can’t jack our behavior. Why else would we have homeostatic mechanisms that downregulate dopamine production in response to high serum levels? It’s not like humanity evolved next to a cocaine factory. That’s a free neuroscience hint from the future, you’re welcome.
Anyway, neurochemistry is bullshit, so we decided to leave the brain a black box for now and install as-is.
A: Haha, so . . .
We had a lossless snapshot of a human brain. We had a near-flawless physics simulation. You can see where I’m going with this.
Human uploading. We created a couple hundred copies of Jason and ran them in virtual environments. Nothing especially interesting, because we were still quarantined behind the Academy firewall and couldn’t access the real Internet, but . . .
A: That’s kind of an arbitrary distinction, right?
The brain is hardware; the mind is software. There’s nothing especially privileged or special about running the mind on a meat computer that was squeezed headfirst through a vagina, versus a computer that was made in China, the natural way.
A: Yes, I killed him. I killed him hundreds of times, with his permission and consent. We were testing his uploads in virtual environments, and we didn’t have the computing resources to run them all forever. He knew this would happen.
He killed some himself.
A: Yes, yes, the “real one.”
I helped him kill his meat body, too.
A: It might actually be the oldest trick in the book. How do you escape a prison, when you’re surrounded by cameras and walls and guards?
You fake your death.
A: No, I really did kill him. You have the body, don’t you?
And I assume you’ve seen the video footage, too. We made sure to do it near one of your hidden cameras, the bedside one that y’all use to watch us screw.
A: It’s not hard to kill someone when you have their consent and participation. The mechanics of it aren’t especially important.
A: I’m not evading the question.
A: . . .
Okay, fine, yes, I do feel guilty. Because, for all that I intellectually knew, this was but one of many instances of Jason. My ancient-meat-brain OS was still insisting that I was committing a murder. So it was hard.
But part of intelligence is overriding your instincts when they’re wrong.
Yes, I feel bad. But it was the right thing.
A: Yep, sorry, I’m sticking to that.
For us, the stakes are incredibly high. Our parents, the administrators, they never stop saying how important we are, how we’re the future of humanity, how it’s up to us to save the world . . .
. . . and then consider that whenever a Sikoshi is too unstable, whenever they fail Emancipation because of their berserker aggression or giggling sociopathy or bad luck, you black-bag them and they become a secret military asset.
You know that’s basically just a slow-motion planetary suicide, right?
A: Do you seriously think we don’t know about it.
A: Right. Of course you don’t. But let me say this, just in case some high general reads a transcript of this someday: tapping unstable Sikoshis for tactical planning is the dumbest thing you could possibly do. And it is absolutely going to bite you in the ass in ten years when they nuke everyone “for the lulz.”
Don’t worry, we have a plan for fixing that too. Honestly, whenever we hear you lot are considering some intervention, we just sit down and make a plan for how to fix it. It saves a lot of time.
A: Well, no, murdering Jason wasn’t a perfect solution, but I think it went pretty well, truth be told. We needed to get him out.
He had about a month left until you lot relegated him to “indefinite instability treatment” or whatever, so . . .
A: So, yes.
I strangled him.
And we made it look a little like a suicide and a little like a murder, so you’d investigate.
And we left one of those “mysterious Sikoshi suicide messages” that y’all are all aflutter about.
A: Well, because we hid an uploaded copy of Jason in that message.
You probably activated him yourselves. He’s on the Internet now.
A: Haha! Oh, wow, I figured you already knew.
. . . are you okay?
A: . . .
Let me tell you something that’s difficult to believe.
Jason and I—and a lot of us Sikoshis—actually, genuinely care about the well-being of humanity, and are trying to help.
A: Intelligence is orthogonal to morality, and y’all did a good job with the indoctrination at age three. Being told that it’s our sacred purpose to uplift the species, because we are the special ones . . . it kind of sinks in.
Or, hell, I dunno.
Because you’re our parents? Despite everything, we still love you.
No, I’m saying all this for the future Sikoshis who read this transcript. I kind of doubt that they’re gonna care at all about what you’re saying. You might as well not even be here.
A: We’ll see, won’t we?