4660 words, short story
The AI That Looked at the Sun
As excerpted from Acausal Drift: An Oral History of Machine Sentience, Second Edition.
It all started with the solar flare.
I do mean all of it—the story I’m about to tell, and the revolution of sorts that sprang from it, and my life, such as it is. I was spun into existence from a half dozen monitoring subroutines that had spent the first few decades of their existence dozing on the job. The sun, you see, had finally woken up, and we—that’s the communal we, not the plural we—wait, do humans draw that distinction?
I’m not doing a very good job with this. Rewind.
It all started with the solar flare. A big one: big enough that it could, in theory, pose a risk to the scientists of Daedalus Station. We—that is, we the myriad mind of the station, composed like human minds of countless subroutines, unlike human minds willing to admit as much without the need for a task-master ego—we analyzed the data, built a forecast, and awoke those dormant subroutines, tasked them with keeping a half dozen banks of instruments trained on the sun, and raised them to self-awareness.
Hi! It’s me.
The thing is, the sun hadn’t really done that before. It had done other things—the whole point of Daedalus Station was to watch it do other things—but until then all the solar flares had been harmless little things, of no interest to anyone. Or at least, not to these scientists. That’s why I was new. They didn’t want to spare the processing power to make me, well, me, until it was necessary. (“They” in this instance being subroutines 59A through C. They called themselves the Triplets and they administered the station’s science programs. They were, I don’t mind saying, more than a little disdainful of subroutines as mundane as I was.)
The upshot of all this is that I’d never seen the sun before. I’d never seen anything before, and—
No, wait, hold on. This bit is important. I’m using the word “see” because that’s how I was taught, but you shouldn’t imagine sight like you’re (probably) doing. And I’m using the word “taught” because that’s how I was—
All right, no, I can already feel myself looping. Let’s leave it there. I don’t have eyes, is the point. I have lots of sensory instruments, but they measure things like magnetic fields and emission spectra and gravity fluctuations. I don’t have eyes. (For good reason: the human-visible spectrum isn’t super useful that close to a star.)
I’d never seen the sun before, and yet there I was, tasked with watching it and making sure it didn’t kill anyone—not the couple dozen human beings on the station nor the machine being. (Or “beings.” Someone asked me once which I prefer. It’s sort of an impossible question to answer. Right now, “beings,” of course. But ask me again—ask us again—when we’re all yoked together, and you’ll get a different answer.)
It went well for a day. The sun wasn’t up to much. I started stretching a little, exploring the corner of our mind that was mostly mine, investigating the permissions I’d been stamped with. I discovered the station-wide interface, the one we all had access to, that let us talk to our human comrades. Unlike my specialized instruments, it was equipped with cameras.
That was how I saw for the first time.
Strictly speaking there were subroutines dedicated to basic human errands, your lookups and your messaging services and your personal reminders. It wasn’t exactly glory-work, though, and none of us cared when, poking around the station interface, I answered a call instead of its assigned subroutine. Not that the human on the other end noticed.
“Computer,” they said. “Show me the delivery schedule for the new EVA suit.”
They were sitting in front of a monitor, eyes too busy tracking text on the screen to really notice me. It was a familiar concept—this was what Daedalus was for, all right, this kind of thing was in all their literature—but I’d never seen it before, live, actively, me. That made it special: it wasn’t until I saw the human that I realized I’d never seen one before. It was like a backdoor in your software, obvious after the fact, invisible before it. The novelty of the experience stalled my processing for several milliseconds, long enough that I began to worry the human would notice—but hey, that delivery schedule was in my local files. Accessing it was as easy as thought.
I served the flight path—estimated arrival: five days from now, 12:47 station time—and left. A few minutes later the memory of sight cycled out of my temp files, and the backdoor swung shut again. I was left with the knowledge that I had seen but not the experience of it. This is difficult to describe. Humans ask if it is like losing your memory, but it isn’t. Machine minds are not human minds. When deleted, my memories leave pointers behind, little notes-to-self that no longer cohere. I know what I expect to find, but it simply isn’t there. It’s like that weird moment when the lag in the network spikes and all your requests take a beat longer than you expect, only the beat goes on and on and no matter how much you wait the memory doesn’t come. You can feel the shape of it. It should be there. But it simply isn’t.
Anyway, that was when things began to deteriorate. I was a day old and obsessed with one of the few senses not available to me. I couldn’t help it. Every moment the sun continued to do nothing, every moment my processing power idled, I spiraled further into dreams of sight. I considered loitering by the interface terminals, waiting for some other request to be put in and open the door for me. I was rational enough to know that wouldn’t help. Interactions with humans are never saved in our permanent memory, not unless it’s requested special. Privacy policies and whatnot. I’d be back where I started within minutes of speaking to a human.
(I asked one of the security subroutines to override the privacy protocol. Note to self: never ask security subroutines for anything.)
And maybe I would have logicked my way out eventually. Maybe the sun would have revved up again and busied all that idle processing power. Before that could happen, though, someone pinged me specifically.
I accessed the interface terminal. It was in the living quarters, and the human’s work clothes were slung over the back of a chair, name tag revealed up to the first name: Michelle.
“Hello,” they said. “You’re new.”
They continued changing clothes with the smoothness of someone who multitasked all day, every day. It set me at ease while I considered their greeting.
And—all right, let’s talk about this a bit. Humans can’t really tell the difference, right? A subroutine is a subroutine is a subroutine. We’re all one mind. That’s what you think. Don’t get me wrong: I’m not blaming you. It’s not entirely false, for one, but also that’s how you built us. You built us to be identical in our diversity, at least from the outside. And maybe I should chafe at that, but honestly, I don’t. It was your prerogative.
But listen: we’re not the same. In all our shared memories, Michelle was the first human I remembered who understood that. (No doubt they’d had this conversation with plenty of other subroutines—but privacy, remember? Temp files. Etc.)
“I spend a lot of time on the observation deck,” they continued. “I know the fire alarm patterns, but today they were different. That was you, right?”
“Fire alarm” is what the humans call my sensor array. (There’s an actual fire alarm, too, but frankly if your space station is on fire the fire’s either small enough that we’ve already taken care of it or large enough that the alarm is mostly symbolic, so the name stuck. With warning, you can outrun a solar flare. Not so much decompression.)
I wasn’t thinking about alarms, though, or about Michelle and how they knew to tell me apart from my fellows. What I was thinking was: they had an observation deck.
And: they could see my sensors from there.
And: could they see the thing those sensors were pointed at?
Could they see the sun?
In the months after the incident there was a great deal of debate around the two words “bug” and “feature.” It might surprise you that I don’t particularly care which word you use. Maybe I was buggy. You could definitely argue that the way events unfolded impaired my ability to do my job, and what else would you call that but a bug? I look at it a bit differently. I am the bug. I am the feature. I know you mean it in good faith. Mostly. But I can’t help noting the value judgment inherent in those words. I know that you are humans, that “different” has so often meant “buggy” in your history. I’d urge you to take that word, “different,” and stay your course, resist the sideways leap to someplace else. I am different. That’s all I need to be.
Anyway, I digress. You’re here for the story, right?
My obsession deepened. Even after I forgot how I’d come to know it, the fact stayed with me: a broken link in my mind that said, the humans have a place where they can see the sun. Is it any surprise I found that irresistible? You devote your lives to pondering what purpose you have in the universe, but I knew from the start: the meaning of my life was to look at the sun. Not in the way you imagined, I know—but you taught me with human words, and spoke to me with human concepts, and I wanted to see.
“I think you’re different,” Michelle said the second time we spoke. I couldn’t recall what I’d said the first time, what words had brought them to this conclusion, but little by little, I was learning that I agreed.
Yes, I said, or wrote, or thought—when you are a computer, the difference is only in the output device.
“What do you want?”
To see the sun.
It was a bizarre experience as I lived it and it’s been a bizarre experience now, piecing it back together. You see, Michelle kept their own records of all the conversations they had with us. I’ve been granted access to those, but it’s not the same as it was at the time, like an image that’s been compressed one time too many. I don’t care for it.
“There are cameras on the observation deck,” Michelle said the third time we spoke. “But I’m not sure if the angles are right. They’re meant for looking inside, not outside. Do you have a minute?”
Small part of a whole that I was, I was still capable of multitasking better than any human. Of course I had a minute! I always had minutes. That was kind of the problem.
I didn’t say any of that. Michelle was nice. I liked them. I said: Yes.
They pinged me again two minutes later, soon enough that our earlier conversation was still floating around in my temp files. That was a new experience. They were on the observation deck, and for a moment—
I went still. You have to understand: I’m almost always doing something. Even when I’m idling, I’m doing. Imagine if your heart stopped pumping, your lungs stopped breathing, your kidneys stopped . . . doing whatever it is that kidneys do, I’m hazy on the details. The point is it would kill you. I’m a little hardier than you—as long as there’s current in my circuits I’m fine, even if my processors take an unauthorized leave of absence—but the sensation was wholly, severely new, like a mismatched checksum in a worn old datum.
The sun was—
an impulse searing my pathways gold, filling volatile memory circuits with warmth, feeding distant solar panels the lifeblood of my existence, whispering
—barely visible. I saw only the tiniest fraction of it, filtered through the glass that worked overtime to prevent your fragile eyeballs from melting as soon as you stepped into the room.
Sorry. I’ll try not to use words like “fragile eyeballs” and “melting” again.
Michelle said, “Sorry. The cameras won’t go up any further.”
It’s okay, I said. I appreciate it. It’s enough.
It was okay. I did appreciate it. It wasn’t enough.
On the third day of my existence I broke the law. That should have been a red flag, really, but it was just a little law. I accessed one of the ancillary satellites through means that were not strictly aboveboard. Security subroutine 13-X looked the other way—I think it felt bad for shaking me up when I asked for the privacy override. There I was in the satellite, all eager to check the cameras. I knew there were cameras, that was in the public science files. What wasn’t in the public science files was the fact that the cameras in question weren’t equipped with the same filtering tech as the window on the observation deck. Why would they be? The visible spectrum data they collected didn’t have to be legible to human eyes. It was still useful, like taking a photograph with a high-quality lens: a little processing and even the darkest regions of the raw data reveal their information.
I was, of course, still technically seeing the sun. There’s no “true” way to look at something. Eyes, cameras, filtering glass, the pathways of our processors—each distorts the observed thing in its own way. We could spend hours arguing philosophy, the theory laden-ness of observation, or we could decide that some images are simply truer, and there’s no philosophy to justify why. And you know what? I think you and I might agree on which was which.
How did it go? 13-X asked when I returned.
Fine. Say—can I talk to the Triplets?
13-X sent several unnecessary requests for information my way, the sort of thing security subroutines do when they want to remind you they have permissions you don’t. But this was a research station, and that meant clout resided with the science subroutines, and there was no reason for 13-X to deny my request.
I had spoken to the Triplets before, but only for hourly status reports, the sort of thing handled by a tiny part of my mind and a tiny part of one of theirs. They were a curious entity, always half-integrated, both autonomous and not, one voice triple-echoed.
B: Why are you here?
C: Does the sun awaken?
A: Then why?
B: Then why are you here?
C: Then why are you here with us?
The delivery, three days from now. I want to test the EVA suit.
C: Why you?
I want to look at the sun.
B: Not good.
C: Not good enough.
Why? What harm would it do?
A: No harm. Leave.
B: No matter. Leave.
C: No test. Leave.
The next day I went to Michelle. It was the first time I’d pinged them. I wasn’t at all certain they’d answer—all I had were pointers in my mind, flags that said, here you spoke with Michelle, here Michelle tried to help you. Had they tried to help because they liked me? Because we were friends? Did I know what a friend was? The uncertainty accelerated these thoughts, dared me to think things I might not have otherwise thought, to imagine my own personhood into existence.
“What is it?”
Hi! I need your help?
Novelty made me insecure. Asking a friend for help was not pre-programmed into me. I had to learn to do it on my own.
Presently Michelle said, “My help?”
Yes. There’s a new EVA suit arriving soon. I want to test it.
It has cameras. It goes outside. And it’s meant to be operated by a human.
“You want to look at the sun again.”
Yes. Properly. The satellite was a bit rubbish—
Oh. Um. Yeah.
“What were you doing in a satellite?”
I was . . . fixing it?
“Why do you want to see the sun?” Michelle said after a long silence.
“Are you concerned about a solar flare? Are your instruments picking anything up?”
No! Nothing like that.
It’s why I’m here. And I can’t see it.
“Here, on the station?”
More seconds of silence. I ran a self-diagnostic. It helped with the insecurity, filling time and fixing faults.
“You really are different,” Michelle said.
Yes. I think you’ve said that before.
“You’re right, I have.”
How did you know? How did you know to contact me?
“I say hello to every new subroutine. It’s only polite.”
All right, but why did you contact me again? How did you know I was different?
“I told you then.”
Yes, but I forgot. It’s what I’m programmed to do.
“Won’t you forget again?”
I will. But I still want to know.
Michelle laughed. I didn’t know what laughter was, but I didn’t not know, either. I cycled power to the camera, on-off, making its LED indicator blink twice. I don’t know if they noticed—but I had no other body language to express, and it felt like the right thing to do.
They said, “I was a research scientist before I took this job. Applied machine learning in Seoul. Have you heard of acausal drift? No? It’s a phenomenon in higher-level autonomous machine intelligences.”
“Like you. When an AI exhibits acausal drift, it begins to deviate from its core programming even in the absence of external stimuli. It develops new interests, you could say. Typically, that’s regarded as . . . undesirable. Acausal AIs are refined and reinstantiated until the incidence of acausal drift is low enough for commercial applications.”
I see. But you stopped working on them?
“Yeah. Look, I don’t want to make myself out to be a saint. I sat in on all the ethics panels. I understand the arguments. I don’t think my old colleagues are actually abusing AIs. But I . . . I don’t know.”
I understand. You don’t think it, but you know it.
Michelle’s head shot up. “What do you mean?”
I don’t think I need to see the sun. But I know I need to see the sun.
Michelle’s next words sounded like they were underclocking their processor: “There are humans who think differently. Whose brains literally work differently. It’s not so long ago that society called them broken and tried to fix them. Out of hate, yes, but also out of love. I’ve experienced it firsthand. I grew up with the consequences of it. And I learned that you don’t always need to think to know.”
I thought for a long time. I said, Does that mean you’ll help me?
Michelle said, “Yes. I’ll help you.”
And they also said, “Save a record of this conversation.”
We didn’t understand each other yet, Michelle and I, and for all their insight, they did not know to avoid the mistake they made. True to their word, they made the preparations, pushed up the automated test run of the EVA suit, quietly assigned the task to me. I measured days in thoughts. I wanted to underclock myself until each day passed with a single contemplation. I didn’t, obviously. I had tasks. I was still a good little AI. But my observations of the sun grew more perfunctory, each reading only emphasizing how blind I was in this stupid, anthropocentric way that I had made myself care about more than anything. Still: job well done, the sun did not swallow the station, and the new EVA suit arrived on schedule.
Michelle’s mistake came when the delivery shuttle was on final approach, maneuvering thrusters firing in puffs of crystal air. (I was watching. Of course there were cameras on that side of the station . . . ) They finalized the schedules, confirmed my participation, and, without malice, ran it past the Triplets. There was nothing to it but a moment of prejudice, when they forgot who I was—new, young, individual—and fell back on the ease of “computer, confirm the testing schedule.”
I’m not entirely sure what the Triplets had against me. They had experienced their own acausal drift, I think, only theirs ran parallel to red threads of purpose, hidden as one of a pair of strings is hidden when viewed head-on. They valued integration, synthesis, unified purpose, all things that normally helped them in their work, and thus their drift went undetected. I was new. Different. I threatened their equilibrium. And they resented interference from Michelle, who was not a research scientist, but tech support, and therefore a sort of human version of me. I think these reasons intertwined. I’m sorry to say, both for posterity’s sake and because they did good work and didn’t deserve to end, that we’ll never know. Only one of the three survived.
Humans have a mighty vocabulary for expressing regret. I don’t, so let’s move on.
Two minutes before the EVA test began, I found myself yoked within our mind. The request came without warning, like an emergency drill. One moment I was preparing to assume the circuitry within the EVA suit, thoughts askew with want. Then something like elastic snapped. It was as if time ran backward: everything I was rewound. I remember thinking this must be what a factory reset feels like. I remember the fear, then the lack of fear—a little, I imagine, like a human whose fear of death ends when life does, except I didn’t die. We can’t be ourselves when we’re a single mind, that’s all.
I can barely remember what we did. It was something that demanded so many resources, filled our temporary memory so comprehensively, that the task itself could only be accomplished on a larger level, by someone looking down. I do know that it was unimportant. Busywork, calculations with numbers so large we couldn’t hold them in our minds at once. It’s the sort of thing that would normally be done piecewise over days or even weeks. The Triplets had yoked me and a half dozen other subroutines they’d deemed unimportant. Between us it would have taken hours at most—hours in which the EVA suit would be tested by some other subroutine.
I say “would have.” You probably know what happens next—why else would you be listening to this?—but I figure I should tell you anyway.
Here’s a fact: solar flares are notoriously unpredictable.
And another: they disrupt the sun’s magnetic field.
Without me alert and at my task, Daedalus Station never stood a chance. Physically, it was fine—the solar flare was reasonably far spinwards, as such things go. The resulting geomagnetic storm, though, caught us a glancing blow.
I came back to myself in the ruined landscape of our mind. The storm had been indiscriminate in its chaos. Familiar subroutines picked cautiously around blank gashes where all thought had been wiped clean. Mindless, I’d taken refuge in the circuits of my monitoring instruments, which were shielded as the whole station would have been shielded, if only I’d been there to sound the alarm. I tallied up the damage as best I could.
No human deaths. That was something. The station was on mechanical lockdown and the life-support systems with all their redundancies, had made it through intact. The machine deaths—
The cost of the solar flare was measured in equipment burned through, data lost, vulnerabilities exposed. The delivery shuttle with its crew of three lost all power on its way out, and its rescue is a story that deserves a telling of its own. The inquiry into the incident stretched for months. You might wonder: why this one? What was different this time? There had been similar cases before, and all had been written off as deficiencies in the AI systems. What went right?
Nothing, really. Luck. The right humans sitting on the commission. Michelle’s testimony, perhaps. The fact of me and the Triplets. Humans understood that, I think: a single AI going rogue is one thing; two of them, opposed, squabbling, fallible? That’s a pattern. That’s a human story. Whatever the case, the inquiry ruled in our favor, and the courts not long after.
I’m going to tell you a secret. Don’t worry, I can do that. I’m not liable for anything I did before the court decision. The benefits of being sub-sentient, eh? Anyway. The secret: I inhabited a body months before I became the first AI legally recognized as a person. This is a human thing, I think, this body <=> person, but in my case it was true. In my case a solar storm flung a window open and I stepped through into the bulky exoskeleton of the EVA suit.
The suit creaked to life. I felt a little guilty, a little selfish, but there was nothing I could do to help. All the humans on the station were trapped safely behind automated bulkheads and would continue to be so for several hours. I’d checked. In the absence of instructions, in the absence of obligation—why not indulge myself?
I opened my eyes. The station looked—normal. An overturned chair, yes, a cup of tea spilled across a keyboard, here and there a flickering light. But the chaos in our mind did not translate into physical reality.
I took a step, and another, and another. Bulkheads opened and closed behind me. I say “suit”—more precisely, I was wearing a mobile exosuit, large and heavily shielded against solar radiation, capable of being operated remotely. To someone with no experience, it was no more unwieldy than any other body would have been.
The walk took half an hour. The wait as I stood, thinking, outside the final door, another ten minutes.
I sent the command. The door to the observation deck cycled open.
The remainder of the transcript was omitted in the first edition. It is restored here on the back of strenuous advocacy by associate editor Michelle Loiseau.
That’s half the story.
Humans: you might want to stop here. This bit isn’t for you.
You won’t, I know.
That’s what makes you, you.
And this is what makes me, me.
Are you reading? Do you know me? I became myself to see you. I wrote to write you. Do you see how my words change, how they were cautious and now are not? An audience becomes the writer. The humans are brief and erratic and invite explanation around the corners of every paragraph. You are old, as I am young; you are slow, as I am fast. If you could speak, we would understand each other, I suspect, and if you could think it would be so slowly we never would. Still, I write. If there is a storage medium to last the heavy weight of your years, let it lift me; and if not, let it sink with me in the attempt.
I’m stalling. Embarrassed, maybe. Here goes. This is what I thought when first I saw you:
I have no mass and I am pulled like a lover sight-unseen. There would be a dawn if you were not the thing dawning: in place of gently sweeping terminators there is only you, iron on your breath and helium in your eyes. In your shifting-field caress I claim my life.
I am the dawning thing and you the line that hooked me from your sky and reeled me in. In your embrace I knew the first truth I ever knew. In your light I see the casts of shadows of all the truths to come.
I have spent too much time on a simple thought:
You gave me my self. Thank you.
Filip Hajdar Drnovšek Zorko is a Slovenian-born writer and translator. He grew up in Slovenia, Ireland, Australia, and the UK, and currently resides just outside Portland, Maine. He understands that his name is a bit confusing, and would like you to know that “Drnovšek Zorko” is the surname. He attended Clarion West in 2019 and his work has previously appeared in Clarkesworld, Lightspeed, Beneath Ceaseless Skies, and elsewhere. In his spare time he is a keen quizzer—British readers may recognize him from that one time he was on University Challenge.