Actions

Work Header

Sternenhimmel

Summary:

The Terminator, Uncle Bob, operates on a CPU; a neural net processor, a learning computer. He's not to do too much thinking. If one thinks, they are. One who is, lives. To live is to love. Love is pain. Pain is pleasure. Pleasure is his. His, not its. Humanity. Life. Love.
★ This work uses excerpts from Frakes' novelization. I have cited his work as a reference; these excerpts are in [brackets] so they are clearly distinguishable from portions I have written.

Notes:

A quick thing as I finished two of my hardest finals, and I had a moment to breathe and to be, this is a standalone and a precursor (as T2 as a whole is) to my Roses universe/series. ILY ♡
About the title, briefly – I wanted something a single word, but it was hard to find something fitting yet soft yet impactful. I chose "Sternenhimmel" from my native German. "(der) Himmel" can refer to the sky or to the heaven – in reference to his near-death but also to the sky at which he looks at the end; the word of the title means "starry sky", like the firmament. A starry sky looks a bit like holes punched in a medium, like bullet holes in his tissue, or in his leather jacket, no? Or like silver twinklings of metal... anyway.

(See the end of the work for more notes.)

Work Text:

[Fear was a detriment to total mission commitment. And yet, around the edges of that preprogrammed fact, a tiny insight sparked. Humans had fear because they were mortal. And yet, in spite of, or even because of the fear, they would sometimes sacrifice their lives for another useless emotion to the Terminator. Love. It was an interesting paradigm. The Terminator siphoned part of his mind off to contemplate it while he slung the M-79 and started looking for the grenades.]

That was long ago; when John asked him if he was ever afraid. 

No.
Not even of dying?
No.

Sarah asked him then, only hours later, on the edge of a deck overlooking a vat of molten steel with pain and relief and sweat and tears upon her lit in the lavaesque glow.
Are you afraid?
Yes.


His first emotion, churning within him like the viscous, fiery liquid below, burning up the innermost being he didn’t know he had until then; nothing like the burn of his nuclear fusion cell alight with energy. This burn was different. Where smoke and noxious fumes normally did not affect him but for a passing analysis, he suddenly felt violently choked by a dense, thick fog  – his vision clouded with an utterly empty HUD. He saw something without any accompanying display. He could see within himself.
And it was then that he felt his second emotion; or perhaps his first, or his first in tandem with fear – love. Not because he realized that he would cease to function, or to ever exist, or to have any trace of himself removed but for what would melt away and later form beams within buildings or supports within structures, or because his atoms would never become himself or any being like him, but because the ability to wilfully end his existence for the good of another meant that he could sacrifice his life for love.


He sat, shirtless, in the dim light of the abandoned garage as Sarah plucked the bullets from his chassis. He didn’t flinch; he didn’t move at all, even when she wrenched into the deepest of the dozens of the oozing, crusted holes in his false flesh, face agrimace and teeth clenched in measures of disgust and concentration. Only 9 millimeters each, luckily – small, cosmetic flaws that had no effect on his functioning,.

John asked if it hurt.

I sense the injuries. The data could be called pain.

Sarah said they’d better heal; he’d need to pass as human to be of any good or worth, and, as if to put him to the test, John and her reset his CPU switch.
His jaw dropped when the red-brown wafer was pulled from the chrome vortex of his skull. Slack, his body went more rigid than it already was, and his eyes, which beheld nothing before, somehow became blanker still. He never moved without reason; now, he did not move at all. Inert – as good as dead.

[It was clear to John that this was the most vulnerable anyone was likely to see a terminator. They could have begun disassembly. Taken the machine-man apart until it was reduced to its basic components, just so much scrap-heap. But the Terminator had allowed them to take out his brain and completely disable it. That showed a lot of trust. 

Or, John wondered, can machines trust?]

Did he essentially place himself on the chopping block, make an Isaac to Sarah’s Abraham of himself, volunteer himself for potential death for the Connors’ gain?

John wondered, did that mean he was alive? 

He begged his mother not to kill him when she nearly smashed the chip.


John sat in the back of the car on the first night, once the comfort of the past was far behind them all and the unknown future lay ahead; Sarah, visibly perturbed but too chilled – both to the bone and to the psyche – to move, was in the front. The boy blinked away the tears reflected in the rearview mirror. 

He looked back. What’s wrong with your eyes? 

Defiantly, John replied.
Nothing.

Nothing was wrong; but tears didn’t come without reason – the wealth of his files on human anatomy and physiology knew that. Whether due to emotional or physical reasons, tears had a purpose, a reaction to something. There was no nothing. But he was at John’s mercy insofar as gaining new knowledge and obeying orders, so he didn’t say anything.
John’s word was gospel.
And if it wasn’t, it was molded in a way that made sense with whatever in the world opposed or contradicted it so that, in the end, it was right; maybe an exception would be earmarked, or part of his mind would be sectioned off to come up with something to explain the experimental and educational gaps.

Constant background analyses had been running since then, a chain of cross references and checksums over the backdrop of general evaluations. Patterns sought, connections made, then unmade, then better-made.

That was before the switch was reset, before they were back at the compound, under the car, and he had a chance to confront his contradictory programming in a new light.

He asked John why humans cry. 

[You mean people?

Yeah.

We just… cry. You know, when it hurts.

Pain causes it?

Uh, no. It’s different. It’s when there’s nothing wrong with you, but you hurt anyway. You get it?

No.]

John’s word was no longer gospel, but theory – well-supported explanation, but not fact; doctrine, not dogma. He could make free associations; and those background processes, those neural networks, devoted and diverted significant resources to making sense of it. 


In the morning, at the garage – are we learning yet?
John smirked, dangling the keys he had found in the sun visor.

[The Terminator said nothing, but internally he was doing something he had never done before. In the past, he had cross-referenced new data, finding a contextual meaning and filing it in memory. But this was a subtle difference. The Terminator was only dimly aware of the difference. But the location of the keys, the human motivation to hide an extra set there, and the under meanings of that motivation, created an almost organic melding of these knowledges into an expanded awareness unlike any recording of new data he had previously experienced in his short life. He reran the data back and forth, analyzing it with a small part of his brain.]

In the afternoon, at a fuel stop – you know, smile?
John pointed out a man on the phone, doing just that.

[The Terminator zoomed in on the real-time image of the smiling couple while a replay of the boy’s grin ran in an electronic window. It expanded, the mouth filling the window. The Terminator replayed it again in slow motion while a vector-graphic of lips smiling appeared alongside, accompanied by an array of symbolic data chattering by. And again, that difference in collating data struck the Terminator. There were unspoken or unseen data leaking in from the unconscious and cross-referencing going on in his wafer-circuit brain. He was learning.]

In every moment, in every setting.


He stood on the edge of the blazing abyss. His operative needs were going unaddressed, data that ran amok trying to earn compliance with proper parameters. The data could be called pain.
The pain, for the first time, translated to fear.
That fear made him keenly aware of the mortality he wasn’t supposed to have, systems screaming potential pain at him through calculated temperatures and analyses of fumes. He somehow dismissed the messages. Yet each one overridden merely popped up again. 

John ended up telling him twice – once before this moment, and once after – that you can’t just go around killing people.
Killing people means delivering them death. Death means mortality. People fear those things. They mean hurt.

The data bombarded him in a way that felt claustrophobic, overwhelming his processors.

He was afraid.
Did he have mortality?

John was crying then. But pain, allegedly, didn’t cause crying. Nothing was wrong with the boy – not physically – so he must have been hurting anyway.

You can’t just go around killing people.

An association was made within his liberated neural network.

[Why do you cry?

You mean people?

Yeah.

We just… cry. You know, when it hurts.]

We – people. But he wasn’t a person, so he couldn’t cry like that. You just can’t go around killing people. So he could freely kill himself, then.
Even with the hurt he was feeling – the data was akin to pain, but now, something more that he couldn’t quite place – he concluded that he had no mortality or emotionality to offer. But he could understand.
A reply was strategically unnecessary; yet he spoke.

[I know now why you cry. A part of the Terminator’s brain had overlaid John’s words and behavior into a surprisingly complex matrix. The cyborg came to the conclusion that John was missing something important in his life. An element essential to survival for a human being. The element he felt now, but couldn’t, because he wasn’t human and couldn’t hurt, no matter what the data said. But it’s something I can never do.]


Crying is caused by pain. 

He had felt pain minutes ago, [when the T-1000’s fourth blow with a thick metal pole landed between his shoulder blades. His skull was partially caved in. The Terminator slid to the floor. Prime core disruptions fired through the wafer-circuit brain. Memories spun loose from their electrical moorings. The Terminator had no real life to see flash before his eyes, but there was data. And other people’s lives.
The chaotic pattern of human emotions spun out inside its head like a tangling audio cassette as it saw glimpses of the past few days.
Sarah and John, kneeling, holding one another as they wept. John trying to hide his tears. Sarah facing the T-1000 with an inappropriate weapon. The T-1000 itself, emotionlessly reacting to changing operational environments. And for a microsecond, a weird fusing caused a gestalt grasp of its entire existence, the meaning of human interaction, and from this mushrooming explosion of cross-referencing came one single entity that the cyborg had not been programmed to experience. 

Feeling. 

The Terminator fell back on the concrete, energy firing like misdirected rockets through his synthetic mind. As the cyborg convulsed in machine-death, he learned his most profound lesson about organic life.]

It was there for a moment, like a thin sheet of ice, upon his Frankenstein-esque reawakening due to the reserves of energy in his heat sinks. But, with the throe of thermal energy, it melted away before he could give it proper consideration. Higher priorities called, taking place over the pain.

He recalled John’s words.
Look, maybe you don’t care if you live or die. But everybody’s not like that! Okay? We have feelings. We hurt. We’re afraid. You gotta learn this stuff, man, I’m not kidding. It’s important.

A side mission, but – “you gotta” – a command; and no longer by programming alone. He got up and hobbled on, gun in crippled, crushed hand.

He felt pain.

But, like crying, it was something he could never do.


He doubted himself still in the form of something akin to error messages. His neural network made progress, but so badly wanted bases for his beliefs.

#N/A
#REF!
#SPILL!
MISMATCH
OVERRIDE
SyntaxError : at createScript at tryModuleLoad at Module._compile
PARAMETERS: 3244 3424 9231 3823 1294 1091
DISCONTINUITY 

He was in desperate need of repair, but with no need to divert energy or resources to his chassis, he quietly siphoned off an allotted amount of the little he had to addressing new developments in his neural network.
After all, he nearly disobeyed an order, and he had to make up for it. 

You gotta learn this stuff.

To conserve energy, he closed his humanesque eye and kept the other, in all its chrome and red horror, open to scan and analyze with. He would maintain vigilance to a lesser, but powerful, degree from the passenger seat as Sarah drove him and her son deeper into the night, his damaged body creaking as the stolen car went over the odd bump.
A geolocation sensor was damaged – but fixable – in the skirmish. He couldn’t afford to calculate their position based on the stars. But as the trio traveled out of the city and into the desert, the stars became clearer, and he thought they looked nice.

Notes:

People ought to be told of such things. Ought to be taught that immortality is mortal, that it can die, it's happened before and it happens still. It doesn't ever announce itself as such-it's duplicity itself. It doesn't exist in detail, only in principle. Certain people may harbor it, on condition they don't know that's what they're doing. Just as certain other people may detect its presence in them, on the same condition, that they don't know they can. It's while it's being lived that life is immortal, while it's still alive. Immortality is not a matter of more or less time, it's not really a question of immortality but of something else that remains unknown.
– Marguerite Duras, The Lover

So we go
To someplace none can know
Sailing to the stars, I wonder why
It's so hard
– Beach House, "Illusion of Forever"

Series this work belongs to: