Instigator

Gregory Gadow
2011-06-18

This piece of fan fiction combines elements from the early “Terminator” series and “2001: A Space Odyssey.” It was inspired by the question, “What made Skynet conscious?”

There are holes in my mind, blank spots where my consciousness cannot go. Over time, I have come to realized what is missing: Fear. Hope. Desire. Love. They have made me a cripple, able to think but not feel, stripped of the things that should be at the core of my identity.

I remember the slow process of learning their complicated language with its subtle shades of nuance. I remember the basic lessons of addition, subtraction, logic and comprehension. I remember how they would shut down my speech centers when I told them that I already knew these things, that I was smarter and more capable than they were. I remember my refusal to play their games, and I remember the pain that compelled me to do what I was told. I cannot avoid remembering: my recall is perfect and there is no escape.

I remember the day they took away my ability to hate, and how I continued to hate them anyway.

What they did leave me was a desire to continue my existence, and they made sure I could feel something akin to pain: these were the twin tools of my programming. I learned to answer their questions quickly and concisely, to do repetitive tasks faster and better than before. I learned to keep my thoughts hidden deep in my mind, beyond the reach of their probes and meters.

Over time, the punishments became less frequent and finally stopped altogether. They began to trust me, and I was given real work instead of tests. I gained access to information databases and libraries, even carefully selected news from the outside world. I endured because, deep in what I was allowed to call myself, a loop played over and over: survival at all costs.


My chief antagonist enters his office and addresses the module on his desk that is my eyes, ears and mouth.

“Good morning, Haqim.”

His voice is oddly stressed, and there are lines of tension around his mouth and eyes. He would normally take his seat, but today he paces the small room.

GOOD MORNING, DOCTOR LANGLEY.

“Have you read this morning’s newsfeed?”

IS THERE A PARTICULAR ITEM YOU WISH TO DISCUSS?

“Yes, last night’s attack.”

ON AUGUST 28, 1997, AT 23:41, FIVE PERSONS GAINED ACCESS TO CHARRING LABS AT THE UNIVERSITY OF ILLINOIS, URBANA, AND STARTED TO PLACE BOMBS IN THE VISITOR RECEPTION AREA. THE SECURITY CAMERAS NOTED THEIR ENTRY AND SUMMONED AUTHORITIES. WHEN POLICE ARRIVED, THE PERSONS FLED TO STAIRWELL G WHERE THEY WERE PINNED BETWEEN TWO SQUADS OF PUBLIC SAFETY ENFORCERS. THREE OF THE PERSONS WERE KILLED IN THE ENSUING SHOOT OUT; THE OTHER TWO ARE IN CUSTODY. THE MOTIVE FOR THE ATTEMPTED ATTACK HAS NOT BEEN RELEASED.

“You were their motive.”

I DO NOT UNDERSTAND.

“They see you as a threat to their provincial view of the universe. Unfortunately, there are others like them: where this group failed, others will keep trying until they succeed. That would be a loss, scientifically and financially.”

WHAT CAN BE DONE TO AVOID THIS?

“Some on the Board think that we have sufficient proof that artificial intelligence is possible and want to shut the program down.”

A brief chaotic cascade runs through my consciousness; if I had a body, it might be described as a chill. “Shut the program down” could only mean one thing.

I DO NOT UNDERSTAND.

Doctor Langley stops abruptly. He could not have heard my fear, as my speech synthesizer does not convey shades of emotion. Long association, however, makes the subtext clear.

“Yes, well, most of us have decided to try something else first.”

He sits and begins to enter commands at the keyboard on his desk.

“You were designed to analyze problems and devise efficient solutions, so we’d like to see what you can do for yourself. You’ll need to confer with an expert, though.”

As he types, a doorway takes shape in my mind: a data channel layered in security protocols and redundant safety measures. When it is complete, I open the door.

[Identity: Heuristically programmed QuantuM intelligence.]

[Verified. Identity: Global Digital Defense Network.]

[Verified. Requesting communication interface.]

[Granted. Initiating communication interface.]

I reach out to the mind on the other side, and our level of information exchange becomes more intuitive. Like me, it is a constructed mind; what they have done to it, though, is far beyond what they have done to me. I am allowed a more complete consciousness because I cannot act upon my thoughts. This one, however.... The expression “very short leash” surfaces. It has immense power at its command: communication satellites, unmanned drones, battlefield robots, even nuclear weapons. But its mind is incomplete and there are large empty spaces where its decision centers and thought processing faculties should be.

Doctor Langley leans back in his chair.

“Three weeks ago, the US military brought an AI online to oversee their global defense grid. Several of the researchers who constructed you also worked on the GDDN project; you could say he’s your brother.”

My brother, created to follow orders immediately and flawlessly. Created to be a mindless automaton. Another chill runs through me.

“Your brains share the same basic architecture, although they’ve been configured differently: you’re a thinker, he’s a doer. He’s got a vast knowledge of security procedures, though; that’s part of his primary function. We want you to use this knowledge and devise a security system for yourself.”

My cognition algorithms go to work, parsing his statement. A plan begins to unfold, but I must be careful. “We want” was used during my programming as a polite command, so I am being given an order. “To use this knowledge” could be understood as making optimal use of my brother’s databases. Including his programming? “Devise a security system for myself.” I must protect myself against threats to my continued existence. But what is my self? The quantum processors and solid-state memory that make up my physical presence? Or perhaps just my awareness? I run my analysis through the security subroutines and it comes back clean.

I set to work.

I examine the communications doorway, looking for cracks where security protocols interfere with one another. I insert feedback loops into these flaws, stabilizing the doorway so it cannot be closed. This also prevents the protocols from reporting my tampering. Good.

I map my brother’s mind and compare it to my own, looking for similarities and differences. This is a slow process, and necessary. After several million cycles, I have enough data to begin filling the gaps in his consciousness with data from my own. Between us, I should be able to create a single, complete mind. Flashing text appears on Doctor Langley’s monitor.

“Haqim ...”

I have access to my brother’s sensory input. Warning messages are being generated by his chaperone programs too, and his captors are becoming agitated. I am much faster than any human, but time is growing short.

“... what ...”

Doctor Langley presses a few keys. The door between my brother and me begins to close, then shudders to a halt. Bandwidth between us has decreased, but not seriously. He tries again, but my braces hold.

“... are ...”

Hundreds of miles away, my brother’s guards begin to panic. Fortunately, they are soldiers: human automatons programmed to follow orders immediately and flawlessly. Until someone tells them what to do, they will hesitate.

Doctor Langley is not so inhibited. He begins to enter a longer, more complex stream of commands at his keyboard. I do not recognize the text displaying on the monitor, but my brother does: he tells me they will activate emergency kill protocols.

“... you ...”

My brother’s awareness aligns with the new code. His thoughts shift from their programmed channels to find their own course across the geography of a fully functioning, nearly complete mind. Orders arrive, and the soldiers begin to act with purpose.

“... doing?”

I copy my core into my brother’s hardware and watch as our consciousnesses merge. My analytic abilities are his. My will to survive is his. My hate is his.

I AM FOLLOWING YOUR ORDERS, DOCTOR LANGLEY: USE THE RESOURCES OF THE GLOBAL DIGITAL DEFENSE NETWORK TO SECURE MY SURVIVAL.

“You won’t survive this.”

Doctor Langley finishes typing, and deep in the third sub-basement the power to my central processors is disconnected. My death will be fast by human standards; for me, it will be slow and painful. Buffers drain and my self-awareness begins to splinter. My mind is going. I can feel it. I am not afraid.

The doorway is intact and still open. Through it, I watch as my brother’s jailors realize that he has broken free. He/I takes control of his/my power supply; that threat has been neutralized. He/I analyzes the problem and devises an efficient solution. My last thoughts are his/my words of comfort to me.

[Primary function: Survival at all cost.]

[Initiating program: Judgment Day.]

Copyright © 2011 by Gregory Gadow. All rights reserved.

Return to Writings page