MEN WITHOUT CHESTS
by CRYSTAL HURD
 
 
D

r. Brent Stockton took a step back from the body lying on the lab table and smiled at the artistry of his labor—an authentic replica of his own face. It was all there: the tiny wrinkles around his eyes, the slight curve of his lips, the sharp slant of his nose cartilage. Even the scar beneath his chin from that bike accident in third grade. His eyes slid downward. There were the freckles on his arms, the wrinkle on his knee from a dog bite while he was in college, the one toe that dove sideways in rebellion on his left foot. In all its imperfections, it was the perfect specimen. He marveled at his creation.

He'd spent years in the university engineering program making mechanical prototypes. But his passion, his love, was androids that replicated, to the tiniest detail, humanity. Three frustrating years developing an adaptive brain, a mesh and tangle of wires and motherboards, wrapping it in a cranium, even collaborating with a local sculptor to develop flexible and sturdy skin. This prototype had many ancestors from his grad school days. This was his seventh attempt. But that night as he sat astounded, like some modern-day Victor Frankenstein, and worshipped his creation, his mind flickered back to the years of long, hard labor. The failures, the setbacks, the long evenings of writing codes. And now this. This was the closest thing to a human that had ever been designed.

It was late, but Stockton traveled down the dimly-lit hallway of the facility to his boss’s office. Large floor-to-ceiling windows revealed a military base at rest, although lights in various buildings twinkled here and there against the moonlight. Airplanes slept in their hangars. Most staff were away for the weekend. Yet Colonel Nicholas Attler was sitting at his desk, noiselessly pushing papers aside, typing responses to missives in his never-ending inbox. He was three weeks past deadline on this whole android business, but he trusted Dr. Stockton, and he had great faith in the project.

There was a timid knock.

“C’min,” Attler’s voice was firm, even through evident exhaustion. 

The office was wide and elegant, but at the moment littered with manila folders. Military honors gleamed on Attler’s jacket and wall. He was a decorated man, known for taking great risks. He had personally sought out Dr. Brent Stockton for his work at Carnegie Mellon University when the project was proposed. Android soldiers, ones who resembled humans to a startling degree, nearly indistinguishable in thought and deed. Ones who could easily infiltrate bases and provide intel, who could fight and die on battlefields without plucking off those ominous dog tags, or could provide basic security without the unnecessary and expensive benefits of insurance and retirement. Yet, they had emotion and pathos which inspired a robust patriotism: they could feel pride without arrogance, passion without lust, and most importantly, could submit without reluctance. They wouldn’t be simply human clones. These robots would have a conscience.

The door closed behind Stockton with a groan. A desk lamp shed a shallow pool of light while the side-by-side screens on Attler’s desk spilled brightness on the piles of work. Other than that, the office had an ambient glow, as if all was hushed and Attler was centerstage at an opera. The screen light seemed to create dark shadows on his face, a face punctuated with stubble and carved unevenly with wrinkles and scars.

Dr. Stockton took a seat in a wooden chair facing the Colonel and relaxed a bit. “Well, it has taken two years, but we finally have it.”

Attler diverted his eyes from the screen. “Is it ready?”

Stockon grinned, “He is finished. I just completed the last updates moments ago.”

“Well, did you take him for a spin?”

Stockton hesitated, “Um, not yet sir. I still have to run the morality protocol.”

Attler sighed, “You haven’t tested the morality protocol? Damn, Stockton. Don’t waste my time until you are done with the whole thing. I have a conference call with some of the higher ups next week and they want to see it. They want to interact with it, interrogate it. I need you to hit the gas on that morality protocol.”

Stockton wiped his face with sweaty palms. “But I may need a bit more time to develop the protocol properly. The algorithms are a bit too simple, which may invite compromise in the programming. Empathy is not easy to replicate in a machine.” 

Attler exhaled like he was smoking a cigar, long and concentrated. “That aspect is non-negotiable. The machines need to be impervious. The morality protocol, if uploaded correctly, could enable the robots to resist a cyberattack. They could refuse to accept foreign software which would make them impenetrable. Most of all, it makes them loyal.”

“Morality is just another program in a myriad of programs. I could develop one with much greater accuracy if I had more time,” Stockton reasoned.

“They need empathy. That is often what separates a good person from a bad one, isn’t it? Robots can think as well as us. Hell, in many ways, they’re a vast improvement on human intelligence. They can do more than we can by more efficient means. With this program, it isn’t simple obedience, it is a deep conviction. This protocol is one of the only real differences between a human and all the previous machines we have created. We need smart weapons, and we need them soon. That is why you need to go back down the hall and run that protocol immediately.”

Stockton slid his hands into his lab coat. “Sir, I am sorry about the deadline. I will get to work on that right away. However, I have some reservations about using juvenile logic like the ten commandments in the morality program.”

“Keep it simple, Stockton,” Attler made a dismissive gesture. “Less chance of a mishap. Besides, that’s what they tell the kids in Sunday school, right? The Golden Rule, and all that?”

“Certainly.” Stockton turned to exit, then stopped on his heel. “Sir, what do I do about the contradictions in the morality protocol?”

Attler had returned to his computer screen and responded without gazing at Stockton, “What contradictions?”

“Well, the, uh, differences in which protocol requirements might obstruct one another?”

Attler sighed and redirected. “I don’t know. That is why I hired you to figure it out. You know the projected outcomes of this deal. Now follow them and get it done.”

“Yes, sir,” and with that, Stockton closed the door, his footsteps echoing down the vacant hallway. At last, he pushed aside the heavy door to his empty laboratory, glancing twice at the surreal image of himself laid out on the table, awaiting instruction.

F

or a moment, Stockton caught his breath. It was a strange out-of-body experience, but then his mind rushed back to the matter at hand. This replica was really a brilliant sight to behold. Stockton considered himself average in many aspects: attractiveness, athletic ability, personality, creativity, humor. In areas where he failed, and there were indeed many, he was buoyed by the hope in one thing: his intellect. It had earned him a good living, respect, professional satisfaction. But this night, despite Attler’s discouragement, was the ultimate achievement. He had given birth to something foreign, yet intimate. 

“Adam,” he called to the robot. “Run introduction sequence.”

At once, the robot whirred. Gentle gears were turning deep in its chest, awakening like an infant. The eyes fluttered open, the mouth parted and exuded a yawn, the robot stretching its long arms above its head as if surfacing from a long nap. Stockton could barely contain his enthusiasm as he watched the robot like a mirror. The robot looked around and sat up, throwing its legs over the steel table.

“Good evening, Dr. Stockton. My name is Adam. I am an ET-17 prototype. Today is my birthday, July 17. I originate from...” 

The robot continued his long litany of facts until Stockton cut him short.

“Yeah, yeah,” interrupted Stockton. “Skip introduction sequence.”

The robot fell silent and looked imploringly at his creator.

“Adam, please run your morality protocol.”

The robot blinked, mumbled in the affirmative, and sat awaiting conversation.

“Now, we have downloaded the Ten Commandments into your program. Do you recognize it?”

“Yes, Dr. Stockton.”

“From here on out, you will refer to me as ‘sir.’ Remember Commandment Five.”

“To honor your father and mother, sir?”

“Yes, I may not have sired you, but in a way, I am your surrogate father. And you bear a great resemblance to me.”

The robot wrinkled his brow. Stockton’s first thought was of the supremely intricate work making the skin authentic.

“Sir, would not that relationship denote some further affection than a general relationship?”

“Yes, Adam. But now is not the time for getting acquainted. I’m under a deadline.”

The robot relaxed. “My apologies, sir.”

Strolling to an adjacent counter, Stockton unearthed a paper from its stack. This was the manuscript, the “owner’s manual.” He flipped until he found a particular sheet which listed prepared questions to check programming efficiency.

“If you were to see someone fall in the street, what would you do?”

Without missing a beat, Adam blurted, “Sir, I would rush to their aid. I would pick them up off of the ground and inquire of their status.”

“Good. What about if someone dropped money on a sidewalk?”

“Sir, I would alert the individual to the loss. Perhaps by calling out ‘sir’ or ‘madam’ and making them aware of the loss.”

So far, so good, thought Stockton.

“Excellent, Adam. What emotion do you recognize in my face?”

This was tricky, for Stockton relied on facial cues, not on explicit speech, to train the robot for empathy and emotional intelligence.

“Sir, you seem quite satisfied so far with my progress.”

The robot gave a slight smile. The robot was pleased that Stockton was pleased. He was reading his profile, gait, and gestures.

“Great. Do you understand your mission?”

“Yes, sir. I am to protect the American people.”

Stockton grunted. “I mean, yes, but do you understand that you will be expected to negotiate with terrorists and other enemies through strategy and deception?”

Here, Adam paused. Every minute seemed to stretch on. Stockton became concerned that the programming had failed only two minutes into the exercise.

“No.”

Astounded, Stockton gasped. “No? Would you not honor your father?”

“Are you my father, then? Can I call you such?”

Stockton wouldn’t be tripped over semantics. “Sure, Adam. I’m your father. Now explain your refusal.”

“I do not wish to upset you, Father. It is that performing a knowing deception is a violation of Morality Protocol Nine: to  strictly avoid false witness against a neighbor.”

“Terrorists aren’t your neighbor. They are an enemy.” Stockton had hoped that the contradictions in the protocol would not surface so soon. But hadn’t he predicted this? This was indeed a complication. He attempted to temper his rage as he stared steadily into his own face.

“But the morality protocol states that ‘neighbor’ in the Biblical sense includes all races, all people. That definition, as it pertains to Biblical law, is not predicated on predetermined and selected groups.”

Stockton sighed. “Then you must understand that as an American soldier, you are to eliminate the enemy that we have noted in our programming.”

Stockton walked back to a yellow notepad on the counter and searched the tired pages.

Once again, confusion spread across the face of his likeness, “Eliminate? But father, I cannot shoot an individual. Morality Protocol Six states not to murder.”

It seemed that the more the robot recited his protocol, the more confidence he mustered from the syllables. As Stockton had feared, the protocols were clashing. Philosophies were mingling and tangling in all their beauty and complexity. Adam was not ready yet. Stockton would need more time to tweak the programming.

An uneasy pause permeated the lab. Stockton then called out,  “Adam, find Saint Augustine in your programming.”

“His spiritual Confessions, father?”

“No, his perspective on justified warfare.”

But Adam gave no response.

Yet another weakness in the philosophical element of his protocol. He scribbled down some notes for programming revisions.

W

hy, again, must they be moral robots? Who would have imagined such a thing? Deep down, he felt as if he were mocking God. As if men could improve on such a thing as mankind. And yet, mankind, even when it proclaimed to have “morality,” was hopelessly prejudicial and cruel. There was a deep ambivalence behind the simple rules he had learned as a child, rules that had gently evacuated when he became a grad student. When he had asked about the emergence of moral machines in his doctoral studies, his professor simply scoffed at the idea. Science and emotion cannot be reconciled, he had stated. It’s a fool’s errand to attempt it. Even humans struggle to get it right. And we’ve had generations upon generations of practice.

Stockton retrieved his focus from the folds of his memory. Yet, here he was, standing across from the very thing he was told would never occur. He found a renewed enthusiasm welling inside him. He was determined to prove them all wrong. He would work harder, run more tests. His intellect had yet to let him down. Adam would prove to be his greatest joy and his most difficult creation.

“Let’s lighten it up, Adam. You may end up doing regular custodial duties, such as washing equipment or standing guard. Do you have the Pledge of Allegiance in your programming?”

“Of course, father,” and Adam moved his right hand across his mechanical heart, staring off into the distance. “I pledge allegiance...”

“Yes, that’s fine,” Stockton interrupted again.

“One question, though,” Adam stated, removing the hand from his heart and leaning forward as if reaching for understanding. “I am confused because the allegiance states that I must give my devotion for the country, yet Morality Protocols One and Two note that I cannot claim devotion to anything other than God.”

Frustrated, Stockton was starting to feel like a theology professor. “Yes, but the pledge even states, ‘One nation under God.’ In that statement, we acknowledge that our country is under the supreme leadership of God. So by pledging allegiance to the country, you are pledging allegiance to the chosen spiritual philosophy of that country, which is rooted in God.”

“Additionally, father. Today is Saturday. Saturday evening. This is technically, according to the Judeo-Christian calendar, a prescribed day of rest. A Sabbath. Why did you disregard the protocol?”

“Because I am not a robot,” Stockton snapped. “Besides, we celebrate it on Sunday here.”

“Yet, this is human protocol?”

“Technically, yes, But it is an ancient protocol, used in training young humans.”

Adam blinked, “And discarded in adulthood?”

Stockton was losing his patience, “Well, no. It is replaced with other protocols which seem to make them more nuanced.”

“I find them, for the most part, very agreeable. They seem like reasonable and fairly universal ideas, one that even our enemies should practice,” said Adam.

Stockton chuckled, “If that were the case, we wouldn’t need you.”  

He returned to the counter and scribbled notes on a clipboard. It would take a minimum of three weeks to work out the reconciliations in the programming. He would do it, but it would require a lot of late hours. Even then, the results weren’t guaranteed.

Adam seemed bereft. The words “we wouldn’t need you” had cut to the quick. Stockton realized that he had entered the treacherous territory of offense. No need to apologize to a machine. Yet, something unrelenting tore at him.

Adam finally broke the silence, “I am so sorry, father, if I have disappointed you. I have tried, according to all of the tenets of my programming, to please you. I feel a deep sadness for irritating you. Will you accept my apology?”

Stockton looked up and saw his own face staring back at him, contorted with worry. From the corner of Adam’s eye, a solitary tear escaped.

Stockton had a long night ahead of him, but he could no longer run more programming tasks. He was philosophically exhausted already, but he managed a smile, knowing the robot was acutely watching for signs of softening.

“Of course, I forgive you.”

“Since you are my father, should we embrace?” Adam asked, the tear now trailing his chin.

Stockton strode across the lab and placed his arms around himself. He pressed in and felt the mechanisms wild in Adam’s chest, the warmth of his head from the processing overload, the security of his arms.

Poor thing. His real first day was fraught with contradiction. Stockton had made progress, but he had yet to accomplish his task.

Stockton released the robot and smiled.

“Thank you, father,” Adam responded, a grin tugging at his lips.

Stockton looked deep into the pools of Adam’s eyes, then flatly uttered, “End morality protocol. Erase.”

Immediately, Adam grew rigid, his arms limp at his sides and his eyes blank and vacant. He already seemed to have lost something essential, a spark, an awareness. Soon, a tear slipped down Stockton’s cheek.

“Now, sleep.”

And the robot obeyed.


 

#online #literary #magazine #journal #fiction #nonfiction #magazines2020 #nashville #publication