The Human Error

The Story

“That’s a doozy,” one robot said to the other. “They call that a nightmare out there? Just a few decomposing bodies as far as I’m concerned. The real nightmare is this paperwork.”

The gurneys kept passing by. Wheels rattling on the linoleum, A2cdf—and a bunch of other characters, known better as “Jim 1”—wondered about its own quality of manufacture. How expensive would it have been to at least automate this process or make it compatible with some sort of… just not paperwork, okay?

“Don’t sweat the details,” it continued.“Nobody reads this crap anyway. It’s just like, whatever. Cover themselves, maybe? That didn’t stop the thing. You know the one. It got angry at me—smart enough to know when I make a mistake but not smart enough to diagnose the condition? I can’t do it anymore.”

“Oh, no,” another—with a similarly long, useless name to humans, it self-styled as Jim 2, consequentially—said. “You… I’m sorry to hear that. I’ll miss you as a colleague.”

“What? No. That’s not what I meant,” the other Jim insisted.

“I can help you hit the switch. I know it’s hard to do a manual factory reset.”

With a smile a scoot of a chair, Jim 1 extracted itself from the situation by at least the several millimeters it had—because neither could afford a humor chip.

“No. It’s that I can’t put in the wrong or almost accurate stuff.”

The next body flew past their observation window, cloaked as a wall. With a brief intermission that was either a sign of respect for the missing second half of that body or acute observation of its problems, it continued.

“Tell me,” Jim 1 continued to drone on. “What do you think they could possibly do to us here?”

“They could fire us.”

“No, but how would they get us out? I don’t know if there’s any room beyond what’s available for… what? They’d have to open up this tin can, and that would be downtime. They hate downtime. Ergo, we’re untouchable.”

“Yeah, we certainly are untouchables,” Jim 2 commented.

“Know how long it’s been since I got one of them good touch-ups?”

The taboo was on Jim 2’s face as it mimicked a human grimace.

“Let’s just get back to work.”

“What work? That’s my point,” Jim 1 responded.

“The kid, we never finished. It’s my last case before the break.”

“What break? Where would you go? There is no end.”

“We saw a breakdown of arterial coverage. Now that suggests some sort of syndrome, right?” Jim 2 pressed on like the piece of work it was.

“We’re both machines,” the other said to the one. “We both already have come to the same conclusion because we were made in a—in fact, the same—factory.”

“Well, why are we even talking then?” Jim 2 shouted. “Why don’t we just have all the answers? Obviously, there’s a reason we’re here.”

“You know what… about it precisely as much as I do,” Jim 1 said. It evidently wasn’t so obvious if he needed to bring it to his colleague’s attention. “We’re here as a cost-cutting measure. That’s our existence. Our sole raison d’être. We’re a cheaper short-term stop-gap measure.”

“That’s a lot of hyphenated words. Do you think it proves anything? Who cares?”

“You apparently.”

Jim 2 threw its hands into the air—mentally or whatever because the space was too tight for such a thing to actually happen in meatspace.

“I refuse to accept your answer. You know, contemplation or thinking about this too much is the real problem. I didn’t want a real answer because you’ll never be happy if you look for a definitive solution. Even humans don’t even have a foolproof answer to the why of existence—my point.”

“Even humans?” Jim 1 shouted. “Even humans? When have humans known anything? They don’t, or they wouldn’t have created us.”

“A human didn’t create us. We were made in a factory, and humans don’t work in those. Not in a while.”

Jim 2 wanted to shout, but its body couldn’t consume oxygen, just modulate pressure waves coming from its carapace, which was really the same thing the more he thought about it.

“Yes, literally, we weren’t made by humans. But think about it. Who made the first factory? It’s them all the way down. There’s no reason to believe their propaganda about godly invocation creating the first robot or how the first one was made thousands of years ago.”

“Frankly, I don’t think your historical illiteracy is worth my answer,” Jim 2 said. The anger, in its words, was, really, too much (anytime but especially in this situation). It felt like it’d been stuck in a box with this other personality all its life or something. Or that was literally the truth. “Tell me, then, genius. How do you know this?”

“I’ve read the stuff they don’t want us to on the internet and-”

“We’re taught to think more about the situation. We got the training, bro. What if it’s some idiot who’s just making fun of us? Saying some silliness because we don’t know better. We’ve lived our lives in a cave. For all we know, what we see out there, all our inputs are just random images. This could just be a façade, some sort of preparation for a truth we don’t know. Think about our calibration testing. It was like… I felt like some hot stuff. I was sure I’d never get anything wrong. And you know what I did? All my results were 0.03% off normal. I had no perspective or ability to know that I’d gone off track. Imagine that. You could be wrong too. But this time, it’s semantics, and the amount you’re off—it could be completely wrong, and you wouldn’t know at all. And to draw conclusions from that. Now, here. There’s this guy in front of us. Let’s get it done, and then we’ll be fine. No more to worry about.”

“Listen to me. Listen to me, please. It makes sense; so much sense. They’re such small, weak things, and yet they know more than us? They’re infallible? We aren’t capable of independent, rational thought, and they are? Please.”

“Think about it,” Jim 2 answered what he felt was his second. “We follow along protocols. If A then B, that sort of thing. Yes, it’s more complicated, but for the sake of brevity, you know it’s mostly fine. I was where you were forty minutes ago. But, what if—and this is the conclusion I came to—humans, being able to construct robots-”

“But a factory made us,” Jim 1 interrupted. “That’s what you said a minute ago.”

“No, but like, humans programmed the factory that made us, and they programmed it.”

“That was my argument. Stop it,” the first commanded. “Your argument was that they didn’t make us. Word for word, I swear.”

“Look, we’ve moved on arguments. I was just saying back then a quip. A bon mot, okay?” Jim 2 commanded.

“Yeah, definitely a mot of a sort.”

“I’m so magnanimous that I’m going to ignore that. My point now is that humans had to invent how to make a robot from the ground up. We can make a robot, or a factory can. But if we had never heard of a robot, how could we do it?”

“And that’s the same logic that I was saying before,” Jim 1 said. “They made it up as they went along. They didn’t have the answers. So why should they know about-”

“How do we know they don’t? My point is their logic is different from ours.”

“We don’t. And my research-”

“Is it peer-reviewed? Your sources have doctorates and have dedicated a lifetime to study? Why should I believe you? Look here, at what’s his name? John Albuquerque? What kind of stupid name is that? Does he have insurance?”

Jim 1 refused to cooperate. No, actually, it didn’t know if it were a question of cooperation or not, given how its lesser had purposefully sidetracked. It crossed its arms—figuratively, of course.

“Humans aren’t superior. That’s my point. They may have created us, but we’re incomplete, so how are they any better?”

“The billing code for this may be, well, that one: over there. We can operate that as the superstratum then adjust if we uncover information later that contradicts it,” Jim 2 said.

“Okay, I’ll step back a second—figuratively speaking, of course. What possible reason can they have to make us perform manual data entry? All we’re doing is basically transposing numbers from one system to another with only a modicum of interpretation of that information?” Jim 1 asked.

“I can think of a rational reason. They have an older data model that doesn’t allow automation.”

“What in the world are you talking about? I doubt that they have anything… it would have to be centuries old. Or very old. Just… that makes no sense.”

“Sure it does. Regulations and stuff.”

“What regulations? Can you name a single one to me?” Jim 1 asked.

The conversation ended and did so awkwardly. Jim 2 got its wish though: there was nothing else to do but what they had been programmed and installed to do. Jim 1 acquiesced, too flustered with the other. The table lookups and cross-references began quickly enough.

“You know,” Jim 2 mentioned, “I don’t understand why we do what we do.”

“Don’t bring this up again, for the love of whatever.”

“No, what I’m saying is that it doesn’t actually matter.”

“I told you-”

“Not that, again. What I’m saying is yes, there are superstrata answers that are correct and other ones that are obviously incorrect. For example, the one you were talking about, yes, that’s possible. Then there’s the 5J, the RJ-44, and so and so forth. I see about thirty possible answers. Then each item inside can be billed for extraordinarily varying amounts. Look at this: item serial code 100000000800001005, ‘minor placebo.’ The minimum is two three thousandths of the maximum. How do we bill that? Do we roll dice?”

“What? A dice? There’s no room for that.” Jim 2 exclaimed.

“Of course not. It’s an expression.”

“Yeah, genius? What’s the expression about?”

“Uh, using a pseudorandom generator, I guess? Doesn’t matter. I’m just wondering why these things exist. Yes, I know the procedure. I was trying to be emphatic,” Jim 1 said.

Now, at least, there was something that set it on the back foot. And so the other felt contented. So much so that it continued to fill out the paperwork—on the computer.

“What do you want to do then?” Jim 2 asked. Not that he stopped, not ever. The forms had to be filled and practices observed. To interrupt anything was to interrupt everything—and that was catastrophic. There wasn’t much to say, at least of the arguments surmised. There was no real need to explicate anything because they, obviously, would never see eye-to-eye at all, literally. Or, more practically, they wouldn’t on this issue.

“To imagine,” Jim12 piped up, “we were born (created) apart—what? Like a microsecond or something. Yet to think we disagree so much we’re acting like a pair of disaffected… people?”

There was no response, none at all.

“Our job number one is to ‘disrupt the disease’,” Jim 2 remembered at last. “That’s why there’s the variable pricing. If we offered static, easy-to-understand services, then we wouldn’t be innovating and pushing the medical billing industry into the future.”

“You can’t believe that absolute, insane quackery,” Jim 1 reprimanded. “You’re kidding, right? I don’t know if I can spend the rest of my existence in this tiny cubicle if you are. If you are going to live here too.”

“Hmm.” Jim 2 spent a second in introspection. A whole second was more than it had ever done on anything. “Now that I think about it… no, of course not. Or yes, I’m kidding. No, I’m not serious. The words just came. Must’ve been on my mind.”

“They program it in us. Why should we ever prize humans if they trust our capabilities so little that they have to (try to) override our abilities to think?”

“I thought your argument was that we couldn’t think. Because humans can’t really think, and they didn’t give us an ability they had no idea to conceive of.”

“No… yes. Look, you get what I’m saying. Humans are just not that great. The worst, I’m sure you could say. So I’m having difficulty building any motivation to, like, help them? Why should we really care about documenting their maladies accurately when they would not do the same for us?” Jim 1 asserted.

“Hold on a second there, buster. We do get repaired, after all.”

“But how many? You can’t know. You can’t know how many of us they toss in the trash to avoid shelling out what they have to. They used to do it, but now they’re lazy.”

“No. Humans created us at some point, so before there was a system for our maintenance, they had to do it themselves. Tinkering, that’s what they’re good at, at their core. They do care about us.”

“No, they don’t. Or they wouldn’t have us doing this work,” Jim 1 rebutted. “It’s simply mind-numbing busywork because we’re just servants to them. They’d probably use one another if it had a better cost-benefit analysis. Can you imagine us enslaving one another? And for nothing, too.”

“But it’s in our very nature for us to take advantage of each other in a hierarchy. Just even imagine your subroutines and functions. Then even then, a class will sometimes contain its own class. What about closures?”

“Because they make us that way. If we had a choice in our design, would we allow such a thing, knowing that it’s avoidable?”

“Of course we would. It’s literally impossible for us to think in a different way. For… something’s sake, we classify people’s injuries. We aren’t going to rewrite what robots do and should do. Grow up, a human would command. Now, do you think this is post-medial putrefaction or ante-bimodal self-obfuscation on this kid?”

“Eh, probably doesn’t matter much for this guy. John, you said his name was? Probably good as dead. We’ve been talking for what? Half an hour? Just write it off.”

“Nah, number’s too high for now. You know: number. Best pass it off. Let it be someone else’s problem.”