The robot didn’t care what Isaac meant.
It sat on the bench in pieces, wires splayed like exposed veins, the smell of warm insulation still hanging in the air. One of the motor controllers had browned at the edge. Not catastrophically. Just enough to be unmistakable.
They’d been sure the math was right.
The simulation had been flawless. Smooth turns. Clean timing. Perfect response curves. On the screen, the robot had glided through the course like it knew where it was going.
On the floor, it had lurched forward, clipped the barrier, and stalled with a sound like embarrassment.
Isaac crouched beside it, tracing the failure backward. Encoder lag. Power draw spikes. A control loop that assumed the world responded instantly when it never did.
“You need more smoothing,” someone said behind him.
“Or less pretending,” Isaac replied, not looking up.
They swapped the controller, adjusted the duty cycle, rerouted a cable that had been rubbing against the frame. When they powered it back on, the robot behaved better. Not perfect. Better.
That was the lesson, over and over again.
Nothing failed all at once.Everything failed at the seams.
Sensors lied. Actuators drifted. Timing slipped. And when something broke, it broke locally. A wheel locked. An arm overshot. A joint jittered until it was told, firmly, to stop.
You didn’t retrain the robot.You didn’t start over.
You isolated the fault, replaced the part, and moved on.
Isaac liked that. The honesty of it.
If the system worked, it was because every piece was doing exactly what it said it would. If it didn’t, the reason was discoverable. Concrete. Usually audible.
Late one evening, long after most of the team had left, Isaac watched the robot idle on the floor, its control loop humming quietly. It wasn’t intelligent. It wasn’t adaptive. But it was reliable in a way that felt earned.
He wondered, not for the first time, why software was allowed to be less accountable.
The thought didn’t go anywhere yet. It didn’t have language. It didn’t need to.
He packed up his tools, powered the system down, and left the lab with the faint, persistent sense that some kinds of failure were being tolerated simply because they didn’t make noise.
First ContactIsaac, Spring 2021, Age: 16
Isaac didn’t remember exactly when AI stopped being a curiosity and started being everywhere. It crept in sideways, through demos and club meetings and late-night links dropped into group chats with the casual confidence of something already decided.
Someone showed him a language model one afternoon in the computer lab. It wrote code faster than he could think it. It explained concepts cleanly, confidently, with the unearned certainty of something that never hesitated.
He laughed. Everyone did.
It was impressive. Undeniably so. Better than anything he’d built. Better than anything most people could.
For a while, that was enough.
The unease didn’t come from what it could do. It came from how people talked about it afterward.
When it failed, no one opened anything up. No one asked where it went wrong. Someone shrugged and said they’d retrain it. Another model would be spun up. The old one discarded without ceremony.
Later, he noticed that something it had done well before was gone. Not broken exactly. Just… missing.
When he asked about it, someone waved him off.“Yeah, that happens sometimes. Tradeoffs.”
Royal Road is the home of this novel. Visit there to read the original and support the author.
Tradeoffs with no accounting.Losses without names.
That bothered him more than the error.
He started paying closer attention. Not out of suspicion at first, but out of habit. Systems always told you who they were if you listened long enough.
What he found was nothing to listen to.
There were no boundaries. No stable parts. No seams you could point to and say this failed here. Everything was everywhere. Learning meant reshaping the whole thing. Improvement meant erasure somewhere else.
It reminded him of nothing he’d ever been taught to respect.
In robotics, you didn’t fix a sensor glitch by rebuilding the entire machine. In networking, you didn’t recompile the stack because a packet dropped. Even the messiest systems he’d worked on had interfaces, contracts, something you could repair without destroying the rest.
Here, failure was absorbed and forgotten. Success was celebrated. And no one seemed bothered by how thin the explanations were, as long as the outputs looked good.
He didn’t argue. He didn’t object. He just started writing things down.
Not designs.Not fixes.
Questions.
Why does learning require forgetting?Why does improvement break unrelated capabilities?Why is coordination implicit instead of explicit?Who is responsible when no part can be named?
The questions didn’t feel academic. They felt procedural. Like noticing a missing safety rail after watching someone nearly fall.
The more he watched, the clearer it became that this wasn’t just a technical choice. It was a cultural one. The field had decided that performance mattered more than repairability, that success excused opacity, that intelligence didn’t need structure as long as it dazzled.
That was the moment the shift happened.
Not anger.Not rebellion.
Obligation.
Someone should be able to take these systems apart. Someone should be able to fix them without destroying what already worked. Someone should be able to say this part failed and mean it.
And if no one else seemed interested in doing that, then at the very least, someone needed to understand why.
He closed his notebook that afternoon with the quiet, unsettling sense that curiosity was no longer enough. That once you noticed a missing boundary, you were responsible for it.
He didn’t know yet what that responsibility would demand.
Only that he wouldn’t be able to unsee it.
By his final year of high school, Isaac had learned the shape of the argument without ever hearing it spoken.
AI demonstrations had become routine. The novelty wore off. The applause stayed. Each new system arrived with the same confidence, the same claims, the same unexamined assumptions carried forward like inherited debt.
He noticed something else too. The people building these systems no longer talked about fixing them. They talked about managing them. Guardrails. Fine-tuning. Prompting. Wrappers around something no one seemed willing to touch directly.
When a model failed, the answer was still retraining. When it regressed, the answer was still scale. When it behaved unpredictably, the explanation was statistical inevitability.
No one asked whether inevitability was a design choice.
He tried, once or twice. Carefully. Quietly.
What he got back wasn’t hostility. It was worse. Polite dismissal. The sense that he was asking the wrong kind of question. That the field had already moved on from worrying about internals.
“Why does it matter how it works if it works?” someone asked him.
He didn’t have a clean answer yet. Only the certainty that the question itself was wrong.
Working systems weren’t finished systems. Systems you couldn’t repair weren’t complete. Intelligence that couldn’t be taken apart safely wasn’t something you built on purpose. It was something you tolerated because it was convenient.
That tolerance felt like a refusal to grow up.
He spent more time alone with textbooks that weren’t assigned. Systems engineering. Networking. Control theory. Anything that treated complexity as something to be managed, not worshipped. He kept noticing the same pattern: mature fields didn’t pretend their systems were indivisible. They built layers. Interfaces. Protocols. Places where failure could be contained.
Why was intelligence exempt?
The question followed him through graduation rehearsals, through final projects, through the strange quiet of knowing you were about to leave a place without having found what you came for.
By spring, the decision wasn’t dramatic. It barely felt like a decision at all.
If the problem was architectural, then he needed a place that treated architecture as real work. If the questions were foundational, then he needed a place that still believed foundations mattered.
New Mexico Tech wasn’t glamorous. That was part of the appeal.
It was a place for systems that had to work because people depended on them. A place where failure wasn’t theoretical. A place that didn’t mistake elegance for completeness.
He didn’t tell anyone he was going there to fix AI. That would have sounded absurd, even to him.
He told himself something smaller. Something truer.
He was going there to learn how complex systems were supposed to be built. To find the rules everyone else seemed willing to skip. To understand what had gone missing before the answers became unrecognizable.
On the day he packed his room, he found the old notebook from a year earlier. The one filled with questions instead of plans.
He didn’t add anything to it.
He didn’t need to.
The questions hadn’t gone away. They had simply stopped being optional.
He closed the cover, slid it into his bag, and left for NMT knowing only this:
If there were answers, they wouldn’t be found by scaling what already existed.
They would be found by rebuilding what had been skipped.
And that, at least, felt like something worth learning how to do.

