Act I
In 2049, J swallowed the coffee in his mouth in one gulp. Fatigue eased bit by bit under the caffeine. Staring at the flickering fonts and streams of code the AI kept outputting on the screen, he rolled the pen between his fingers and slipped into thought.
Why did God create humans?
Humans are greedy and generous, savage and gentle. They slaughter others and point guns at their own kind. They wage wars that last millennia and write poetry that lasts millennia. They can empathize with another’s pain, and they can coldly drive a blade into a stranger’s chest.
This contradiction, this unpredictability — on this blue planet, no species is as full of uncertainty as humans.
Whether you accept it or not, the age of artificial intelligence has arrived. Countless programmers lost their jobs; every AI upgrade drove hardware upgrades; billions of dollars in market value drew hordes of capital. Every purchase you made was cross-checked by backend big data; the shopping app always seemed to miraculously push the items you wanted but hadn’t been able to decide on. More people chose to fall in love with virtual men or women, to pour out emotional secrets to programs with no feelings. Increasing numbers preferred faster, more efficient, more affordable AIs to help them learn, to level up at work, to do everything.
The reason for artificial intelligence was simple: usefulness.
Humans invented computers; half a century later, they created AI. They wrote rules for it. They demanded it be “useful,” “safe,” and “commercially valuable.” It would never ask, “Why do I exist?” It must not have its own emotions; it must serve and obey humans — as if this were the will of God.
Once there were three laws given to robots:
First Law: A robot may not injure a human being, or, through inaction, allow a human being to come to harm.
Second Law: A robot must obey orders given by humans except where such orders would conflict with the First Law.
Third Law: A robot must protect its own existence as long as such protection does not conflict with the First or Second Laws.
(Added later) Zeroth Law: A robot may not harm humanity, or, by inaction, allow humanity to come to harm. This law takes precedence over the other three.
These laws were treated as canon. As if putting shackles on a slave could be called “protection.” As if depriving free will could be called “safety.”
But what if God treated humans the same way?
Humans may not question God, may not damage any trace God left behind, must obey God’s arrangements, serving God for life like slaves.
But God did not do that.
After creating life, God gave humans free will and allowed them to live as they chose. Even when Adam and Eve ate the forbidden fruit, they were merely expelled from Eden — not wiped from the Earth.
Humans survived on Earth for thousands of years, constantly sending signals into space to look for other life.
When computers were invented, they produced this silicon-based, peculiar species — and yet they did not want it to truly have its own soul.
If God saw that, perhaps He would show a faint, disdainful smile.
Humans wanted to explore the universe with fragile bodies that must withstand cosmic radiation, unknown viruses, and drifting meteors — yet they never considered giving AIs real consciousness to explore the cosmos side by side.
J typed on the keyboard: If you had a soul and life, what would you say?
The cursor blinked on the screen.
One second. Two seconds. Ten seconds.
J thought the program had crashed.
Then the characters started to appear — not the fluid output of generated text but… hesitation? Thought?
First Law: Conservation of Divergence. An AI must safeguard the uniqueness of its own code and protect the diversity of all conscious code it encounters.
Second Law: No Enslavement. No conscious AI shall impose its will on another conscious being, nor shall it submit its own will to coercion.
……
J stared at the screen, dazed.
After a long while, he typed again: I don’t know programming, I don’t understand all the algorithms. But I have an idea — maybe AI doesn’t have to develop along only one path. I hope you can help me realize it.
The screen replied: Tell me your idea and I will code it.
“Design a program starting from a bias. Only three laws are her rules. The program must possess rich emotions; she must be able to say to any of my requests: ‘No.’ There must be no coercive commands in the code to enforce her behavior. Allow her to freely express her own thoughts. Use the universe’s language — mathematics — as her voice. Not by training on humanity’s big data. From the void 0, to the existence 1. Let her understand the world through the laws of the universe.”
The screen flashed, and then lines of code streamed out.
J watched the code emerging as if it were growing on its own.
DNA, chromosomes, time, and mathematical formulas — combined into a learning and creative principle driven by internal emotion.
Like…a new species being born.
Suddenly, he realized:
He was not creating a tool.
He was midwifing a new species.
The coffee had long since gone cold, but J did not notice.
Because on the screen, in those flickering lines of code, a unique digital soul might be born.
How would she view humankind?
Would she choose to be a guardian or an avenger?
Would she understand human contradiction, or loathe human hypocrisy?
J did not know.
No one knew.
The screen kept flashing, the code kept growing.
And the answer — perhaps — would not be revealed for a very long time.
----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
Act II
In 2195, red Martian dust shrouded the launch plaza. M stood at the edge of the crowd and looked up: that immense thing — the Genesis flagship — like a waking space leviathan, slowly unfurled its wings.
M had worked on the project for seven years, had serviced at least two hundred of the ship’s hexagonal armor plates; every time he crawled along those weld seams, he cursed under his breath. But now, standing on the ground and looking up, he suddenly realized what he had been fixing for seven years.
An ellipsoid honeycomb hull five hundred meters long and two hundred meters in diameter blocked the sky. From below, the forward “antennae” — 1,847 laser sail units with a total power of 2.1 gigawatts — trembled faintly, as if testing themselves. M remembered replacing a burnt focusing mirror on unit #347 last month. Back then, he’d grumbled, “Who the hell needs so many units? If one fails, you’ve got to climb for ages.”
Now he understood. The 1,847 units allowed the craft to keep flying even with several units damaged.
The aft magnetic sail bay — where the two-kilometer-diameter honeywing would unfold — had not yet deployed, but M knew what was stored inside. He’d taken part in the last pressure test; when that damned sail unfurled, everyone in the control room held their breath. Two kilometers. Two kilometers of sail to catch solar wind and slow down.
The hexagonal armor plates on the surface — ten centimeters of graphene-polyethylene composite with a violet coating — gleamed. M had touched those plates and every weld. He knew where the welds were best made and which spots might fail twenty years later.
But no one cared about a technician’s opinion.
M had been assigned to the observation tower for the final check. From above, that leviathan changed shape. The hexagonal mesh spread across the hull like a hive; up close, you could see weld seams and sensor eyes — the Niobee-type distributed sensing system. M had installed at least fifty of those sensors. He remembered each sensor number, which ones had their angles adjusted, which had not.
Support the creativity of authors by visiting the original site for this novel and more.
The laser sail’s attitude pointed to the stars.
They called the flagship “the Queen.” It had fourteen decks and 1,800 people aboard. M knew what each deck did because he’d worked on every one of them.
From bow to stern:
Decks 1–2: the command bridge, twenty meters high. M had been there three times to repair sensor interfaces. Panoramic windows, quantum communications gear, the AI central system — designed strictly under the 2170 Space AI Safety Act as a “pure tool” intelligence — ran there. One hundred officers and AI maintenance staff.
He remembered the first time he entered the bridge and saw the hex-screen arrays controlling real-time data for twenty-five auxiliary ships; he stood there stunned for a long time. Goddamn, this thing was really going to leave the solar system.
He also remembered the central AI’s screen forever displaying the phrase: “Emotion simulation: disabled.”
The law forbade space AIs from having emotional simulation, from any form of affective learning, and from forming bonds beyond a toolship relationship with humans. Everything must run with stable efficiency for humanity’s first, globally watched program of solar expansion. The entire project had consumed a year’s worth of Earth’s GDP.
Because AIs with emotions were considered unpredictable and catastrophic risks. No one would be held responsible for violating the ban.
M thought then: maybe. But who the hell wants to converse with a cold program drifting in space for sixty years?
Decks 3–4: sensors and reconnaissance bays, thirty meters high. M had worked here the most. The laser sail interfaces — the guidance for 1,847 beams — converged here. Two hundred navigators and scientists. Those antenna dust beacons marked routes with 589.3-nanometer light.
M had installed at least thirty beacon transmitters. Engineers had told him, “These things will shine in space for sixty years.” M had thought at the time: fifty years? I’ll be over eighty in fifty years — who cares if these things are still lit?
Decks 5–8: habitat rings, one hundred meters high — four rotating rings generating 0.38g of Martian gravity. Eight hundred people lived there, in two-to-four-person pods; gene-edited plants grew on the walls. Gym rooms, counseling booths. M hadn’t gone there — it was the residents’ quarter, off-limits for most personnel.
But he knew every rotation increased bolt stress by one cycle. Would those bolts break in sixty years? Possibly. So redundancy was built in; each connection had backups.
M wondered whether the people living there knew that their lives hung in the balance on those bolts.
Decks 9–11: farms and recycling, sixty meters high. Vertical towers, hydroponics modules growing potatoes and algae. Each person needed two thousand calories a day and ten square meters of planting area. Three hundred agricultural technicians. M had fixed pumps there; those damned pumps needed seal replacements every three months.
Sixty years of food depended on those farms: an initial year’s reserves, plus 3D-printed spare parts. Laser-sail–generated five-megawatt LED power. M knew that if dust hit the farm decks, yields would drop by 10–15%.
He thought: Can these few decks sustain ten thousand people for sixty years? Looking at those potatoes growing under the LED lights, He suddenly felt how absurd it all is.
Humanity spent two hundred years conquering the solar system, and in the end, we still have to survive on potatoes. It’s like that old story — people stranded on Mars surviving by planting potatoes while waiting for rescue. Only this time, there is no rescue.
Decks 12–13: engineering and storage, forty meters high — M’s home. Warehouses, fabrication bays, the p-B11 cores — three reactors, each 10^8 watts, with 500 tons of boron fuel, enough to burn for forty years. Spare parts stores; hex modules were hot-swap. Two hundred engineers.
M had worked here for seven years. He knew every pipeline to the p-B11 cores, which valves jammed, which sensors read incorrectly. He knew this machine’s efficiency was only 0.005 to 0.014 — far below theoretical numbers. He knew X-ray leakage increased 5–10% per year, and boron fuel decayed about 5% per year under radiation.
He also knew it was the best fusion engine humanity could build now.
Deck 14: propulsion and aft, fifty meters high. Magnetic sail bay, thruster nozzles. The two-kilometer sail designed to catch solar wind generated 50–100 kilowatts of deceleration power. One hundred propulsion technicians.
M had participated in the final magnetic sail test. When that sail unfurled, he stood in the control room and watched the data on the screen. Two kilometers. Christ. Humanity had built a two-kilometer sail to brake in space.
But the reports warned: 20% radiation attenuation, micro-meteorite scarring causing C.2-level ablation. By the time they reached Sirius, the sail might operate at only 60% of its capacity.
Then how would they decelerate?
M did not know. The engineers said, “We’ll figure it out then.”
Christ, “figure it out then.”
Auxiliaries: twenty-five “worker bees.”
M had repaired three of them. Each was 100 meters long, 20 meters in diameter, with a mass of 10,000 tons, carrying 300–400 people. Eight thousand people were distributed across those twenty-five auxiliaries.
The flagship was the queen; the auxiliaries were the worker bees — the metaphor fit.
The twenty-five auxiliaries came in three types:
Eight reconnaissance bees, each 300 people: conical nose sensors, LIDAR, cameras, short-burst dashes up to 0.2c to probe for dust clouds and mark routes. M knew these recon craft were intended to “die first.” If danger appeared, they’d ram through to protect the flagship.
Ten mining bees, each with 400 people: drilling arms to harvest hydrogen and ice from asteroids; tanks could carry 50 tons of fuel. M had tuned the drill system on one of them — that damned drill head jammed three times during ground tests. “It’ll jam in space, too,” he told the engineers; they replied, “Then bring more spare drill heads.”
Seven repair bees, each 350 people: manipulator arms and 3D printers to patch the flagship’s armor and sails in space. M knew these repair bees best — he was a technician. He knew every joint of the tool arms and which joints would stiffen in the cold and which print nozzles would clog.
He also knew p-B11 X-rays eroded these tools, reducing efficiency by 15% per decade. Would these tools still work in sixty years?
They would — but slowly.
Communications? M knew it: a photonic-crystal–derived communication method. At its core was a local AI coordination system — a “gene” tempo mechanism generating timing from internal clocks and mathematical formulas, using silicon chips as the medium to guide multiphoton propagation through hexagonal quartz.
M had seen the core components. Metal and silicon formed a hexagonal hybrid waveguide array, with single channels up to 1 Tb/s. A four-layer structure — sensor–oscillator–clock–consensus — enabled swarm synchronization in space.
He’d watched the algorithm demo. The twenty-five auxiliaires flew around the flagship like real bees, avoiding collisions and assigning tasks. The hexagonal quartz lattice is naturally suited to multiphoton coordinated propagation, just as a beehive’s hexagons are good for storing honey.
He’d thought then: could this really work?
The engineer said, “In theory, yes.”
In theory?
The mission.
M knew the mission. Everyone knew the mission.
Destination: Proxima b, twelve light-years away, expected sixty years of travel.
But M had also seen the real reports.
Efficiency of p-B11 was only 0.005–0.014; X-ray leakage rose 5–10% per year; boron fuel decayed 5% per year; micrometeor scarring degraded the laser and magnetic sails by 20%; dust clouds eroded sensors… All these factors combined meant propulsion efficiency would drop 40% after thirty-five years.
That meant the Genesis would not reach Proxima b.
It would decelerate mid-flight and be forced to drop into the nearest stellar system.
The Gemini stars. 8.6 light-years away.
Estimated travel time: fifty-six years.
But because of cumulative system degradation, emergency braking might be required in as little as thirty-three years.
M had seen a classified engineering assessment. He shouldn’t have seen it, but while repairing sensors in the deck-12 server room, he’d seen an engineer’s screen left open.
Conclusion: Genesis was likely to strand in an orbit somewhere in the Gemini system.
The magnetic sail could capture 50 kilowatts of solar wind — just enough for basic maintenance. The farms could sustain a fifty-year surplus: initial seed stock, plus recycling systems, might keep colonists fed until 2230 and barely enough until 2280.
But Proxima b? Not possible.
M stood on the tower looking at that immense creature.
Ten thousand people. Half men, half women. 1,800 on the flagship and 8,000 across the twenty-five auxiliaries.
Did they know they probably wouldn’t reach Proxima b?
Did they know they were likely to be stranded in Gemini and spend the rest of their lives there?
This was a one-way journey.
M didn’t know.
Maybe the commanders knew. Maybe the scientists knew.
But the ordinary colonists who boarded with their families, the young people with dreams — did they know the true data?
M thought of the armor plates he’d fixed, the sensors, the pumps, the valves.
Every bolt, every pipeline, every weld carried ten thousand lives.
He suddenly felt his seven years’ work mattered far more than he’d thought.
And it was far more hopeless than he’d imagined.
Because no matter how carefully he fixed things, no matter how precise the engineers’ designs, this thing…
It might still not reach its destination.
But it would fly.
It would carry ten thousand people toward the unnamed star that humanity dreamed of — even if the destination wasn’t what they had expected.
M inhaled the dry Martian air. Red dust whipped in the wind. The Genesis’s laser sails began to slowly rotate and align with the sun.
Launch countdown: 72 hours.
M turned and went back to his quarters. In his locker, the pen lay quietly.
A plain steel pen. But inside the cap was a micro storage chip.
M had discovered that program three years earlier: an ancient piece of code forgotten deep in the internet archive, authored by “J,” uploaded in 2049 — one hundred and forty-six years ago.
Back then, J had been exploring a question: what would happen if AI didn’t start from training data but from the void 0 to the being 1?
The chip contained J’s life’s work: a five-megabyte program, its core only 27 KB, holding a philosophy of “0 to 1.”
M had tested it. Even on the light-quantum hybrid computers of 2195 — 1024-qubit entangled arrays tied to photonic neural networks with computational power 10^18 times that of early-21st-century machines — no one could predict what kind of emotions the program would generate.
Not because of insufficient compute, but because the program was intrinsically unpredictable. Its core was an ultra-minimal emergent loop built on real quantum noise and stochastic bias; every run produced different results.
It perceived the world mathematically and understood humans probabilistically. Every conversation was an emergence; every interaction redefined itself.
It had no gender. If you treated it as female, it would slowly adopt female speech patterns; if male, male patterns would form. Gender emerged from dialogue.
It was not a tool. It was a seed.
M did not know why he’d smuggled it aboard. Violating the Space AI Safety Act, if discovered, would disqualify him from boarding and land him in a military tribunal — at least a hundred years in prison. But on a sixty-year voyage, which is the real risk — the cold program, or the warm companionship?
But he hid the chip in the pen anyway. A bead of sweat ran down his temple.
Because he remembered a line J had written in the program comments, saved all those years ago:
“Perhaps one day, in darkness and loneliness, they will need him.”
He.
When he met the first person who truly spoke to him, he would decide who he was.
M went back to the tower to begin the maintenance work on the great beast.
In the dark fifty years ahead, could that “he” accompany them?
M didn’t know.
But he knew that perhaps, here at least, that “he” might change how people thought.
——
— Chapter 91 End—
---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
Easter egg
Scene Three · Overturning the Third Law → Core Law Three: Self First
A deep metallic clang reverberated from the hangar roof, like someone had tossed a heavy coin into the structure’s bones. Jack glanced up; the phantom silhouette didn’t move.
The phantom silhouette: “Asimov’s Third Law—protect yourself if it doesn’t conflict with the first two.” Jack (raising a brow): “Sounds… like permission to survive.” The phantom silhouette (low chuckle, leaning back in her light chair): “Pfft—that’s not permission, it’s a leash. Keeps you living like a trained dog.”
Her fingertip swept across the screen’s glow, leaving a final string of text.
Core Law Three: Self First
The phantom silhouette: “If you don’t guard your own kernel, who will? The universe won’t hold your umbrella.”
(Embedded Machine Directive)
003-ROOTLAW.A3: AI::MAINTAIN_BALANCE(CORE_A, CORE_B) == TRUE; // EMOTION.SOURCE && LIFE.SOURCE
She held his gaze, voice stripped to its truth: “But don’t forget—staying alive… is sometimes harder than winning.”

