Chapter 163: Second Monday (8) - Divine Artifact in a Scientific World - NovelsTime

Divine Artifact in a Scientific World

Chapter 163: Second Monday (8)

Author: FractalSoul
updatedAt: 2025-09-05

CHAPTER 163: SECOND MONDAY (8)

"Um, wouldn’t that cause the extinction of the human race?" asked Samantha.

"Sure, but if we can create a safe self-replicating birth-control nanotech spore, then we should be able to design something that people can turn on and off."

"But is that ethical?" asked Samantha. "What if something goes wrong? It could cause the extinction of the human race, even if you didn’t intend it."

"We can test it in a simulation first," said Madison. "We’ll probably need to wait until Jack’s soul is stronger and has a higher EP generation rate, but once we can run long-term simulations, we can test it before releasing it for real."

"Oh. Right. I’m still trying to wrap my head around all this."

"You and me both," said Jack.

Everyone chuckled.

"Does anyone see a problem with creating a universal birth-control?" he asked.

Everyone shook their heads.

"Great. I don’t think anyone objects to finding ways to improve food production, so let’s move on to goal number three, AGI and robots."

"The invention of robots that can take over menial labor and manufacturing jobs is going to happen. It’s just a matter of time. So how can we mitigate the disruption that AGI robots will cause?"

"Limit one per customer, only rent, and provide options so even a house cleaner could afford one," said Isabella. "You’d also want to offer advanced and luxury models and super cheap basic models so you can cover both ends of the market."

Everyone turned to look at her. She’d been quiet the whole time.

"Why limit to one per customer, and why only rent?" asked Jack.

"If you just sold to whoever could afford one, then companies would buy all of them and use them to replace human employees. Most experts agree that there would be drastic short-term and medium-term impacts on the economy and the job market if AGI robots become available."

"The only way to mitigate that is to limit robot sales to one per customer. If you do that, then companies would want to hire the people with robots, or rent the robots from their owners."

"There would still be some disruption, but money would still flow to average citizens instead of concentrating into the hands of the few people that can afford to buy large quantities of robots."

"And the reason to limit to only renting and not buying is so that people don’t just act as proxies for companies. If you allow people to buy a robot, then they’ll just turn around and sell it to someone else for more money. Companies would still be able to accumulate large numbers of robots."

"Okay, that’s an interesting idea," he said, "but what about people that can barely afford to feed themselves? They’re the ones whose jobs are most at risk of being replaced with AGI robots. Yet they are also the ones least able to afford one."

"There’s not much you can do," replied Isabella. "AGI roots will disrupt the job market. I think the best you can do is try to provide opportunities for those people to get an education or provide them with alternative employment. Like in your VR world."

"I think we need to establish the VR world first," said Nora, "before AGI robots become generally available. If we include free education and virtual job opportunities, then we could create a natural demand for robots."

"So, are you’re suggesting that by offering free education and VR jobs, people that are currently doing menial labor would switch to better-paying jobs, leaving a vacuum in the menial labor market?" he asked.

"Yes," said Nora.

"I agree," added Isabella. "If you ensured that robots are not able to maintain or repair themselves, then you could offer free training on how to do that so people could switch to doing robot maintenance work."

"Also," added Isabella, "about half the world’s population live on less than ten credits a day. If you made education and VR access free, or incredibly cheap, then those people could potentially improve their lives."

"But to make VR that ubiquitous, you’d have to run fiber optics everywhere. Satellite internet is just too laggy for VR," said Rina.

"Fungus net," said Nora.

"Fungus net?" asked Jack.

"Sure. We could engineer some kind of mycelium slash nanotech fungus that grows deep underground and spreads everywhere. Then it would sprout little bushes every kilometer and provide WiFi access."

"If we released something like that, I think half the planet would lose their collective minds," said Miranda.

"Okay, so keep it underground and just build the access points ourselves. Then claim we are using a proprietary tech to distribute the fiber."

"Or just use nanotech to dig tunnels to run the fiber," said Samantha.

"Or that," said Nora, "If you want to do it the safe and boring way."

"Okay, we want to release VR tech before AGI robots are readily available, but we also need to be first to produce AGI robots so we can dominate the market and enforce the one robot per person policy. "

They all nodded.

"And we need to be first with AGI. Not just any AGI, but safe AGI, so we can avoid a skynet situation."

They all nodded, Rina and Madison vigorously.

"So, as much as I hate to say it, I think we need to prioritize creating AGI over nanotech development."

Nora opened her mouth to protest, but he held up his hand.

"I’m not suggesting we halt nanotech development. But if there is a competition for compute resources, we need to prioritize AGI."

Nora frowned, but nodded.

"Nora, your focus remains on nanotech. You’re mostly using Genesis Heart simulations anyway, so you shouldn’t experience much impact from our shift in priorities. And I think we can spare a few Cerebras to dedicate to your work."

She smiled, looking relieved.

"Madison, I want you to focus on AGI. We need to establish ourselves in the A.I. market, so start with the same kinds of models everyone else is working on. With the amount of compute we can throw at the problem, you could generate a new fully up-to-date model every day. Something other companies simply cannot do."

"Got it," she said. "Then once I’m comfortable training current models, I can start working on trying other theories to see what works."

"Yes, but I want to focus on someway to ensure that an A.I. cannot betray us. Either because it never achieves self-awareness or a will of its own, or because there is some way to prove that it cannot violate some set of rules."

"Like Asimov’s three laws?" asked Samantha.

Novel