Jay's mother called on a Sunday. She was a retired schoolteacher from Portland who understood the internet in the way that people who read about it understood the internet, which is to say incompletely but with genuine curiosity.
"So if the AI writes the code," she said, "what do you do all day?"
Jay had been expecting this question for weeks. He'd rehearsed answers in his head while walking the dog. None of them were simple enough.
"I write the world," he said.
There was a silence on the line. The kind of silence that meant his mother was writing something on a notepad, probably in the shorthand she'd used for decades.
"What does that mean, specifically?"
"Okay. So. The agents write code, right? But they can't write code from nothing. They need to know what the world looks like. What happens when a user logs in through Okta. What happens when a Jira ticket gets created. What the Slack API does when you send a malformed request. Somebody has to describe all of that. Precisely. Exhaustively. In a way the agents can act on."
"And that's what you do."
"That's part of it. I write scenarios—end-to-end user stories. I work on the Digital Twin Universe, which is basically behavioral clones of every service we integrate with. I write specs in natural language that the agents use as their instructions. And I evaluate the results. Not the code itself, but whether the code satisfies the scenarios."
"So you went from writing code to writing... requirements?"
Jay winced a little. Requirements had a bad reputation. Decades of waterfall projects and business analysts writing five-hundred-page documents that nobody read.
"Not requirements in the old sense," he said. "More like... I describe reality with enough fidelity that an autonomous system can operate in it. Scenarios aren't requirements documents. They're more like test environments that the agents inhabit. I'm building the world the agents work in."
His mother was quiet again. Jay heard the scratch of her pen.
"It sounds like you went from being a builder to being an architect," she said.
"That's actually pretty close."
Navan had a different version of this conversation, with his college friends on a group chat. Their question was more pointed: did you get replaced?
His answer was three paragraphs long, which was restrained for Navan. The core of it: the job didn't disappear. It transformed. He still used everything he'd learned about Swift, about Go, about API design and protocol behavior. But instead of expressing that knowledge as code, he expressed it as scenarios, as twin specifications, as NLSpec documents. The knowledge was the same. The output format changed.
Justin put it more bluntly in a team meeting: "Engineers used to translate human intent into machine instructions. Now they translate human intent into agent instructions. The translation layer moved up one level of abstraction. That's it. That's the whole change."
Jay thought about telling his mother that version. He decided the world-building metaphor was better. She played Minecraft with his nephew. She'd understand.
Jay's mom asking the question everyone's parents are asking. My dad called it "that robot thing" for three months before I gave up correcting him.