The pipeline was called "The Interviewer," and the first time Jay ran it end-to-end, he didn't speak for several minutes afterward.
It started with a blank prompt. No NLSpec. No requirements document. No pre-written specification. Just a question, rendered on his terminal in clean monospace: What would you like to build?
Jay typed: A command-line tool that converts Markdown files to HTML, supporting tables, code blocks, and footnotes.
The pipeline moved. The first interviewer node took his answer and generated three follow-up questions. Not boilerplate. Not a form. Three specific, probing questions about his requirements: Should the output include a complete HTML document with head and body tags, or just the converted content? How should code blocks be syntax-highlighted? Should footnotes render as endnotes or as hover tooltips?
Jay answered each one. The pipeline consumed his answers and moved to the design node, where an agent generated an architecture based on Jay's responses. The design appeared on his screen: module structure, data flow, dependency list.
Another interviewer node. The proposed design uses a streaming parser for large files. Is this acceptable, or do you prefer full-file parsing for simplicity?
He chose streaming. The pipeline advanced. A sprint planning node broke the design into tasks. A fan-out node dispatched four codergen agents in parallel: one for the parser, one for the renderer, one for the CLI interface, one for the test suite. Tokens burned. The TUI showed four nodes pulsing green simultaneously.
The agents finished. An integration node merged the components. A validation node ran the test suite. Satisfaction: 0.86. Not enough. The pipeline routed back through another iteration. Second pass: 0.92. Third pass: 0.96. Above threshold.
The deployment node activated. The tool was built. From first question to working software: twenty-three minutes.
Jay stared at the terminal. He had described what he wanted in four sentences and answered five questions. The pipeline had done the rest. Requirements gathering. Architecture. Sprint planning. Implementation. Testing. Integration. All of it autonomous, with human input limited to answering questions about intent.
"It interviewed me," Jay said. "It figured out what I wanted by asking me questions. And then it built it."
Navan looked at the pipeline graph. Fourteen nodes. The interviewer nodes were scattered throughout, like checkpoints on a highway. Each one was a moment where the pipeline paused to confirm direction before accelerating again.
"This is end-to-end autonomous development," Navan said. "Not assisted. Not copilot. Autonomous. The human provides intent. The machine provides everything else."
"Not quite everything," Justin said from the doorway. He'd been watching. "The human provides intent AND judgment. The interviewer nodes aren't formalities. They're the pipeline's way of admitting uncertainty. Asking a question is the most honest thing a machine can do."
Jay opened the generated tool and ran it against a test Markdown file. The HTML output was clean, correct, and formatted exactly the way he'd described.
He hadn't written a single line of code. He'd had a conversation.
"Asking a question is the most honest thing a machine can do." Justin's one-liners are getting increasingly existential and I'm here for it. The Interviewer pipeline is the factory's thesis statement in executable form.