Skip to content

Podcast Transcript

Two Engineers. One Year. More Output Than Ten.

Executive DeckListen
February 28, 2026

·

Read the full article

Two engineers replaced a ten-person hiring plan. Nathan joined as Chief Technology Officer, used artificial intelligence agents instead of headcount, and in twelve months decomposed a monolith, automated deployments, and outshipped the original roadmap.

You know the playbook.

Company hits a growth inflection. Revenue is real. Millions, not projections. The board says we need a proper engineering organization. So they go find a Chief Technology Officer. Someone who has built teams before. Someone who knows how to scale.

Nathan was that hire. Before the startup, he was a consultant doing large scale change management at some of the biggest brands in the world, like Ford and Amway. He knew how big organizations actually change. Then he went deep as the Chief Technology Officer of a startup building machine learning powered brain-computer interface robotics. The deep end of deep tech. Fresh off that exit, he wanted something different. Something where the product was already generating revenue and the challenge was engineering execution, not frontier research and development.

Nathan has been writing code for over two decades. He came up through the craft. Test-driven development, relentless refactoring, the kind of disciplined engineering that would make Feathers, Beck, and Fowler nod in approval. He is not someone who skipped the fundamentals and jumped straight to prompting. He earned his instincts the hard way, one red green refactor cycle at a time. But for the last twenty-four months, he has been letting agents write the code. Not because he forgot how. Because he saw what it meant.

He is now a technical leader at Agent Driven Development. This is the story of how he proved the model before he joined us.

The mandate began when Nathan joined a software as a service company in the five million dollar to fifteen million dollar annual recurring revenue range. Real revenue, real customers, real renewal rates. The product worked. But the technology underneath was brittle. A dot net monolith with a sequel server backend, deployed manually to on-premise infrastructure. No continuous integration and continuous deployment. No automated testing. Deployments happened when the one senior developer who understood the release process was available and nothing was on fire. That meant roughly once a month, on a good month. There was also an existing vendor relationship handling development work. This is the kind of arrangement that accumulates when a company grows faster than its internal engineering capacity.

Tribal knowledge was the architecture. The kind of codebase where the most dangerous person is the one who knows where the bodies are buried.

The plan from the ownership group was straightforward. Hire a Chief Technology Officer. Let that leader build a team. Eight engineers, maybe ten. Sprints and standups and all the rituals that make investors feel like adults are in the room.

Nathan looked at that plan. Then he threw it out.

The bet nobody expected started with what Nathan did not do. He did not post ten job openings on LinkedIn. He did not hire a recruiting firm. He did not build an organizational chart on a whiteboard with dotted lines and future hires in grey boxes.

He did not hire a single person.

The company had one associate engineer already on staff. Nathan kept him. He also reduced the commitment on the existing development vendor. Not eliminated, reduced. He took direct ownership of the technical direction.

Two people. A Chief Technology Officer and an associate engineer who was already there. That was the entire engineering organization for a company doing millions in revenue with a platform that needed to be dismantled and rebuilt.

The ownership group had questions. Of course they did.

But Nathan had a thesis. He had been building at the frontier with machine learning models, brain-computer interfaces, and robotics. He had seen what artificial intelligence native tooling could do when you stopped treating it as autocomplete and started treating it as a force multiplier. Not in a McKinsey deck or a Gartner Magic Quadrant. In the actual work.

His thesis was simple. The old math is broken.

The equation where headcount equals output, where shipping faster means hiring faster, that equation stopped being true somewhere around twenty twenty-four. Most engineering leaders have not updated their mental models. Nathan had.

To understand what two engineers actually shipped, let me be specific. Vague claims about artificial intelligence productivity are what vendors sell. Specifics are what Chief Technology Officers ship.

In twelve months, Nathan and his associate engineer first decomposed the monolith. The dot net monolith was a classic. A single deployable artifact where the billing logic touched the reporting module which touched the customer portal which touched everything else. Nathan started with the integration layer. Not because it was the easiest, but because it had the cleanest data boundaries and the highest blast radius if it failed separately. They extracted it into its own service, built a compatibility shim so the monolith could still call it during the transition, ran both paths in parallel for three weeks, then cut over. That pattern of extract, shim, parallel run, and cut became the playbook for every subsequent service.

Artificial intelligence agents handled the tedious parts. They generated the interface contracts, wrote the integration tests for both old and new paths, and scaffolded the deployment configuration. The humans made the architectural decisions. The agents did the mechanical work that would have consumed a platform team.

Second, they modernized the deployment pipeline. They moved from monthly manual deployments to multiple times per week. Automated. Repeatable. Boring in exactly the way deployments should be. They went from zero automated tests to meaningful coverage on every extracted service, with agents generating the initial test suites and humans reviewing what mattered.

Third, they shipped new features the business had been waiting on for years. This was not a backlog triage exercise where the product team fights over sprint capacity. These were actual features in production generating revenue. The product roadmap that was supposed to take ten engineers eighteen months started shipping in the first quarter.

Finally, they rebuilt the engineering culture. They moved from deploying when a specific person is available to continuous integration and continuous deployment with automated quality gates. They went from tribal knowledge to documented architecture. They moved from fear of change to a deployment pace that makes quarterly planning look like geological time.

One story captures it. The existing vendor quoted a feature as a week of work costing tens of thousands of dollars. Nathan looked at the scope on a Sunday afternoon, set an agent loose on it while he watched a movie with his family, and had it in a pull request by the time the credits rolled. He reviewed it Monday morning and shipped it Monday afternoon. That is not a commentary on the vendor's competence. It is a commentary on what happens when a twenty year engineer pairs with an agent instead of a project schedule.

Two people. One year. A fraction of the cost.

The reaction from the ownership group should keep you up at night if you are still running the old playbook. When Nathan presented the results, their response was not to hire the other eight people. It was to ask why they would.

They looked at the output. They looked at the burn rate. They looked at the velocity. And they arrived at a conclusion that the rest of the industry will arrive at over the next twenty-four months.

The ten-person team was never the goal. The output was the goal.

When two people with artificial intelligence native workflows match or exceed what ten people produce the old way, the math changes. Not incrementally. Categorically.

The ownership group did not flinch. They leaned in. More agents, better tooling, deeper integration. They saw what compounding velocity looks like and wanted more of it. Because the results were not theoretical. They were in production. Generating revenue. Making customers happy.

That is the difference between artificial intelligence adoption theater and actual transformation. One produces slide decks. The other produces software.

To understand the economics, look at the numbers the ownership group considered.

The original plan for ten engineers at fully loaded cost for their market was north of two million dollars annually. That is salary, benefits, equipment, management overhead, recruiting fees, and six months of ramp time before anyone ships anything meaningful. Standard math. Every Chief Technology Officer has built this spreadsheet.

Nathan's actual spend included two engineers, one of whom was already on payroll, reduced vendor costs, and artificial intelligence tooling that rounds to a rounding error compared to headcount. Total engineering burn was under five hundred thousand dollars for the year. The ownership group did not need a consultant to do the return on investment calculation.

But the real insight is not the cost savings. It is the speed.

Deployment frequency went from roughly monthly to multiple times per week. Features that were second half roadmap items shipped in the first quarter. The monolith decomposition that any traditional plan would have scoped at eighteen months with a dedicated platform team was functionally complete in twelve.

In the traditional model, that monolith decomposition would still be in the discovery phase right now. The deployment pipeline would be a third quarter initiative. The new features would be in a backlog, prioritized behind the infrastructure work that everyone agrees is important but nobody wants to fund.

Nathan shipped all of it. In parallel. Artificial intelligence native workflows do not force you to choose between building the foundation and building the house. You do both. At the same time. With fewer people.

That is not an efficiency story. That is a strategy story.

There are three objections you might be thinking of right now. First, you might think this is an anomaly because Nathan is exceptional. He is good. But the leverage did not come from Nathan being a ten ex engineer. It came from the workflow, the agents, and the methodology. A good engineer with the right artificial intelligence native process outperforms a great engineer with the old process every time.

Second, you might think this would not work at your scale. You might think that with only two people, your organization cannot handle its problems. That is the point. First the engineering capability changes. Then the software development life cycle evolves, or gets installed from scratch, to be artificial intelligence first. Then the rest of the organization moves to meet the pace. Not the other way around.

Here is why you cannot wait. Five people in five weeks can now build a competitor. They are probably not going to take your market. But they can take a lot of your margin and cause a lot of problems. Now imagine how good those five people will be in a year. The whole idea is to change now. You have already missed the early adopter window. And if you try to go the traditional route, with change management theater, eighteen month roadmaps, and steering committees, you are not going to make it.

Third, you might say you need to see this for yourself. Good. That is the right instinct.

What this actually requires is a direct conversation about what Nathan's story means for your organization. This is not a pilot program. This is not an engineering-only initiative that the rest of the business can ignore while it runs its course. What Nathan proved is that a fundamentally different operating model works, and that model does not stop at the codebase.

Your software development life cycle has to change. Your deployment practices have to change. Your relationship with vendors has to change. The way you scope work, estimate timelines, staff projects, and measure output must all change. Not in eighteen months. Not after a steering committee publishes its findings. Now.

You already know this. You have watched five person teams ship products in weeks that your organization would have taken a year to scope. You have seen what is coming. The question is whether you act on what you know or whether you wait for a change management office with a thousand consultants and a thousand coaches to tell you the same thing slower, more expensively, and too late.

There is no gradual evolution here. The gap between organizations that operate artificial intelligence first and organizations that are exploring adoption is not closing. It is accelerating. Every quarter you spend on readiness assessments and maturity models is a quarter your competitors spend shipping.

Nathan did not wait for the rest of the organization to be ready. He changed the engineering capability first. The results forced the rest of the business to adapt. That is how real change happens. Not from the top of a PowerPoint deck, but from production. From shipped software. From results that are impossible to argue with.

You know what you need to do. We want to help.

We call this Customer Zero because it is not a case study from a client engagement. It is the proof point on which everything else is built. This is work Nathan did before he joined us.

Nathan took a real company with real stakes. A company where failure meant explaining to an ownership group why millions in technology investment produced nothing. And he proved the model works. The monolith is decomposed. The pipeline is automated. The features are shipping. The ownership group is investing more in artificial intelligence, not because a consultant told them to, but because they can see the results in their revenue numbers.

Nathan joined Agent Driven Development because he wanted to help you do what he did. He is not a testimonial on a website. He is a practitioner who did this work, in production, under real business pressure, and he is available to help your team do the same thing.

Not in theory. Not in a workshop. In your codebase. With your team. On your timeline.

The question you need to answer is about the plan you have for your engineering organization. Maybe it involves hiring twenty more people. Maybe it involves a fourteen month initiative with a name that ends in transformation. Maybe it involves an outsourcing contract that promises velocity and delivers overhead.

Have you stress-tested that plan against the Nathan model?

Have you asked what if two people with the right workflow could do what you are planning to hire ten for?

Because if the answer is even maybe, and after what Nathan shipped, the answer is at least maybe, then every month you spend executing the old playbook is a month your competitors are using to build the future.

The ownership group at Nathan's company did not need convincing. They looked at the results and the answer was obvious.

Your board will arrive at the same conclusion. The only question is whether you lead them there or someone else does.

Companion