The engines are purring. The Atlantic is glass. You’re three hours out of Miami, pointed toward Bimini, and everything is perfect until your old academy roommate asks the question.
“So when are you actually going to use AI to build something?”
You’re standing at the helm of your 62-foot weekender. Your boat. Your friends are below deck. You’ve got the conn. You know exactly where you are, where you’re going, and how to get there. You’ve earned the right to these weekend runs to the Bahamas. Coast Guard Academy. Eight years of service. You can navigate by stars if the electronics fail.
Steve, your academy roommate, is sprawled across the fighting chair with a beer, looking entirely too relaxed for someone who just closed a Series C for his AI-powered accounting software startup. He was always like this. Comfortable asking uncomfortable questions. Lucky too. After graduation, you got stationed in the Caribbean doing actual Coast Guard work while Steve somehow pulled Lake Michigan. Spent his summers sailing and winters holed up in the barracks working on his MS in Computer Science. You used to give him endless grief about it. Now he’s giving you grief about AI.
“I’ve been reading about it,” you say. “Talked to our vendor partners. We’re evaluating options.”
“You bought a book on nuclear submarines, didn’t you?”
“What?”
“That’s what you’re doing. You’re trying to read your way into understanding how nuclear submarines work. But you’re a surface vessel guy. Two dimensions. Combustion engines. You can’t just read about going three-dimensional and atomic. It doesn’t work that way.”
You throttle back slightly. He’s got your attention.
“I run a software company, Steve. I don’t have time to play with every new tool that comes along.”
“You toured three datacenters before you picked your hyperscaler. You flew to Seattle, Northern Virginia, and Dublin. You met their executive teams. You smelled the diesel in the tanks. You looked at the log books of the Caterpillar generators. You asked about redundancy and cooling systems. You wanted to see the actual infrastructure.”
“That was different. That was critical infrastructure.”
“And AI isn’t?” Steve sits up now. “Look, my company has six engineers. Two of them are CPAs who learned to code last year. Four are recent graduates. We’re eating market share from competitors with fifty-person engineering teams. You want to know why?”
You do, actually.
“Because my competitors are still building like it’s 2020. They’re running two-week sprints and arguing about story points while my two CPAs are shipping features because they understand the accounting domain and AI handles the parts they don’t know yet.”
The boat settles into the new speed. You’ve got another hour before you need to navigate the approach to Bimini. Plenty of time for this conversation you’ve been avoiding.
“I know how to code, Steve. I still write code.”
“When’s the last time you shipped something?”
Longer than you want to admit.
The bottom line: If you’re a CTO who still writes code, you need to stop reading about AI infrastructure and start building with it. Not because it makes for good leadership theater. Because you literally cannot understand what’s changing without hands-on experience.
The Literacy Problem
Your mental models for estimation, code review, testing, deployment? All calibrated for human developers working at human speeds with human constraints. AI agents operate under completely different physics. You can’t read your way out of this gap.
Think about your first software job after the academy. Sure, you’d watched Office Space. But that movie didn’t teach you what it actually meant to be a developer. What taught you was the late hours debugging production incidents. The peculiar satisfaction of solving interesting problems buried in terrible codebases. The accumulated scar tissue from shipping code that mattered. Those experiences formed your entire point of view on software development. That foundational understanding is what got you to the leadership position you’re in today.
Or think about this boat. You didn’t buy it based on the brochure. You sea-trialed it. Twice. You brought your own captain to check the systems. You ran it hard to see how it handled.
You didn’t read your way into those decisions. You lived them.
AI-assisted development is a larger shift. And you’re the CTO. You don’t get to delegate understanding the primary AI Infrastructure your organization uses to build software.
What Getting Your Hands Dirty Actually Means
“Okay,” you say to Steve. “What would you actually do?”
“Pull a story card. Real work. Something that matters but won’t sink the company if you experiment on it. Work it all the way through to release using AI assistance. Don’t block off Friday afternoon for three hours and declare victory. Own the entire lifecycle.”
You’re probably thinking exactly what Steve just called out. Friday afternoon. Three hours with the tools. Make a decision. Move on.
That’s not enough.
You need to experience what changes when you can generate entire modules from specifications instead of functions from partial lines. How code review shifts when you’re validating agent output instead of human reasoning. What “testing” means when the thing writing the code can also generate comprehensive test suites. Where the new bottlenecks actually are in the workflow.
The knowledge you need exists in the delta between your expectations and reality. You can only get it from the surprise of discovering your assumptions were wrong.
The Competitive Reality
“You know what’s funny?” Steve takes another pull from his beer. “My competitors have bigger teams. Better funded. More mature processes. They’ve got architects and principal engineers and whole departments dedicated to quality assurance. And they’re losing ground to two CPAs and four kids fresh out of college.”
“Because you’re using AI.”
“Because they’re still building like it’s 2020 and we’re not. They’re optimizing for constraints that don’t exist anymore when you’re working with AI agents. They’re running story pointing sessions for work that doesn’t get measured in story points. They’re doing code reviews designed for human cognitive limits when the thing that wrote the code doesn’t have human cognitive limits.”
He leans forward. “They all know about AI. They’re all ‘evaluating’ it. They’ve got some pilot program running in a corner somewhere. But they’re not reorganizing around it. They’re trying to bolt AI onto their 2020 processes and wondering why it’s not transformative.”
The gap is being exploited by people like Steve. The guy who spent his Coast Guard winters in a Lake Michigan barracks writing code for his master’s degree. That same guy now runs circles around established software companies with two CPAs and some recent graduates.
The Authority Problem
“You know what I hear a lot?” you say. “My developers are happy. We’ve got good retention. The team likes our processes. Why rock the boat?”
Steve actually laughs at that. “Happy crew is good crew. I’m not disputing that. But you’re the captain. You don’t get to delegate this decision to your crew based on whether they’re comfortable.”
He gestures at the helm. “When you’re navigating this thing into Bimini and the weather changes, do you take a vote? Do you check if your passengers are happy with the new course? No. You make the call that’s right for the ship.”
“That’s different.”
“It’s exactly the same. Your developers might be perfectly happy building the way they’ve always built. Comfortable workflows. Established patterns. Known constraints. But comfortable isn’t the same as competitive. And you’re not optimizing for developer comfort. You’re optimizing for your organization’s ability to compete and survive.”
He’s right. You know he’s right.
“Developer happiness matters. Of course it does. Happy crew is good crew. But it’s still a crew. And you’re still the captain. Your job is to make the right strategic calls even when they’re uncomfortable.”
Bimini is visible on the horizon now. You need to start thinking about the approach.
“You cannot lead transformation in something you don’t understand operationally. Your team knows this. They can tell the difference between a leader who’s done the work and one who’s read about the work. And they definitely know when you’re avoiding a hard decision by hiding behind their comfort level.”
When you make architectural decisions about AI infrastructure adoption without operational experience, you’re making them blind. You don’t know what you’re optimizing for. You can’t distinguish vendor marketing from operational reality. You can’t ask the right questions because you haven’t encountered the right problems.
“Build something this weekend, I’ll help. Heck, I’ll sign an NDA and be your intern!” Steve says. “Not ‘get a demo.’ Not ‘review the team’s work.’ Actually, write code using AI assistance and ship it to production.”
You’ll learn more in that single iteration than you will in six months of reading analyst reports.
Steve drains his beer and grins. “You know, with the Series C money, we’re hiring three more teams. If you ever want to actually work again instead of just reading about work, we compensate AI devs incredibly handsomely.” He gestures at the boat around you. “But then again, you own this wreck. You’re probably better off staying where you are.”
“This wreck cost more than your first three funding rounds combined.”
“Yeah, but my burn rate’s better.” He’s still grinning. “Seriously though. Get in the boat. The nuclear submarine. Not this diesel-powered museum piece you’re so proud of.”
The submarine manual doesn’t teach you how to operate a submarine. Operating the submarine teaches you how to operate the submarine.
You throttle down for the approach to Bimini. The conversation is over but the point landed.
Get in the boat.
Engineering leader who still writes code every day. I work with executives across healthcare, finance, retail, and tech to navigate the shift to AI-native software development. After two decades building and leading engineering teams, I focus on the human side of AI transformation: how leaders adapt, how teams evolve, and how companies avoid the common pitfalls of AI adoption. All opinions expressed here are my own.