Skip to content

Podcast Transcript

If You Want to Measure Macro Results, Answer These 3 Questions Before AI Touches Your SDLC

Executive DeckListen
November 26, 2025

·

Read the full article

There is a pattern playing out across enterprises right now. A C-suite executive reads the headlines about AI transforming software development. They call their Vice President of Engineering. They say, we need an AI strategy. Three months later. Two piloted tools. Significant budget spent. And nothing to show except a handful of engineers who occasionally use autocomplete.

The problem is not the tools. The problem is starting without answering three fundamental questions.

And here is the uncomfortable part. These are not questions you should spend months studying. If you cannot answer them today, right now, you have a visibility problem that goes far deeper than AI adoption.

First. You have to ask. What is the goal? This sounds obvious. It is not. Improving developer productivity is not a goal. It is a platitude. Productivity toward what end. Shipping faster. Reducing defects. Cutting costs. Enabling a smaller team to do more. These lead to radically different approaches.

Look. Organizations invest heavily in AI coding assistants when their actual constraint is architectural complexity. The AI helps engineers write code faster. Code that still takes weeks to integrate because the system is a tangled mess of dependencies. They optimized the wrong thing.

Your goal needs to connect to business outcomes. Not ten times developer productivity, but reducing time to market for new features by forty percent. Or maintaining current velocity with twenty percent fewer engineers as you scale. Goals you can measure. Goals the board cares about.

If you cannot articulate this in the next five minutes, that is the problem you need to solve. And it has nothing to do with AI.

Second. You must define the constraints. Every organization operates within constraints they rarely examine. Regulatory requirements. Security postures. Skill gaps. Legacy systems that cannot be touched. Political dynamics that determine what is actually possible versus what looks good on a slide.

AI does not eliminate constraints. It shifts them.

OK. Here is the thing. If your constraint is senior engineers spending sixty percent of their time on code review and mentoring, AI might help. If your constraint is that nobody understands the legacy billing system and the one person who did retired three years ago, AI will not magically create that institutional knowledge. Different constraints require different interventions.

But here is the harder question. Are these real constraints or artificial ones? Does the agile coach really need to sign off on every story. Does the manual test team actually catch bugs, or do they just provide a paper trail. Is that approval workflow protecting you from something, or is it just the way things have always been done.

Like all change management, you have to decide if something truly is a constraint or if it is something you would rather not change. Those are very different things. One requires working around. The other requires honesty.

Be honest about what is actually slowing you down. Not what you wish was slowing you down. And not what is politically convenient to call a constraint. If you do not already know this, you have been flying blind. And AI will not fix that.

Third. You must look at your current Software Development Life Cycle. Not the life cycle in your process documentation. The actual one. How does work really flow through your organization. Where does it stall. Where do handoffs create friction. What do your engineers actually spend their time doing.

Right. Most organizations do not know. They have theories. They have what Jira says. But they have not traced a feature from idea to production and understood where the hours actually go.

Here is the uncomfortable truth. Typically fifteen to twenty percent of engineering effort goes to work that directly creates customer value. The rest is coordination. Context switching. Waiting. Rework. And organizational overhead. AI can address some of that. But only if you know where the waste actually lives.

If you cannot draw your current development life cycle on a whiteboard with rough percentages of where effort goes, that is a leadership gap. Not answering these questions should not delay AI adoption. Not knowing the answers already should concern you.

The organizations succeeding with AI in their development life cycle are not the ones who moved fastest. They are the ones who understood what they were trying to achieve. What was actually constraining them. And how work really flowed through their systems.

These questions are not gatekeepers to getting started. They are diagnostics. If you cannot answer them today, you now know your next steps.

So. If you choose not to answer them, you are not adopting AI. You are buying lottery tickets.

Companion