There’s a pattern playing out across enterprises right now. A CxO reads the headlines about AI transforming software development. They call their VP of Engineering. “We need an AI strategy.” Three months later: two piloted tools, significant budget spent, and nothing to show except a handful of engineers who occasionally use autocomplete.
The problem isn’t the tools. The problem is starting without answering three fundamental questions.
And here’s the uncomfortable part: these aren’t questions you should spend months studying. If you can’t answer them today, right now, you have a visibility problem that goes far deeper than AI adoption.
1. What is the goal?
This sounds obvious. It isn’t.
“Improve developer productivity” is not a goal. It’s a platitude. Productivity toward what end? Shipping faster? Reducing defects? Cutting costs? Enabling a smaller team to do more? These lead to radically different approaches.
Organizations invest heavily in AI coding assistants when their actual constraint is architectural complexity. The AI helps engineers write code faster. Code that still takes weeks to integrate because the system is a tangled mess of dependencies. They optimized the wrong thing.
Your goal needs to connect to business outcomes. Not “10x developer productivity” but “reduce time to market for new features by 40%” or “maintain current velocity with 20% fewer engineers as we scale.” Goals you can measure. Goals the board cares about.
If you can’t articulate this in the next five minutes, that’s the problem you need to solve. And it has nothing to do with AI.
2. What are the constraints?
Every organization operates within constraints they rarely examine. Regulatory requirements. Security postures. Skill gaps. Legacy systems that can’t be touched. Political dynamics that determine what’s actually possible versus what looks good on a slide.
AI doesn’t eliminate constraints. It shifts them.
If your constraint is senior engineers spending 60% of their time on code review and mentoring, AI might help. If your constraint is that nobody understands the legacy billing system and the one person who did retired three years ago, AI won’t magically create that institutional knowledge. Different constraints require different interventions.
But here’s the harder question: are these real constraints or artificial ones?
Does the agile coach really need to sign off on every story? Does the manual test team actually catch bugs, or do they just provide a paper trail? Is that approval workflow protecting you from something, or is it just the way things have always been done?
Like all change management, you have to decide if something truly is a constraint or if it’s something you’d rather not change. Those are very different things. One requires working around. The other requires honesty.
Be honest about what’s actually slowing you down. Not what you wish was slowing you down. And not what’s politically convenient to call a constraint.
If you don’t already know this, you’ve been flying blind. And AI won’t fix that.
3. What is your current SDLC?
Not the SDLC in your process documentation. The actual one.
How does work really flow through your organization? Where does it stall? Where do handoffs create friction? What do your engineers actually spend their time doing?
Most organizations don’t know. They have theories. They have what Jira says. But they haven’t traced a feature from idea to production and understood where the hours actually go.
Here’s the uncomfortable truth: typically 15 to 20 percent of engineering effort goes to work that directly creates customer value. The rest is coordination, context switching, waiting, rework, and organizational overhead. AI can address some of that. But only if you know where the waste actually lives.
If you can’t draw your current SDLC on a whiteboard with rough percentages of where effort goes, that’s a leadership gap. Not answering these questions shouldn’t delay AI adoption. Not knowing the answers already should concern you.
The organizations succeeding with AI in their SDLC aren’t the ones who moved fastest. They’re the ones who understood what they were trying to achieve, what was actually constraining them, and how work really flowed through their systems.
These questions aren’t gatekeepers to getting started. They’re diagnostics. If you can’t answer them today, you now know your next steps.
And if you choose not to answer them, you’re not adopting AI.
You’re buying lottery tickets.
Engineering leader who still writes code every day. I work with executives across healthcare, finance, retail, and tech to navigate the shift to AI-native software development. After two decades building and leading engineering teams, I focus on the human side of AI transformation: how leaders adapt, how teams evolve, and how companies avoid the common pitfalls of AI adoption. All opinions expressed here are my own.