ADD Engineering Leadership Deck
CTO + VP Engineering briefing 01 / 07

Slide 01

Thirty Minutes a Week Is Not an Adoption Strategy. It Is a Signal.

CTO + VP Engineering + Board
Core claim

You are asking your engineers to learn the most significant shift in how software gets built since the internet — between sprint commitments — and wondering why adoption is not sticking.

Daniel runs engineering at a company with a couple hundred engineers. He has Copilot licenses, internal hackathons, a Slack channel for AI tips, an LLM evaluation committee, and a training budget. Standard playbook. And thirty minutes a week his teams can dedicate to learning. Between sprint commitments and velocity targets set before any of this existed. That is not an adoption strategy. That is a signal to your team that you do not actually prioritize this.

What engineers hear This is not important enough to change anything for. They are rational people. They respond exactly as you would. They nod. They attend the optional training. They do not change how they work.

Slide 02

45 Days to Ship. 8 Days of Actual Work. You Optimized the 8 Days.

Value stream reality
Total cycle time ~45 days

What it takes to ship a feature at a typical mid-market engineering organization. From idea to production. Forty-five days with a tailwind and nothing going wrong.

Actual engineering work ~8 days

Writing code. Building things. Approximately eight days of the forty-five. This is where your AI tooling lands. This is the part you gave engineers thirty minutes a week to optimize.

Waste nobody mapped ~37 days

Waiting, handoffs, approvals, queue time. Nobody traced it. Nobody measured it. Nobody asked whether the release committee that meets every other Thursday is protecting something or just consuming time.

You optimized the eight days. You gave your engineers thirty minutes a week to optimize the eight days. Meanwhile thirty-seven days of waste sit there untouched because nobody mapped it.

That is not an AI problem. That is a leadership problem.

Slide 03

If You Cannot Measure the ROI of Manual QA, You Are Making an Emotional Argument

Process reality check
The defense and the gap

Daniel: "Our product is complex. Automated testing doesn't catch everything. We've tried." When asked for the ROI on the current manual QA setup: silence.

The team had always had manual QA. It was load-bearing. It was how things worked. Questioning it felt like questioning whether the building needed walls. Not because there was no ROI. Because nobody had measured it. It was a tradition, not a decision.

The honest frame If you are defending a process you cannot measure — in an era where AI can generate, execute, and maintain test suites at a scale your manual team will never match — you are not making a technical argument. You are making an emotional one.
What the honest comparison requires

What does your manual QA process actually catch that automated tests would miss? What is the cost per defect caught — including the cycle time consumed? What would an AI-generated and AI-maintained test suite cost, and what would it catch?

If you cannot make that comparison, you do not have a QA strategy. You have a tradition. And traditions are expensive when the field has moved past them.

The point Not "fire QA tomorrow." Measure it. Compare it. Decide from data, not from the way things have always been done.

Slide 04

Your Best Engineers Are Already Using AI. Underground. Because It Is Not Safe to Be Above Ground.

Culture signal

What the underground looks like

  • Engineers using AI tools they are paying for themselves, quietly, because they are terrified someone will find out and fire them for it.
  • The last time someone suggested changing how things work: six weeks in meetings with the transformation team who wanted to turn it into a pilot program with success metrics and a steering committee.
  • They decided it was easier to just do it quietly.
  • They are sitting on techniques that could save everyone else weeks — and they are not sharing them.

What the underground tells you

  • If your best engineers are hiding that they use AI tools, you have a safety problem, not an adoption problem.
  • They have learned that innovation in your organization is more expensive than silence.
  • The adoption you want is already there. It is hiding from you.
  • Make it safe. Make it official. Then get out of the way. The culture is trying to go somewhere. Stop blocking the door.

Slide 05

Nine Years of Transformations. Still Takes Six Weeks to Ship a Button.

Framework reality
The transformation track record

SAFe. Then Spotify squads. Then OKRs. Three transformations in ten years. Seven full-time agile coaches on staff. Still takes six weeks to ship a button — "if nothing goes wrong."

Daniel's teams are "running a hybrid Agile-Kanban approach" and "still tuning it." The tuning has been happening for nine years. The framework is not the thing that needs more tuning. The framework is the thing you installed to manage a problem you have not addressed at the source.

The actual problem Process frameworks are tools. If the tool has been in use for nine years and the output is six-week ship times on a button, the framework is not what needs examining.
What actually enables AI adoption

Time to experiment. Not thirty minutes. Real time, carved out from delivery commitments, protected from sprint velocity pressure, treated as a first-class activity rather than something you fit in the margins.

Mapped value streams. You cannot apply AI to waste you have not identified. The whiteboard exercise is not complicated. It is just revealing — and leaders tend to avoid revealing things when they are responsible for what gets revealed.

The willingness question Some of what you built to manage complexity is itself complexity. Are you willing to look at that honestly?

Slide 06

Be Patient With the People Who Built This. Do Not Let Their Comfort Set Your Timeline.

Leadership balance
The empathy

The engineers who built the manual QA process built something that worked. The leaders who installed the release committee were protecting something real. That matters.

Be patient with those people. Invest in those people. Some of them will surprise you. They have domain knowledge that no AI can replicate and no new hire will have for years. Asking them to imagine a fundamentally different way of working is genuinely hard — they are responding rationally to the incentives you built around them.

The accountability

But do not let their comfort set your timeline. The organization's survival is not well-served by protecting processes that no longer serve it. You can be kind about the pace of change. You cannot be indefinite about it.

The leaders succeeding at AI adoption create the conditions for real learning — actual time, actual safety, actual mapping of what is and isn't working — and then hold the organization to an honest accounting of what it sees. Not what it hopes. What it actually sees.

The balance Empathy for the people. Clarity about the direction. Zero ambiguity about whether the direction is changing.

Slide 07

What Are You Actually Willing to Change?

Decision close
The audit

You have a training budget. You have licenses. You have a Slack channel and a committee. What you do not have is an organization designed to absorb this change.

The engineers have thirty minutes. The value streams are not mapped. The manual QA process is defended on instinct. The release committee meets every other Thursday. The framework has been "tuning" for nine years. These are not technical problems. They are leadership problems. The question is not whether you can see them. The question is whether you are willing to change them.