6 min read
A VP of Engineering called me in January. Smart person. Running about sixty engineers across four squads. Had rolled out AI coding tools six months earlier. Adoption numbers were good. Mostly.
What was bothering him was not the adoption numbers. It was that he still could not tell who was actually performing. He had velocity metrics. He had PR counts. He had hours-logged-in-the-IDE data from his tooling dashboards. He had all of it.
And he still could not answer the question his CEO had asked him the week before.
“Who are your top people?”
I asked him if he had looked at AI usage by engineer.
Silence. Then: “We have that data?”
You have that data.
And if that sounds familiar, it should. I hear versions of this every week from a couple thousand developers and engineering leaders: we rolled out the tools, but we still cannot clearly see what people are actually doing with them.
Your High Performers Are Probably Obvious
Here is the part that is not complicated.
Every major AI platform you are paying for generates usage telemetry. Request volume. Active days. Session depth. Completion acceptance rates.
Pull it up. Sort by usage descending.
The people with the deepest usage — having real conversations with the model about architecture, iterating on prompts, using the tools as actual tools — those are probably your best people. This should not be a surprise once you see it. These are the engineers who sit down and do the work. The AI is just where that work happens now.
There are exceptions. Some people run up usage translating corporate memos into bullet points, or asking a model to convert Shakespeare into Klingon, or regenerating the same function seventeen ways without ever committing anything. High usage with nothing shipped is still nothing shipped. That has always been true.
But as a first pass? The people at the top of your usage data are probably doing interesting things. You probably already know who they are. The data just confirms it.
That is not the conversation I want to have.
The Interesting Signal Is the Bottom of the List
Here is where it gets harder.
Scroll down. Find the senior engineers — principals, staff, architects, tenured mid-levels — who show almost no usage. Or none at all.
That is the signal that matters. And it requires you to do something most leaders are not great at: go find out why.
There are two explanations. The first is that they are blocked. The second is that they are not engaging. You have to know which one it is before you do anything else.
First Check: Can They Actually Use the Tools?
This one embarrasses a lot of organizations when they dig into it.
In 2026, access friction around AI tooling is not a theory. It is the default state at a remarkable number of orgs. Security reviews that take six weeks. Procurement cycles that approve one vendor and stall the rest for a quarter. IT tickets for API key provisioning that sit unacknowledged for two weeks. Data classification policies written vaguely enough that cautious engineers apply them to everything just to stay out of trouble.
Your best senior engineer might show zero usage because nobody ever unblocked her access request from October.
I know an engineering director — seven years at a company you would recognize, excellent track record — who spent three months waiting for an IAM role that would let her team access an internal AI gateway. She had the use case mapped. She had the architecture drafted. She was ready to go. But the IAM request needed a manager approval, which needed a VP co-sign because the budget code touched production infrastructure, and the VP was not processing non-urgent requests until the next planning cycle.
Three months. Usage: zero. Would have looked damning in a report.
Before you have any conversation with a low-usage engineer, pull the ticket. Trace the access request. Find where it stopped. You may find that the person you were about to question has been waiting longer than you have been watching the dashboard.
If that is what happened — fix it immediately. Treat it like a production incident. Because it is one.
Then: Have the Awkward Conversation
Now the scenario where the access is fine, the tools are available, the training happened — and the usage is still flat.
This is the conversation you need to have with that person. And it is going to be awkward.
Not because they are bad. Not because you are accusing them of anything. Awkward because you are surfacing something neither of you has said out loud yet. You are sitting across from a senior engineer with a decade of good work behind them, and you are essentially asking: why aren’t you using the thing everyone else is using?
Some of them are scared. Some have been told so many times that their expertise is their identity that picking up a new tool feels like admitting the old identity is not enough anymore. Some watched a junior engineer produce something fast and mediocre with AI and decided they would rather be slow and correct — and nobody has shown them yet that this trade-off is no longer available at the price they think it is.
Those are rational responses to real pressures. Treat them that way. Do not come in with a performance frame. Come in with curiosity. What are you working on? How are you getting it done? Have you tried using an agent for this kind of problem? Let me show you what it looks like when someone does.
Pair them with someone who has figured it out. Not in a formal training. Just two engineers working on a real problem together, one of whom reaches for the agent naturally. That is how it spreads.
But do not stop there, and do not wait.
Some Conversations Do Not Have a Happy Version
Here is the one most leaders are avoiding.
If someone has had access, has had the time, has been shown how it works, has watched their peers shipping at two or three times their previous pace — and they are still not engaging — you have a different problem.
That is not a tools problem. That is a judgment problem. And it is yours to address.
The question is not whether to punish them. The question is whether to invest in them — and what timeline you are willing to hold. Because the engineers who have genuinely internalized this way of working are the core of what your engineering organization looks like from here forward. The ones who have not are falling behind. Not in a theoretical way. In a measurable, compounding way that shows up in the usage data, and eventually in the output data, and eventually in the business.
You are not doing them any favors by letting a year go by before you have this conversation.
What You Actually Do with This
Pull the usage data. Today, not next quarter.
Sort it two ways: highest usage and lowest. Glance at the top — you probably see what you expected. Now look at the bottom. Find the names that surprise you. The people you thought were doing well who are not showing up in the data at all.
For each one, do two things in order.
First, find out if they can actually use the tools. Not in theory — actually trace it. If they are blocked, unblock them. Loudly and fast.
Second, have a conversation. Not a passive check-in. An actual conversation where you are asking: what is getting in the way, and what would it take to remove it?
Some of those conversations will be easy. Some will be uncomfortable. A few will be genuinely hard.
Have them anyway.
The data is there. You have been looking at it as a budget sanity check.
Start looking at it as a map of where people need help — and go help them.