Selected Work

Why Your AI Rollout Is Not Sticking: What 283 Engineers Reveal

Organizations are deploying generative AI tools into engineering workflows. Adoption remains uneven. The conventional explanations — more training, stronger mandates — turn out to be wrong. The evidence points to something more fundamental.

283
Engineers across two study phases
55%
Acknowledged efficiency benefits — yet many still didn’t adopt
#1
Workflow compatibility — primary adoption driver

The problem

Organizations across industries are deploying generative AI tools into their software engineering workflows. The business case seems clear: faster development cycles, reduced manual effort, improved code quality. Yet adoption remains uneven. Some teams integrate AI tools readily; others resist or abandon them within weeks.

The conventional explanation — that people need more training, or that leadership should push harder — turns out to be wrong.

#1
Driver of adoption
Central finding

Compatibility with existing workflows — not perceived usefulness — is the primary driver of AI adoption. Tools that require engineers to change how they work fail to gain traction, regardless of their technical merit or efficiency gains.

The research

The study used a two-phase design. In the first phase, 100 software engineers completed detailed open-ended surveys about their experience with AI tools — their intentions, concerns, and the organizational context shaping their decisions. Analysis of this material produced a new conceptual framework for understanding how engineers decide whether to adopt AI tools.

In the second phase, that framework was tested through a structured survey of 183 engineers, with findings statistically validated to confirm they hold across different organizations and roles.

What the research found

  • Compatibility is the decisive variable. What determines whether engineers actually use AI tools is how well those tools fit their existing development practices. When tools require process changes, adoption falters — regardless of how useful the tools are in principle. This held consistently across all respondents.
  • Perceived usefulness is necessary but not sufficient. While 55% of engineers acknowledged efficiency benefits, and some reported 10–30% time savings on specific tasks, reviewing AI-generated code consumed time that offset many gains. Proprietary systems and internal APIs created significant compatibility barriers that usefulness alone could not overcome.
  • Social pressure has limited impact. Peer encouragement, organizational mandates, and professional signalling proved surprisingly weak predictors of sustained adoption. Engineers adopt tools when they fit their work — not because colleagues or leadership told them to.
  • Concerns are legitimate signals, not resistance. Job security worries (25%), concern about skill erosion among less experienced developers (16%), data privacy questions (15%), and doubts about code accuracy (13%) all surfaced as barriers. These are rational assessments that organizations ignore at their own cost.
  • The least-disruption principle. At early stages of disruptive tool adoption, engineers prioritize workflow continuity over anticipated benefits. Tools that slot into current practices with minimal friction get used. Tools that require process changes — however beneficial in theory — do not.
What this means for your organization

If your AI rollout is producing disappointing adoption rates, more training and stronger mandates are unlikely to solve the problem. The evidence points to a different set of levers.

Assess compatibility before usefulness: understand how your engineering teams actually work today, in granular detail, and evaluate whether the tools you have chosen fit that reality. Address concerns as design constraints — not resistance to be overcome. Target specific high-friction tasks where engineers already want relief: searching documentation, debugging, generating repetitive code. Broad “AI transformation” narratives are far less effective than targeted, low-disruption interventions.

Publication Russo, D. (2024). Generative AI Adoption in Software Engineering. ACM Transactions on Software Engineering and Methodology, 33(5). dl.acm.org/doi/10.1145/3652154 →

Is AI adoption stalling in your engineering organization?

The problem is rarely the tool. It is almost always the fit between the tool and how your teams actually work. That is a diagnosable, addressable problem.

Get in touch