Is the 17-Year Evidence-to-Practice Gap Holding Us Back?
By Jon Scaccia
15 views

Is the 17-Year Evidence-to-Practice Gap Holding Us Back?

The meeting room was packed. Local health leaders, researchers, and frontline practitioners squeezed into chairs as a project team unveiled its new implementation plan. Early on, someone mentioned it—the familiar line many of us in public health have heard for years:

“It takes 17 years for research to make it into practice.”

Heads nodded. People scribbled it into notebooks. The number felt solid, urgent, even motivating. After all, who wants to wait nearly two decades for communities to benefit from good science?

But what if that number—repeated in academic articles, grant proposals, and conference keynotes—isn’t just oversimplified? What if it’s actually distracting us from the real challenges of implementing change in complex systems?

A new commentary argues exactly that: the evidence-to-practice gap is real, but the “17-year” number has outlived its usefulness. And understanding why matters for anyone trying to advance health equity, scale effective interventions, or build learning health systems.

The Evidence-to-Practice Gap: More Complicated Than We Think

For more than two decades, implementation science has been built, in part, on the idea that research takes far too long to reach communities. The “17-year gap” comes from a 2000 article by Balas and Boren, and that paper has been cited more than 2,200 times.

But here’s what the new commentary highlights: the original article never actually claimed a universal 17-year timeline.

The “17-year gap” was an average across a small number of clinical specialties, based on unclear definitions of what “using evidence” actually meant. Some specialties moved faster, others slower. Some counted evidence uptake as reaching textbooks; others counted it as 50% of clinicians using a practice. The reality was—and still is—far messier than a single statistic can capture.

Why Old Evidence No Longer Fits Today’s World

Here’s another problem: the 17-year estimate is based on data collected before the iPhone, Google Scholar, Twitter, or modern learning health systems existed.

In 2000:

  • Social media did not spread health information at the speed of light.
  • Knowledge brokers, embedded researchers, and implementation specialists were rare.
  • Major frameworks like CFIR and EPIS had not yet been developed.
  • Implementation science itself was just beginning to formalize as a field.

Today, we have:

  • Dozens of validated theories, models, and frameworks
  • Health system embedded research programs
  • Decision-support technologies
  • Routinized quality improvement structures
  • Global innovations from Indigenous, LMIC, and community-led implementation science

The infrastructure supporting implementation is fundamentally different. Using a number from the pre-smartphone era is like trying to navigate with a 25-year-old paper map.

What’s the “Right” Timeline? We Don’t Actually Know.

Even if we agree 17 years is outdated, a natural question follows:

So what is the optimal pace of implementation?

The commentary notes: we haven’t figured that out yet. Implementation speed depends on factors such as:

  • Intervention risk
  • Local capacity
  • Trust between partners
  • Political stability
  • Organizational readiness
  • Community infrastructure
  • Equity considerations
  • Need for adaptation

Sometimes, quicker is better, like during infectious disease outbreaks. Sometimes going too fast leads to bad outcomes like the spread of unproven COVID treatments.

In some settings, especially those under-resourced or historically marginalized, implementation simply takes longer because communities need space for trust-building, adaptation, and co-creation.

The takeaway: speed alone shouldn’t be our north star. Quality, equity, and sustainment matter more.

Implementation Science Has a Better Story to Tell

Implementation is rarely linear. It’s not a tidy staircase from “evidence” to “action.” It’s a complex ecosystem shaped by:

  • competing priorities
  • political dynamics
  • shifting resources
  • community needs
  • interpersonal relationships
  • system history

As one study notes, implementation often feels “torturous, political, and governed by chaos more than straight-line models.” The 17-year trope flattens that complexity. It hides the relational labor, equity work, adaptation, and system constraints that make implementation challenging yet also rewarding.

The field has grown far beyond its early days. The most exciting stories now come from places like:

  • Aotearoa New Zealand, where researchers developed Indigenous-centered implementation models
  • LMICs developing frameworks that reflect decentralized health systems
  • U.S. communities using learning health systems to integrate real-time data into practice
  • Public health departments partnering with communities to build shared governance and implementation support structures

The field’s richness is lost when we rely on an oversimplified number.

What This Means in Practice

For agencies, funders, practitioners, and researchers, the commentary suggests a shift in mindset—and messaging.

1. Stop citing the 17-year number

It’s outdated and doesn’t reflect the current science.

2. Emphasize context-specific implementation timelines

Ask: What does implementation “success” look like in this setting? What is the right pace for this intervention, population, and system?

3. Recognize that relationships and trust take time

Equity-centered implementation requires shared power, community collaboration, and distributed leadership—all of which require meaningful investment.

4. Highlight the global and diverse foundations of modern implementation science

Move beyond Western-centric assumptions about how change unfolds.

5. Reframe the narrative around equity, access, and shared benefit

The authors suggest turning to the Universal Declaration of Human Rights:
Everyone has the right to share in scientific advancement and its benefits.

This reframes implementation not as a race against time, but as a commitment to ensuring that science reaches all people fairly.

Barriers & Open Questions

The commentary leaves us with questions the field still needs to tackle:

  • How do we measure “optimal” implementation speed in a way that respects context?
  • How do we prevent rushed implementation in crises from creating long-term community distrust?
  • What structures are needed so that marginalized communities benefit from scientific advances at the same pace as well-resourced ones?
  • How do we build global implementation frameworks that reflect truly diverse systems and cultures?

These are not technical questions—they’re ethical, relational, and structural.

What’s Next?

The field of implementation science is evolving quickly. As we move forward, the challenge isn’t to replace “17 years” with a different universal timeline—it’s to acknowledge what implementation science has always known:

Complex problems require context-sensitive solutions. And context takes time.

Reflection Questions for Readers

  • How might your organization rethink the way it talks about implementation timelines?
  • What structures or relationships could accelerate—not rush—implementation in your setting?
  • Does this shift challenge any assumptions you’ve held about the evidence-to-practice gap?

Discussion

No comments yet

Share your thoughts and engage with the community

No comments yet

Be the first to share your thoughts!

Join the conversation

Sign in to share your thoughts and engage with the community.

New here? Create an account to get started