A New Tool for Peer Recovery Programs
By Jon Scaccia
32 views

A New Tool for Peer Recovery Programs

Last year, staff at a small recovery community center in Texas were working late on yet another grant proposal. They knew their peer-led support groups were saving lives—but when funders asked for hard numbers on cost and impact, the answers were fuzzy. How do you put a dollar figure on recovery support or the lives saved by distributing naloxone?

A new study published in Frontiers in Public Health offers a practical way forward. Researchers teamed up with recovery community centers (RCCs) to design and test a free, web-based cost-effectiveness calculator for two widely used peer-driven interventions:

  • Peer Recovery Support Services (PRSS): long-term, peer-led programs that help people sustain recovery after treatment.
  • Bystander Naloxone Distribution (BND): equipping people likely to witness an overdose with naloxone to reverse it.

The project went beyond traditional economic analysis. It adopted a “nothing about us without us” approach, inviting RCC staff to shape the models, test the prototypes, and ensure the outputs were relevant to community practice.

Why This Matters Now

Peer-driven programs have expanded nationwide in response to the opioid crisis, but funding remains tight. Most economic research focuses on medications or clinical care, rather than peer-led services. Without cost-effectiveness data, RCCs risk being overlooked by policymakers and funders, even though they often fill critical gaps in the care system.

The calculator aims to change that. By letting RCCs plug in their own numbers—like peer worker pay, session length, or number of naloxone kits distributed—the tool generates results in plain language. It shows whether programs are cost-saving or cost-effective compared to usual treatment and provides graphics to help make the case in grants and reports

How the Study Worked

The research team partnered with two RCCs in Austin, TX. Together they:

  • Reviewed the Evidence: A systematic review helped identify which outcomes mattered most, like quality-adjusted life years (QALYs) and the number of people in recovery after three years.
  • Translated the Math: Complex formulas were rewritten into everyday language. For example, instead of “(RetpRpNtDTt),” the team explained it as “the percent of people who return to substance use and need treatment again.”
  • Co-Created Tools: RCC staff provided input on model assumptions, outcomes that mattered to them, and what data they could realistically track.
  • Tested Prototypes: Staff rated the calculator’s ease of use, clarity of results, and usefulness for their work. Most found it easy to use, though finding input data was a bigger challenge.

Key Findings

  • Feasibility: Community involvement in economic evaluation is possible, even with limited funding and staff capacity.
  • Accessibility: Nearly 60% of users found the calculator somewhat easy to use, and almost 40% said the results were very useful for their organizations.
  • Challenges: Many RCCs struggle with data collection. Without consistent tracking systems, gathering the right numbers to feed into the calculator can be difficult.
  • Impact: The calculator is already helping centers make a stronger case in proposals and stakeholder reports. It has been presented at national conferences and used in trainings across the country.

What This Means for Practice

For health departments and funders:

  • These tools make it easier to see the return on investment of peer-driven programs, helping justify sustainable funding.

For recovery community centers:

  • The calculator provides ready-to-use cost and outcomes data tailored to your own program inputs.
  • It can strengthen grant applications, inform strategic planning, and demonstrate value to stakeholders.

For policymakers:

  • Investing in peer-driven interventions is not just ethically right—it’s economically sound. This study provides a concrete way to prove it.

What’s Next?

The authors note that this is just a pilot. The study involved two RCCs and 24 feedback surveys—small numbers for such an urgent issue. Future research should:

  • Expand to more diverse communities, including rural areas.
  • Enhance data systems to enable RCCs to more easily track inputs and outcomes.
  • Explore other peer-led models beyond PRSS and naloxone distribution.

But the bigger message is clear: economic evaluation doesn’t have to be top-down or inaccessible. When communities co-create the tools, the results are more usable and more powerful.

Barriers and Open Questions

  • Data capacity: How can RCCs build stronger data systems without compromising their peer-driven ethos?
  • Equity in evaluation: Will funders embrace tools that value lived experience alongside traditional metrics?
  • Scaling up: Can this model be applied to other grassroots public health interventions beyond substance use?

Join the Conversation

This research opens new doors for community-driven evaluation. Now, the question is how we use it.

  • How could your organization apply a cost-effectiveness tool like this?
  • What barriers might stop your team from using it effectively?
  • Does this research challenge the way you think about funding peer-driven interventions?

The overdose crisis demands urgency, but it also demands innovation. Tools like this calculator remind us that the people closest to the problem can help design the solutions.

Discussion

No comments yet

Share your thoughts and engage with the community

No comments yet

Be the first to share your thoughts!

Join the conversation

Sign in to share your thoughts and engage with the community.

New here? Create an account to get started