Scaling the implementation cliff: strategies for increasing the effectiveness of evidence-based interventions in community settings

Approximately one quarter of Americans suffer from mental health problems, and more than 20 million receive treatment annually (1). To address this demand for services, funding agencies have pushed for the development and implementation of evidence-based treatments (EBTs), behavioral health interventions that outperformed existing treatments in highly controlled studies. Researchers have responded by generating hundreds of treatments (many of which are EBTs) to address a broad range of psychological disorders including anxiety, depression, trauma and disruptive behavior disorders (2). However, recent studies have identified an implementation cliff, such that interventions are almost universally more effective in the context of controlled experiments in the laboratory, as compared to community mental health settings out in the “real world”, which treat the vast majority of individuals seeking behavioral interventions (3). 

One explanation for this implementation cliff is that many providers do not maintain treatment fidelity. Treatment fidelity refers to the extent to which the therapist skillfully implements only practices that are prescribed by a given EBT. Think of an EBT as a recipe for chocolate chip cookies and its components as the cookie ingredients. If you add too much sugar the cookies taste too sweet. If you forget to add chocolate chips, you end up with sugar cookies. Introduce an unfamiliar ingredient and the cookies may be unrecognizable. Just as the strict adherence to established recipes often yields the best tasting chocolate chip cookies, EBTs are most effective when their core components are implemented as intended. Adjustment of dosage and sequencing, along with integration of outside treatments, often neutralize the effectiveness of the active ingredients prescribed by a given EBT. Whereas treatment fidelity is generally excellent in clinical research trials, there is great variance in the extent to which EBTs are implemented as intended in community settings.  

Researchers at the University of Texas have explored two training approaches that they think are associated with treatment fidelity (4). Most providers learn how to implement EBTs through a combination of self-study (reading a treatment manual), didactic workshop training, and receipt of follow-up supervision. These pathways emphasize passive training strategies, whereby information is transferred uni-directionally from trainer to trainee. While passive training experiences elicit reliable increases in trainee knowledge, such gains are rarely accompanied by increased utilization of EBT-specific practices.  Active training techniques, by contrast, facilitate experiential learning processes through bidirectional interaction and feedback between trainer and trainee. Behavioral modeling, which involves therapist acquisition of skills through observation and imitation of supervisor behavior, and role play, where the therapist practices skills on a hypothetical client whose role is assumed by the supervisor, are two of the most common active training strategies, and are frequently used in tandem. Additional active techniques include goal-setting, performance evaluation and feedback, and provision of opportunities to perform newly acquired skills on-the-job.  

The University of Texas group recently conducted a study examining the effects of active and passive learning approaches on treatment fidelity in novice therapists training to practice cognitive restructuring, a common element of many EBTs (5). All therapists in the study attended a training workshop, consisting primarily of didactic instruction. Following the workshop training, half of the providers received supervision as usual, consisting of three supervision sessions that emphasized mostly passive training strategies. The remaining therapists received three sessions of active supervision, involving role-plays, behavior modeling and performance review and feedback. Providers conducted mock therapy sessions with a trained confederate before and after the workshop training and following each of the three supervision sessions, and their performance was evaluated by two independent observers. Consistent with prior research, both groups demonstrated increased knowledge and reported more positive attitudes towards EBTs. Therapists in both conditions demonstrated modest gains in treatment fidelity, expertise (mastery of specific intervention) and overall therapeutic competence following the workshop training. Therapists receiving active supervision showed increasing fidelity, expertise and competency over the course of supervision, eventually exceeding adequacy benchmarks in each domain. By contrast, therapists receiving passive supervision did not maintain initial improvements in fidelity, expertise and competency and failed to meet minimum standards for practice at any time during the study. The findings indicate that while therapists in both training groups developed a strong theoretical understanding of the treatment, only those in the active supervision condition delivered the intervention as intended and demonstrated general clinical competence in doing so. 

The study discussed above contributes to a growing body of literature suggesting that from a programmatic training perspective, mental health service organizations should augment traditional didactic training approaches emphasizing passive learning with active training strategies that promote experiential learning. Implementation of active training should not be construed as a panacea for mental health organizations, which are beset by a number of challenges including overwhelming demand for services, a transient workforce and limited financial resources. However, these training strategies are easy to implement and have the potential to increase the effectiveness of EBTs through improvement of provider treatment fidelity.

 

References

1. Demyttenaere, K., Bruffaerts, R., Posada-Villa, J., Gasquet, I., Kovess, V., Lepine, J., ... & Polidori, G. (2004). Prevalence, severity, and unmet need for treatment of mental disorders in the World Health Organization World Mental Health Surveys. Jama291(21), 2581-2590.

2.  Kazdin, A. E. (2000). Psychotherapy for children and adolescents: Directions for research and practice. Oxford University Press.

3. Weisz, J. R., Chorpita, B. F., Palinkas, L. A., Schoenwald, S. K., Miranda, J., Bearman, S. K., ... & Gray, J. (2012). Testing standard and modular designs for psychotherapy treating depression, anxiety, and conduct problems in youth: A randomized effectiveness trial. Archives of general psychiatry69(3), 274-282.

4. Bearman, S. K., Weisz, J. R., Chorpita, B. F., Hoagwood, K., Ward, A., Ugueto, A. M., ... & Research Network on Youth Mental Health. (2013). More practice, less preach? The role of supervision processes and therapist characteristics in EBP implementation. Administration and Policy in Mental Health and Mental Health Services Research40(6), 518-529.

5. Bearman, S. K., Schneiderman, R. L., & Zoloth, E. (2017). Building an evidence base for effective supervision practices: An analogue experiment of supervision to increase EBT fidelity. Administration and Policy in Mental Health and Mental Health Services Research44(2), 293-307.

Photo by Paul Bence on Unsplash