A new journal, Implementation Research and Practice, makes its official debut this week as a forum for research examining the implementation of approaches to assess, prevent, and treat mental health, substance use, or other addictive behaviors. Cara Lewis, PhD, an associate investigator at Kaiser Permanente Washington Health Research Institute, and Sonja Schoenwald, PhD, a senior scientist at the Oregon Social Learning Center, are the co-founding editors of the new online publication, which was launched by the Society for Implementation Research Collaboration with SAGE Publications. We sent them 7 questions about the decision to start the new journal, their hopes for what it can accomplish, and the burgeoning field of implementation science. Here are their responses.
Cara: The Society for Implementation Research Collaboration (SIRC) convened in 2011 with funding from the National Institute of Mental Health with the specific goal of bringing together stakeholders to improve evaluation of implementation efforts for these conditions. After its fourth biennial conference, SIRC recognized the need for a new journal dedicated to this area of research.
Sonja: Approaches targeting these conditions tend to be interpersonal, complex, and psychosocial in nature; they generally require recipients to engage in multiple encounters over time. Implementing these types of approaches demands great attention to key implementation issues, such as fidelity and adaptation, and warrant implementation strategies across multiple levels (e.g., individual, organizational, community, and systems).
Cara: I began my training as an efficacy researcher where I participated in the largest randomized clinical trial for adolescent depression. Out of that trial the data suggested that cognitive behavioral therapy (CBT), particularly when combined with antidepressant medications, was an efficacious treatment. However, when we looked in the community, it was clear that CBT was essentially not available. I found this incredibly frustrating that we, as a field, could develop effective interventions that those who were intended to benefit could not access. I then pivoted my training and research studies toward implementation science in 2006, which happened to be when the field’s first academic journal, Implementation Science, was established.
Sonja: My entry points preceded the moniker, “implementation science.” Early in my career, as a master’s level therapist in community mental health agencies, I had treated youth and families referred by juvenile justice, child welfare, and refugee placement agencies. My prior undergraduate psychology research training did not map well onto those experiences. When I began pursuing a doctorate in clinical psychology, I aimed to marry the two in ways I hoped would change things for people with serious clinical problems who, at the time were generally placed in institutions, residential care facilities, or prisons. In that period, research aligned with that goal fell more squarely under the rubric of mental health and substance abuse services research than treatment research.
I had the good fortune of working with Dr. Scott Henggeler who was doing that work in community-based studies of Multisystemic Therapy, an intensive, home- and community-based treatment originally developed and tested with chronic, violent, juvenile offenders and their families. I began to chart the intellectual and empirical territory, with community practice organizations and service systems, of “transportability” research, which subsequently was subsumed within implementation research. Later, in collaboration with scholars from other disciplines (organizational behavior, health economics, education, for example) and working in other contexts (schools, outpatient mental health clinics) that work expanded in scope to implementation questions and the extent to which scientifically derived answers might vary or remain similar across treatments and contexts.
Cara: There is research showing that psychotherapy is optimal when combined with tools to assess mental health problems. It’s known as measurement-based care (MBC), as the results from the assessments inform and improve care that is delivered. But evidence of an effective practice isn’t enough to change provider behavior. We recently completed an implementation trial where we found that tailoring implementation strategies to the unique needs, resources, and contexts of community mental health clinics improved the uptake of MBC over a standardized package of strategies that included training, consultation, and electronic health record enhancements. This led to more patients receiving psychotherapy optimized with MBC.
Sonja: Although there are a number of factors, a few stand out as particularly relevant. These include publication in the mid-1990s of meta-analyses that illuminated differences in the effects and contours of mental health treatments in controlled studies and community-based practice, and a number of reviews illuminating a long time lag between establishment of treatment efficacy and larger scale use. Around the same time, reviews of health care research in the U.K. and in the U.S. noted the need for evidence-based implementation of evidence-based medicine. Several monographs offered narrative reviews of efforts to disseminate and implement innovations in health care and other industries and service sectors, and in various nations. Leaders in research, policy, and practice recognized that to get from reviews to testable hypotheses and scientific advances requires more explicit focus on a science of implementation.
Cara: Now, 14 years after the field’s flagship journal was first published, the field is booming with several international conferences, requests for funding, training forums and degrees, for instance, resulting in increased production of empirical studies, conceptual frameworks, and methodologies. There clearly is a need for more scientific outlets.
Sonja: Indeed it does. We hope that readers will check out an invited Commentary by David Chambers, DPhil., a deputy director of implementation science at the National Cancer Institute, on “Considering the Intersection Between Implementation Science and COVID-19." In addition to the general implementation science issues discussed in his article, the pandemic is having an unprecedented impact on the mental health of the general population, which presents unique implementation challenges to take evidence-based interventions to scale.
Sonja: Along with Dr. Chamber’s COVID-19 commentary, there’s an editorial welcome by Cara and me that offers a brief history of the relationship between implementation science and behavioral science and the promise of work in this area going forward. These will be followed later in the summer by a collection of systematic reviews of methods to measure implementation constructs, empirical evaluations of the nature and costs of implementation strategies in substance abuse and mental health contexts, and innovative methodologies for assessing the implementability of complex psychosocial interventions. We will regularly post calls for papers on selected topics, as well.
Cara: We would like to see our journal bring together the science and practice of implementation in one outlet, and rapidly make available to readers worldwide advancements that are co-created by stakeholders.
We hope that researchers and other stakeholders reading this will consider Implementation Research and Practice as an outlet for their work, and a place to find usable knowledge and practical methods to support the implementation of evidence-based practices, programs, and policies.
To read or to contribute to Implementation Research and Practice, please visit the journal’s website at Sage Publications.
Kaiser Permanente launches the Center for Accelerating Care Transformation.
Dr. Cara Lewis reflects on using implementation science to integrate patient-reported symptoms into behavioral health care.
We profile the KPWHRI associate investigator, a clinical psychologist who also studies social factors that affect health.
Dr. Cara Lewis and Callie Walsh-Bailey describe a meeting that generated as many questions as answers — and why that’s a good thing.