1,934
Views
7
CrossRef citations to date
0
Altmetric
Original Research

Implementing adolescent SBIRT: Findings from the FaCES project

, PhD, , MPH, , MA, , MA & , MD, MPH

Abstract

Background: Screening, Brief Intervention, and Referral to Treatment (SBIRT) is an evidence-based approach to early intervention of substance misuse. Methods: This mixed-methods evaluation assessed the implementation of an adolescent SBIRT change package across 13 primary care clinics. These clinics participated in an 18-month learning collaborative, during which they received training and technical assistance on SBIRT practices. Results: Six major themes emerged around the implementation of the change package: operational readiness of the sites, training of staff members, factors around the screening process, factors around intervention delivery, the referral process, and the adaptation and utilization of the electronic health record (EHR). Conclusions: Through the guidance of the change package and the associated training and technical assistance, the participating primary care clinics were able to implement SBIRT practices within their existing workflows. There was also an observed reduction in reported substance use among the at-risk adolescents served by these clinics.

Introduction

Substance use among adolescents is an ongoing and critical public health issue. Individuals in the United States are most likely to start using drugs during their adolescent years.Citation1 The 2019 Monitoring the Future report collected questionnaires from 42,531 students in 396 secondary schools to determine the prevalence of substance use among American youth. By the end of high school nearly 6 out of every 10 students (58.5%) have consumed alcohol (more than just a few sips), and about a quarter (24.5%) have done so by 8th grade.Citation2 Overall use of illicit drugs, including marijuana was reported by 20.4% of 8th-grade students and 47.4% of 12th-grade students.Citation2 Adolescents are at high risk for the adverse effects of drug use and misuse because their brains are still developing, specifically neurological development of their prefrontal cortex.Citation1 Drug use is also linked to other patterns of risk-taking behavior. Dangers to adolescents include altered brain development, difficulties in school, issues in interpersonal relationships, driving while intoxicated, engaging in unsafe sex, and increased risk of HIV and other sexually transmitted infections (STIs).Citation3–5 Therefore, it is recommended that evidence-based screening tools and interventions are used with adolescents to detect and treat substance use problems early.

Screening, Brief Intervention, and Referral to Treatment (SBIRT) is an evidence-based practice that identifies and responds to substance misuse through three core components: screening, brief intervention, and referral to treatment.Citation6 Implementation of SBIRT, in conjunction with motivational interviewing, in a pediatric practice was shown to increase provider competence and rates of screening for substance use.Citation7 Screenings should be conducted through an age-appropriate instrument, such as the S2BI, to detect substance use in order to prevent, reduce risk of, identify, and treat substance use in adolescents.Citation8 This approach of addressing substance use along a continuum allows for earlier identification and intervention than practices that focus solely on substance use disorders.Citation6 For adolescents whose screens indicate alcohol, tobacco, or other drug use, a brief intervention is delivered and, when necessary, a referral for further assessment or treatment may be made.Citation9 In addition to decreasing substance use, SBIRT used with adults has been associated with a decrease in healthcare costs ranging from $3.81 to $5.60 for each dollar spent.Citation10,Citation11. As of 2015, 16 states have included SBIRT within their Medicaid reimbursement codes.Citation10 SBIRT has been implemented in various settings, including primary health care, integrated care, hospitals, and federally qualified health centers (FQHCs). In its most recent statement, the United States Preventive Task Force (USPSTF) recommends SBIRT for alcohol and drug use in adults.Citation12 Similarly, the American Academy of Pediatrics recommends SBIRT as part of routine care for adolescents and has provided guidance on best clinical practices.Citation13

While SBIRT has been implemented in primary care, emergency rooms, school, and community-based organizations there is a need for more extensive research as the current body of literature is scarce, particularly as it relates to SBIRT for adolescents.Citation12,Citation14 In one investigation into adolescent SBIRT, Stanhope et al. evaluated adolescent SBIRT within community mental health centers reaching 2,873 adolescents.Citation15 This investigation found that 52.8% had positive results to their alcohol and drug screens (n = 1,517), of those 87.3% received brief intervention and/or referral to treatment. It has been shown that the implementation of SBIRT with adolescents increases substance misuse recognition, brief intervention, and referral to treatment.Citation16,Citation17 However, these studies have conflicting results on which may be the better model of SBIRT delivery, pediatrician-delivered or behavioral health professional-delivered. Mitchell et al. found that physician-delivered SBIRT practices reported higher rates of brief intervention delivery, while Sterling et al. found that the behavioral health professional model demonstrated higher rates of intervention delivery.Citation16,Citation17

Goplerud and McPherson reviewed the elements of successful SBIRT implementation for adults in primary care settings and found that leaders must be supportive, interventions and questionnaires must be adapted to the individual practice and needs of patients (verbal, written, time spent, etc.), and a standardized questionnaire or form is needed to promote consistency and ease of implementation.Citation18 Implementation is affected by the external political, economic, and social contexts, and in order to decrease barriers it is advised to: standardize EHRs and substance misuse inventories by modeling SBIRT measures after screening measures already recorded, make brief interventions core to patient-centered medical practice, commit greater effort to integrate substance use services within health initiatives, remove barriers to intra-provider communication, remove barriers to reimbursement, and institute payer driven accountability for services provided.Citation18,Citation19. Training and skills development, including consultation with established SBIRT providers, were also identified as key to SBIRT success.Citation18,Citation19 Workforce plays a large role in the delivery of SBIRT; this can be supported through pre-professional primary workforce development, education programs for behavioral health personnel, and recognition of those who do an outstanding job implementing SBIRT.Citation18,Citation19 Much of the work around the factors surrounding successful implementation have focused on clinics serving mixed populations, both adolescents and adults. Previous studies on the implementation of SBIRT in adolescent serving clinics have provided a less comprehensive picture of the factors surrounding successful adoption. This study aims to expand on this existing knowledge of the key activities and organizational characteristics critical to successful SBIRT implementation specifically for adolescents in primary care settings.

Program description

The Conrad N. Hilton Foundation’s initiative to prevent adolescent substance use through access to early intervention funded a range of projects involving innovative screening and early intervention approaches in order to advance the field in training, implementation, and technical assistance for adolescent SBIRT services. Grantees executed initiatives to support the research and evaluation of adolescent substance use prevention interventions, including the National Council for Mental Wellbeing’s (National Council) Facilitating Change for Excellent in SBIRT (FaCES) project.

The FaCES project consisted of three phases. The first involved the development of an evidence-informed adolescent SBIRT in primary care change package, or implementation guide, with input from a national group of SBIRT experts. The FaCES change package includes a series of change concepts that marry clinical guidance with the actual operational changes required for effective practice transformation. The second stage was designed to test the utility of the FaCES change package via a pilot program. Thirteen FQHCs were selected through a request for proposal process to participate in an 18-month learning collaborative to implement the FaCES change package. Selected sites had a diversity of readiness for implementation, geographic location and setting, center size, and patient population demographics (). Sites received targeted, multi-modal, and responsive training on a range of topics, including SBIRT and behavioral health integration, developing and sustaining community partnerships, navigating confidentiality, and data-driven decision-making. The final phase consisted of refinement of the FaCES change package based on the lessons gleaned from the learning collaborative.

Table 1. Characteristics of the clinical sites involved in the learning collaborative (n = 13).

The FaCES change package recommended that clinicians conduct universal screening for adolescent patients using the Screening to Brief Intervention tool (S2BI), which has demonstrated high sensitivity and specificity in identifying risky substance use.Citation20 The S2BI also recommends a range of interventions to deliver based on response to the screening results. The change package provides further guidance on how providers should respond to patients based on the results of the screening instrument ().

Table 2. S2BI scoring algorithm.

The purpose of this paper is to summarize evaluation findings regarding the experience with, and progression of SBIRT implementation among the FQHCs utilizing the FaCES change package created by the National Council. This study has received appropriate Institution Review Board approval.

Method

This evaluation assessed learning collaborative participants on organizational readiness, capacity, and implementation outcomes to examine the facilitators, barriers, and successes experienced when implementing SBIRT in their clinic. A key aspect in assessing the success of the FaCES change package was to oversee the organizational changes of the clinics as they implemented SBIRT into their workflows. This included an organizational self-assessment (OSA), an implementation survey, and the feedback received during site visits.

The National Council designed the OSA as a performance improvement tool for organizations to integrate sustainable and effective SBIRT into their services delivery. The instrument was completed electronically by clinical leadership at participating FaCES sites at the beginning of the learning collaborative, approximately 12 months in, and again at the end of the learning collaborative. The OSA contains multiple scales and items related to SBIRT implementation, including commitment from senior leadership, formation of a multidisciplinary core implementation team, developing procedures and tools for continually tracking and analyzing data, and practicing continual process improvement. The scores on the OSA represent how well the site meets that standard, ranging from 0 representing not meeting the standard at all to 4 representing being exemplary on that standard. The OSA also contains an item asking sites to approximate the time it takes to deliver a brief intervention (BI) and assess what staff is responsible for the screening and intervention procedures and how sites were funding SBIRT activities.

An implementation survey developed by Friends Research Institute (FRI) was electronically distributed at the beginning and again at the end of the learning collaborative to develop a broader picture of the breadth and depth of training received, as well as barriers and facilitators to SBIRT implementation. The implementation survey assessed clinic functioning across multiple domains, including organizational readiness, organizational capacity, training of staff, acceptance of SBIRT practices, and adherence to SBIRT practices. The survey assessed these elements of implementation across various clinical positions. The survey was tailored to the type of respondent, based on role within the organization, and only questions pertinent to that role were asked in each branching survey.

Evaluation site visits with each clinic participating in the learning collaborative were conducted by FRI. These visits took place mid-way through the learning collaborative process. During the site visits, formal and informal interviews were conducted with various staff members, including clinical leadership and clinical staff. Overall, the site visits focused on collecting data on screening procedures, intervention processes, referral networks, and general clinic workflows.

EHR data were also collected from the participating FaCES sites on a monthly basis. The National Council provided sites with a standardized set of SBIRT data points to assist them in tracking population health and quality improvement elements. This data fell into several categories: client demographics, screening process, screenings results, interventions delivered, and referral information. Throughout the learning collaborative, the EHR data were turned into Tableau dashboards, which were returned to the sites following each data submission. The data were also analyzed to assess fidelity to the SBIRT model.

Analysis

Quantitative analysis

OSA data was analyzed only from the sites that submitted results for each of the 3 time points to give an unbiased picture of how sites changed over time. Means and standard deviations were calculated for each scale across sites at each time point. Effect sizes (Cohen’s d) were calculated to assess the magnitude of the change between baseline and the final time point. Values for d were interpreted as 0.2 representing a small effect, 0.5 representing a medium effect, and 0.8 representing a large effect.Citation21 For the OSA item asking sites to endorse how long their average BI takes, a Spearman’s Rho was calculated to determine if average time of BI changed through the 3 time points. This value was interpreted as 0.1 representing a small effect, 0.3 representing a medium effect, and 0.5 representing a large effect.Citation21

Implementation survey results were disaggregated by position within the clinic (e.g., nurse, primary care physician [PCP]) to understand how implementation may have varied across the different clinical roles. Means and standard deviations were calculated for each scale across sites and across administrations of the instrument. Effect sizes (Cohen’s d) were calculated for each clinical position comparing scores on each item from baseline to follow-up. The barriers to screening and implementation were summarized as percentage of each role that endorsed a particular barrier, at each time point.

The EHR data were descriptively summarized to assess the characteristics of the patients served through this project. Additionally, the EHR data were used to assess whether sites were able to more consistently record completed screening results over time and if staff improved in matching screening results to the intervention delivered. This time was then put into binary logistic regression models to see if time predicted whether a complete screening result was recorded for that clinical record and, separately, if time predicted whether the intervention indicated matched the risk level of the S2BI screen. The EHR data were also analyzed to assess decreased S2BI scores among patients who screened as having used substances in the past year at their initial appointment and received at least one follow-up screening. For patients with multiple records, each subsequent record was coded for days since their baseline screen. Changes on risk level over time were assessed through multi-level regression modeling to determine if substance use decreased over time.

All quantitative analyses were conducted through SPSS version 23 software.

Qualitative analysis

The qualitative interview and observational data produced through the FRI site visits were mapped onto core components of SBIRT implementation. The site visit summaries produced for each individual FaCES site were inspected for themes that emerged across sites.

Results

Over the course of the FaCES change package implementation evaluation, 6 key themes emerged as major contributors or barriers to the overall success of the implementation:

Operational readiness

Site visits indicated the importance of having team cohesion and a dedicated core implementation team. Communication and cohesion of the team emerged as a factor in implementing the new system. This presented itself in four sub-themes: core implementation team cohesiveness, effective communication, engaged staff, and well-functioning preexisting systems for delivering care. The site that was noted during the site visit to experience the most resistance from staff in implementing SBIRT was also noted during the site visit to exhibit a lack of trust, communication, and cohesion between the organizational leadership and the project implementers.

The results of the OSA show a generally strong commitment from senior leadership, starting at baseline (). Sites rated their commitment from senior leadership relatively high at baseline (3.17). On this outcome there was a marginal to small increase observed through the course of the learning collaborative (d = 0.17). The core implementation team was rated less strongly at baseline (2.87) but increased with a moderate to large effect over time (d = 0.73). For both these scales, the largest increase was observed between baseline and the midpoint assessment. Overall, sites began the learning collaborative relatively lower in their readiness to track and analyze data, as well as practice continual quality improvement. However, over the course of the learning collaborative, sites reported large effects in increasing both data (d = 0.90) and quality improvement (d = 1.65) capabilities.

Table 3. Results from the organizational self-assessment, means (SD) and effect size (d) (n = 9).

A barrier indicated by several sites, through the OSA, was the difficulty of funding due to lack of billing codes and a reliability on grants. Sites reported relying on embedding SBIRT services within other billable services and/or being reliant on grant funding.

Training

The need to train staff on SBIRT practices emerged as a theme through the site visits that, while it had many different methods of execution, was consistently presented as being connected to implementation. Sites viewed adequate training as directly related to successful program implementation. The change package instructs sites to have initial and ongoing training opportunities for staff involved in SBIRT delivery but allows individual sites to determine what mode of instruction aligns most closely with their needs and capacity. Trainings consisted of external efforts of the National Council and internal efforts. Throughout the learning collaborative, an array of training and technical assistance options were made available by the National Council to each clinical site. However, the specific training provided to each site was dictated by site needs. Sites varied in how they utilized the provided training opportunities. Some organizations trained all staff, while others trained select staff only. Other sites utilized trained staff to facilitate the training of other and future staff. Helping staff understand how to fit SBIRT activities into their existing clinical workflow was another issue raised.

The results of the OSA align with site visit feedback indicating sites faced substantial need for training. (Please see .) At baseline, approximately half of screeners and interventionists had received training in SBIRT. By the final time point, both figures were above 85%, constituting large effects (d = 0.78 and d = 0.73, respectively).

The implementation survey assessed training needs across different staff positions. At the beginning of the learning collaborative, behavioral health counselors (BHCs) and PCPs reported high levels of formal training specifically for the screening of substance use, at 87% and 79%, respectively. Nurses had lower proportions who received training in screening, at 56%. This finding is taken in the context of the OSA result that nurses and medical assistants (MAs) are the most commonly used staff positions for administering screens. BHCs also had the highest proportion of formal training in delivering BIs for substance use, at 87%. PCPs had the second-highest proportion of formal training in BIs (72%), followed by nurses (61%), care coordinators (44%), and front desk staff (42%). This is also taken in the context of the OSA finding that PCPs were the most common staff administering the BIs, but that most sites used a combination of positions, including BHCs and nurses.

Screening process

Through the site visits, several issues around screening arose that the individual clinics had to navigate. A period of time was required to refine the screening process, including incorporating the S2BI into their workflow. Sites also had to determine what additional screenings they would conduct. There was also the matter of who was to administer the screen and how it would be administered (electronic or pencil-and-paper). What emerged was that sites were tailoring the screening guidance from the FaCES change package to best fit within their individual clinics.

The implementation survey found there was increased confidence in delivering screenings for some clinic positions, but not for others (). Moderate increases in screening confidence were observed for nurses and PCPs (d = 0.42 and d = 0.49, respectively). MAs/front desk staff, care coordinators, and BHCs showed negligible increases in confidence, though BHCs had the highest level of confidence to begin with. In addition to this increased confidence, there was an observed increase over time in the proportions of clients with recorded complete screening results within the clinical records (Wald(1) = 65.48, p < 0.001).

Table 4. Results from the implementation survey by clinic position, means (SD) and effect size (d).

The implementation survey also identified barriers, by position, associated with administering screens (). The 3 most common barriers were insufficient time, lack of trust that the patient would be truthful about their substance use, and lack of privacy to administer the screen. However, the EHR data indicated that despite these barriers, sites were routinely screening patients. Of the total number of clients encountered, 89.6% had substance screening information in their records.

Table 5. Primary barriers identified through the implementation survey by clinic position.

Intervention delivery

The implementation survey contained multiple items related to intervention delivery. (Please see .) For confidence in counseling patients on substance use, both BHCs and PCPs rated their baseline confidence higher than care coordinators, but only BHCs reported increased confidence at follow-up (d = 0.24). When assessing for the belief that BIs were effective, only marginal differences existed among staff positions at baseline. Small to medium increases in this belief were observed among care coordinators (d = 0.42), nurses (d = 0.58), and PCPs (d = 0.28). Interestingly, there was a small decrease in this belief among MAs/front desk staff (d = 0.23). There was no change in this belief among BHCs, though they did have the highest rating at baseline. When assessing for familiarity with BI procedures, BHCs reported the highest level of baseline familiarity, followed by PCPs, then by care coordinators. Among these three groups, small to medium increases in familiarity were observed among the BHCs (d = 0.34) and care coordinators (d = 0.29). These findings would also be taken in the context of the OSA finding that PCPs were the staff most commonly reported as delivering the BIs.

The implementation survey also assessed for barriers to providing BIs. (Please see .) The top three barriers were the same as identified around screening: insufficient time, lack of trust that the patient would be truthful about their substance use, and a lack of privacy to administer the screen. The constraint around time was particularly pronounced among PCPs, with 81% reporting this barrier at baseline. This is again in light of the OSA finding that the most common position delivering BIs are PCPs. The OSA assessed for the approximate average time that was being spent on a BI (). The results indicate that there was a large effect toward reducing the amount of time being spent on delivering each BI (ρ = −0.51). In addition to becoming more efficient in delivering BIs, analysis of the EHR data revealed that clinic staff improved in matching screening results with the appropriate intervention over time, per the algorithm contained in (Wald(1) = 19.45, p < 0.001). Among the 3,811 clinical records that had both screening and intervention detail, 3,510 (92.1%) received the appropriate intervention (Please see ). outlines the proportions of cases with an appropriately matched intervention through the course of the learning collaborative.

Figure 1. Organizational self-assessment results for number of sites endorsing their average time spent delivering a brief intervention at each time point (n = 9).

Figure 1. Organizational self-assessment results for number of sites endorsing their average time spent delivering a brief intervention at each time point (n = 9).

Table 6. Proportions of cases with intervention matching screening results over time.

Referral process

The change package provides information on the importance of appropriate referral sources and guidance on establishing partnerships with outside organizations when needed. However, the change package does not dictate whether sites should establish internal capacities to provide substance use services or whether they should prioritize external networks. Site visit summaries showed that sites varied in terms of their capacity to make internal referrals. The organizations that had integrated behavioral health or substance use services would treat appropriate cases with internal referrals. These internal referrals may be to different individuals or programs within the organization based on the needs of the patient and the scope of practice of the integrated clinician(s). Those that did not have this internal capacity would refer patients to appropriate external services.

All but one site indicated, through the OSA, a network of substance use treatment providers at the outset of the learning collaborative, with most sites indicating the ability to provide both internal and external referrals for substance misuse. Referral capacities and sources remained stable over time. The biggest difference revealed through this evaluation was the ability to track the referral. Analysis of the EHR data indicated that internal referrals were far more likely to have an indication of whether the referral was attended by the patient, compared to external referrals.

Electronic health record

Site visit summaries uncovered stark differences in the actualization of modifications to their respective EHR systems and how this impacted their implementation, reporting, and workflows. Across sites, records were either stored directly in the EHR, existed as scanned files, paper files, or some combination of the three. The double documentation cited among sites without a modifiable EHR was identified as a substantial barrier to integrating SBIRT practices into their workflow. Several sites identified this as a drawback due to extra staff time as well as the fact that the results could not be run as a report or used to determine follow-up timelines, which had a negative impact on quality improvement efforts.

Though the OSA showed overall large effects for increases in both having procedures and tools for continually tracking data (d = 0.90) and practicing continual process improvement (d = 1.65; Please see ), the individual scores provided by sites indicated that several sites struggled with these components. Only one site identified themselves as being exemplary (4.0) at tracking and analyzing data at the final time point. No site had an exemplary rating for continual process improvement at the final time point, and two sites rated themselves as not meeting this standard at all (0.0) at baseline.

Patient outcomes

Among the patients who initially screened as having used substances in the past year and who had at least one follow-up screen, there was a significant trend toward reduced substance use over time (F(1,547) = 64.58, p < 0.001). contains the proportions of patients in each risk category at baseline and at 90+ day follow-up, among those patients who screened to be at least low risk at baseline.

Table 7. Screening results at baseline and follow-up for patients reporting substance use (n = 245).

Client characteristics

The EHR data submitted by the participating FaCES sites was compiled and a total of 12,126 unique patients within the eligible age range were identified as having engaged in clinic services. contains the demographic and substance use characteristics of these patients.

Table 8. Demographic and substance use characteristics of patient served through the FaCES initiative.

Missing values

Several of the indicators had missing values. For the OSA 9 of the 13 sites provided data at all 3 time, which represents a significant number of clinics missing data on at least one OSA assessment.

Not all sites provided EHR data each month. One site left the learning collaborative near the end of the project and, therefore, did not provide EHR data for the final months of the collaborative. The EHR data also had significant amounts of missingness, particularly related to tracking interventions. Overall, approximately half the client records submitted did not have any indication of what, if any, intervention was delivered. It could not be determined if this was indicative of a lack of interventions being delivered or if it related to the difficulty in tracking this data.

For the implementation survey, the overall response rate was approximately 61%, 55% at baseline and 66% at follow-up. The survey was also unidentified; therefore, it could not be determined if the individuals providing data at the baseline were the same individuals who responded to the follow-up administration.

Discussion

The findings of this evaluation support that sites were able to successfully implement SBIRT through the utilization of the FaCES change package, training and technical assistance during the learning collaborative. Sites demonstrated meaningful growth during the learning collaborative on most metrics of organizational readiness and practice implementation. The findings also support the effectiveness of SBIRT practices in addressing adolescent substance misuse.

On most indicators of operational readiness, the participating sites began the learning collaborative with varying levels of readiness to implement SBIRT practices. Though variation existed in progress, most gains in operational readiness were observed in the first half of the learning collaborative. This indicates that many of the elements of the change package were able to be adopted within the first 9 months of implementation.

A critical task that sites faced at the outset of the learning collaborative was training key staff in the implementation of SBIRT. This was particularly important for nursing staff, as they had large proportions who indicated training needs for both screening and intervention. Nursing staff were also a position reported as often being involved in the screening and/or intervention process. To meet these training needs, sites were able to utilize training and technical assistance from the National Council in addition to utilizing the change package itself.

Several findings emerged around the process of screening patients. Sites had to navigate the process of tailoring their screening workflow to meet clinic and patient needs. Per the change package, each site began administering the S2BI but also had to take a critical look at what else they were currently screening for and how to best streamline workflows for their clinic, as well as train staff on proper screening procedures. Ultimately, sites employed tailored solutions for this process to best meet their patients’ needs and fit the process within their workflow capacity. For instance, some sites reported asking parents to leave the room when the adolescent filled out the screen. Another site was able to administer the S2BI via electronic tablet, which removed the step of needing to manually enter data from a paper form. Overall, sites were able to screen a high proportion of patients through the course of the collaborative.

Delivering the BIs constituted a major shift for the practices, particularly medical professionals. In general, the mental health professionals (BHCs) began the learning collaborative with higher levels of familiarity and comfort with SBIRT practices than medical professionals (e.g., nurses, PCPs). However, during the learning collaborative, staff reported becoming more confident and competent in delivering the BIs. Over time, staff also improved at matching the correct intervention for a patient’s particular risk level. Staff reported requiring shorter periods of time to deliver a BI, indicating that they became more efficient with the practice and were able to tailor the BI delivery approach to fit within the time constraints that exist in the primary care setting. The majority of sites reported on the OSA that at the final assessment, their brief interventions, on average, were taking 5–15 min corresponds with the extant literature’s recommendation of how much time a typical BI should require.Citation22–24 There was also an observed increase in the perception that BIs are effective, among most providers.

At the beginning of the learning collaborative, almost all the sites reported having a network of substance use providers to refer to. What separated the referral process was whether the organization had the capacity to provide internal referrals. Sites indicated a preference for making internal referrals because they were able to directly “hand off” the patient to the substance use treatment provider. The finding that sites documented follow-up for internal referrals at a higher proportion indicates that an integrated care setting may better facilitate care coordination. Sites that did not have the capacity for internal referrals reported lacking the capacity to managing ongoing care past the intervention phase.

Another major challenge for sites was being able to tailor their EHRs to store relevant SBIRT indicators, including the screening results, intervention, and referral data. The sites varied in their ability to store data, with some sites being able to update their EHRs relatively quickly. Other sites reported being unable to modify their record system, in some instances relying on paper records, which created a burden upon workflow, as well as inhibited the ability to generate reports and conduct quality improvement activities. The largest deficits were observed in intervention delivery data. This suggests that the field could benefit from a standardized set of SBIRT metrics that can be more universally incorporated into primary care medical record systems. Such integration into standard medical recording would remove the often onerous task of manually tracking SBIRT data points. To this end, the National Council developed specific guidance within the change package regarding what metrics are most useful for practices to monitor.

In addition to the factors around implementation, the data showed a reduction in S2BI risk level over time for those patients who indicated substance use on their initial screen. This finding provides further support for the effectiveness of adolescent SBIRT.

There were limitations to this evaluation related to missing data. This missingness came both in the form of complete non-response, as was observed with the OSA and the implementation survey, as well as in the form of missing values within submitted data. The latter being a larger factor with the EHR data. In either case, this missingness posed limitations in the analysis and interpretation of the data.

Another limitation of the study was the lack of fidelity monitoring of the interventions being delivered. Providers were given guidance and training on delivering the brief interventions for each risk level. However, there was no formal fidelity checking.

Through the course of the learning collaborative, the National Council received feedback on the change package from the participating sites. This feedback was used to refine the content of the change package. For instance, the refined version contains nuanced guidance for delivering shorter brief interventions. For referral to treatment, elements were added to emphasize the long-term management of substance in primary care. Additional guidance was also added about managing the EHR, including developing data point guidance and providing sample data dashboards.

Future research needs to continue investigating best practices in adolescent SBIRT and what organizational characteristics contribute most strongly to successful implementation. SBIRT sustainability is another issue that needs continued focus and efforts. The next extension of this work by the National Council will focus on broader dissemination of the refined change package. Through continued support from the Conrad N. Hilton Foundation, further evaluation will look at how SBIRT practices are adopted in primary care associations, as well as individual clinical settings.

Author contributions

AS: Lead author of initial and revised draft, lead data analyst, lead in interpretation of results and structuring of manuscript.

LD: Lead content expert, assisted in writing and revising of manuscript, oversaw evaluation design.

PP: Content expert, assisted in writing and revising of manuscript.

MS: Assisted with data analysis, assisted in writing and revising of manuscript.

SL: Context expert, consulted on structure and focus of manuscript, assisted in writing of manuscript.

Additional information

Funding

Conrad N. Hilton Foundation, The funding organization had no role in the design and conduct of the study; collection, management, analysis, and interpretation of the data; preparation, review, or approval of the manuscript; and decision to submit the manuscript for publication.

References

  • Nora D, Volkow M. Principles of adolescent substance use disorder treatment: a research-based guide. PNAS. 2004;101:8174–8179.
  • Johnston LD, Miech RA, O'Malley PM, Bachman JG, Schulenberg JE, Patrick ME. Monitoring the Future National Survey Results on Drug Use, 1975-2018: Overview, Key Findings on Adolescent Drug Use. Ann Arbor, MI: Institute for Social Research, University of Michigan; 2019.
  • Meier MH, Hill ML, Small PJ, Luthar SS. Associations of adolescent cannabis use with academic performance and mental health: a longitudinal study of upper middle class youth. Drug Alcohol Depend. 2015;156:207–212.
  • Ritchwood TD, Ford H, DeCoster J, Sutton M, Lochman JE. Risky sexual behavior and substance use among adolescents: a meta-analysis. Child Youth Serv Rev. 2015;52:74–88.
  • Volkow ND, Koob GF, McLellan AT. Neurobiologic advances from the brain disease model of addiction. N Engl J Med. 2016;374(4):363–371.
  • Substance Abuse and Mental Health Services Administration. SBIRT: Screening, Brief Intervention, and Referral to Treatment. Substance Abuse and Mental Health Services Administration, 2019. [Online]. Available: https://www.integration.samhsa.gov/clinical-practice/sbirt. Accessed December 19, 2019.
  • Alinsky RH, Percy K, Adger H, Jr, Fertsch D, Trent M. Substance use screening, brief intervention, and referral to treatment in pediatric practice: a quality improvement project in the Maryland adolescent and young adult health collaborative improvement and innovation network. Clin Pediatr. 2020;59(4–5):429–435.
  • Levy S, Knight J. Screening, Brief Intervention, and Referral to Treatment for adolescents. J Addict Med. 2008;2(4):215–221.
  • Mitchell SG, Schwartz RP, Kirk AS, et al. SBIRT implementation for adolescents in urban federally qualified health centers. J Subst Abuse Treat. 2016;60:81–90.
  • Substance Abuse and Mental Health Services Administration Center for Integrated Health Solutions. SBIRT: Screening, Brief Intervention, And Referral to Treatment: Opportunities for Implementation and Points for Consideration. Washington, DC: National Council for Community Behavioral Healthcare; 2015.
  • Substance Abuse and Mental Health Services Administration. Why SBIRT? SBIRT Colorado. Improving Health. Saving Lives. Denver, CO: SAMHSA.
  • US Prevention Taskforce. Final recommendation statement, unhealthy alcohol use in adolescents and adults: screening and behavioral counseling interventions. https://www.uspreventiveservicestaskforce.org/Page/Document/RecommendationStatementFinal/unhealthy-alcohol-use-in-adolescents-and-adults-screening-and-behavioral-counseling-interventions.
  • Levy SJ, Williams JF. Substance use Screening, Brief Intervention, and Referral to Treatment. Pediatrics. 2016;138(1):e20161211–e20161211,
  • Mitchell SG, Gryczynski J, O'Grady KE, Schwartz RP. SBIRT for adolescent drug and alcohol use: current status and future directions. J Subst Abuse Treat. 2013;44(5):463–472.
  • Stanhope V, Manuel JI, Jessell L, Halliday TM. Implementing SBIRT for adolescents within community mental health organizations: a mixed methods study. J Subst Abuse Treat. 2018;90:38–46.
  • Mitchell SG, Gryczynski J, Schwartz RP, et al. Adolescent SBIRT implementation: generalist vs. specialist models of service delivery in primary care. J Subst Abuse Treat. 2020;111:67–72. Epub 2020 Jan 20.
  • Sterling S, Kline-Simon AH, Satre DD, et al. Implementation of Screening, Brief Intervention, and Referral to Treatment for adolescents in pediatric primary care: a cluster randomized trial. JAMA Pediatr. 2015;169(11):e153145.
  • Goplerud E, McPherson TL. Implementation Barriers to and Facilitators of Screening, Brief Intervention, Referral, and Treatment (SBIRT) in Federally Qualified Health Centers (FQHCs). Chicago, IL: NORC at the University of Chicago, 2015.
  • Hargraves D, White C, Frederick R, et al. Implementing SBIRT (Screening, Brief Intervention and Referral to Treatment) in primary care: lessons learned from a multi-practice evaluation portfolio. Public Health Rev. 2017;38(1):31.
  • Levy S, Weiss R, Sherritt L, et al. An electronic screen for triaging adolescent substance use by risk levels. JAMA Pediatr. 2014;168(9):822–828.
  • Cohen J. 1988. Statistical Power Analysis for the Behavioral Sciences. 2nd ed. Hillsdale, NJ: Lawrence Earlbaum Associates.
  • Agerwala SM, McCance-Katz EF. Integrating Screening, Brief Intervention, and Referral to Treatment (SBIRT) into clinical practice settings: a brief review. J Psychoactive Drugs. 2012;44(4):307–317.
  • Manuel JK, Satre DD, Tsoh J, et al. Adapting Screening, Brief Intervention, and Referral to Treatment for alcohol and drugs to culturally diverse clinical populations. J Addict Med. 2015;9(5):343–351.
  • Bernstein E, Bernstein J, Feldman J, et al. An evidence based alcohol Screening, Brief Intervention and Referral to Treatment (SBIRT) curriculum for emergency department (ED) providers improves skills and utilization. Substance Abuse. 2007;28(4):79–92.