Abstract
This article presents the psychometric properties of the evidence-based practice (EBP) instrument (School Version), a new interdisciplinary measure for understanding and measuring EBP use that can be understood and used across the three professions who provide the most mental health services in schools. The instrument was developed based on theory, review of the literature, expert review (N = 12), pilot study (N = 20), and national study (N = 303). While the measure may have applicability for other groups of mental health providers in other settings, this study focused on the perspectives of mental health providers in schools, specifically school psychologists, school counselors, and social workers. Initial psychometric examination resulted in a 13-item, one factor model and indicated preliminary evidence for strong validity and internal reliability. No significant difference in total score among groups of mental health professionals was found, suggesting similarities of comprehension and application of EBP regardless of professional discipline. This instrument is the only one of its kind and provides a helpful first step towards common language and common goals when conceptualizing what it means for mental health providers to use best practice. Implications for school professionals and future research are offered.
Similar content being viewed by others
Avoid common mistakes on your manuscript.
Providing mental health services in K-12 schools has become a vast and critical undertaking as research has continued to demonstrate an alarming increase in the need for outpatient and emergency treatment for children and adolescents (Plemmons et al., 2018). The current 12-month prevalence rate of mental illness for adolescents is estimated to be 40.3% (Bagalman & Cornell, 2018). According to a 2019 report by the Substance Abuse and Mental Health Services Administration (Substance Abuse and Mental Health Services Administration, 2020), among the 3.8 million adolescents who reported a major depressive episode in the past year, nearly 60% did not receive any treatment. Of the adolescents who do receive mental health treatment, almost two-thirds of them receive treatment through school only (National Association of School Psychologists, 2021). Addressing mental health needs in schools allows students to be more likely to receive and continue services without the complications of traditional outpatient services, such as long waiting lists, lack of after-school appointment times, and long drives from home to the provider’s office (Swick & Powers, 2018; Weist, 2005). Having providers on site to deliver mental health services throughout the school day, once a novel solution to student care, has grown over the last two decades to be accepted “virtually everywhere” in the public-school system (Shernoff et al., 2017). This expansion of mental health services has been defined as “Expanded School Mental Health (ESMH)” and now has national organizations, multi-state coalitions, annual conferences, and academic journals devoted to its application.
Implementation of school mental health services through ESMH requires participation and cooperation from many parties (e.g., parents, school administration, universities, third-party reimbursement). However, the overwhelming responsibility of introducing such programs and conducting mental health service delivery falls on interdisciplinary teams that include school psychologists, school counselors, and social workers (McCance-Katz, 2019; Teich et al., 2008). These service providers have complex jobs with many, sometimes overlap**, areas of knowledge and skills. School psychologists, school counselors, and social workers may be employees of school districts or may be contracted from a district cooperative, mental health agency, or private practice to provide services. They often work directly with school administrators, community agencies, students, and families to help promote student success (American School Counselor Association, 2018; National Association of School Psychologists, 2017a; National Association of Social Workers, 2017). These mental health providers are often available to schools in a variety of combinations depending on factors such as funding, administration preference, or availability (Locke et al., 2016; Markle et al., 2014; Teich et al., 2008). Some schools may just have one mental health provider, who could be a school psychologist, school counselor, or social worker. Other schools may have one of each provider, as well as additional contracted service providers who offer services not otherwise covered by school employees. Because of the myriad of employment variations, much of the literature fails to stratify data by professions, and rather lumps all providers together when researching mental health services in schools (Teich et al., 2008; Weist, et al., 2019).
Although ESMH services may include skills and abilities from a variety of practitioners with differences in education and licensure backgrounds, each provider has a framework and guidelines to structure their practice provided by their graduate training and leadership organizations, ASCA, NASP, and NASW, all of which encourage and promote the use of evidence-based practice (EBP; Mullen et al., 2008). EBP may be represented in the literature by other names, such as evidence-based interventions (EBIs), evidence-based practice in psychology (EBPP), and empirically supported treatments (ESTs), but for consistency throughout this study, the term EBP will be used. While EBIs and ESTs are specific psychological treatments that have been shown to be efficacious in controlled clinical trials, the complete scope of EBP includes a broader range of clinical activities, such as psychological assessment, case formulation, and building therapeutic relationships (APA, 2005). EBP is part of a national undertaking stemming from the medical, education, and prevention science fields and suggests similar treatment expectations when receiving mental health care, such as when receiving “best practices” treatment during an appointment with a physician (Institute of Medicine, 2001, p. 147). According to the American Psychology Association (APA, 2005), the purpose of EBP is to “promote effective psychological practice and enhance public health by applying empirically supported principles of psychological assessment, case, formulation, therapeutic relationship, and intervention” (p. 5).
Meeting all the elements of EBP when working with students in school settings is a challenging and often daunting task. Best practice guidelines come from the US Federal Government and the US Department of Education, however, relevant literature reports significant gaps in services and quality of interventions (Ennett et al., 2003; Fabiano & Pyle, 2019; Hicks, et al., 2014; Lyon & Bruns, 2019; President's New Freedom Commission on Mental Health, 2003; Spiel et al., 2014). Even with guidelines from governing organizations, such as ASCA, NASP, and NASW, most providers are not using empirically supported treatments (McIntyre et al., 2007; Schaeffer et al., 2005) or are using outdated, eclectic, and reactive approaches with poor scientific support for efficacy (Evans & Weist, 2004; Hoover, 2018). Because of the diversity of school-based mental health providers, each person may look to their own licensing board or regulatory agency instead of one authoritative body for direction of what is considered “best practice.” In addition, each mental health profession has its own unique, and sometimes conflicting, history with adaptation to using and mandating EBP.
Several attempts have been made to identify therapist characteristics associated with adoption of EBP and encourage the application of empirical knowledge to real-world practice (Addis & Krasnow, 2000; Baumann et al., 2006; Essock et al., 2003; Rubin & Parrish, 2007). Multiple studies have detailed the many barriers providers face when trying to implement EBP in school settings (Domitrovich et al., 2008; Eiraldi et al., 2015; Jensen & Foster, 2010; Owens et al., 2002; Patalay et al., 2016; Schaeffer et al., 2005). In addition, researchers have developed assessment tools to learn more about mental health provider attitudes towards the adoption of specific EBIs and collaboration between providers and team members (Aarons, 2004; Mellin et al., 2014). Aarons (2004) created and validated the Evidence-Based Practice Attitude Scale (EBPAS) using a sample of 322 public sector clinical service workers from 51 programs providing mental health services to children and adolescents and their families. The EBPAS identified four dimensions of attitudes towards adoption of EBP that can assist with dissemination and implementation into real-world settings: (1) intuitive Appeal of EBP, (2) likelihood of adopting EBP given Requirements to do so, (3) Openness to new practices, and (4) perceived Divergence of usual practice with research-based/academically developed interventions (emphasis original).
Mellin et al. (2014) created a three-scale instrument, the Expanded School Mental Health Collaboration Instrument (School Version) (ESMHCI (SV)) based on findings from focus group interviews and a review of the literature. This instrument was intended to assess ESMH collaboration from the perspective of school-employed professionals. However, even with the development of these supporting tools, a core element is missing: there is no measure of EBP use across disciplines. Measuring EBP use using an underlying consistency of what the term means should be the first step to determine if a provider is or is not using EBP. Having a common language among mental health providers would be a natural beginning towards improving use of EBP and understanding more about barriers and facilitators to its use. The purpose of this research is to report the development and initial psychometrics of a new instrument that can be used to measure EBP use based on a shared understanding across school psychologists, school counselors, and social workers.
Challenges to Measuring EBP
Historically, measuring EBP has been a challenge. School psychology, school counseling, and social work have formalized EBP conceptualization and training requirements, which impact accreditation standards, clinical competencies, and ethical codes (Bellamy et al., 2006; Drisko, 2014; Drisko & Grady, 2015; Mullen et al., 2019; Reddy et al., 2017). Because these three groups have different experiences and histories with EBP, there has been vacillation on what constitutes EBP implementation, including current ongoing criticism regarding dissemination of evidence from clinical research trials into real-world practice settings (Bellamy et al., 2006). In an updated perspective of applying EBP in social work, Drisko and Grady (2015) write “The definition of EBP is actually very clear; it is just not effectively taught, nor well understood. It is also rarely practiced in full” (p. 275). Reasons for this include general lack of awareness of available best practices, lack of fit to patient population, or suspicion of a rapid push towards change. In addition, mental health providers have identified controversy in establishing a formal EBP definition as it has been adapted and implemented by state, federal, and health care funding entities to mandate and restrict service delivery (Jensen and Foster, 2010).
When conceptualizing how to measure EBP, much of the research has focused more on EBIs, including graduate school training, provider willingness to use them, and applicability to real-world settings (Beidas & Kendall, 2010; Karekla et al., 2004; Schaeffer et al., 2005; Weisz et al., 2005). Some of these clinical activities, such as the therapeutic relationship, have been shown to be reliable predictors of positive clinical outcomes, regardless of the psychotherapy approach or specific assessment measures used (Ardito & Rabellino, 2011). Too much focus on EBIs can also take away from the whole picture of EBP, with many practitioners conflating the two and believing that using an EBI, such as motivational interviewing (MI) or cognitive-behavioral therapy (CBT), is the only way to apply EBP (Drisko and Grady, 2015).
Despite these difficulties, many providers continue to press on with their own definition of EBP while others await further instruction. Parrish (2018) calls social workers to the importance of a common definition stating “a broad sampling of the social work literature continues to reflect confusion with the term” (p. 407). Calling the process and definitions of EBP a “circular debate,” Parrish encourages use of a common definition of EBP in order to “engage in critical and reflective thinking, ethical practice rooted in client empowerment, and practice decisions that have the most promise for hel** the clients they serve” (p. 408).
The Current Study
This study was designed to develop an instrument to understand and measure EBP use based on a shared understanding of what behaviors constitute best practice that is applicable to school-based mental health providers regardless of their discipline. Creating a measure of EBP use allows for mental health providers, employers, and professional groups to examine their own proficiency with the construct and establish a common understanding of the term. As part of this instrument development, common elements of EBP across school psychologists, school social workers, and school counselors were identified. While much has been published on the topic of barriers and facilitators to EBP in schools (Domitrovich et al., 2008; Eiraldi et al., 2015), the literature is sparse on the efforts needed to first measure appropriate understanding and agreement of EBP use across providers. It is difficult to understand what is hindering EBP when it remains unclear if mental health providers have the same understanding of what EBP means.
Framework of This Study
The framework for this study stems directly from the Evidence-Based Practice in Psychology (EBPP) definition from APA (2005), which states EBPP is the “integration of the best available research with clinical expertise in the context of patient characteristics, culture, and preferences” (p. 6). APA offers examples of each aspect of this definition as part of their policy statement (see APA, 2005, updated 2021) that were used to inspire survey questions for this instrument. While there is not one consistent definition of EBP used across all mental health professions, this definition was chosen as a starting point because APA has the largest membership base of mental health researchers, educators, providers, consultants, and students in the USA. Definitions published by APA are frequently referenced by leaders in other disciplines and by supporting health care literature, causing a “trickle down” effect as APA’s decisions are implemented into supporting professions, such as counseling and social work (Bacon, 2008; Drisko & Grady, 2015; Mullen et al.,; 2008). This definition also fits within the social ecological framework suggesting a relationship between the actions of people (providers) with their environment (schools). The social ecological model (SEM) is a “theory-based framework for understanding the multifaceted and interactive effects of personal and environmental factors that determine behaviors, and can be used for identifying behavioral and organizational leverage points and intermediaries for health promotion within organizations” (Bronfenbrenner, 1979; Centers for Disease Control and Prevention, 2014, p. 1). Using this model is helpful when conceputalizing how mental health providers in schools have an interdependent relationship with research, administrators, teachers, students, students’ families, school neighborhood, and more (APA, 2008). This model is consistent with multiple policy statements released by APA (APA, 2005; APA 2017) and helped to ground this new instrument in theory as there do not exist other similar measures for comparison.
Methods
Phase 1: Literature Review and Development of Items
A questionnaire was created for this study using the recommended steps for instrument development described by DeVellis (2012). Initial items developed for the questionnaire were based on three principal sources: (a) a review of the literature and related instruments, (b) examples of evidence-based practice in psychology (EBPP) as provided by APA’s policy statement (2005, updated 2021): and (c) the social ecological model (SEM). Questions were designed to ask participants about the professional practice elements used in their current school setting. Initially, participants were asked to respond “yes” or “no” to a series of items to indicate their implementation of various forms of EBP. The next set of questions asked to what extent each item contributes to or interferes with the participants’ use of professional practice. A 4-point Likert rating scale was used as it requires the participant to express an opinion or attitude, thus clearly indicating a definitive response (DeVellis, 2012). At its early stages, the EBP instrument was conceptualized to measure EBP either as a single construct or as three constructs consistent with: 1. best available research, 2. clinical expertise, and 3. patient characteristics, culture, and preferences, consistent with the APA definition.
Phase 2: Expert Review
An expert review was conducted to investigate the quality and clarity of the instrument (Ikart, 2019; Kelley et al., 2003). The first version of the instrument was shared with a convenience sample of school psychologists, school counselors, and social workers (N = 12) working in Central Indiana. The group consisted of professors, regional professional organization representatives, and long term (10 + years) practitioners. The consensus across the expert review was that more emphasis should be placed on finding survey terms that are consistent across professions to provide an accurate method of providing contrast between groups. The original instrument was then deconstructed to identify a total of 13 items that school psychologists, school counselors, and social workers would be expected to do as routine parts of their job. The 12 experts agreed, without any differences among school psychologists, school counselors, and social workers, that these 13 items had sufficient applicability to each profession and could be used as a measure of whether someone was using EBP. These 13 items were consistent with examples of evidence-based professional practice (EBPP) as provided by APA (2008) that were used in the first version of the instrument. The 12 experts were also asked to offer feedback on overall directions, format, and length of the instrument, which was used to update the document prior to the pilot study.
Phase 3: Pilot Study
A pilot study was conducted to establish content validity and investigate the quality and clarity of the revised instrument (Hill, 1998; Isaac & Michael, 1995). A cover letter and questionnaire were distributed using an emailed link to a convenience sample of 56 school psychologists, school counselors, and social workers in Central Indiana (Connelly, 2008). Participants were recruited from alumni databases, social networking groups for mental health providers in schools, and recommendations from participants who had already completed the questionnaire (i.e. “snowballing”). A total of 20 of the 56 potential respondents completed the pilot study (school counselors = 4, school psychologists = 8, and social workers = 8). Respondents were asked to first read the cover letter and informed consent and then complete the questionnaire as if they were participating in the actual study. Participants were given a chance throughout the questionnaire to flag questions that seemed confusing or unclear and were given space on each page to leave comments, feedback, and suggestions. At the end of the questionnaire, participants were asked to evaluate the instrument for content and clarity, consider the relevance of each item, and offer useful information that may have been overlooked. Because of the small sample size of the pilot study, data from participants was used in a qualitative format to look for any outliers or unique responses. Feedback from the pilot testing was used to fine tune the final version of the instrument for this study, which included 13 items with five response options per item (Table 5). Small changes recommended by respondents included word choices, formatting, and minor grammatical edits.
Phase 4: Study
Upon completion of the pilot study, a written request was made to the ASCA, NASP, and NASW Research Committees for access to their membership databases for participants for the current study. Following approval from ASCA, NASP, and NASW Research Committees, participants were randomly selected by a marketing agency from a current computerized list of members who were listed as currently practicing in a school setting. A total of 1000 member names from NASP and NASW were requested, as this was the maximum permitted purchase number, and 2000 member names from ASCA were requested (4000 total names requested). Additional names were purchased from ASCA to learn if increasing mailings offered a significant increase in response rate and power to determine an appropriate request number for future studies. Additional names would have been purchased from NASP and NASW if that was an available option. Each organization provided these participant names via mailing or email addresses; NASP and ASCA only allowed the purchase of physical mailing lists and NASW allowed for purchase of email addresses. Each potential participant was mailed or emailed a cover letter explaining the purpose of the study and providing a link and quick response (QR) code to access the electronic questionnaire. Participants were able to follow the link and the QR code to the informed consent page and an initial screening question that asked if the participant was an active provider of mental health services in a school setting. Participants who consented could then complete the study. After completing the questionnaire, participants had the option of providing their contact information in a separate document to enter an incentive drawing for one of 10, $25 Amazon gift cards. Two weeks after the initial mailing and email, a follow-up postcard and email reminder were sent with the same link and QR code to access the questionnaire. The survey was available for participants to access for one month. Four mailings were returned to sender as “undeliverable.” Qualtrics online software was used to collect data and was set to automatically make collected data both confidential and anonymous.
Sample
Using a recommendation for sample size adequacy for exploratory factor analysis, a sample size between 300 (good) and 500 (very good) was desired (Comrey & Lee, 1992). In total, 418 (10.5%) participants responded to the survey. The response rate was consistent with the response rate suggested by NASP; ASCA, and NASW did not provide an estimated response rate. Out of the 418 participants who responded to the survey, one participant did not provide consent and closed the survey, 29 provided no responses to any portion, and 85 provided responses, but did not answer any demographic information on the next page to allow for classification or further assessment. Overall, 303 of the 418 participants (86.5% female, 12.9% male, 0.3% preferred not to identify gender) completed the consent form, the instrument, and the demographic information. Please see Table 1 for demographic information; demographics were consistent with the most recent representative sample of NASP members (NASP, 2018); ASCA and NASW did not have readily available membership data. It should also be noted that cover letters were mailed towards the beginning of the global COVID-19 pandemic. Numerous schools closed throughout the duration of data collection, which likely impacted the response rate for this study.
Results
An exploratory factor analysis (EFA) was completed to determine the structure of the instrument. Responses of “does not apply to my position” were included in analyses; these items were filled with each participant’s average score on the remainder of the instrument as to not move the total score in an overall positive or negative direction. The EFA revealed three underlying factors with an eigenvalue greater than one. This was confirmed visually by scree plot. A three-factor solution was identified using image factoring extraction and Quartimax rotation with Kaiser normalization with the three factors accounting for 57% of the variance. The Quartimax rotation was chosen to minimize the number of factors needed to explain each variable as this study conceptualized EBP as either a single construct or three constructs, as parsed apart by APA definition. The Kaiser-Mayer-Olkin measure of sampling adequacy was 0.85, above the recommended threshold of 0.6 (Dziuban & Shirkey, 1974) and Bartlett’s test of Sphericity was significant at < 0.01. Due to five items cross-loading (Table 2), the additional item analysis was conducted and loading values were further examined. Because all five items with cross-loadings loaded with over 0.2 difference on each item, the highest loading was used, which meant that the highest loadings for all items loaded onto one factor. Further item examination using Pierson correlations between the variables demonstrated a pattern that the three factors did not correspond with the three factors hypothesized from the APA definition, so the EFA was run a second time with all items being forced to load onto one factor (Table 3). No items were deleted from the instrument as all items loaded higher than the recommended 0.35 cutoff (Tabachnick & Fidell, 2013). A reliability test was conducted using all the 13 items as one factor, which revealed a Cronbach’s alpha of 0.85. This suggests that EBP can be conceptualized and measured as a single construct. No subscales were identified throughout the analysis. See Table 4 for distribution of question responses.
After the EFA was completed, a one-way ANOVA was conducted to examine any differences in EBP use. Participants were classified into three groups: school psychologists (n = 139), school counselors (n = 98), and social workers (n = 57). These groups served as the independent variables, and the total score on the EBP instrument was used as the dependent variable. There were two outliers in the social work group as assessed by boxplot. Shapiro–Wilk test demonstrated normally distributed data (p > 0.05) and Levene’s test of homogeneity of variances demonstrated homogeneity of variances (p = 0.67). As a result, the ANOVA was run without any transformations of the data. EBP score was not significantly different among school psychologists (M = 3.45, SD = 0.33), school counselors (M = 3.45, SD = 0.35), and social workers (M = 3.54, SD = 0.34), F(2, 291) = 1.56, p = 0.21, η2 = 0.011. No further post hoc analyses were conducted. Of note, social workers received their recruitment letter by email, while school counselors and school psychologists received their letters by mail. These results suggest that there is not a difference based on profession or recruitment method.
Discussion
The purpose of this study was to create a measure of EBP use based on a common understanding of the concept that could be understood and used across the three professions who provide the most mental health services in schools. The results of this study provide preliminary evidence for the validity and internal reliability of the single construct, 13-item EBP instrument for measuring EBP use. Integration of three mental health professions into its development ensures maximal utility of the instrument across ESMH settings. This instrument can provide numerical understanding of EBP adherence and set a standard for progress improvement. While still in early stages, the rigor of an expert review, pilot study, and national study shows excellent potential for addressing the research gap through connecting APA’s formal definition to EBPP to daily real-life work environments across the nation. In addition, the instrument is brief and easy to administer via paper or electronic delivery.
Limitations
This is the first instrument of its kind (to the knowledge of this author) measuring EBP, so there are no similar measures or external criteria to compare the validity of the results. As there is no existing “gold standard” of measuring EBP, there was no inclusion of a related construct when collecting data to examine the criterion-related validity of the instrument. The evidence-based practice movement has looked different over time across mental health professions and lacks any type of measurement tool that can be used across disciplines, hence, the need for the type of exploratory work reflected in this present study. However, the scale was validated by factor analysis and strong reliability prior to use in the current study. It is important to note that other researchers may choose to handle the “does not apply to my position” responses differently in lieu of using the average weight as used in this study. The decision was made for the purpose of this study that coding the “does not apply” responses as 0 or removing them all together might disproportionally sway the overall results. However, different approaches to coding the data would be an interesting direction for a future study when further validating this instrument through additional studies.
Furthermore, as with all self-report studies, social desirability is a concern with participants wanting to paint a more positive picture of their daily practice; however, there is no specific reason why results from this study would be expected to differ from other self-report studies. Another consideration is the response rate. Out of 4000 potential respondents, 418 completed some part of the survey, but only 303 completed each section of the survey. It is unclear how much of this was related to the global COVID-19 pandemic or other factors, such as survey fatigue or recruitment techniques. Postcard and emails reminders were sent to mitigate this limitation. It is also unclear if the 85 participants who did not provide any demographic information and were removed from analyses would have meaningfully contributed to the results. In addition, organizational limitations on the number of member names available to purchase meant that more school counselors were recruited for the study than school psychologists and social workers in order to have a sufficient sample size for the desired analyses; future validation of this instrument will want to include a more balanced sample of the three different groups or avoid separation by profession as no differences were found based on profession.
An additional limitation is possible theory drift. While the Social Ecological Model (SEM) was used to help conceptualize EBP at the origin of the instrument creation, additional expert review is needed to examine if the final product continues to have a theoretical base. Significant care was taken to ground the instrument development in both professional consensus through the expert review and known EBP elements as defined through examples from APA to minimize any drift from its theoretical basis. However, it is likely that real-world responses from mental health providers who are actively working in schools may not exactly align with the SEM or even concrete examples of EBPP provided by APA. More research in this area will be needed.
Implications for Practice and Research
First, it will be imperative to further validate the instrument with a variety of settings and populations and conduct additional psychometrics on the EBP instrument. Because EBP is used in mental health care outside of schools, there is potential for this scale to be used in other settings, such as community mental health and primary care. Additional psychometrics and expert review may also be needed to establish a “cut-off” score to determine if someone was or was not using EBP in their practice. In this present study, participants were simply given a total EBP use score based on their responses. Furthermore, because EBP was conceptualized as a single construct that would apply across disciplines, the 13 items that make up the instrument are intentionally broad. Enhancing the instrument to add more specific behaviors and actions may be a future research direction after additional interdisciplinary review.
In the future, this scale could help clarify the distinction between EBP and EBIs and allow for common language among mental health providers. This scale could also be used in connection with student outcomes when tracking results of EBP use. Measuring EBP has been difficult to pin down throughout its adaptation into the mental health field (Bellamy et al., 2006; Drisko & Grady, 2015; Parrish, 2018). Defining and measuring EBP is one of the first steps for quality service improvement and could be a significant contribution to the mental health field.
Additionally, more research is needed on how mental health providers work together. School psychologists, school counselors, and social workers have different requirements to enter their respective field, such as differences in education, training, supervision, and licensure (American School Counselor Association, 2018; National Association of School Psychologists, 2017b; National Association of Social Workers, 2017). However, this study shows the fields may have more in common when it comes to EBP than they have in contrast. Because each school setting may have different employment options when it comes to providing mental health care for students, it is important to examine what crossover information is relevant for the three fields in order to best provide EBP. Sharing resources across professions, such as supervision, research journals, and continuing education trainings, may be a useful way to have a consistent message and promote collaboration surrounding EBP (Mellin et al., 2014).
Results from this study showed that there is no significant difference in EBP use based on profession. This is especially interesting, since school psychologists, school counselors, and social workers have updated their required trainings on EBP at different times and sometimes with different results (Barrio Minton et al., 2014; Hicks et al., 2014; Mullen et al., 2019; Reddy et al., 2017; Wike et al., 2019). School psychologists, school counselors, and social workers are all equipped to utilize EBP, even though their respective fields have different histories with adaptation and dissemination of EBP content. This is similar to a finding from Aarons (2004) who created the Evidence-Based Practice Attitude Scale (EBPAS) and examined attitudes of 322 mental health providers about aspects of EBP. Results from that study showed little difference between professional groups when it came to dimensions that influence evidence-based interventions (EBI) practice. This current study implies that EBP can be conceptualized and measured with one instrument that applies to all three professional groups. Consistency across the literature will be crucial to ensuring providers are talking about the same thing when trying to make changes and improve quality of mental health services (Anderson & Bronstein, 2012; Mellin et al., 2014).
Finally, this instrument can be used as a reference point when studying facilitators and barriers to EBP. Through utilizing the new instrument as a numerical understanding of EBP use, it can distinguish between providers’ perception of facilitators and barriers and the reality of actual practice. In addition, the instrument can be used to help clarify the severity of facilitators’ and barriers’ effects (i.e., if barriers can be ignored, overcome with ease, or if they are completely debilitating).
Conclusion
This study was conducted with the aim of improving mental health service delivery to one of our nation’s most vulnerable populations: our children. Global events that transpired throughout this study’s completion have only placed further emphasis on the growing need for quality mental health care in schools. School psychologists, school counselors, and social workers are currently on the front lines of trying to help children understand how issues like systemic racism and a health pandemic impact their daily lives (American School Counselor Association, 2020; National Association of School Psychologists, 2020; National Association of Social Workers, 2020). Using EBP is an important framework to ensure a standard of care as these mental health services are being provided. This research provides a starting conversation piece on commonalities among different groups of providers and discussion on what it looks like to measure a set of practices consistent with a definition. This research has created, through a rigorous process of literature review, expert review, pilot study, and national study, a brief and convenient way to identify and measure EBP use across three different professions with no significant differences among groups, indicating an exciting new chapter in measuring best practice standards.
Data Availability
The datasets generated during and/or analyzed during the current study are available from the corresponding author on reasonable request.
Code Availability
Not applicable.
References
Aarons, G. A. (2004). Mental health provider attitudes toward adoption of evidence-based practice: The Evidence-Based Practice Attitude Scale (EBPAS). Mental Health Services Research, 6(2), 61–74.
Addis, M. E., & Krasnow, A. D. (2000). A national survey of practicing psychologists’ attitudes toward psychotherapy treatment manuals. Journal of Consulting and Clinical Psychology, 68(2), 331–339. https://doi.org/10.1037//0022-006X.68.2.331
American Psychological Association. (2005). Report of the 2005 presidential task force on evidence-based practice. Author.
American Psychological Association Task Force on Evidence-Based Practice for Children and Adolescents. (2008). Disseminating evidence-based practice for children and adolescents: A systems approach to enhancing care. American Psychological Association.
American Psychological Association. (2017). Multicultural guidelines: An ecological approach to context, identity, and intersectionality. Retrieved from: http://www.apa.org/about/policy/multicultural-guidelines.pdf
American Psychological Association (2018). About APA. Retrieved November 2, 2018 from https://www.apa.org/about/
American School Counselor Association (2018). Role of the school counselor. Retrieved November 2, 2018 from https://www.schoolcounselor.org/administrators/role-of-the-school-counselor.aspx
American School Counselor Association. (2020). School counseling during COVID-19: Online lessons and resources. Author. Retrieved June 6, 2020 from https://www.schoolcounselor.org/school-counselors/professional-development/learn-more/coronavirus-resources
Anderson, E. M., & Bronstein, L. R. (2012). Examining interdisciplinary collaboration within an Expanded School Mental Health framework: A community-university initiative. Advances in School Mental Health Promotion, 5(1), 23–37.
Ardito, R. B., & Rabellino, D. (2011). Therapeutic alliance and outcome of psychotherapy: Historical excursus, measurements, and prospects for research. Frontiers in Psychology, 2, 270.
Bacon, V.L. (2008). A new vision for school counseling: Evidence-based practice. [Review of the book Evidence-based school counseling: Making a difference with data-driven practices. C. Dimmitt, J. C. Carey & T. Hatch]. PsycCRITIQUES, 53(9). https://doi.org/10.1037/a0010165
Bagalman, E., & Cornell, A.S. (2018). Prevalence of mental illness in the United States: Data sources and estimates. Congressional Research Service. Retrieved from www.crs.gov.
Barrio Minton, C. A., Wachter Morris, C. A., & Yaites, L. D. (2014). Pedagogy in counselor education: A 10-year content analysis of journals. Counselor Education and Supervision, 53, 162–177. https://doi.org/10.1002/j.1556-6978.2014.00055.x
Baskin, T. W., Slaten, C. D., Sorenson, C., Glover-Russell, J., & Merson, D. N. (2010). Does youth psychotherapy improve academically related outcomes? A Meta-Analysis. Journal of Counseling Psychology, 57(3), 290–296.
Baumann, B. L., Kolko, D. J., Collins, K., & Herschell, A. D. (2006). Understanding practitioners’ characteristics and perspectives prior to the dissemination of an evidence-based intervention. Child Abuse and Neglect, 30, 771–787.
Beidas, R. S., & Kendall, P. C. (2010). Training therapists in evidence-based practice: A critical review of studies from a systems-contextual perspective. Clinical Psychology: Science and Practice, 17(1), 1–30.
Bellamy, J., Bledsoe, S. E., & Traube, D. (2006). The current state of evidence based practice in social work: A review of the literature and qualitative analysis of expert interviews. Journal of Evidence-Based Social Work, 3, 23–38. https://doi.org/10.1300/J394v03n01_02
Breslau, J., Miller, E., Breslau, N., Bohnert, K., Lucia, V., & Schweitzer, J. (2009). The impact of early behavior disturbances on academic achievement in high school. Pediatrics, 123, 1472–1476.
Bronfenbrenner, U. (1979). The ecology of human development. Harvard University Press.
Centers for Disease Control and Prevention. (2014). The social-ecological model: A framework for prevention. Retrieved November 14, 2018 from https://www.cdc.gov/violenceprevention/overview/social-ecologicalmodel.html
Comrey, A. L., & Lee, H. B. (1992). A first course in factor analysis (2nd ed.). Lawrence Erlbaum.
Connelly, L. M. (2008). Pilot studies. MEDSURG Nursing, 17(6), 411–412.
DeVellis, R. F. (2012). Scale development: Theory and applications (3rd ed.). SAGE Publications Inc.
Domitrovich, C. E., Bradshaw, C. P., Poduska, J. M., Hoagwood, K., Buckley, J. A., Olin, S., & Ialongo, N. S. (2008). Maximizing the implementation quality of evidence-based preventive interventions in schools: A conceptual framework. Advances in School Mental Health Promotion, 1(3), 6–28.
Drisko, J. (2014). Split or synthesis: The odd relationship between clinical practice and research in social work and in social work education. Clinical Social Work Journal, 42, 182–192. https://doi.org/10.1007/s10615-014-0493-2
Drisko, J., & Grady, M. (2015). Evidence-based practice in social work: A contemporary perspective. Clinical Social Work Journal, 43(3), 274–282.
Dziuban, C. D., & Shirkey, E. C. (1974). When is a correlation matrix appropriate for factor analysis? Some Decision Rules. Psychological Bulletin, 81(6), 358–361. https://doi.org/10.1037/h0036316
Eiraldi, R., Wolk, C. B., Locke, J., & Beidas, R. (2015). Clearing hurdles: The challenges of implementation of mental health evidence-based practices in under-resourced schools. Advances in School Mental Health Promotion, 8(3), 124–145. https://doi.org/10.1080/1754730X.2015.1037848
Ennett, S. T., Ringwalt, C. L., Thorne, J., Rohrbach, L. A., Vincus, A., Simons-Rudolph, A., & Jones, S. (2003). A comparison of current practice in school-based substance use prevention programs with meta-analysis findings. Prevention Science, 4(1), 1–14.
Essock, S. M., Goldman, H. H., & Van Tosh, L. (2003). Evidence-based practices: Setting the context and responding to concerns. Psychiatric Clinics of North America, 26, 919–938.
Evans, S. W., & Weist, M. D. (2004). Implementing empirically supported treatments in the schools: What are we asking? Clinical Child and Family Psychology Review, 7, 263–267. https://doi.org/10.1007/s10567-004-6090-0.
Fabiano, G. A., & Pyle, K. (2019). Best practices in school mental health for attention-deficit/hyperactivity disorder: A framework for intervention. School Mental Health: A Multidisciplinary Research and Practice Journal, 11(1), 72–91. https://doi.org/10.1007/s12310-018-9267-2
Hicks, T. B., Shahidullah, J. D., Carlson, J. S., & Palejwala, M. H. (2014). Nationally certified school psychologists’ use and reported barriers to using evidence-based interventions in schools: The influence of graduate program training and education. School Psychology Quarterly, 29(4), 469–487.
Hill, R. (1998). What sample size is “enough” in Internet survey research? Interpersonal Computing and Technology: An Electronic Journal for the 21st Century, 6, 3–4.
Hoover, S.A. (2018). When we know better, we don’t always do better: Facilitating the research to practice and policy gap in school mental health. School Mental Health 10. https://doi.org/10.1007/s12310-018-9271-6
Ikart, E. M. (2019). Survey questionnaire survey pretesting method: An evaluation of survey questionnaire via expert review technique. Asian Journal of Social Science Studies, 4(2), 1–17.
Institute of Medicine. (2001). Crossing the quality chasm: A new health system for the 21st century. Author.
Isaac, S., & Michael, W.B. (1995). Handbook in research and evaluation. San Diego, CA: Educational and Industrial Testing Services.
Jensen, P. S., & Foster, M. (2010). Closing the research to practice gap in children’s mental health: Structures, solutions, and strategies. Administration and Policy in Mental Health, 37, 111–119.
Karekla, M., Lundgren, J., & Forsyth, J. (2004). A survey of graduate training in empirically supported and manualized treatments: A preliminary report. Cognitive and Behavioral Practice, 11, 230–242.
Kelley, K., Clark, B., Brown, V., & Sitzia, J. (2003). Good practice in the conduct and reporting of survey research. International Journal for Quality in Health Care, 15(3), 261–266. https://doi.org/10.1093/intqhc/mzg031
Locke, J., Beidas, R. S., & Marcus, S. (2016). A mixed methods study of individual and organizational factors that affect implementation of interventions for children with autism in public schools. Implementation Science, 11, 135.
Lyon, A. R., & Bruns, E. J. (2019). User-centered redesign of evidence-based psychosocial interventions to enhance implementation—Hospitable soil or better seeds? Journal of the American Medical Association Psychiatry, 76, 3–4. https://doi.org/10.1001/jamapsychiatry.2018.3060.
Markle, R.S., Splett, J.W., Maras, M.A., Weston, K.J. (2014). Effective school teams: Benefits, barriers, and best practices. In M.D. Weist, N.A. Lever, C.P. Bradshaw, and J.S. Owens, (Eds.). Handbook of school mental health: Research, training, practice, and policy. New York, NY: Springer.
McCance-Katz, E. (2019). Guidance to states and school systems on addressing mental health and substance use issues in schools: Joint informational bulletin. Substance Abuse and Mental Health Services Administration (SAMHSA). Centers for Medicare & Medicaid Services (CMS).
McIntyre, L. L., Gresham, F. M., DiGennaro, F. D., & Reed, D. D. (2007). Treatment integrity of school-based interventions with children in the Journal of Applied Behavior Analysis 1991–2005. Journal of Applied Behavior Analysis, 40, 659–672. https://doi.org/10.1901/jaba.2007.659-672
Mellin, E. A., Taylor, L., & Weist, M. D. (2014). The Expanded School Mental Health Collaboration Instrument [School Version]: Development and initial psychometrics. School Mental Health, 6, 151–162. https://doi.org/10.1007/s12310-013-9112-6
Mullen, E. J., Bledsoe, S. E., & Bellamy, J. L. (2008). Implementing evidence-based social work practice. Research on Social Work Practice, 18(4), 325–338. https://doi.org/10.1177/1049731506297827
Mullen, P. R., Stevens, H., & Chae, N. (2019).School counselors’ attitudes toward evidence-based practices. Professional School Counseling, 22. https://doi.org/10.1177/2156759X18823690
National Association of School Psychologists. (2017a). Who are school psychologists. Author. Retrieved January 18, 2018 from https://www.nasponline.org/about-school-psychology/who-are-school-psychologists
National Association of School Psychologists. (2017b). NASP Practice Model 10 Domains. Author. Retrieved January 18, 2018 from https://www.nasponline.org/standards-and-certification/nasp-practice-model/nasp-practice-model-implementation-guide/section-i-nasp-practice-model-overview/nasp-practice-model-10-domains
National Association of School Psychologists. (2020). COVID-19: Resource center. Guidance and Supports. Author. Retrieved June 1, 2020 from https://www.nasponline.org/resources-and-publications/resources-and-podcasts/covid-19-resource-center
National Association of School Psychologists. (2021). Comprehensive school-based mental and behavioral health services and school psychologists [handout]. Author.
National Association of Social Workers. (2017). Why choose the social work profession? Author. Retrieved January 18, 2018 from https://www.socialworkers.org/Careers/Career-Center/Explore-Social-Work/Why Choose-the-Social-Work-Profession
National Association of Social Workers. (2020). Social work is grappling with two pandemics: COVID-19 and racism. Author. Retrieved June 6, 2020 from http://www.socialworkblog.org/advocacy/2020/06/two-pandemics/
Owens, P. L., Hoagwood, K., Horwitz, S. M., Leaf, P. J., Poduska, J. M., Kellam, S. G., & Ialongo, N. S. (2002). Barriers to children’s mental health services. Journal of the American Academy of Child and Adolescent Psychiatry, 41(6), 731–738.
Parrish, D. E. (2018). Evidence-based practice: A common definition matters. Journal of Social Work Education, 54(3), 407–411. https://doi.org/10.1080/10437797.2018.1498691
Patalay, P., Giese, L., Stankovic, M., Curtin, C., Moltrecht, B., & Gondek, D. (2016). Mental health provision in schools: Priority, facilitators and barriers in 10 European countries. Child and Adolescent Mental Health, 21(3), 139–147.
Payton, J., Weissberg, R. P., Durlak, J. A., Dymnicki, A. B., Taylor, R. D., Schellinger, K. B., & Pachan, M. (2008). The positive impact of social and emotional learning for kindergarten to eighth-grade students: Findings from three scientific reviews. Collaborative for Academic, Social, and Emotional Learning.
Plemmons, G., Hall, M., Doupnik, S., Gay, J., Brown, C., Browning, W., & Williams, D. (2018). Hospitalization for suicide ideation or attempt: 2008–2015. Pediatrics, 141(6), e20172426. https://doi.org/10.1542/peds.2017-2426
President's New Freedom Commission on Mental Health. (2003, April). Retrieved January 31, 2018 from http://www.mentalhealthcommission.gov/mission.html
Reddy, L. A., Forman, S. G., Stoiber, K. C., & Gonzalez, J. E. (2017). A national investigation of school psychology trainers’ attitudes and beliefs about evidence-based practices. Psychology in the Schools, 54(3), 261–278.
Rubin, A., & Parrish, D. (2007). Views of evidence-based practice among faculty in master of social work programs: A national survey. Research on Social Work Practice, 17, 110–122.
Schaeffer, C. M., Bruns, E., Weist, M., Stephan, S. H., Goldstein, J., & Simpson, Y. (2005). Overcoming challenges to using evidence-based interventions in schools. Journal of Youth and Adolescence, 34(1), 15–22. https://doi.org/10.1007/s10964-005-1332-0
Shernoff, E. S., Bearman, S. K., & Kratochwill, T. R. (2017). Training the next generation of school psychologists to deliver evidence-based mental health practices: Current challenges and future directions. School Psychology Review, 46(2), 219–232.
Spiel, C. F., Evans, S. W., & Langberg, J. M. (2014). Evaluating the content of individualized education programs and 504 plans of young adolescents with attention deficit/hyperactivity disorder. School Psychology Quarterly, 29(4), 452–468. https://doi.org/10.1037/spq0000101
Substance Abuse and Mental Health Services Administration. (2020). Key substance use and mental health indicators in the United States: Results from the 2019 National Survey on Drug Use and Health (HHS Publication No. PEP20–07–01–001, NSDUH Series H-55). Rockville, MD: Center for Behavioral Health Statistics and Quality, Substance Abuse and Mental Health Services Administration. Retrieved from https://www.samhsa.gov/data/
Swick, D., & Powers, J. D. (2018). Increasing access to care by delivering mental health services in schools: The school-based support program. School Community Journal, 28(1), 129–144.
Tabachnick, B. G., & Fidell, L. S. (2013). Using multivariate statistics (6th ed.). MA Pearson.
Teich, J. L., Robinson, G., & Weist, M. D. (2008). What kinds of mental health services do schools in the United States provide? Mental Health Promotion, 1(1), 13–22. https://doi.org/10.1080/1754730X.2008.9715741
Weist, M. D. (2005). Fulfilling the promise of school-based mental health: Moving toward a Public Mental Health Promotion approach. Journal of Abnormal Child Psychology, 33(6), 735–741.
Weist, M. D., Hoover, S., Lever, N., Youngstrom, E. A., George, M., McDaniel, H. L., & Hoagwood, K. (2019). Testing a package of evidence-based practices in school mental health. School Mental Health, 11, 692–706.
Weisz, J. R., Sandler, I. N., Durlak, J. A., & Anton, B. S. (2005). Promoting and protecting youth mental health through evidence-based prevention and treatment. American Psychologist, 60(6), 628–648. https://doi.org/10.1037/0003-066X.60.6.628
Wike, T. L., Grady, M., Massey, M., Bledsoe, S. E., Bellamy, J. L., Stim, H., & Putzu, C. (2019). Newly educated MSW social workers’ use of evidence-based practice and evidence-supported interventions: Results from an online survey. Journal of Social Work Education, 55(3), 504–518.
Acknowledgements
The author would like to thank Dr. Wilfridah Mucherah for her guidance with overseeing this project, Dr. Julia Craner for her encouragement to publish this research, and Dr. Ethan Blocher-Smith for his support with formatting and proofreading.
Author information
Authors and Affiliations
Contributions
All work was completed by the sole author of this manuscript including study conception and design, data collection and analysis, and writing.
Corresponding author
Ethics declarations
Ethics Approval
This study was approved by the Ball State University IRB Committee.
Consent to Participate
Informed consent was requested and granted by all participants.
Consent for Publication
I, Lindsay G. Flegge, hereby declare that I participated in the study and development of the manuscript titled The Evidence-Based Practice (EBP) Instrument (School Version): Development and Initial Psychometrics of a New Interdisciplinary Scale. I have read the final version and give my consent for the article to be published in Contemporary School Psychology.
Conflict of Interest
The author declares no competing interests.
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Appendix 1
Appendix 1
Rights and permissions
About this article
Cite this article
Flegge, L.G. The Evidence-Based Practice (EBP) Instrument (School Version): Development and Initial Psychometrics of a New Interdisciplinary Scale. Contemp School Psychol 27, 581–592 (2023). https://doi.org/10.1007/s40688-022-00424-6
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s40688-022-00424-6