Implementation science is an area that seeks to bridge the gap between research and practice (Forman et al. , 2013). Within the system of a school, there are many areas that have been researched, such as academic problems and externalizing and internalizing behaviors (Forman et al. , 2013; Owens et al. , 2008; Moceri et al, 2012). This research has yielded evidence based intervention techniques, interventions that have been scientifically tested and shown to have an impact (Forman et al. 2013). While a specific intervention might have been proven to e the best course of action for addressing an issue, it might not be reaching the students who need it because it is not being implemented in the schools. The definition of implementation science describes the process of putting a program in place as opposed to simply studying the intervention that is being enacted (Forman et al. , 2013).
If a Positive Behavioral Interventions and Supports (PBIS) system is being enacted at the school, implementation science is looking at the training given to staff and the institutional support provided, not if the students are receiving the best intervention. Another key difference is that intervention outcomes and implementation outcomes are different constructs (Forman et al. , 2013). Intervention outcomes in a PBIS system might be a reduction in bullying while the implementation outcomes would be referring to the fidelity of the intervention provided.
The fidelity of an intervention refers to how closely it is implemented in the way in which the researchers have verified through scientific research. Another important concept to understand is the difference between sustained implementation and sustainability. Sustained implementation is when a program is used with fidelity over time while sustainability is the “potential of sustained implementation” based on the intervention characteristics (Turri et al. , 2016, p. 6). Moceri, Elias, Fishman, Pandina, and Reyes-Portillo (2012) contend that implementation and sustainability are, for all practical purposes, the same concept.
Implementation can also be defined by its essential components or the stages that take place. Forman, Shapiro, Codding, Gonzalez, Reddy, Rosenfield, Sanetti, and Stoiber (2013) contended that the necessary components are: a new rogram that is introduced, communication between the people who want to implement it and the people who will be part of the program, a social system, which in the context of school psychology would be a school, and a person who is actively attempting to make the change, possibly a school psychologist. The stages involved in implementation closely mirror the components.
One way to conceptualize the stages is: information on the program being shared, the decision to try the program, the first attempt to enact the program, and then maintenance of the program to ensure fidelity (Forman et al. 2013). Another way to conceptualize the stages is based on the amount of time the intervention has been enacted, or initial implementation, full operation, and sustainability (Turri et al. , 2016). There does not seem to be a standard in the field for the stages involved in implementation science; each researcher seems to have their own conceptualization.
Historically, this field has been interdisciplinary and was not primarily focused on psychological issues (Forman et al. , 2013). The key terms that have developed in this area refer to the process and have specialized meaning. Diffusion describes the stage where information about the program is shared, dissemination describes to the component of communication between the people who want to implement it and the people who will implement it, and implementation is the institutions support while enacting the program (Forman et al. , 2013).
Implementation science was originally based in systems theory, social learning theory, and behaviorism, but now has its own theory and framework (Forman et al. , 2013; Cappella, Reinke, & Hoagwood, 2011). The APA Division 16 Working Group that a successful model for implementation would include: an rganization with an expert at the intervention, clear instructions for the intervention, proper training to enact the intervention, institutional support, and differentiation of guidance based on the stage of intervention (Forman et al. , 2013).
The aim of implementation science is to identity the issues in implementing interventions and to find ways to overcome those problems. Equally important is understanding the context of the situation and choosing an appropriate intervention (Forman et al. , 2013; Cappella, Reinke, & Hoagwood, 2011). This is an especially prevalent issue for chool psychologists because of the need to choose appropriate intervention and encouraging schools to use evidence-based interventions. Evidence-based treatment is the basis for the services that school psychologists provide.
These are interventions and treatment that have been scientifically tested and shown to be effective. The fidelity of the intervention, along with the development of the student, depend on proper implementation. Unfortunately, the rates at which evidence- based interventions are successfully implemented in schools are low (Forman et al. , 2013). The low rates are a result of the any barriers to successful implementation, which span across every stage of the process and are also embedded in the context of schools.
Some of the overarching barriers include: systems level factors, such as funding, laws, and politics, personal characteristics of the implementer, the fit of the program, training problems, support at the various levels of the school (institutional support), staff turnover, enmeshing the new program with existing ones, time, goals of the intervention, and the impression of improvement (Forman et al. , 2013; Turri et al. , 2016). One discrepancy Moceri et al. 2012) found was that staff turnover did not affect the fidelity of implementation.
Staff turnover is suspected to be a barrier based on the reasoning that if a key staff member leaves and a new, untrained staff member takes their place, the fidelity or motivation to continue an intervention would suffer (Turri et al. , 2016). Moceri et al. (2012) hypothesized their results were the product of a similar process, but that at the same time unsupportive staff were being replaced with people who would approve of the implementation, creating an insignificant correlation of staff turnover and implementation. Barriers to successful implementation change based on the role being filled.
School psychologists have slightly different barriers than teachers or staff members. The most important factors are belief in the effectiveness in the intervention, the availability of resources, and the acceptability to the school (Forman et al. , 2013). The people who are normally implementing these interventions are the teachers and they also have a specific set of barriers to implementation. Barriers for teachers include not enough training, skills, funding, efficacy of the intervention, and not aving the knowledge to choose proper interventions (Forman et al. 2013; Turri, et al. , 2016). One of the biggest concerns with research on implementation is that a large portion of the research on barriers to implementation need to be interpreted carefully because it relies heavily on focus groups and interviews (Forman et al. , 2013; Turri, et al. , 2016; Moceri et al. , 2012).
This is a common complaint with qualitative data and while it is important to keep the limitations of qualitative data in mind, there are not always other feasible methods to measuring certain constructs. Turri et al. 016) aimed to eliminate some of the skepticism of qualitative data by using a survey to examine barriers previously identified in the literature. Other surveys are also being developed for more quantitative analysis of implementation and sustainability, such as the Schools Implementing Towards Sustainability (SITS) scale (Moceri et al. , 2012). While the general census is that there needs to be more quantitative data in the field of implementation science, there are no established scales or measures to use at the moment. One of the most pressing issues in the field is the importance of idelity.
Higher fidelity of implementation is reflected in better effect sizes, but even some studies do not monitor the fidelity of their interventions. In schools, when there are higher perceptions of barriers, there is lower fidelity in the beginning years of implementation (Turri et al. , 2016). When only parts of an intervention are implemented or not as rigorously enforced, the outcomes cannot be predicted (Forman et al. , 2013). What is known is that implementations that are adapted to the specific needs of a school are more likely to continue being used (Forman et al. , 2013).
Maintenance is important to consider when implementing a new program, but if it fails or is ineffective, the culprit could either be the lack of fidelity or a problem in the program (Forman et al. , 2013). Possible solutions for the lack of fidelity in schools include proper training and support for the staff (Forman et al. , 2013). Training staff to be competent in an intervention would have concrete examples and the ability to practice new strategies (Forman et al. , 2013). Proper maintenance of the program requires feedback to the staff, so that corrections can be made and the fidelity can be ept high (Cappella, Reinke, Hoagwood, 2011).
Moceri et al. (2012) found that one of the most important constructs for continued success was support from experts outside of the school and Owens et al. (2008) also found this to be an important characteristic. Another possible solution to a lack of fidelity could be addressing possible barriers before the new program is implemented (Turri et al. , 2016). Also, having a program that spans school and home life could improve the implementation and intervention outcomes (Forman et al. , 2013; Owens et al. , 2008).
One of the issues facing school sychologists today is working with diverse populations and being culturally competent. Working in a school means working with all socioeconomic levels, ethnicities, orientations, and family situations. The samples for many evidence-based treatments are not representative of these diverse populations (Forman et al. , 2013; Owens et al. , 2008). Considering diversity when implementing a new program is also important because research has shown that when parents or the larger community support a program, there are better outcomes for implementation (Forman et al. 013, Owens et al. , 2008).
If families from diverse backgrounds approve of a program, it is more likely to work than if they feel it is not targeting their needs. Owens, Murphy, Richerson, Girio, and Himawan (2008) examined the translation of behavioral interventions in rural, underserved populations in the Appalachian region of the U. S. The context of a rural school is distinct from that of an urban school, often times mental health services are “not available, not accessible, or not acceptable” in rural communities (Owens et al. , 2008, p. 435).
The locations used in this study were high poverty and low education, with a shortage of mental health professionals. The intervention was carefully chosen to fit the needs of the community and students and other best practices for implementation were carefully used. The results indicated that evidence-based interventions could maintain their positive outcomes when translated to underserved communities (Owens et al. , 2008). Interesting to note is that the researchers in this study carefully chose the best intervention and went to great lengths to ensure high implementation fidelity.
Their results suggest that successful implementation techniques identi the literature might be the solution to successful translation of interventions to diverse populations. School psychologists as scientist-practioners are required to be able to generate and evaluate new research to suggest the best interventions to best fit each specific client or school. Fundamentally, this is implementation science. There are a variety of barriers that have been identified in the literature, but also suggestions to improve implementation.