Skip to main content

Adoption, implementation and sustainability of school-based physical activity and sedentary behaviour interventions in real-world settings: a systematic review

Abstract

Background

Globally, many children fail to meet the World Health Organization’s physical activity and sedentary behaviour guidelines. Schools are an ideal setting to intervene, yet despite many interventions in this setting, success when delivered under real-world conditions or at scale is limited. This systematic review aims to i) identify which implementation models are used in school-based physical activity effectiveness, dissemination, and/or implementation trials, and ii) identify factors associated with the adoption, implementation and sustainability of school-based physical activity interventions in real-world settings.

Methods

The review followed PRISMA guidelines and included a systematic search of seven databases from January 1st, 2000 to July 31st, 2018: MEDLINE, EMBASE, CINAHL, SPORTDiscus, PsycINFO, CENTRAL, and ERIC. A forward citation search of included studies using Google Scholar was performed on the 21st of January 2019 including articles published until the end of 2018. Study inclusion criteria: (i) a primary outcome to increase physical activity and/or decrease sedentary behaviour among school-aged children and/or adolescents; (ii) intervention delivery within school settings, (iii) use of implementation models to plan or interpret study results; and (iv) interventions delivered under real-world conditions. Exclusion criteria: (i) efficacy trials; (ii) studies applying or testing school-based physical activity policies, and; (iii) studies targeting special schools or pre-school and/or kindergarten aged children.

Results

27 papers comprising 17 unique interventions were included. Fourteen implementation models (e.g., RE-AIM, Rogers’ Diffusion of Innovations, Precede Proceed model), were applied across 27 papers. Implementation models were mostly used to interpret results (n = 9), for planning evaluation and interpreting results (n = 8), for planning evaluation (n = 6), for intervention design (n = 4), or for a combination of designing the intervention and interpreting results (n = 3). We identified 269 factors related to barriers (n = 93) and facilitators (n = 176) for the adoption (n = 7 studies), implementation (n = 14 studies) and sustainability (n = 7 studies) of interventions.

Conclusions

Implementation model use was predominately centered on the interpretation of results and analyses, with few examples of use across all study phases as a planning tool and to understand results. This lack of implementation models applied may explain the limited success of interventions when delivered under real-world conditions or at scale.

Trial registration

PROSPERO (CRD42018099836).

Contributions to the literature

  • Real-world implementation and scale-up of school-based physical activity and sedentary behaviour studies remains uncommon, but critical to achieving population health goals.

  • This paper identifies where and how to improve efforts to understand how to enhance adoption, implementation and sustainability of school-based physical activity interventions under real-world conditions which is a necessary ingredient to advancing implementation science in this field and setting.

  • Improving the use of theory/model driven approaches and common language across the implementation research spectrum in school-based interventions from planning through to measurement and interpretation is highlighted. This push to include theory driven approaches and to further outline best practices for terminology and reporting is common across disciplines but important to discuss specifically in relation to physical activity interventions.

Background

Physical inactivity is a worldwide pandemic and leading cause of non-communicable disease [1]. Increased physical activity and decreased sedentary behaviour are associated with positive health impacts and healthy development in children [2, 3], and physical activity provides benefits for school-related outcomes such as classroom behaviour, cognitive function, and academic achievement [4,5,6]. Nonetheless, the 2018 Global Matrix 3.0 Physical Activity Report Card, which included 49 countries, showed that a minority of school-aged children are meeting internationally recognised guidelines for physical activity (27–33%) by accumulating at least 60 min of moderate- to-vigorous-intensity physical activity daily, and sedentary behaviour (34–39%) which recommend no more than 2 hours of screen time per day [7].

Schools have been proposed as an ideal setting to intervene [8] with numerous calls from the WHO to implement school-wide physical activity promotion programmes [9, 10]. This has led to a number of studies and systematic reviews of efficacy trials which provide evidence of reduction in sedentary behaviour, increased time spent in overall physical activity and in-school physical activity for children exposed to school-based interventions [11, 12]. In a 2013 Cochrane Review, Dobbins et al. showed that increases in physical activity ranged from five to 45 min per day and that television watching, as a marker of sedentary behaviour, was reduced by five to 60 min per day [11]. Despite showing promising findings, this review and others to date, have mostly focused on investigating interventions delivered in controlled settings, or have included studies of school policies rather than interventions, and have not reported on the implementation frameworks, models and theories (‘implementation models’) used to support this evaluation process [13,14,15,16]. There has also been far less research describing how interventions are adopted, implemented and sustained under real-world conditions (e.g. implementation studies, or studies which tested the effectiveness, scale-up, dissemination or translation of interventions) [17,18,19,20]. By ‘real-world’ we are referring to interventions delivered by school employees during their standard practice in the education system. Real-world interventions require a better understanding of the complex systems in which contextual factors, including organisations, intervention agents (i.e. implementers), target population and setting level social influences (e.g. organisational culture), are typically less controlled than they are efficacy research designs [21]. In this instance, adoption occurs when an organisation (e.g. school) makes a formal decision to commit to using an intervention or policy [22], whereas implementation refers to the processes involved in integrating interventions or policy within organisations and settings [23]. Sustainability relates to the continued use of an intervention with ongoing positive intervention outcomes [24].

Understanding how and what affects the real-world adoption, implementation and sustainability of interventions is critical, as interventions need to be designed for delivery in real-world conditions to have a population-wide impact [18]. We know from a 2015 review by Naylor et al. [14] that the level of implementation is linked to efficacy and outcomes of school-based physical activity interventions. Their review also describes factors that facilitated and hindered implementation based on Durlak and DuPre’s implementation model [25]. An acknowledged limitation in the review is that findings may not be generalisable to real-world systems as they stem predominantly from efficacy trials and more work is needed to assess interventions when interventions are delivered under real-world conditions at scale [14]. Systematic review evidence from obesity prevention research suggests scaled-up interventions are less effective than their initial efficacy trials [26]. The difficulty of achieving intervention effects at scale may, in part, be due to adaptations which are necessary to translate complex interventions originally delivered under controlled circumstances into real-world settings [26]. This may also highlight the level of planning required for effective real-world implementation [27, 28] and the inherent limitations of attempting to translate interventions from highly controlled conditions into ‘real-world settings [29]. Thus, to better understand how to improve the real-world impact of physical activity and sedentary behaviour interventions, there is a need to review the factors associated with adoption, implementation, and sustainability of interventions delivered in real-world settings.

Schools face many challenges in translating evidence-based interventions into routine practice (e.g. funding, school climate, teacher self-efficacy, curriculum demands, and implementation support, among others) [30,31,32]. Therefore the use of implementation theory is recommended to underpin the processes of planning, implementing and evaluating interventions, especially in the case of complex, multifaceted health promotion programs [33, 34]. To this end, numerous implementation theories, frameworks and models have been developed and collated [33, 34]. Unfortunately, despite the existence of multiple implementation models and appeals for more systematic reviews investigating the application of evidence-based programs in everyday practice [35], there still remains a lack of research, particularly regarding issues of sustained practice [19, 20].Whilst we know ‘why’ implementation models are selected (i.e., empirical support, description of implementation processes, or researcher familiarity) [36], it is unclear ‘how’ they are used in the practice of physical activity school-based prevention research. This review aims to offer important insights into future intervention development and delivery at a population level by: 1) identifying which implementation theories, frameworks, and models (hereafter referred to as “implementation models”) are used in real-world school-based physical activity and/or sedentary behaviour trials; and 2) identifying barriers and facilitators associated with the adoption, implementation and sustainability of interventions in real-world settings.

Methods

This review was prospectively registered with PROSPERO (CRD42018099836) and follows the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) guidelines [37] (Additional file 5).

Eligibility criteria

Inclusion criteria were studies which: i) included school-aged children or adolescents; ii) involved interventions delivered in the school setting during school hours with a primary outcome to either increasing physical activity and/or decreasing sedentary behaviour; iii) applied implementation models to plan or to interpret study results; and iv) were conducted in real-world settings (e.g. effectiveness, scale-up, dissemination, translation, and implementation studies). As this review focuses on studies conducted in real-world settings, inclusion of a control group was not a criterion for eligibility. Studies were excluded when they: i) tested efficacy (e.g. randomised controlled trials, feasibility and pilot studies); ii) were conducted with special schools or pre-school and/or kindergarten aged children; and iii) applied or tested school-based physical activity policy (i.e. no program was implemented).

Information sources and searches

We searched the online databases of MEDLINE, EMBASE, CINAHL, SPORTDiscus, PsycINFO, CENTRAL, and ERIC for peer-reviewed English language articles published on or after January 1st, 2000 until the 31st of July 2018. A research librarian was consulted during the development and testing of search terms (Additional file 1). Reference lists of included studies were hand-searched for eligible interventions and a forward citation search using Google Scholar was performed on the 21st of January 2019 including articles published until the end of 2018.

Study selection

One author (SC) screened article titles. All abstracts and full texts were screened by two authors (SC and AM) with discrepancies on study inclusion discussed and a consensus agreement made by five authors (SC, AM, HK, JS, and AT). Reference lists and forward searching was undertaken by SC and inclusion decisions were made by consensus agreement by four authors (SC, HK, JS, and AT).

Data collection process

Data were extracted by one author (SC), with authors (AM, HK, JS, and AT) consulted for clarification where necessary. Data extracted included: date, study population, study design, intervention strategies and location, implementation model use, implementation strategies, implementation measures, factors related to adoption, implementation, sustainability, and results and comments. As the studies included in this review did not all include evidence on the effectiveness of the interventions, we were unable to report the impact of each of the factors described above on overall intervention success. The need to research the relative importance of the factors listed in this review is highlighted for future research.

Data synthesis, extraction and quality assessment

Implementation models applied in the included studies (Aim 1) were first grouped within Nilsen’s five categories [33]: (i) process models (used to describe or guide implementation), (ii) determinant frameworks (helpful to understand what influences implementation outcomes), (iii) classic theories, (stemming from fields outside implementation research and used to understand or explain aspects of implementation), (iv) implementation theories (which aim to describe and understand features of implementation), and (v) evaluation frameworks (to guide relevant features of successful implementation). Secondly, for each included study implementation models were characterised per their reported application to either: (i) design the intervention, (ii) plan the evaluation, (iii) interpret the results, or any combination of the three. Factors related to adoption, implementation and sustainability, and barriers and facilitators related to implementation were extracted and grouped (Aim 2). Factors were then categorised according to Durlak and DuPre’s [25] framework, which highlights 23 contextual factors related to the five domains of the delivery system, support system, the providers, aspects of the intervention, and the communities in which they are implemented. Following categorisation, factors were then consolidated, and intervention specific terminology was generalised. All factor categories were discussed among SC, AT, JS and HK before consensus decisions were made on final groupings. Analysing factors within the scope of this framework enabled comparisons of factors between studies, including those found to be relevant in Naylor et al.’s [14] review.

The Mixed Methods Appraisal Tool (MMAT) was used independently by two authors (SC and AM) to assess study quality [38]. The MMAT was developed to enable the assessment of different study designs by offering a single tool consisting of different criteria for quantitative, qualitative and mixed methods studies [39]. The tool includes two screening questions, in addition to five questions per study design in which response options include: yes; no; and can’t tell. For the purposes of this review, questions relating to the qualitative studies, non-randomized studies, quantitative descriptive, and mixed-methods studies were included. Where multiple publications have been published for any one intervention, publications were grouped, and an overall assessment made for the intervention. As overall scores assigned to interventions are discouraged because they do not allow readers to see which aspects of the studies have been covered or not, the MMAT instead recommends users to provide a presentation of the ratings (see Additional file 6). Initial inter-rater reliability was determined using Cohen’s κ, showing moderate agreement 86.1% (κ = 0.56).

Results

Study selection

The study selection and screening process is outlined in Fig. 1. The electronic database search identified 33,445 unique records. An additional 12 records were identified from reference searching and 708 records from our forward searching which resulted in 34,175 records for screening. A total of 33,888 records were excluded at the title level and 175 at the abstract level, thus 112 full texts were assessed for eligibility. Full texts were excluded (n = 85) due to publication type (e.g. editorials, commentary papers), outcome other than physical activity/sedentary behaviour, inappropriate study design, absence of implementation model, and inappropriate delivery setting/ time (e.g. outside school hours). Thus, 27 papers, comprising 17 unique interventions were included in this review [40,41,42,43,44,45,46,47,48,49,50,51,52,53,54,55,56,57,58,59,60,61,62,63,64,65,66].

Fig. 1
figure 1

PRISMA flow diagram- Search strategy and inclusion

Study characteristics

Of the 27 papers included in this review and outlined in Table 1, five employed a qualitative study design [45, 54, 57, 59, 61], nine a quantitative design [42, 44, 48, 50,51,52, 60, 62, 65], ten utilised mixed-methods [40, 41, 43, 46, 47, 55, 59, 63, 64, 66], and three included summary articles which collated previous findings and discussed lessons learned across multiple publications for a specific intervention [49, 53, 56]. Interventions conducted in the articles were delivered in six high income countries, as categorised by the World Bank [67]: USA [41,42,43,44, 48,49,50,51, 56, 58, 60, 65, 66], Canada [52, 53, 59], Netherlands [46, 47, 61,62,63,64], United Kingdom [45, 57], Australia [40], and Denmark [54, 55]. Further, interventions were conducted in a range of school settings including primary/elementary [40, 42, 44,45,46,47, 49, 51, 54,55,56,57], middle [43, 48, 66], primary/middle [52, 53, 59], high [58], pre-vocational [61,62,63,64], and all ages (primary, middle and high) [41, 50].

Table 1 Intervention implementation models and factors associated with adoption, implementation, and sustainability

Quality assessment scores have been reported in Additional file 6. Briefly, the three qualitative studies all scored a ‘yes’ for the seven items. The quantitative studies were of lower quality comparatively, with four of the six studies receiving a ‘no’ for the item ‘Is the risk of nonresponse bias low?’, with one ‘can’t tell’ and one ‘no’ for the item ‘Is the sample representative of the target population?’. Of the eight mixed-methods studies, two scored a ‘yes’ for all of the 17 related items. For the other six studies, items relating to qualitative aspects were least likely to receive a ‘yes’, with items ‘Are the qualitative data collection methods adequate to address the research question?’ and ‘Is there coherence between qualitative data sources, collection, analysis and interpretation?’ both receiving four ‘can’t tell’ responses.

Implementation model application

Fourteen implementation models were applied 34 times in the 27 included papers (Fig. 2). Eight implementation models were utilised by at least two separate interventions, including: RE-AIM [21], Rogers’ Diffusion of Innovations theory [22], Ecological framework for understanding effective implementation [25], Consolidated Framework for Implementation Research (CFIR) [68], Determinants of innovation within health care organizations [69], Multilevel implementation quality framework [30], Precede Proceed model [70], and A Conceptual Framework for Implementation [71]. Of the 14 implementation models applied in the included studies, all five of Nilsen’s [33] categories; Evaluation frameworks (n = 5), Implementation theories (n = 3), Determinant frameworks (n = 3), Process models (n = 2) and Classic theories (n = 1) were represented, underlining the variety of models used in the field.

Fig. 2
figure 2

Frequency of implementation model use per intervention

The most common use of implementation models across studies were to interpret results (n = 9), followed by a combination of planning the evaluation and interpreting the results (n = 8). Implementation models were also used to plan the evaluation (n = 6), solely in the design of the intervention (n = 4), to design the intervention and interpret results (n = 3), to design the intervention and plan the evaluation (n = 1) and finally in a combination of all three aspects to design the intervention, plan the evaluation and interpret the results (n = 1).

Barriers and facilitators in intervention adoption, implementation and sustainability phases

Of the included papers reviewed, seven described factors pertinent to adoption, 14 considered aspects related to implementation, and seven discussed influences on sustainability. A total of 275 factors were reported across the three phases, with 52 factors related to adoption (facilitators n = 36, barriers n = 16), 154 factors linked to implementation (facilitators n = 107, barriers n = 47), and 63 factors linked to sustainability (facilitators n = 33, barriers n = 30). A full list of these factors are organised under the five domains relating to the Durlak and DuPre model [25]: community level factors; provider characteristics; characteristics of the innovation; factors relevant to the delivery system; and factors related to the prevention support system (See Additional file 2, Additional file 3, and Additional file 4).

Table 2 highlights the domains covered for each individual phase of adoption, implementation, and sustainability to illustrate the impact (barriers/facilitators) and coverage of factors across the dissemination continuum. The following section contains a list outlining the category groups covered for each phase with examples in parentheses taken from included articles. In total, there were seven category groups reported as a facilitator for all three phases of adoption, implementation and sustainability: 1) Policy (e.g. Aligned with state education standard); 2) Perceived benefits of innovation (e.g. Classroom behaviour benefits); 3) Compatibility (e.g. Feasible and acceptable); 4) Adaptability (e.g. Flexible approach to commencing implementation); 5) Integration of new programming (e.g. Easy to integrate in organisations); 6) Coordination with other agencies (e.g. Willingness/aptitude to collaborate); and 7) Managerial support (e.g. Teachers encouraged/ supported by school to trial intervention).

Table 2 Durlak and DuPre domains covered by each dissemination phase

Correspondingly, five category groups were reported as barriers across each phase: 1) Perceived need for innovation (e.g. Low priority relative to other academic subjects); 2) Compatibility (e.g. Program too complex for education level); 3) Integration of new programming (e.g. Need for simplified methods, instruments, protocols, and tasks); 4) Specific staffing considerations (e.g. Teacher attrition); and 5) Leadership (e.g. Change in school leadership). Aspects related to the compatibility and integration of new programming were the only two category groups to be listed as facilitators and barriers across all three phases. Further, several category groups were listed in at least two of the phases, with the majority of these listed as facilitators stemming from factors relevant to the delivery system (schools’ organisational capacity) and the prevention support system. A full list of facilitators and barriers relating to the adoption, implementation and sustainability are reported in Additional file 2, Additional file 3, and Additional file 4.

Adoption

Facilitating factors specifically related to adoption were identified across 16 category groups. Facilitators relevant to domains ‘characteristics of the innovation’ (n = 13) and ‘the prevention delivery system’ (n = 15) were presented most frequently. Adoption barriers were reported in the following nine category groups. Factors related to ‘the prevention delivery system’ (n = 9) were barriers represented the most frequently.

Implementation

Implementation facilitators were reported across all five domains and comprised 21 category groups. Factors relating to ‘the prevention delivery system’ (n = 42) were represented most frequently. Implementation barriers were mentioned across all domains with the exception of community level factors, covering a total of 15 different categories. Of which, factors relevant to the ‘prevention delivery system’ (n = 39) were most frequently reported.

Sustainability

Facilitators for the sustainability of school-based interventions were reported across all five domains and included factors from 16 category groups. Sustainability barriers again covered all five domain headings across 14 category groups. Factors under ‘the prevention delivery system’ domain were the most prevalent for both sustainability facilitators (n = 12) and barriers (n = 18).

Discussion

This review assessed the use of implementation models in 17 school-based interventions aiming to increase physical activity and/or reduce sedentary behaviour interventions implemented under real-world conditions, and identified facilitators and barriers associated with the adoption, implementation and sustainability of these interventions. The review contributes to the existing evidence base by identifying and comparing factors relevant to implementation under largely uncontrolled conditions and mapping them against a well-recognised implementation framework [25] to identify patterns that will move implementation research on school-based physical activity interventions forward. However, we faced difficulties with comparing identified factors and themes because of the variability in use of terminology across implementation research, previously described as a ‘Tower of Babel’ [72]. Thus it is important for future studies to clearly and systematically label intervention strategies and outcomes [73,74,75,76], and to follow recommended reporting mechanisms such as the purpose designed Standards for Reporting Implementation Studies (STARI) statement [77].

In reviewing facilitators and barriers for real-world physical activity and sedentary behaviour interventions in schools, we encountered a broader evidence base for factors which influence the implementation phase (such as implementation support strategies and implementation fidelity), in comparison to literature discussing influences pertinent to the adoption or sustainability of interventions. Further research on factors associated with adoption and sustainability of interventions is warranted given that previous studies show barriers and facilitators differ across phases [19, 25, 30, 31].

The application of implementation models in school-based intervention studies

In total, 14 different implementation models were applied across interventions, with eight applied on more than two occasions and three (RE-AIM [21], Roger’s diffusion theory [22], and Ecological framework for understanding effective implementation [25] standing out as most often utilised. The most common use of an implementation model was predominately centred around the interpretation of results and analyses, with few examples of studies which applied implementation models as a tool across all phases of the study (e.g. as a planning tool for intervention components, as a tool to evaluate the intervention effect and as a tool to interpret study results/findings). This is underlined by the Nilsen [33] groupings, as implementation models under the category of ‘Evaluation frameworks’ were most commonly cited across studies. The unsystematic application of implementation models at different phases, and in some cases in a retrospective manner, precludes their applicability as a guiding tool throughout the entire intervention process, and may contribute to limitations in the field’s understanding of key mechanisms and phases [34]. Our findings are in line with a previous systematic review of studies citing the Consolidated Framework for Implementation Research (CFIR) [78] that found more than 80% of studies did not apply the model in a meaningful manner (i.e. CFIR was not used to guide the methodology of study design, analysis or interpretation of results) [79]. Their review also highlighted that more than half of the included studies used the implementation model for data analysis purposes and further, that only 23% of studies applied the framework to both guide data collection and analysis. The authors report that using an implementation model was advantageous as a checklist in guiding data collection and ensured that important unmeasured factors were not uncovered during data analysis [79]. The selective and sporadic application of implementation models in their review appear to mirror our findings and alludes to the seemingly ad hoc application of models in implementation research also noted in the implementation literature [36, 73]. In recognition of the under- and ad hoc utilisation of implementation models, and the understanding that researchers may need support in the selection and application of implementation models [80], a number of publications [34, 36, 81] and tools [82,83,84] have been developed which aim to guide this process. For researchers and practitioners seeking to plan clinical and community interventions implemented at scale, the previously mentioned PRACTical planning for Implementation and Scale-up (PRACTIS) guide represents another example of recent work providing practical direction [28].

Barriers and facilitators to adoption, implementation and sustainability

Despite these potential differences across phases, our review suggests that several barriers and facilitators, in particular factors relating to intervention compatibility and the integration of new programming, remained common across the three phases of adoption, implementation, and sustainability (See Table 2). We report on these category groups here as they represent action areas which may prove to be a list of ‘best buys’ for intervention planning and development.

Across all three phases of adoption, implementation and sustainability, factors relating to the school ‘Delivery system’ were most often cited as facilitators and barriers. This implies the importance of schools and change agents (including researchers) addressing these barriers through organisational policies and practices which support the delivery of new interventions. We encourage schools and change agents wanting to adopt, implement and sustain new interventions to consider how they can best prepare their staff when introducing new interventions. In particular schools and intervention developers should work together to limit the impact of anticipated barriers and to harness the benefits of identified facilitators.

One such way to increase the likelihood of implementation of interventions in everyday practice, includes utilising tools such as the PRACTIS guide which encourages early planning for anticipated barriers at the individual, organisational and systems levels [28]. These barriers can then be linked to implementation strategies which best address the specific contextual determinants of implementation [85]. School-level, organisational factors reported above which include managerial support, coordination with other agencies, and specific staffing considerations are a key determinant of successful implementation and have been described as such both within and outside of the education sector [86,87,88,89]. Perceptions regarding the need for and benefits of the intervention also seem central, as well as the compatibility and adaptability of programs, thus supporting Rogers’ seminal diffusion of innovation model [22] among others [25, 30, 78]. For example, designing interventions which involve changes to the pedagogical style (e.g. active lessons) rather than changes in curriculum may be a useful strategy moving forward. Additionally, it seems pertinent to focus on the language used to promote the need and benefits of these intervention using school-related (i.e. improvements in classroom focus and improved academic performance) rather than the traditional approach of highlighting the impact of physical inactivity on health.

Despite several factors being relevant across the dissemination continuum, our review found various phase-specific factors and therefore supports recommendations put forward in the Conceptual Model of School-Based Implementation that implementation strategies need to be tailored for each phase [30]. This suggests schools, researchers and change agents should consider strategies utilised during the adoption phase are not necessarily the same needed during the implementation phase and further, that to ensure sustainability, a separate set of conditions and factors may be relevant [31].

Limitations

Major strengths of this review include the application the Durlak & DuPre model [25], an established implementation model to enable the comparison of facilitators and barriers across other reviews [14]. Secondly, our review demonstrates the diversity in application of implementation models in real-world trials across the three phases of the dissemination continuum. However, this review is not without limitations. Firstly, there were perhaps other interventions that have been implemented under real-world conditions that have collated factors relevant to adoption, implementation, or sustainability which are not included in this review because they didn’t meet the inclusion criteria of ‘using an implementation model’. This is therefore not an exhaustive list of all factors relevant to adoption, implementation and sustainability of real-world interventions. Papers rarely reported separately on implementation of physical activity and sedentary behaviours and it is certainly possible that barriers and facilitators to implementation could differ. We further note the absence of studies stemming from low- and middle-income countries, and suggest further research is needed to complement our findings and expand the literature base regarding issues faced in these countries. Results discussing use of models may not capture full application of the model as use of the model was simply extracted from the authors’ description and there may be instances where it was inferred that one use automatically led to its’ application in another form. Finally, the identified facilitators and barriers may not necessarily be ‘significant’ or result in meaningful changes in effectiveness and may share the same name but have been measured in a different way (e.g. qualitative interviews or focus groups vs quantitative surveys, or different definition of variables).

Conclusions

Our review highlights the selective and sporadic application of implementation model components and alludes to a seemingly ad hoc application which focuses more so on the interpretation of results than of a holistic application across the lifespan of an intervention (i.e. designing the intervention, planning the evaluation, and interpreting the results). Further, this study reviews the growing literature describing school-based physical activity interventions conducted under real-world conditions by mapping factors related to the adoption, implementation and sustainability against a recognised implementation model. The key message for practice being that the application of implementation models from intervention inception can aid researchers and practitioners to leverage known facilitators and mitigate the impact of barriers. Finally, further research is needed, particularly during the adoption and sustainability phases, to assist in the development of strategies which facilitate the process of implementing school-based physical activity interventions in real-world conditions at scale.

Availability of data and materials

N/A.

Abbreviations

CFIR:

Consolidated Framework for Implementation Research

PRACTIS:

PRACTical planning for Implementation and Scale-up

STARI:

Standards for Reporting Implementation Studies

WHO:

World Health Organization

References

  1. Lee IM, Shiroma EJ, Lobelo F, Puska P, Blair SN, Katzmarzyk PT. Lancet physical activity series working group. Effect of physical inactivity on major non-communicable diseases worldwide: an analysis of burden of disease and life expectancy. Lancet. 2012;380(9838):219–29.

    PubMed  PubMed Central  Google Scholar 

  2. Carson V, Hunter S, Kuzik N, Gray CE, Poitras VJ, Chaput JP, Saunders TJ, Katzmarzyk PT, Okely AD, Connor Gorber S, Kho ME. Systematic review of sedentary behaviour and health indicators in school-aged children and youth: an update. Appl Physiol Nutr Metab. 2016;41(6):S240–65.

    Article  PubMed  Google Scholar 

  3. Poitras VJ, Gray CE, Borghese MM, Carson V, Chaput JP, Janssen I, Katzmarzyk PT, Pate RR, Connor Gorber S, Kho ME, Sampson M. Systematic review of the relationships between objectively measured physical activity and health indicators in school-aged children and youth. Appl Physiol Nutr Metab. 2016;41(6):S197–239.

    Article  PubMed  Google Scholar 

  4. Donnelly JE, Hillman CH, Castelli D, Etnier JL, Lee S, Tomporowski P, Lambourne K, Szabo-Reed AN. Physical activity, fitness, cognitive function, and academic achievement in children: a systematic review. Med Sci Sports Exerc. 2016;48(6):1197.

    Article  PubMed  PubMed Central  Google Scholar 

  5. Fedewa AL, Ahn S. The effects of physical activity and physical fitness on children's achievement and cognitive outcomes: a meta-analysis. Res Q Exerc Sport. 2011;82(3):521–35.

    Article  PubMed  Google Scholar 

  6. Watson A, Timperio A, Brown H, Best K, Hesketh KD. Effect of classroom-based physical activity interventions on academic and physical activity outcomes: a systematic review and meta-analysis. Int J Behav Nutr Phys Act. 2017;14(1):114.

    Article  PubMed  PubMed Central  Google Scholar 

  7. Aubert S, Barnes JD, Abdeta C, Abi Nader P, Adeniyi AF, Aguilar-Farias N, Andrade Tenesaca DS, Bhawra J, Brazo-Sayavera J, Cardon G, Chang CK. Global matrix 3.0 physical activity report card grades for children and youth: results and analysis from 49 countries. J Phys Act Health. 2018;15(Supplement 2):S251–73.

    Article  PubMed  Google Scholar 

  8. Naylor PJ, McKay HA. Prevention in the first place: schools a setting for action on physical inactivity. Br J Sports Med. 2009;43(1):10–3.

    Article  PubMed  Google Scholar 

  9. World Health Organization. School policy framework: implementation of the WHO global strategy on diet, physical activity and health. Geneva: World Health Organization; 2008.

  10. World Health Organization. Global action plan on physical activity 2018–2030: more active people for a healthier world. Geneva: World Health Organization; 2018.

  11. Dobbins M, Husson H, DeCorby K, LaRocca RL. School-based physical activity programs for promoting physical activity and fitness in children and adolescents aged 6 to 18. Cochrane Database Syst Rev. 2013;2:e12542.

    Google Scholar 

  12. Kriemler S, Meyer U, Martin E, van Sluijs EM, Andersen LB, Martin BW. Effect of school-based interventions on physical activity and fitness in children and adolescents: a review of reviews and systematic update. Br J Sports Med. 2011;45(11):923–30.

    Article  CAS  PubMed  Google Scholar 

  13. Nathan N, Elton B, Babic M, McCarthy N, Sutherland R, Presseau J, Seward K, Hodder R, Booth D, Yoong SL, Wolfenden L. Barriers and facilitators to the implementation of physical activity policies in schools: a systematic review. Prev Med. 2018;107:45–53.

    Article  PubMed  Google Scholar 

  14. Naylor PJ, Nettlefold L, Race D, Hoy C, Ashe MC, Higgins JW, McKay HA. Implementation of school based physical activity interventions: a systematic review. Prev Med. 2015;72:95–115.

    Article  PubMed  Google Scholar 

  15. Olstad DL, Campbell EJ, Raine KD, Nykiforuk CI. A multiple case history and systematic review of adoption, diffusion, implementation and impact of provincial daily physical activity policies in Canadian schools. BMC Public Health. 2015;15(1):385.

    Article  PubMed  PubMed Central  Google Scholar 

  16. Shah S, Allison KR, Schoueri-Mychasiw N, Pach B, Manson H, Vu-Nguyen K. A review of implementation outcome measures of school-based physical activity interventions. J Sch Health. 2017;87(6):474–86.

    Article  PubMed  PubMed Central  Google Scholar 

  17. Milat AJ, Bauman AE, Redman S, Curac N. Public health research outputs from efficacy to dissemination: a bibliometric analysis. BMC Public Health. 2011;11(1):934.

    Article  PubMed  PubMed Central  Google Scholar 

  18. Reis RS, Salvo D, Ogilvie D, Lambert EV, Goenka S, Brownson RC. Lancet physical activity series 2 executive committee. Scaling up physical activity interventions worldwide: stepping up to larger and smarter approaches to get people moving. Lancet. 2016;388(10051):1337–48.

    Article  PubMed  PubMed Central  Google Scholar 

  19. Stirman SW, Kimberly J, Cook N, Calloway A, Castro F, Charns M. The sustainability of new programs and innovations: a review of the empirical literature and recommendations for future research. Implement Sci. 2012;7(1):17.

    Article  Google Scholar 

  20. Wolfenden L, Chai LK, Jones J, McFadyen T, Hodder R, Kingsland M, et al. What happens once a program has been implemented? A call for research investigating strategies to enhance public health program sustainability. Aust N Z J Public Health. 2019;43:3.

    Article  PubMed  Google Scholar 

  21. Glasgow RE, Vogt TM, Boles SM. Evaluating the public health impact of health promotion interventions: the RE-AIM framework. Am J Public Health. 1999;89(9):1322–7.

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  22. Rogers EM. Diffusion of innovations. 5th ed. New York, NY: Free Press, c2003; 2003.

    Google Scholar 

  23. Klein KJ, Sorra JS. The challenge of innovation implementation. Acad Manag Rev. 1996;21(4):1055–80.

    Article  Google Scholar 

  24. Scheirer MA, Dearing JW. An agenda for research on the sustainability of public health programs. Am J Public Health. 2011;101(11):2059–67.

    Article  PubMed  PubMed Central  Google Scholar 

  25. Durlak JA, DuPre EP. Implementation matters: a review of research on the influence of implementation on program outcomes and the factors affecting implementation. Am J Community Psychol. 2008;41(3–4):327–50.

    Article  PubMed  Google Scholar 

  26. McCrabb S, Lane C, Hall A, Milat A, Bauman A, Sutherland R, Yoong S, Wolfenden L. Scaling-up evidence-based obesity interventions: a systematic review assessing intervention adaptations and effectiveness and quantifying the scale-up penalty. Obes Rev. 2019;13:964.

    Article  Google Scholar 

  27. Brownson RC, Colditz GA, Proctor EK, editors. Dissemination and implementation research in health: translating science to practice. New York: Oxford University Press; 2018.

  28. Koorts H, Eakin E, Estabrooks P, Timperio A, Salmon J, Bauman A. Implementation and scale up of population physical activity interventions for clinical and community settings: the PRACTIS guide. Int J Behav Nutr Phys Act. 2018;15(1):51.

    Article  PubMed  PubMed Central  Google Scholar 

  29. Glasgow RE, Lichtenstein E, Marcus AC. Why don’t we see more translation of health promotion research to practice? Rethinking the efficacy-to-effectiveness transition. Am J Public Health. 2003;93(8):1261–7.

    Article  PubMed  PubMed Central  Google Scholar 

  30. Domitrovich CE, Bradshaw CP, Poduska JM, Hoagwood K, Buckley JA, Olin S, Romanelli LH, Leaf PJ, Greenberg MT, Ialongo NS. Maximizing the implementation quality of evidence-based preventive interventions in schools: a conceptual framework. Adv School Ment Health Promot. 2008;1(3):6–28.

    Article  Google Scholar 

  31. Greenberg MT, Domitrovich CE, Graczyk PA, Zins JE. The study of implementation in school-based preventive interventions: Theory, research, and practice, Promotion of Mental Health and Prevention of Mental and Behavioral Disorders 2005 Series V3; 2005. p. 21.

    Google Scholar 

  32. Kearns NE, Kleinert JO, Dupont-Versteegden EE. Implementing multilevel school-based physical activity interventions using Core implementation components model. J Sch Health. 2019;89(5):427–31.

    Article  PubMed  Google Scholar 

  33. Nilsen P. Making sense of implementation theories, models and frameworks. Implement Sci. 2015;10(1):53.

    Article  PubMed  PubMed Central  Google Scholar 

  34. Tabak RG, Khoong EC, Chambers DA, Brownson RC. Bridging research and practice: models for dissemination and implementation research. Am J Prev Med. 2012;43(3):337–50.

    Article  PubMed  PubMed Central  Google Scholar 

  35. Yoong SL, Clinton-McHarg T, Wolfenden L. Systematic reviews examining implementation of research into practice and impact on population health are needed. J Clin Epidemiol. 2015;68(7):788–91.

    Article  PubMed  Google Scholar 

  36. Birken SA, Powell BJ, Shea CM, Haines ER, Kirk MA, Leeman J, Rohweder C, Damschroder L, Presseau J. Criteria for selecting implementation science theories and frameworks: results from an international survey. Implement Sci. 2017;12(1):124.

    Article  PubMed  PubMed Central  Google Scholar 

  37. Moher D, Liberati A, Tetzlaff J, Altman DG. Preferred reporting items for systematic reviews and meta-analyses: the PRISMA statement. Ann Intern Med. 2009;151(4):264–9.

    Article  PubMed  Google Scholar 

  38. Pace R, Pluye P, Bartlett G, Macaulay AC, Salsberg J, Jagosh J, Seller R. Testing the reliability and efficiency of the pilot mixed methods appraisal tool (MMAT) for systematic mixed studies review. Int J Nurs Stud. 2012;49(1):47–53.

    Article  PubMed  Google Scholar 

  39. Hong QN, Pluye P, Fàbregues S, Bartlett G, Boardman F, Cargo M, Dagenais P, Gagnon MP, Griffiths F, Nicolau B, O’Cathain A. Improving the content validity of the mixed methods appraisal tool: a modified e-Delphi study. J Clin Epidemiol. 2019;111:49–59.

    Article  PubMed  Google Scholar 

  40. Austin G, Bell T, Caperchione C, Mummery WK. Translating research to practice: using the RE-AIM framework to examine an evidence-based physical activity intervention in primary school settings. Health Promot Pract. 2011;12(6):932–41.

    Article  PubMed  Google Scholar 

  41. Beck J, Jensen LH, Hill JO. Implications of facilitators and barriers to implementing fuel up to play 60. Health Behav Policy Rev. 2015;2(5):388–400.

    Article  PubMed  PubMed Central  Google Scholar 

  42. Bice MR, Brown SL, Parry T. Retrospective evaluation of factors that influence the implementation of CATCH in southern Illinois schools. Health Promot Pract. 2014;15(5):706–13.

    Article  PubMed  Google Scholar 

  43. Bogart LM, Fu CM, Eyraud J, Cowgill BO, Hawes-Dawson J, Uyeda K, Klein DJ, Elliott MN, Schuster MA. Evaluation of the dissemination of SNaX, a middle school-based obesity prevention intervention, within a large US school district. Transl Behav Med. 2018;8(5):724–32.

    Article  PubMed  PubMed Central  Google Scholar 

  44. Carlson JA, Engelberg JK, Cain KL, Conway TL, Geremia C, Bonilla E, Kerner J, Sallis JF. Contextual factors related to implementation of classroom physical activity breaks. Transl Behav Med. 2017;7(3):581–92.

    Article  PubMed  PubMed Central  Google Scholar 

  45. Chalkley AE, Routen AC, Harris JP, Cale LA, Gorely T, Sherar LB. A retrospective qualitative evaluation of barriers and facilitators to the implementation of a school-based running programme. BMC Public Health. 2018;18(1):1189.

    Article  PubMed  PubMed Central  Google Scholar 

  46. de Meij JS, Chinapaw MJ, Kremers SP, Jurg ME, Van Mechelen W. Promoting physical activity in children: the stepwise development of the primary school-based JUMP-in intervention applying the RE-AIM evaluation framework. Br J Sports Med. 2010;44(12):879–87.

    Article  PubMed  Google Scholar 

  47. de Meij JS, van der Wal MF, van Mechelen W, Chinapaw MJ. A mixed methods process evaluation of the implementation of JUMP-in, a multilevel school-based intervention aimed at physical activity promotion. Health Promot Pract. 2013;14(5):777–90.

    Article  PubMed  Google Scholar 

  48. Dunton GF, Lagloire R, Robertson T. Using the RE-AIM framework to evaluate the statewide dissemination of a school-based physical activity and nutrition curriculum:“exercise your options”. Am J Health Promot. 2009;23(4):229–32.

    Article  PubMed  PubMed Central  Google Scholar 

  49. Franks A, Kelder S, Dino GA, Horn KA, Gortmaker SL, Wiecha JL, Simoes EJ. School-based programs: lessons learned from CATCH, planet health, and not-on-tobacco. In: School nutrition and activity: impacts on well-being. Oakville, ON: Apple academic press; 2015. p. 147–62.

    Chapter  Google Scholar 

  50. Graziose M, Bandelli L, Economos C, Murray R, Ragalie-Carr J. Adoption and implementation of the fuel up to play 60 program in schools. Health Behav Policy Rev. 2017;4(5):436–45.

    Article  Google Scholar 

  51. Hoelscher DM, Feldman HA, Johnson CC, Lytle LA, Osganian SK, Parcel GS, Kelder SH, Stone EJ, Nader PR. School-based health education programs can be maintained over time: results from the CATCH institutionalization study. Prev Med. 2004;38(5):594–606.

    Article  PubMed  Google Scholar 

  52. Mâsse LC, McKay H, Valente M, Brant R, Naylor PJ. Physical activity implementation in schools: a 4-year follow-up. Am J Prev Med. 2012;43(4):369–77.

    Article  PubMed  Google Scholar 

  53. McKay HA, Macdonald HM, Nettlefold L, Masse LC, Day M, Naylor PJ. Action schools! BC implementation: from efficacy to effectiveness to scale-up. Br J Sports Med. 2015;49(4):210–8.

    Article  PubMed  Google Scholar 

  54. Nielsen JV, Klakk H, Bugge A, Andreasen ML, Skovgaard T. Implementation of triple the time spent on physical education in pre-school to 6th grade: a qualitative study from the programme managers’ perspective. Eval Program Plann. 2018;70:51–60.

    Article  PubMed  Google Scholar 

  55. Nielsen JV, Skovgaard T, Bredahl TV, Bugge A, Wedderkopp N, Klakk H. Using the RE-AIM framework to evaluate a school-based municipal programme tripling time spent on PE. Eval Program Plann. 2018;70:1–1.

    Article  CAS  PubMed  Google Scholar 

  56. Osganian SK, Parcel GS, Stone EJ. Institutionalization of a school health promotion program: background and rationale of the CATCH-ON study. Health Educ Behav. 2003;30(4):410–7.

    Article  PubMed  Google Scholar 

  57. Ryde GC, Booth JN, Brooks NE, Chesham RA, Moran CN, Gorely T. The daily mile: what factors are associated with its implementation success? PLoS One. 2018;13(10):e0204988.

    Article  PubMed  PubMed Central  CAS  Google Scholar 

  58. Saunders RP, Pate RR, Dowda M, Ward DS, Epping JN, Dishman RK. Assessing sustainability of lifestyle education for activity program (LEAP). Health Educ Res. 2011;27(2):319–30.

    Article  PubMed  PubMed Central  Google Scholar 

  59. Storey KE, Spitters H, Cunningham C, Schwartz M, Veugelers PJ. Implementing comprehensive school health: teachers’ perceptions of the Alberta project promoting active living and healthy eating in schools-APPLE schools. Revue phénEPS/PHEnex J. 2011;21(2):3.

    Google Scholar 

  60. Totura CM, Figueroa HL, Wharton C, Marsiglia FF. Assessing implementation of evidence-based childhood obesity prevention strategies in schools. Preventive medicine reports. 2015 Jan 1;2:347-54.Totura CM, Figueroa HL, Wharton C, Marsiglia FF. Assessing implementation of evidence-based childhood obesity prevention strategies in schools. Prev Med Rep. 2015;2:347–54.

    Article  PubMed  PubMed Central  Google Scholar 

  61. van Nassau F, Singh AS, Broekhuizen D, van Mechelen W, Brug J, Chinapaw MJ. Barriers and facilitators to the nationwide dissemination of the Dutch school-based obesity prevention programme DOiT. Eur J Pub Health. 2016;26(4):611–6.

    Article  Google Scholar 

  62. van Nassau F, Singh AS, Hoekstra T, van Mechelen W, Brug J, Chinapaw MJ. Implemented or not implemented? Process evaluation of the school-based obesity prevention program DOiT and associations with program effectiveness. Health Educ Res. 2016;31(2):220–33.

    Article  PubMed  Google Scholar 

  63. van Nassau F, Singh AS, van Mechelen W, Brug J, Chin A, Paw MJ. In Preparation of the Nationwide Dissemination of the School-Based Obesity Prevention Program DOiT: Stepwise Development Applying the Intervention Mapping Protocol. J Sch Health. 2014;84(8):481–92.

    Article  PubMed  Google Scholar 

  64. van Nassau F, Singh AS, van Mechelen W, Paulussen TG, Brug J, Chinapaw MJ. Exploring facilitating factors and barriers to the nationwide dissemination of a Dutch school-based obesity prevention program “DOiT”: a study protocol. BMC Public Health. 2013;13(1):1201.

    Article  PubMed  PubMed Central  Google Scholar 

  65. Welk GJ, Bai Y, Saint-Maurice PF, Allums-Featherston K, Candelaria N. Design and evaluation of the NFL PLAY 60 FITNESSGRAM® partnership project. Res Q Exerc Sport. 2016;87(1):1–3.

    Article  PubMed  Google Scholar 

  66. Wiecha JL, El Ayadi AM, Fuemmeler BF, Carter JE, Handler S, Johnson S, Strunk N, Korzec-Ramirez D, Gortmaker SL. Diffusion of an integrated health education program in an urban school system: planet health. J Pediatr Psychol. 2004;29(6):467–74.

    Article  PubMed  Google Scholar 

  67. World Bank High income data. World Bank High income data. [Cited 2019 Apr 13]. Available from: https://data.worldbank.org/income-level/high-income

  68. Damschroder L, Hall C, Gillon L, Reardon C, Kelley C, Sparks J, Lowery J. The consolidated framework for implementation research (CFIR): progress to date, tools and resources, and plans for the future. InImplementation science. BioMed Central. 2015;10(1):A12.

    Google Scholar 

  69. Fleuren M, Wiefferink K, Paulussen T. Determinants of innovation within health care organizations: literature review and Delphi study. Int J Qual Health Care. 2004;16(2):107–23.

    Article  PubMed  Google Scholar 

  70. Green L, Kreuter M. The precede–proceed model. Health promotion planning: an educational approach. 3rd ed. Mountain View (CA): Mayfield Publishing Company; 1999. p. 32–43.

    Google Scholar 

  71. Fixsen DL, Naoom SF, Blase KA, Friedman RM. Implementation research: a synthesis of the literature; 2005.

    Google Scholar 

  72. McKibbon KA, Lokker C, Wilczynski NL, Ciliska D, Dobbins M, Davis DA, Haynes RB, Straus SE. A cross-sectional study of the number and frequency of terms used to refer to knowledge translation in a body of health literature in 2006: a tower of babel? Implement Sci. 2010;5(1):16.

    Article  PubMed  PubMed Central  Google Scholar 

  73. Hailemariam M, Bustos T, Montgomery B, Barajas R, Evans LB, Drahota A. Evidence-based intervention sustainability strategies: a systematic review. Implement Sci. 2019;14(1):57.

    Article  PubMed  PubMed Central  Google Scholar 

  74. Powell BJ, Waltz TJ, Chinman MJ, Damschroder LJ, Smith JL, Matthieu MM, Proctor EK, Kirchner JE. A refined compilation of implementation strategies: results from the expert recommendations for implementing change (ERIC) project. Implement Sci. 2015;10(1):21.

    Article  PubMed  PubMed Central  Google Scholar 

  75. Proctor E, Silmere H, Raghavan R, Hovmand P, Aarons G, Bunger A, Griffey R, Hensley M. Outcomes for implementation research: conceptual distinctions, measurement challenges, and research agenda. Adm Policy Ment Health Ment Health Serv Res. 2011;38(2):65–76.

    Article  Google Scholar 

  76. Proctor EK, Powell BJ, McMillen JC. Implementation strategies: recommendations for specifying and reporting. Implement Sci. 2013;8(1):139.

    Article  PubMed  PubMed Central  Google Scholar 

  77. Pinnock H, Barwick M, Carpenter CR, Eldridge S, Grandes G, Griffiths CJ, Rycroft-Malone J, Meissner P, Murray E, Patel A, Sheikh A. Standards for reporting implementation studies (StaRI) statement. Bmj. 2017;356:i6795.

    Article  PubMed  PubMed Central  Google Scholar 

  78. Damschroder LJ, Aron DC, Keith RE, Kirsh SR, Alexander JA, Lowery JC. Fostering implementation of health services research findings into practice: a consolidated framework for advancing implementation science. Implement Sci. 2009;4(1):50.

    Article  PubMed  PubMed Central  Google Scholar 

  79. Kirk MA, Kelley C, Yankey N, Birken SA, Abadie B, Damschroder L. A systematic review of the use of the consolidated framework for implementation research. Implement Sci. 2015;11(1):72.

    Article  Google Scholar 

  80. Tabak RG, Padek MM, Kerner JF, Stange KC, Proctor EK, Dobbins MJ, Colditz GA, Chambers DA, Brownson RC. Dissemination and implementation science training needs: insights from practitioners and researchers. Am J Prev Med. 2017;52(3):S322–9.

    Article  PubMed  PubMed Central  Google Scholar 

  81. Atkins L, Francis J, Islam R, O’Connor D, Patey A, Ivers N, Foy R, Duncan EM, Colquhoun H, Grimshaw JM, Lawton R. A guide to using the theoretical domains framework of behaviour change to investigate implementation problems. Implement Sci. 2017;12(1):77.

    Article  PubMed  PubMed Central  Google Scholar 

  82. Dissemination & Implementation Models in Health Research & Practice. Dissemination & Implementation Models. [Cited 2019 Apr 13]. Available from: http://www.dissemination-implementation.org/

  83. EPIS Framework. EPIS Framework. [Cited 2019 Apr 13]. Available from: https://episframework.com/

  84. RE-AIM Resources and Tools. RE-AIM . [Cited 2019 Apr 13]. Available from: http://www.re-aim.org/resources-and-tools/2019

  85. Waltz TJ, Powell BJ, Fernández ME, Abadie B, Damschroder LJ. Choosing implementation strategies to address contextual barriers: diversity in recommendations and future directions. Implement Sci. 2019;14(1):42.

    Article  PubMed  PubMed Central  Google Scholar 

  86. Locke J, Lee K, Cook CR, Frederick L, Vázquez-Colón C, Ehrhart MG, Aarons GA, Davis C, Lyon AR. Understanding the organizational implementation context of schools: a qualitative study of school district administrators, principals, and teachers. Sch Ment Heal. 2019;11(3):379–99.

    Article  Google Scholar 

  87. Lyon AR, Cook CR, Brown EC, Locke J, Davis C, Ehrhart M, Aarons GA. Assessing organizational implementation context in the education sector: confirmatory factor analysis of measures of implementation leadership, climate, and citizenship. Implement Sci. 2018;13(1):5.

    Article  PubMed  PubMed Central  Google Scholar 

  88. Thaker S, Steckler A, Sánchez V, Khatapoush S, Rose J, Hallfors DD. Program characteristics and organizational factors affecting the implementation of a school-based indicated prevention program. Health Educ Res. 2007;23(2):238–48.

    Article  PubMed  Google Scholar 

  89. Weiner BJ, Lewis MA, Linnan LA. Using organization theory to understand the determinants of effective implementation of worksite health promotion programs. Health Educ Res. 2008;24(2):292–305.

    Article  PubMed  Google Scholar 

Download references

Acknowledgements

The Authors would like to thank Rachel West for her expert help with designing the search strategy for this review.

Funding

Samuel Cassar is funded by an Australian Government Research Training Program (RTP) Scholarship.

Author information

Authors and Affiliations

Authors

Contributions

SC, JS, AT, and HK conceived and designed the study. SC, JS, AT, RW, and HK developed the search strategy. SC and AM screened articles against the inclusion criteria, and JS, AT, and HK acted as consensus screeners. SC extracted and coded the data, JS, AT, and HK acted as consensus reviewers. All co-authors helped participated the interpretation of results. SC drafted the manuscript and all co-authors contributed to the critical revision of the manuscript and approved the final manuscript.

Corresponding author

Correspondence to Samuel Cassar.

Ethics declarations

Ethics approval and consent to participate

N/A.

Consent for publication

N/A.

Competing interests

The authors declare that they have no competing interests.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary information

Additional file 1.

Search terms and databases.

Additional file 2.

Factors related to the adoption of real-world school-based interventions.

Additional file 3.

Factors related to the implementation of real-world, school-based interventions.

Additional file 4.

Factors related to the sustainability of real-world, school-based interventions.

Additional file 5.

PRISMA checklist.

Additional file 6.

MMAT risk of bias tool.

Rights and permissions

Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Cassar, S., Salmon, J., Timperio, A. et al. Adoption, implementation and sustainability of school-based physical activity and sedentary behaviour interventions in real-world settings: a systematic review. Int J Behav Nutr Phys Act 16, 120 (2019). https://0-doi-org.brum.beds.ac.uk/10.1186/s12966-019-0876-4

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://0-doi-org.brum.beds.ac.uk/10.1186/s12966-019-0876-4

Keywords