UBC Faculty Research and Publications

Is reporting on interventions a weak link in understanding how and why they work? A preliminary exploration… Riley, Barbara L; MacDonald, JoAnne; Mansi, Omaima; Kothari, Anita; Kurtz, Donna; vonTettenborn, Linda I; Edwards, Nancy C May 20, 2008

Your browser doesn't seem to have a PDF viewer, please download the PDF to view this item.

Item Metadata


52383-13012_2006_Article_100.pdf [ 263.33kB ]
JSON: 52383-1.0223787.json
JSON-LD: 52383-1.0223787-ld.json
RDF/XML (Pretty): 52383-1.0223787-rdf.xml
RDF/JSON: 52383-1.0223787-rdf.json
Turtle: 52383-1.0223787-turtle.txt
N-Triples: 52383-1.0223787-rdf-ntriples.txt
Original Record: 52383-1.0223787-source.json
Full Text

Full Text

ralssBioMed CentImplementation ScienceOpen AcceResearch articleIs reporting on interventions a weak link in understanding how and why they work? A preliminary exploration using community heart health exemplarsBarbara L Riley*1, JoAnne MacDonald2, Omaima Mansi3, Anita Kothari†4, Donna Kurtz†5, Linda I vonTettenborn†6 and Nancy C Edwards2,7Address: 1Centre for Behavioural Research and Program Evaluation, University of Waterloo, Waterloo, Ontario, Canada, 2School of Nursing, University of Ottawa, Ottawa, Ontario, Canada, 3School of Nursing, McGill University, Montreal, Quebec, Canada, 4Bachelor of Health Sciences Program, University of Western Ontario, London, Ontario, Canada, 5School of Nursing, University of British Columbia Okanagan, Kelowna, British Columbia, Canada, 6Bachelor of Science in Nursing Program, Faculty of Health Sciences, Douglas College, New Westminster, British Columbia, Canada and 7Department of Epidemiology and Community Medicine, University of Ottawa, Ottawa, Ontario, CanadaEmail: Barbara L Riley* - briley@healthy.uwaterloo.ca; JoAnne MacDonald - jmacd069@uottawa.ca; Omaima Mansi - omaima.mansi@mcgill.ca; Anita Kothari - akothari@uwo.ca; Donna Kurtz - donna.kurtz@ubc.ca; Linda I vonTettenborn - vontettenbornl@douglas.bc.ca; Nancy C Edwards - nedwards@uottawa.ca* Corresponding author    †Equal contributorsAbstractBackground: The persistent gap between research and practice compromises the impact of multi-level andmulti-strategy community health interventions. Part of the problem is a limited understanding of how and whyinterventions produce change in population health outcomes. Systematic investigation of these interventionprocesses across studies requires sufficient reporting about interventions. Guided by a set of best processesrelated to the design, implementation, and evaluation of community health interventions, this article presentspreliminary findings of intervention reporting in the published literature using community heart health exemplarsas case examples.Methods: The process to assess intervention reporting involved three steps: selection of a sample of communityhealth intervention studies and their publications; development of a data extraction tool; and data extraction fromthe publications. Publications from three well-resourced community heart health exemplars were included in thestudy: the North Karelia Project, the Minnesota Heart Health Program, and Heartbeat Wales.Results: Results are organized according to six themes that reflect best intervention processes: integratingtheory, creating synergy, achieving adequate implementation, creating enabling structures and conditions,modifying interventions during implementation, and facilitating sustainability. In the publications for the threeheart health programs, reporting on the intervention processes was variable across studies and across processes.Conclusion: Study findings suggest that limited reporting on intervention processes is a weak link in research onmultiple intervention programs in community health. While it would be premature to generalize these results toother programs, important next steps will be to develop a standard tool to guide systematic reporting of multipleintervention programs, and to explore reasons for limited reporting on intervention processes. It is ourcontention that a shift to more inclusive reporting of intervention processes would help lead to a betterPublished: 20 May 2008Implementation Science 2008, 3:27 doi:10.1186/1748-5908-3-27Received: 10 November 2006Accepted: 20 May 2008This article is available from: http://www.implementationscience.com/content/3/1/27© 2008 Riley et al; licensee BioMed Central Ltd. This is an Open Access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/2.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.Page 1 of 12(page number not for citation purposes)understanding of successful or unsuccessful features of multi-strategy and multi-level interventions, and therebyimprove the potential for effective practice and outcomes.Implementation Science 2008, 3:27 http://www.implementationscience.com/content/3/1/27BackgroundScholars commonly acknowledge inconsistent and sparsereporting about the design and implementation of com-plex interventions within the published literature [1-3].Complex interventions (also referred to as multiple inter-ventions) deliberately apply coordinated and intercon-nected intervention strategies, which are targeted atmultiple levels of a system [4]. Variable and limitedreporting of complex interventions compromises the abil-ity to answer questions about how and why interventionswork through systematic assessment across multiple stud-ies [3]. In turn, limited evidence-based guidance is availa-ble to inform the efforts of those responsible for thedesign and implementation of interventions, and the gapremains between research and practice.The momentum within the last five years to identifypromising practices in many fields [5-7] increases theurgency and relevance of understanding how and whyinterventions work. However, complex community healthprograms involve a set of highly complex processes [8-10]. It has been argued that much of the research on theseprograms has treated the complex interactions amongintervention elements and between intervention compo-nents and the external context as a 'black box' [4,11-14].Of particular relevance to these programs are failures toeither describe or take into account community involve-ment in the design stages of an intervention [8]; thedynamic, pervasive, and historical influences of inner andouter implementation contexts [12,14-17]; or pathwaysfor change [13,14]. A comprehensive set of propositionsto guide the extraction of evidence relevant to the plan-ning, implementation, and evaluation of complex com-munity health programs is missing.Our research team was interested in applying a set ofpropositions that arose out of a multiple interventionframework to examine reports on community healthinterventions [4]. To this end, we present a set of proposi-tions that reflects best practices for intervention design,implementation, and evaluation for multiple interven-tions in community health, and we conduct a preliminaryassessment of information reported in the published liter-ature that corresponds to the propositions.Propositions for the design, implementation and evaluation of community health interventionsThe initial sources for propositions were primary studiesand a series of systematic and integrative reviews of manylarge-scale multiple intervention programs in communityhealth (e.g., in fields of tobacco control, heart health,injury prevention, HIV/AIDS, workplace health) [8,10,18-24]. By multiple interventions, we mean multi-level andies to achieve expected outcomes. Authors of these reviewshave elaborated reasons why some multiple interventionprograms may not have had their intended impact.Insights for propositions include researchers' reflectionson the failure of their multiple intervention effectivenessstudies to yield hypothesized outcomes, and reviews ofcommunity trials elaborating reasons why some multipleinterventions programs have not demonstrated theirintended impact [8,10,22,23,25,26]. The predominantand recurring reasons for multiple intervention researchfailures are addressed in the initial set of propositions forhow and why interventions contribute to positive out-comes.The propositions arise from and are organized within amultiple interventions program framework (see Figure 1and Table 1). The framework is based on social ecologicalprinciples and supported by theoretical and empirical lit-erature describing the design, implementation, and evalu-ation of multiple intervention programs [8-10,18-21,25-29]. The framework has four main elements, and severalprocesses within these elements. The propositions addresssome of the common reasons reported to explain failuresin multiple intervention research.MethodsThe preliminary assessment involved three main steps:selection of a sample of multiple intervention projectsand publications, development of a data extraction tool,and data extraction from the publications.Selection of a sample of multiple intervention projects and publicationsA first set of criteria was established to guide the selectionof a pool of community-based multi-strategy and multi-level programs to use as case examples. The intent was notto be exhaustive, but to identify a set of programs thataddress a particular health issue that we anticipated mightreport details relevant to the propositions. The teamdecided reporting of such intervention features wouldmost likely be represented in: a community-based pri-mary prevention intervention program; a program thatwas well-resourced and evaluated, and thus represented afavorable opportunity for a pool of publications thatpotentially reported key intervention processes; and, ahealth issue that had been tackled using multiple inter-vention programs for a prolonged period, thus providingthe maturation of ideas in the field.In the last 30 years, community-based cardiovascular dis-ease prevention programs have been conducted world-wide and their results have been abundantly published.The first pioneer community-based heart health programPage 2 of 12(page number not for citation purposes)multi-strategy interventions [4]. Common to many ofthese were notable failures of well-designed research stud-was the North Karelia Project in Finland, launched in1971 [30]. Subsequent pioneering efforts includedImplementation Science 2008, 3:27 http://www.implementationscience.com/content/3/1/27research and demonstration projects in the United Statesand Europe that included the Minnesota Heart HealthProgram, and Heartbeat Wales [9,31,32]. Although spe-cific interventions varied across these projects, the generalapproach was similar. Community interventions weredesigned to reduce major modifiable risk factors in thegeneral population and priority subgroups, and wereimplemented in various community settings to reachwell-defined population groups. Interventions were theo-retically sound and were informed by research in diversefields such as individual behaviour change, diffusion ofinnovations, and organizational and community change.Combinations of interventions employed multiple strate-social networks, organizations, communities). Many ofthese exemplar community heart health programs werewell-resourced relative to other preventive and publichealth programs, including large budgets for both processand outcome evaluations. Thus, community-based cardi-ovascular disease program studies were chosen as the caseexemplar upon which to select publications to explorewhether specific features of interventions as defined bythe propositions were in fact described.To guide the selection of a pool of published literature oncommunity-based heart health programs, a second set ofcriteria was established. These included: studies represent-Multiple Interventions Program FrameworkFigure 1Multiple Interventions Program Framework. (adapted from Edwards, Mill & Kothari, 2004, reproduced with permis-sion).Identify intervention options• Integrating theory (1)Monitor process, impact, spin-offs and sustainability• Modifying interventions during implementation (7, 8)• Facilitating sustainability (9)Describe socio-ecological features of problemOptimize potential impact of interventions• Creating synergy (2, 3)• Achieving adequate implementation (4, 5)• Creating enabling structures and conditions (6)Page 3 of 12(page number not for citation purposes)gies (e.g., media, education, policy) and targeted multiplelayers of the social ecological system (e.g., individual,ative of community-based heart health programs thatwere designed and recognized as exemplars of multipleImplementation Science 2008, 3:27 http://www.implementationscience.com/content/3/1/27intervention programs; studies deemed to be methodo-logically sound in an existing systematic review; andreports published in English. Selection of published arti-cles meeting these criteria involved a two-step process.First, a search of the Effective Public Health PracticeProject [33] was conducted to identify a systematic reviewof community-based heart health programs. The mostrecent found was by Dobbins and Beyers [25]. Dobbinsand Beyers identified a pool of ten heart health programsdeemed to be moderate or strong methodologically. Fromthis pool, a subset of three projects was selected: the NorthKarelia Heart Health Project (1971–1992), HeartbeatWales (1985–1990), and the Minnesota Heart HealthProgram (1980–1993), which were all well-resourced,extensively evaluated, and provided a pool of rigorousstudies describing intervention effectiveness.Second, a subset of primary publications identified in theDobbins and Beyer's [25] systematic review was retrievedfor each of the three programs. In total, four articles wereretrieved and reviewed for the Minnesota Heart Healthbecause several of the publications referred to it fordescriptions of the intervention [43]. The primary studiesand detailed descriptions of the project design, imple-mentation and evaluation for the North Karelia Projectwere retrieved from its book compilation [30].Development of a data extraction toolThe team was interested in identifying the types of inter-vention information reported, or not reported, in the pub-lished literature that corresponded with the identified bestprocesses in the design, delivery, and evaluation of multi-ple intervention programs featured in the propositions.To enhance consistency, accuracy, and completeness ofthis extraction, a systematic method to extract the inter-vention information reported in the selected researchstudies was used. Existing intervention extraction forms[44,45] first were critiqued to determine their relevancyfor extracting the types of intervention information corre-sponding to the propositions. These forms providedclose-ended responses for various characteristics of inter-ventions, but did not allow for the collection of informa-Table 1: Summary of propositions for multiple interventions in community health# PROPOSITIONSIdentify intervention optionsIntegrating theory1 Relevant theories are integrated to contribute to a multi-level and multi-strategy intervention plan.Optimize potential impact of interventionsCreating synergy2 Combinations and sequences of interventions within and across levels of the system are used to create synergy.3 Interventions create synergy through coordinating and integrating intervention efforts across sectors and jurisdictions.Achieving adequate implementation4 Implementation of the interventions is sufficient to achieve population impacts.5 The timing, the effort, and the features of the intervention strategies are tailored to the implementation context.Creating enabling structures and conditions6 Relevant enabling structures and conditions at professional, organizational, community, and other system levels support the interventions.Monitor process, impact, spin-offs and sustainabilityModifying interventions during implementation7 Interventions are continuously adapted to the contextual environment (e.g., setting, leadership, structures, culture, etc.), while maintaining integrity with theoretical underpinnings.8 Evaluation feedback is used to design interventions and to modify them throughout implementation.Facilitating sustainability9 Sustainability – a focus on continuing and extending benefits of interventions – is addressed during planning, implementation, and maintenance phases of interventions.Page 4 of 12(page number not for citation purposes)Program [34-37] and five articles for Heartbeat Wales [38-42]. For Heartbeat Wales, a technical report was also usedtion on the more complex intervention processes reflectedin the propositions. Thus, the research team designed aImplementation Science 2008, 3:27 http://www.implementationscience.com/content/3/1/27data extraction tool that would guide the extraction ofintervention information compatible with the proposi-tions.To this end, an open-ended format was used to extract ver-batim text from the publications. Standard definitions forthe proposition were developed (see Tables 2 through 7 inthe results section), informed by key sources thatdescribed pertinent terms and concepts (e.g., sustainabil-ity, synergy) [46-51]. In order to enhance completenessand consistency of data extraction, examples were addedto the definitions following an early review of data extrac-tion (see below).Data extraction from the publicationsPairs from the research team were assigned to one of thethree heart health projects. Information from the studieswas first extracted independently, and then the pairs foreach project compared results to identify any patterns ofdiscrepancies. Throughout the process, all issues andquestions related to the data extraction were synthesizedby a third party. Early on, examples were added to the def-initions of the propositions to increase consistency ofinformation extracted with respect to content and level ofdetail. Through discussion within pairs and across theresearch team, consensus was reached on informationpertinent to the propositions, and each pair consolidatedthe information onto one form for each project. The con-solidated form containing the consensus decisions fromeach pair was then used to compare patterns across thefull set of articles. All members of the research team par-ticipated in the process to identify trends and issuesrelated to reporting on relevant intervention processes.These trends and issues are described in the next section.ResultsResults are reported for each proposition in order fromone through nine, and grouped according to the themesshown in the multiple interventions program framework(Figure 1). For each proposition, results are brieflydescribed in the text. These descriptions are accompaniedby a table that includes the operational definition for theproposition, findings related to reporting on the proposi-tion, and illustrative verbatim examples from one or moreof the projects.Integrating theory (proposition one)Information regarding the use of theories was most oftenpresented as a list, with limited description of the comple-mentary or unifying connections among the theories inthe design of the interventions. Commonly, interventionprograms projected changes at multiple socio-ecologicallevels, such as individual behaviour changes, in additionto macro-environmental changes. However, while theo-ries were used for interventions targeting various levels ofthe system, the integration of multiple theories was gener-ally implicit and simply reflected in the anticipated out-puts. Although less common, the use of several theorieswas made more explicit through description of the use ofa program planning tool, such as a logic model (Table 2).Creating synergy (propositions two and three)General references were frequently made regarding therationale for combining, sequencing, and staging inter-Table 2: Summary of data reported for integrating theory (proposition one)Operational Definition Information Reported on Propositions Illustrative ExamplesProposition one: Integration of relevant theoriesDescriptions of theories, including any references regarding the relationships among the specific mid-range theories for the various dimensions of Multiple Intervention Programs including: the targets of change, channels, settings, and intervention strategiesA 'shopping list' of theories was reported The 'program operated at the individual, group and community levels and encompassed a wide range of strategies stimulated by social learning theory, persuasive communications theory and models for the involvement of community leaders and institutions' [35:p.203]Most often, use of isolated theories was described for specific intervention design features'The innovation of diffusion theory provided a central framework for the project team... the role of the project as a change agent was to promote the diffusion of the lifestyle innovations of quitting smoking and adopting low fat diets' [30: p.42]Organizational change theory was directed at improving the 'macro environment' while influencing individuals 'choices and opportunities to change' [38: p.8]Some reporting about the relationships among theoretical concepts through use of planning tool, such as a logic model'The approaches described above are unified...to depict the behavioural/social model of community intervention found to be most Page 5 of 12(page number not for citation purposes)relevant' [30: p.43]Implementation Science 2008, 3:27 http://www.implementationscience.com/content/3/1/27ventions as an approach to optimizing overall programeffectiveness and/or sustainability. In particular, refer-ences to this were most often found in proposed explana-tions for shortfalls in expected outcomes. However,specific details regarding how intervention strategies werecombined, sequenced, or staged across levels, as well asacross sectors and jurisdictions, were usually absent. Thus,insufficient information was provided to understandpotential synergies that may have arisen from coordinat-ing interventions across sectors and jurisdictions. In con-within levels of the system (i.e., a series of interventionsdirected at the intrapersonal level) (Table 3).Achieving adequate implementation (propositions four and five)Proposition four specifically considers the quantitativeaspects of implementation. Information reported rangedfrom general statements to specific details. Although thepopulation subgroups targeted by the intervention wereoften clearly identified, information regarding the esti-Table 3: Summary of data reported for creating synergy (propositions two and three)Operational Definition Information Reported on Propositions Illustrative ExamplesProposition two: Combinations and sequencing/staging of interventionsDescriptions of the deliberate combination of interventions (implemented at the same time) and sequencing/staging of interventions (ordered in time) within and across levels of the system relative to their potential for enhanced synergistic and minimized antagonistic effectsDescription regarding the combining and sequencing/staging of interventions at multiple levels of the system as an approach to optimizing overall program effectiveness and/or sustainability ranged from inferences to explicit details'Staff training was implemented in work sites and churches to facilitate offering of health promotion programs such as quit smoking [30: p.203]The program consists of a 'complex set of projects and initiative which combine and interact in different ways to produce overall effect which is being measured through the outcome evaluation' [38: p.14]'The aim is to promote synergism whereby each component reinforces the others' [43: p.89]Some referencing regarding the combining and sequencing/staging of interventions potentially attributable to both the anticipated positive outcomes, as well as explanation for shortfalls in expected outcomes.The 'combination of mass communication and community organization.... was a valuable device for accelerating the diffusion of health innovation' [30: p.321]'Intervention program may have focused on the wrong population segments or used the wrong mix of intervention components' [36: p.1391]More specific details were reported for the combining and sequencing/staging of interventions within levels of the system (such as interventions directed at the intrapersonal individual level), compared to across levels in the system (such as a combination of intrapersonal and policy level changes)'In the two direct intervention schools, butter used on bread was replaced by soft margarine...These changes were also recommended for...meals at home...a nutritionist visited the homes of the children... Healthy diet was also discussed during school lessons. Parent gatherings, leaflets, posters, written recommendations, a project magazine, and the general mass media were used... Screening results were explained... A school nurse repeated the screening...and good advice and counseling to children...' [30: p.293]Compared to...'With an effective political system, public health leaders can gain authority to strenuously exert influence over personal behaviours without arousing resistance.... this was accomplished through a blended approach which included both manipulation and empowerment [30: p.319]Reporting on the timing (sequential versus simultaneous) of interventions spanned from specific detail to general descriptions'Actual screening programmes were often run simultaneously.' [30: p.97]'Staggered entry of communities to intervention to allow for gradual development of the intervention program and strengthened the design through replication' [36: p.1384]'The model Choice-Change-Champion process for health promotion' [was] constructed for 'idealized sequence of events' and intended to 'guide planning and priority setting'. [38: p.9]'...individuals are supported to move from stage one of having a 'choice' for lifestyle... through stage two of making 'changes' successfully... and stage three becoming a 'champion' for health at the local level which requires whereby individuals move from being a recipient to provider' [43: p.48]Proposition three: Coordinating and integrating intervention effortsDescriptions of complementary interventions across sectors (e.g., health, education, recreation, labour, environment, housing, etc) and across jurisdictions (i.e., local/regional, provincial/state, federal/national).Reporting on the importance and deliberate combining and sequencing/staging of interventions through use of multiple channels that crossed sectors and jurisdictions was both implicit and explicit'The programme must be founded on intersectoral activity, community organization and grassroots participation.' [30: p.34]The development of advisory boards 'were made up of influential political business, health, and other leaders in the community and citizen task force' [35: p.202]'The intervention comprises a wide range of locally organized projects together with centrally led initiatives...across all sectors of Welsh life, including the health and educational authorities, local and central government, commerce, industry, mass media, agricultural and voluntary sectors' [38: p.6]Page 6 of 12(page number not for citation purposes)trast, more specific details were reported for thecombining, sequencing, and staging of interventionsmated reach of the intervention was generally non-spe-cific. The amount of time for specific interventionImplementation Science 2008, 3:27 http://www.implementationscience.com/content/3/1/27strategies and the overall program tended to be reportedin time periods such as weeks, months or years. Informa-tion regarding specific exposure times for interventionsstrategies that included the passive receipt of information,interaction, and/or environmental changes. A descriptionof investment levels is also a marker of the intensity of anTable 4: Summary of data reported for achieving adequate implementation (propositions four and five)Operational Definition Information Reported on Propositions Illustrative ExamplesProposition four: Adequate implementationQuantitative descriptions of the intervention implementation, the amount and extent of engagement, include:1. duration (time period);2. intensity (depth of engagement such as passive receipt of information, interaction, or an environmental change);3. exposure (total educational time, total minutes/hours/years of exposure);4. investment (direct funding or in-kind contributions from various sources);5. reach (e.g., total number of participants, proportion of population)General information was often reported on the targeted audience rather than the reach (estimated numbers or proportions receiving intervention)'Programme activities are usually simple and practical in order to facilitate their enactment by the widest spectrum of the community. Rather than the highly sophisticated services are generally simple basic services for a few people, simple basic services are generally provided for the largest possible stratum of the population' [30: p.48]'All eighth graders enrolled in public schools' [34: p.219]Duration was generally reported for the overall program; total time for specific interventions was reported less frequently.A TV series of 15 programmes called 'Key to Health' was broadcast during the 1984–85 school year.' [30: p.300]'Systematic risk factor screening and education were conducted during the first 3 years of the intervention program' [35: p.202]'first intervention – competition: took place over a 4 week community-wide competition' [34: p.219]Descriptions provided regarding the depth of engagement, including the passive receipt of information, to interaction, and environmental change'The following list gives some idea of the extent to which print media were exploited during the five first years of the project (1972–77): local newspaper articles (877.000 column mm) 1509;...Health education leaflets (series of five) 278.000 copies...' [30: p.279]'Activities were experiential – designed to require active participation' [37: p.1211]'Activity was encouraged through a competition...role modeling...and environmental change' [34: p.219]Challenges to reporting cost and cost-benefits, as well as information regarding investment were described.In evaluating the smoking component, cost-benefits were not calculated based on per-capita investment because a) cost of the smoking programme and its administration is 'impossible to estimate, or differentiate from usual operation', and b) the 'cost to some unites such as volunteers is not calculated' because of 'difficulty estimate it' [39: p.131]'In 1990 the North Karelia Project employed nine full-time and eight part-time field office staff, who worked a total of over 18 000 hours that year' [30: p.66]'The money to employ staff and finance the work has come from various sources' [39: p.72]Proposition five: Appropriate implementationQualitative descriptions regarding the quality of the intervention including:1. fidelity (implementing all essential components of interventions as intended)2. alignment with changing context (to ensure best fit);3. implementing the most potent 'active ingredients'.No explicit data reported regarding the quality of implementationDescriptions regarding the quality of implementation were implicit, embedded in reporting of:1. program features, such as priority setting or strategies undertaken to enhance quality implementation2. explanations for problems with intervention fidelity relevant to explaining the results.'One third (1/3) of the budget was dedicated to funding well-defined projects initiated locally that serve the objective of the program....' [38: p.17]'Over its 20 years, the project has initiated or been otherwise involved in hundreds of training seminars. Although the nature of the seminars has changes, the focus has always been the discussion of practical tasks (derived for the objectives), action needed, and progress and feedback.' [30: p.278]'After [the early years of the project ] it became both possible and necessary to introduce more specialized services to support the basic activities. These were prepared and tested by the project and implemented gradually'. [30: p.274]Page 7 of 12(page number not for citation purposes)tended to be unavailable. The intensity of interventionswas provided in some reports, with authors describingintervention strategy. However, investment descriptionswere quite variable, ranging from no information to gen-Implementation Science 2008, 3:27 http://www.implementationscience.com/content/3/1/27eral information on investment of human and financialresources. In addition, challenges to reporting costs andbenefits were often acknowledged.Proposition five considers the quality of implementation,represented by qualitative descriptions of the interven-tion. Reporting regarding the quality of the implementa-tion was primarily implicit (Table 4).Creating enabling structures and conditions (proposition six)Reporting of information relative to the deliberate crea-tion of structures and conditions was limited and gener-ally implicit, often embedded in the details ofintervention implementation (Table 5).Modifying interventions during implementation (propositions seven and sight)Although authors acknowledged the importance of flexi-bility in intervention delivery, information regardingadaptations to environmental circumstances was vague.Reference to context was often in discussion sections ofstudies, and provided as a partial explanation for unin-tended or unexpected outcomes. There was minimaldescription regarding the modification of interventions inresponse to information gained from process/formativeevaluation, outcomes, or population trends – the core ofproposition eight. Again, authors acknowledged the sig-nificance of process/formative evaluation in informinginformation gathered. At other times, in the summativeevaluation, reporting focused on using process evaluationresults to explain why expected outcomes were or werenot achieved, rather than how the process evaluationresults did or did not shape the interventions duringimplementation. Suggestions for improved program suc-cess, based on information gained from formative evalua-tions, were noted in some discussions (Table 6).Facilitating sustainability (proposition nine)Reporting on elements regarding the intention to facilitatesustainability of multiple intervention benefits was alsovariable. Authors made reference to the notion of sustain-ability at the onset of projects and described the condi-tions and supports that were in place to facilitatecontinued and extended benefits. Elements of sustainabil-ity represented in program outcomes were also describedin some detail. In other examples, reporting only focusedon sustainability of the program during the initialresearch phase of program implementation and discussedthe desirability of continuing the program beyond theresearch phase (Table 7).DiscussionThe primary purpose of this paper was to conduct a pre-liminary assessment of information reported in publishedliterature on 'best' processes for multiple interventions incommunity health. It is only with this information thatquestions of how and why interventions work can beTable 5: Summary of data reported for creating enabling structures and conditions (proposition six)Operational Definition Information Reported on Propositions Illustrative ExamplesProposition six: Enabling structures and conditionsDescriptions of the creation of structures (infrastructure) and conditions (processes and relationships) at system levels that support the design, implementation and/or evaluation of interventions, such as : media support; incentive grants; capacity building (for providers, organizations, communities); mechanisms for monitoring, evaluation, surveillance; networks; active citizen participation; opinion leader support.Information regarding the deliberate creation of enabling structures and conditions was embedded in descriptions of intervention implementation.'There was great stress placed on efforts to teach practical skills for change such as smoking cessation techniques and ways of buying and cooking healthier foods. For the latter, close co-operation with the local housewives' association has been proven invaluable, Activities have been coordinated to provide social support, expand options and availability (i.e., production and marketing of healthier foods), and ultimately to organize the community to function in a healthier mode' [30: p.40]'Information gained from the community, clinical and youth baseline surveys about knowledge and lifestyles was shared in community meetings, with professional opinion leaders and published in easily understandable form for the local population...This served as a great force for...winning commitment from key decision makers, and motivating change among individuals and organizations.' [38: p.17]Page 8 of 12(page number not for citation purposes)intervention implementation, with some examples toillustrate how interventions were guided in response tostudied in systematic reviews and other synthesis methods(e.g., realist synthesis). The best processes were a set ofImplementation Science 2008, 3:27 http://www.implementationscience.com/content/3/1/27propositions that arise from and were organized within amultiple interventions program framework. Community-based heart health exemplars were used as case examples.Although some information was reported for each of thenine propositions, there was considerable variability inthe quantity and specificity of information provided, andin the explicit nature of this information across studies.Several possible explanations may account for the insuffi-cient reporting of implementation information. Authorsare bound by word count restrictions in journal articles,and consequently, process details such as program reachally has been viewed as important in interventionresearch. There is emphasis on reporting to prove theworth of interventions over reporting to improve commu-nity health interventions. This follows from the emphasison answering questions of attribution (does a programlead to the intended outcomes?), rather than questions ofadaptation (how does a dynamic program respond tochanging community readiness, shifting communitycapacity, and policy windows that suddenly open?)[16,52].An alternative explanation is that researchers are notTable 6: Summary of data reported for modification of interventions during implementation (propositions seven and eight)Operational Definition Information Reported on Propositions Illustrative ExamplesProposition seven: Adaptation to the contextual environmentDescriptions regarding the adjusting or tailoring of interventions to ongoing and unpredictable contextual changes, while maintaining theoretical underpinnings and integrity. Changes include such factors as: demographics, political priorities; organizational changes or priorities; economic environment; community events; network/coalition development, etc.Authors described the importance of context and need for flexibility in intervention delivery'Even when the framework of an intervention is well-defined...the actual implementation must be flexible enough to respond to changing community situations and to advantage of any fresh opportunities' [30: p.33]Details regarding what modifications were made to initial intervention implementation plans were vague, most often reported as part of the discussion for findings'Project leaders and staff immersed themselves in the community and among the people, where they developed and adjusted programme activities according to the available local options and circumstances' [30: p.33]Proposition eight: Responsive to evaluation feedbackDescriptions regarding the collection and utilization of information about the process of intervention implementation, intervention outcomes (preliminary or later stage), or broader trends on risk factors or conditions, demographics, morbidity and mortality, etc.Importance of process evaluation described as a tool for improving programs.'Process evaluation '...is intended to identify features of a project which enhance or hinder its chances of success as the project develop' [38: p.14]Some description of how interventions were guided in response to preliminary evaluative information and population trends'The project field office is actively involved with many aspects relating to process and formative evaluations. The health behaviour surveys have questions about the person's exposure to various intervention activities, which provides immediate feedback. The health education materials and media campaigns rely heavily on the result of the monitoring' [30: p.71]'The 1987 population survey found that the decrease in population cholesterol means had leveled off. Novel and intensified activities began in North Karelia and across the country, coinciding with new national cholesterol guidelines' [30: p.108]Reporting on formative evaluation as post hoc activities in an attempt to explicate why expected outcomes were or were not achieved.'There was suggestive evidence, however, that innovative modification in format could lead to renewed interest in contests' [35: p.204]Page 9 of 12(page number not for citation purposes)might be excluded in favour of reporting methods andoutcomes [3]. Reporting practices reflect what tradition-attending to the processes identified in the propositionswhen they design multiple intervention programs. Fol-Implementation Science 2008, 3:27 http://www.implementationscience.com/content/3/1/27lowing these propositions requires a transdisciplinaryapproach to integrating theory, implementation modelsthat allow for contextual adaptation and feedback proc-esses, and mixed methods designs that guide the integra-tive analysis of quantitative and qualitative findings.These all bring into question some of the fundamentalprinciples that have long been espoused for communityhealth intervention research, including issues of fidelity,the use of standardized interventions, the need to adhereto predictive theory, and the importance of followingunderlying research paradigms. When coupled with thechallenges of operationalizing a complex communityhealth research study that is time- and resource-limited, itis perhaps not surprising that the propositions were une-venly and weakly addressed.It would be premature to generalize these results to otherprograms. The three multiple intervention programs (theNorth Karelia Project, Heartbeat Wales, and the Minne-sota Heart Health Program) selected for this study wereimplemented between 1971 and 1993, and representedthe 'crème de la crème' of heart health programs in termsdue to the impressive outcomes achieved [17]. We think itwould be useful to apply the data extraction tool devel-oped by our team to some of the more contemporary mul-tiple intervention programs targeting chronic illness. Ourfindings would provide a useful basis of comparison todetermine whether or not there has been an improvementover the past decade in the reporting of information thatis pertinent to the propositions. Before embarking on thisstep, it would be helpful to have further input on the dataextraction tool, particularly from those who are involvedin the development of new approaches to extract data onthe processes of complex interventions with the Cochraneinitiative [3].ConclusionStudy findings suggest that limited reporting on interven-tion processes is a weak link in published research onmultiple intervention programs in community health.Insufficient reporting prevents the systematic study ofprocesses contributing to health outcomes across studies.In turn, this prevents the development and implementa-tion of evidence-based practice guidelines. Based on theTable 7: Summary of data reported for facilitating sustainability (proposition nine)Operational Definition Information Reported on PropositionsIllustrative ExamplesProposition nine: SustainabilityDiscussion regarding the continuation or extension of the issue, program, partnerships, benefits, etc. Includes planning at the outsetReporting on the notion of sustainability at the outset of the project'In principle, a community-based project can vary from a relatively restricted academic study, or local effort, to a major programme with strong nationwide involvement. The North Karelia Project definitely falls into the latter category. At the very onset the national health authorities decided that the North Karelia Project would be a pilot for all Finland.' [30: p.51]Description of conditions and supports in place that would facilitate sustainability such as finances, partnerships, and previous experience'The fact that the project director represented North Karelia in the National Parliament from 1987–1991 was important in this respect. The cooperation of the local health services and health personnel has guaranteed a firm foundation for the project activities. Numerous community organizations have also contributed greatly over the years. Because project activities have been integrated into the existing health services and broad community participation has been a key feature, the overall costs of the programme have been kept modest.' [30: pp.71–72]'The project has arranged numerous competitions in collaboration with the food-industry, the media, schools, sports clubs, voluntary organizations etc. over the past twenty years' [30: p.287]'During the project several of its leading members have been active in various health and health research policy functions' [30: p.287]Descriptions of sustainability evidenced in outcomes of the program such as policy change and extension of the issue illustrated by the role of projects as a catalyst for other jurisdictions'The creation by Secretary of State for Wales of The Welch Health Promotion Authority with clear brief to sustain and support the program provide longer possibilities for Heartbeat Wales' [38: p.17]This 'new administrative arrangements...ensure the future and.. support the complementary initiatives on health promotion for young people and sensible drinking' [40: p.346]'The project became associated with healthy public policy in may ways, by contributing to anti-smoking legislation, for instance.' [30: p.43]'The project has been a major and diverse contributor to many policy decisions on the national and local levels' [39: pp.71–72] 'The North Karelia Project has itself been a model for imitation and acceleration of similar activities around the world [30: p.322]'It was considered worthwhile for the project to continue operating beyond the initial five-year period, but at the same time to expand activities to contribute to national developments. So while North Karelia continued to be an active demonstration area the project evolved a national dimension to its activities' [30: p.360]Page 10 of 12(page number not for citation purposes)of study resources and design. In particular, the NorthKarelia project continues to receive considerable attentionfindings, and recognizing the preliminary status of thework, we offer two promising directions.Implementation Science 2008, 3:27 http://www.implementationscience.com/content/3/1/27First, it is clear that a standard tool is needed to guide sys-tematic reporting of multiple intervention programs. Sucha tool could inform both the design of such research, aswell as ensure that important information is available toreaders of this literature and to inform systematic analysesacross studies. In addition, a research tool that describesbest processes for interventions could benefit practition-ers who are responsible for program design, delivery, andevaluation.Second, the reasons for limited reporting on interventionprocesses need to be understood. Some issues to exploreinclude the influence of publication policies for relevantjournals, and the types of research questions and proc-esses that are used.It is through a more concerted effort to describe andunderstand the black box processes of multiple interven-tions programs that we will move this field of research andpractice forward. It is our contention that a shift to moreinclusive reporting of intervention processes would helplead to a better understanding of successful or unsuccess-ful features of multi-strategy and multi-level interven-tions, and thereby improve the potential for effectivepractice and outcomes.Competing interestsThe authors declare that they have no competing interests.Authors' contributionsBR conceived of the study, managed the project, and wasthe lead writer. JM led development of the data extractiontool. OM led the description of results. NE conceived ofthe multiple interventions framework and co-developedthe propositions with BR. All authors contributed sub-stantively to the operational definitions, data extraction,and writing. All authors have read and approved the finalmanuscript.AcknowledgementsWe wish to acknowledge the research internship that brought us together as a team – Dr. Nancy Edwards' three-month Research Internship in Mul-tiple Interventions in Community Health [53]. Also, thanks to Ms. Christine Herrera and Heather McGrath for their research and technical support in preparation of this article. Contributions were supported by awards to Dr. Riley (personnel award from the Heart and Stroke Foundation of Canada and the Canadian Institutes of Health Research), Dr. Kothari (Career Sci-entist Award from the Ontario Ministry of Health and Long Term Care) and Dr. Edwards (Nursing Chair funded by the Canadian Health Services Research Foundation, Canadian Institutes of Health Research, and the Gov-ernment of Ontario).References1. Oakley A, Strange V, Bonell C, Allen E, Stephenson J, Ripple StudyTeam: Process evaluation in randomized control trials of2. Rychetnick L, Frommer M, Hall P, Sheill A: Criteria for evaluatingevidence on public health interventions.  J Epidemiol CommunityHealth 2002, 56:119-127.3. Armstrong R, Waters E, Moore L, Riggs E, Cuervo LG, LumbiganonP, Hawe P: Improving the reporting of public health interven-tion research: advancing TREND and CONSORT.  J PublicHealth (Oxf)  in press. 2008 Jan 194. Edwards N, Mill J, Kothari AR: Multiple intervention researchprograms in community health.  Can J Nurs Res 2004, 36:40-54.5. Ciliska D, Jull A, Thompson C, (Eds): Purpose and procedure.  EvidBased Nurs 2008, 11:1-2.6. Haynes B, Gasziou P, (Eds): Purpose and procedure.  Evid BasedMed 2007, 12:161.7. Reid S, (Ed): Purpose and procedure.  Evid Based Ment Health 2008,11:1-2.8. Merzel C, D'Afflitti J: Reconsidering community-based healthpromotion: promise, performance, and potential.  Am J PublicHealth 2003, 93:557-574.9. Edwards N, MacLean L, Estable A, Meyer M: Multiple interventionprogram recommendations for MHPSG technical reviewcommittees.  Ottawa, Ontario: Community Health Research Unit;2006. 10. Smedley B, Syme SL: Promoting health: intervention strategiesfrom social and behavioral research.  Am J Health Promot 2001,15:149-166.11. Brownson RC, Haire-Joshu D, Luke DA: Shaping the context ofhealth: a review of environmental and policy approaches inthe prevention of chronic diseases.  Annu Rev Public Health 2006,27:341-370.12. Dooris M, Poland B, Kolbe L, deLeeuw E, McCall D, Wharf-Higgins J:Healthy settings: building evidence for the effectiveness ofwhole system health promotion – challenges & future direc-tions.  In Global Perspectives on Health Promotion Effectiveness Volume1. Edited by: McQueen D, Jones C. New York: Springer; 2007. 13. Krieger N: Proximal, distal, and the politics of causation:what's level got to do with it?  Am J Public Health 2008 in press.2008 Jan 214. Pawson R: Evidence-based policy: a realist perspective Thousand Oaks:Sage; 2006. 15. Edwards N, Clinton K: Context in health promotion andchronic disease prevention.  Background document prepared forPublic Health Agency of Canada 2008.16. Greenhalgh P, Robert G, MacFarlane F, Bates SP, Kyriakidou O, Pea-cock R: Diffusion of innovations in service organizations: sys-tematic review and recommendations.  Milbank Q 2004,82:581-629.17. McLaren L, Ghali LM, Lorenzetti D, Rock M: Out of context?Translating evidence from The North Karelia project overplace and time.  Health Educ Res 2007, 22:414-424.18. COMMIT Research Group: Community intervention trial forsmoking cessation (COMMIT): summary of design and inter-vention.  J Natl Cancer Inst 1991, 83:1620-1628.19. Green LW, Glasgow RE: Evaluating the relevance, generaliza-tion, and applicability of research.  Eval Health Prof 2006,29:126-153.20. CART Project Team: Community action for health promotion:a review of methods and outcomes 1990–1995.  Am J Prev Med1997, 13:229-239.21. Best A, Stokols D, Green LW, Leischow S, Holmes B, Buchholz K: Anintegrative framework for community partnering to trans-late theory into effective health promotion strategy.  Am JHealth Promot 2003, 18:168-176.22. Deschesnes M, Martin C, Hill AJ: Comprehensive approaches toschool health promotion: how to achieve broader imple-mentation?  Health Promot Int 2003, 18:387-396.23. Ebrahim S, Smith GD: Exporting failure? Coronary heart dis-ease and stroke in developing countries.  Int J Epidemiol 2001,30:201-205.24. Koepsell TD, Wagner EH, Cheadle AC, Patrick DL, Martin DC, DiehrPH, Perrin EB, Kristal AR, Allan-Andrilla CH, Dey LJ: Selectedmethodological issues in evaluating community-basedhealth promotion and disease prevention programs.  Annu RevPublic Health 1992, 13:31-57.25. Dobbins M, Beyers J: The effectiveness of community-basedPage 11 of 12(page number not for citation purposes)complex interventions.  BMJ 2006, 332:413-416.heart health projects: a systematic overview update.  Hamil-ton, Ontario: Effective Public Health Practice Project; 1999. Publish with BioMed Central   and  every scientist can read your work free of charge"BioMed Central will be the most significant development for disseminating the results of biomedical research in our lifetime."Sir Paul Nurse, Cancer Research UKYour research papers will be:available free of charge to the entire biomedical communitypeer reviewed and published immediately upon acceptancecited in PubMed and archived on PubMed Central Implementation Science 2008, 3:27 http://www.implementationscience.com/content/3/1/2726. Goodman RM: Bridging the gap in effective program imple-mentation: from concept to application.  J Community Psychol2000, 28:309-321.27. Centres for Disease Control (CDC): Best practices for compre-hensive tobacco control programs.  Atlanta, US: Department ofHealth and Human Services, CDC, National Centre for Chronic Dis-ease Prevention and Health Promotion; 1999. 28. Stokols D: Translating social ecological theory into guidelinesfor community health promotion.  Am J Health Promot 1996,10:282-298.29. Richard L, Lehoux P, Breton E, Denis J, Labrie L, Léonard C: Imple-menting the ecological approach in tobacco control pro-grams: results of a case study.  Eval Program Plann 2004,27:409-421.30. Puska P, Tuomileto J, Nissinen A, Vartiainen E, (Eds): The NorthKarelia project: 20 year results and experiences.  Helsinki:National Public Health Institute; 1995. 31. Anderson LM, Brownson RC, Fullilove MT, Teutsch SM, Novick LF,Fielding J, Land GH: Evidence-based public health policy andpractice: promises and limits.  Am J Prev Med 2005, 28:226-230.32. Pang T, Sadana R, Hanney S, Bhutta ZA, Hyder AA, Simon J: Knowl-edge for better health: a conceptual framework and founda-tion for health research systems.  Bull World Health Organ 2003,81:815-820.33. Public Health Research, Education and Development Program: Effec-tive public health practice project.  2005 [http://www.myhamilton.ca/myhamilton/CityandGovernment/HealthandSocialServiceRe-search/EPHPP/AboutEPHPP.asp]. Hamilton: McMaster University34. Kelder S, Perry C, Klepp K: Community-wide youth exercisepromotion: long-term outcomes of the Minnesota hearthealth program and the class of 1989 study.  J Sch Health 1993,63:218-223.35. Lando H, Pechacek T, Pirie P, Murray D, Mittelmark M, LichtensteinE, Nothwehr F, Gray C: Changes in adult cigarette smoking inthe Minnesota heart health program.  Am J Public Health 1995,85:201-208.36. Luepkar R, Murray D, Jacobs D, Mittelmark M: Community educa-tion for cardiovascular disease prevention: risk factorchanges in the Minnesota heart health program.  Am J PublicHealth 1994, 84:1383-1393.37. Perry C, Kelder S, Murray D, Klepp K: Communitywide smokingprevention: long term outcomes of the Minnesota hearthealth program and class of 1989 study.  Am J Public Health 1992,82:1210-1216.38. Nutbeam D, Catford J: The Welsh heart programme evalua-tion strategy: progress, plans and possibilities.  Health Promot1987, 2:5-18.39. Nutbeam D, Smith C, Simon M, Catford J: Maintaining evaluationdesign in long term community based health promotion pro-grammes: Heartbeat Wales case study.  J Epidemiol CommunityHealth 1993, 47:127-133.40. Smail S, Parish R: Heartbeat Wales – a community pro-gramme.  Practitioner 1989, 233:343-347.41. Tudor-Smith C, Nutbeam D, More L, Catford J: Effects of Heart-beat Wales programme over five years on behavioural risksfor cardiovascular disease: quasi-experimental comparisonof results from Wales and matched areas.  BMJ 1998,316:818-822.42. Phillips CJ, Prowle MJ: Economics of a reduction in smoking:case study from Heartbeat Wales.  J Epidemiol Community Health1993, 47:215-223.43. Directorate of the Welsh Heart Programme: Take Heart: A con-sultative document on the development of community-based heart health initiatives within Wales.  Cardiff 1985.44. Dobbins M, Ciliska D, Cockerill R, Barnsley J, DiCenso A: A frame-work for the dissemination and utilization of research forhealth-care policy and practice.  Online J Knowl Synth Nurs 2002,9:1-12.45. Riley B: OHHP: Taking Action for Healthy Living Local Report-ing Forms.  Data collection tools prepared for the Ontario Ministry ofHealth and Long-Term Care for monitoring and evaluating the OHHP(2003–2008) 2003.46. Fixsen DL, Naoom SF, Blase KA, Friedman RM, Wallace F: Implemen-tation Research: A Synthesis of the Literature Tampa: University of Flor-48. Mancini JA, Marek LI: Sustaining community-based programsfor families: conceptualization and measurement.  Fam Relat2004, 53:339.49. McLeroy K, Bibeau D, Steckler A, Glanz K: An ecological perspec-tive on health promotion programs.  Health Educ Q 1988,15:351-377.50. Richard L, Potvin L, Kishchuk N, Prlic H, Green LW: Assessment ofthe integration of the ecological approach in health promo-tion programs.  Am J Health Promot 1996, 10:318-328.51. Rychetnik L, Hawe P, Waters E, Barratt A, Frommer M: A glossaryfor evidence based public health.  J Epidemiol Community Health2004, 58:538-545.52. Mol A: Proving or improving: on health care research as aform of self-reflection.  Qual Health Res 2006, 16:405-414.53. Multiple interventions for community health   [http://aix1.uottawa.ca/~nedwards]yours — you keep the copyrightSubmit your manuscript here:http://www.biomedcentral.com/info/publishing_adv.aspBioMedcentralPage 12 of 12(page number not for citation purposes)ida; 2005. 47. Last J: A Dictionary of Epidemiology Oxford: University Press; 2001. 


Citation Scheme:


Citations by CSL (citeproc-js)

Usage Statistics



Customize your widget with the following options, then copy and paste the code below into the HTML of your page to embed this item in your website.
                            <div id="ubcOpenCollectionsWidgetDisplay">
                            <script id="ubcOpenCollectionsWidget"
                            async >
IIIF logo Our image viewer uses the IIIF 2.0 standard. To load this item in other compatible viewers, use this url:


Related Items