Skip to main content
Advertisement
  • Loading metrics

Qualitative process evaluation from a complex systems perspective: A systematic review and framework for public health evaluators

  • Elizabeth McGill ,

    Roles Conceptualization, Data curation, Formal analysis, Investigation, Methodology, Project administration, Resources, Software, Validation, Visualization, Writing – original draft, Writing – review & editing

    elizabeth.mcgill@lshtm.ac.uk

    Affiliation Department of Health Services Research and Policy, London School of Hygiene & Tropical Medicine, London, United Kingdom

  • Dalya Marks,

    Roles Formal analysis, Methodology, Supervision, Validation, Writing – review & editing

    Affiliation Department of Public Health, Environments and Society, London School of Hygiene & Tropical Medicine, London, United Kingdom

  • Vanessa Er,

    Roles Data curation, Formal analysis, Writing – review & editing

    Affiliation Department of Health Services Research and Policy, London School of Hygiene & Tropical Medicine, London, United Kingdom

  • Tarra Penney,

    Roles Data curation, Formal analysis, Writing – review & editing

    Current address: School of Global Health, Faculty of Health, York University, Toronto, Canada

    Affiliation MRC Epidemiology Unit, Centre for Diet and Activity Research (CEDAR), University of Cambridge, Cambridge, United Kingdom

  • Mark Petticrew,

    Roles Funding acquisition, Methodology, Supervision, Writing – review & editing

    Affiliation Department of Public Health, Environments and Society, London School of Hygiene & Tropical Medicine, London, United Kingdom

  • Matt Egan

    Roles Formal analysis, Funding acquisition, Methodology, Supervision, Validation, Writing – review & editing

    Affiliation Department of Public Health, Environments and Society, London School of Hygiene & Tropical Medicine, London, United Kingdom

Abstract

Background

Public health evaluation methods have been criticized for being overly reductionist and failing to generate suitable evidence for public health decision-making. A “complex systems approach” has been advocated to account for real world complexity. Qualitative methods may be well suited to understanding change in complex social environments, but guidance on applying a complex systems approach to inform qualitative research remains limited and underdeveloped. This systematic review aims to analyze published examples of process evaluations that utilize qualitative methods that involve a complex systems perspective and proposes a framework for qualitative complex system process evaluations.

Methods and findings

We conducted a systematic search to identify complex system process evaluations that involve qualitative methods by searching electronic databases from January 1, 2014–September 30, 2019 (Scopus, MEDLINE, Web of Science), citation searching, and expert consultations. Process evaluations were included if they self-identified as taking a systems- or complexity-oriented approach, integrated qualitative methods, reported empirical findings, and evaluated public health interventions. Two reviewers independently assessed each study to identify concepts associated with the systems thinking and complexity science traditions. Twenty-one unique studies were identified evaluating a wide range of public health interventions in, for example, urban planning, sexual health, violence prevention, substance use, and community transformation. Evaluations were conducted in settings such as schools, workplaces, and neighborhoods in 13 different countries (9 high-income and 4 middle-income). All reported some utilization of complex systems concepts in the analysis of qualitative data. In 14 evaluations, the consideration of complex systems influenced intervention design, evaluation planning, or fieldwork. The identified studies used systems concepts to depict and describe a system at one point in time. Only 4 evaluations explicitly utilized a range of complexity concepts to assess changes within the system resulting from, or co-occurring with, intervention implementation over time. Limitations to our approach are including only English-language papers, reliance on study authors reporting their utilization of complex systems concepts, and subjective judgment from the reviewers relating to which concepts featured in each study.

Conclusion

This study found no consensus on what bringing a complex systems perspective to public health process evaluations with qualitative methods looks like in practice and that many studies of this nature describe static systems at a single time point. We suggest future studies use a 2-phase framework for qualitative process evaluations that seek to assess changes over time from a complex systems perspective. The first phase involves producing a description of the system and identifying hypotheses about how the system may change in response to the intervention. The second phase involves following the pathway of emergent findings in an adaptive evaluation approach.

Author summary

Why was this study done?

  • Process evaluations are used in public health to understand how and why an intervention works (or does not work), for which population groups, and in which settings.
  • Process evaluations often use qualitative methods—such as interviewing people and observing people in their daily and work routines—in order to draw their conclusions.
  • Researchers in public health have contended that we need to do research in a manner that considers the broader system in which policies and interventions take place—something we call a “complex systems perspective.”
  • To date and to our knowledge, there is no specific framework that describes how researchers can use a complex systems perspective when they conduct a process evaluation with qualitative methods.

What did the researchers do and find?

  • We conducted a systematic literature review that looked for examples of qualitative process evaluations that self-identify as using a complex systems perspective to evaluate public health interventions.
  • We found 21 different evaluations of many different types of public health interventions, including interventions to address student and employee health, sexual health, child development and safety, community empowerment, violence prevention, and substance use.
  • We found that these evaluations describe the systems in which public health efforts take place but are less effective at analyzing how changes affecting health occur within these systems.

What do these findings mean?

  • There is little evidence of a commonly shared understanding of how best to bring a complex systems perspective to process evaluations using qualitative methods, particularly, how to assess how interventions interact with a changing system.
  • We developed a 2-phase framework to guide researchers who want to apply a complex systems perspective to qualitative process evaluations.
  • This review excluded studies that do not self-identify as using a complex systems perspective so we may have missed literature that uses this perspective but not the associated terminology.

Introduction

There has been a growing call [1] for the application of complex systems approaches to intervention planning, service delivery, and evaluation in order to aid understandings of intervention implementation and impacts in real-world environments [24]. Complex systems have been framed as a kind of antidote to reductionist approaches to health research [5]. Finding ways to bring a complex systems perspective to public health evaluation could, it is hoped, shed new light on how to address public health challenges in a complex world. A complex systems perspective can be applied to many different types of research design and methodology. In this paper, we focus on how such a perspective has been applied to process evaluations that utilize qualitative methods. The remainder of this section elaborates on what is meant by complex systems and process evaluations and discusses why qualitative methods are a particular area of interest for public health evaluators interested in complex systems.

Complex systems

Systems are combinations of elements that interact. A distinction is often made between “complex” systems and systems that are “simple” or “complicated” [68]. What make complex systems unique are a number of attributes, including nonlinearity, their dynamic and unpredictable nature, and the ways in which they co-evolve with their environment and produce emergent outcomes [911]. Elements within a complex system (for example, individuals, organizations, activities, and environmental characteristics) interact with each other and are connected in nonlinear ways [6,1214]. Over time, the behavior of system elements leads the individual elements and the system as a whole to adapt and co-evolve with the broader environment—that is, the system is dynamic [6,7, 12,13]. There may or may not be a central authority within the system, such as a president, local authority, or management team, but a complex system is assumed to adapt and behave in ways that cannot be reduced to simple, organizational hierarchies. Because of this, a complex system and its elements are considered to be self-organizing [6]. The individual interactions among system elements collectively generate emergent, system-level behavior wherein the system displays attributes that cannot be reduced to its individual parts [2,6,12,15].

Research into complex systems takes place across academic disciplines and has roots in both systems thinking and complexity science. Although often grouped together because of some conceptual similarities, systems thinking and complexity science can be considered as distinct yet overlapping traditions [16,17]. Systems thinking may be best described as an orientation that prompts researchers to take a holistic, rather than reductionist view, of phenomena and study them in the context of their real-world systems that are open to and interact with surrounding systems. Systems thinking draws on theories, concepts, and methods from a range of disciplinary fields [18]. Complexity science, on the other hand, is more strongly rooted in the mathematical sciences and has drawn on complexity theory, which emphasizes uncertainty and nonlinearity, to create and refine specific methodological approaches to modeling complex systems in order to estimate and predict their emergent behavior over time. Systems thinking prompts researchers and practitioners to consider the boundaries of the system they are studying or in which they are working [19] and places an emphasis on the interactions and relationships between system elements and the system with its broader environment [1,6]. Further applying concepts from complexity science prompts a consideration of how those interactions create nonlinear chains of cause and effect, are unpredictable, unfold overtime, and give rise to system-level emergent outcomes [20].

Complexity has been part of the vocabulary of public health evaluators for decades [16,21]. However, public health evaluations have tended to focus on the complexity of interventions rather than of the systems within which interventions are implemented [22]. A “complex intervention” is one that has a number of interacting parts, targets different organizational levels or groups of people, and aims to affect a number of outcomes [16,17]. In contrast, a complex systems perspective considers complexity as an attribute of the system. The intervention itself may also be complex, for example, a coordinated program of interventions that affect different parts of a system. However, simple interventions can also be theorized to have complex consequences if they are implemented within and interact with a complex system. For example, a single change in a law affecting the price of products that affect health (such as an alcohol or sugar sweetened beverage tax) can be described as an (initially) simple intervention that quickly becomes connected to a complex chain of interactions between industry, retailers, public opinion, consumer behavior, media and policy—each of which may have an impact on future implementation and effects of the intervention itself [15,23]. The way a complex system responds to an intervention may lead to emergent consequences that could amplify or dampen the intervention’s impacts, change the characteristics and behavior of the system over time, and affect future decision-making [15,24]. From a complex systems perspective, the role of the evaluator is to make sense of the interplay between the complex system and the (simple or complex) intervention to help explain health and other impacts and inform future decisions about implementation [1].

Process evaluations and qualitative methods

Traditional evaluations of simple or complex public health interventions often focus on measuring impacts on a single (or small number) of prespecified health and health-related outcomes [10]. However, impact evaluations alone offer little opportunity to explore the mechanisms behind an intervention’s success or failure, particularly when impacts are unevenly distributed among different population groups. For this reason, other forms of evaluation, particularly process evaluation, have been developed and utilized in order to understand intervention implementation and the mechanisms by which interventions may lead to impacts across a population [17,25]. There is no single definition of a process evaluation, but the Medical Research Council’s (MRC) Guidance on Process Evaluations of Complex Interventions argues they “can be used to assess fidelity and quality of implementation, clarify causal mechanisms, and identify contextual factors associated with variation in outcomes” [26 p. 30]. A process evaluation is often, although not always, conducted alongside an outcome or impact evaluation that quantifies the impact of an intervention on a range of outcomes [16].

Process evaluations of public health interventions may benefit from an explicit adoption of a complex systems perspective. The application of systems thinking and insights from the complexity sciences can provide a means through which to evaluate and understand the nonlinear ways in which interventions may lead to a number of impacts within a system. This could include impacts considered to be of interest when the evaluation is initially planned and impacts that emerge as potentially important as the evaluation progresses. By bringing an explicitly relational focus to the evaluation design and placing the wider context in the foreground of the analysis [24], a complex system approach to a process evaluation may help to make sense of intervention mechanisms within a real-world context. An explicit complex systems perspective may also help evaluators construct a narrative that explores the trajectory of a given system. This could include considering how the intervention acts as an event that prompts a series of changes in the way a complex system behaves [15]. Furthermore, it could include consideration of how the intervention itself changes, as system elements and the system as a whole adapt and respond to it [15,24].

Although process evaluations can include quantitative assessments of intervention outputs, they typically draw on a range of qualitative methods. Qualitative methods are well suited for unpacking complex causal chains, understanding changes in implementation, representing varying experiences of the intervention, and generating new theories to inform future decision-making [17]. Proponents of explicitly using complexity theory within qualitative designs argue doing so “has potential to capture and understand complex dynamics that might otherwise be unexplored” [27 p. 3]. Bringing a complex systems perspective to a qualitative process evaluation could have a range of methodological implications. For example, it could involve mapping the system of interest, a sampling strategy that seeks to recruit participants relevant to different parts of that system, a form of data collection geared towards assessing relationships within a system, and an analysis framework that incorporates concepts drawn from systems thinking and complexity science.

There is a large body of literature on quantitative methods for complex systems approaches and some examples of such methods being applied to the study of policies and interventions that may affect population health [2833]. Many of these approaches build simulation models that estimate and predict the impact of interventions on outcomes of interest [34]. These approaches have been developed within the complexity sciences and include methods such as system dynamics modeling, microsimulation modeling, and agent-based modeling [3,20,35,36]. Although these methods may begin with some qualitative work, such as participatory workshops to map a system of interest, their aim is to generate quantitative estimates of future or hypothetical impacts [31]. Compared with quantitative methods, there is little consensus, and less has been written on how to explicitly draw on a complex systems approach for process evaluations that use qualitative methods. This represents an underdeveloped area for complex systems evaluation.

This systematic review therefore aimed to identify the concepts and methods currently used in public health evaluations that apply a complex systems perspective to process evaluations involving qualitative methods. Specifically, this review sought to answer 3 research questions: (1) What types of public health interventions have been subjected to process evaluations that use qualitative methods and apply a complex systems perspective? (2) What are the qualitative methods used in this body of literature? (3) What concepts and theories associated with complex systems are used in process evaluations that use qualitative methods? Drawing on this body of literature, we then had a secondary aim of developing a framework for qualitative process evaluation from a complex systems perspective. We sought to develop an evaluative framework that researchers (working in academic or practice settings) can use as an overarching structure to guide evaluative efforts [37]. In our Discussion section we therefore present our framework and provide some guidance for researchers on the potential role of qualitative data in identifying and understanding aspects of complexity within process evaluations.

Methods

Data sources and screening

Relevant process evaluations were identified through several different search methods. First, we conducted an expert consultation whereby we contacted 32 academics with an interest or experience in complex systems thinking and its application to public health and asked them to identify any relevant examples of complex systems evaluations. The academics were identified through an ongoing familiarization with the literature on complex systems and public health, as well as through our own professional networks. In the original consultation, we did not request permission to be named, but those who did provide permission during the review process are named in the Acknowledgments. We then identified 2 relevant systematic reviews on systems thinking and public health [35] and complexity theory applied to evaluation [20]. From the studies identified in these reviews, we selected evaluations that met our inclusion criteria (next). Finally, we conducted an electronic search covering January 1, 2014–September 30, 2019 using 3 databases: Scopus, Medline, and Web of Science. The search dates were set to capture evaluations published after the 2 systematic reviews. The electronic search strategy included terms and synonyms for systems thinking, complexity science, evaluation, and public health and was restricted to English-language publications. An example of the full search strategy can be found in S1 Text. This study is reported as per the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) guideline (S1 PRISMA Checklist).

Titles and abstracts were screened initially by one reviewer, and all potentially relevant studies were independently screened by 2 reviewers. In cases in which a decision was not clear cut, or the reviewers disagreed, a discussion was held with a third reviewer. The review had 4 inclusion criteria, which we describe in more detail next. In brief, studies were included in the review if they (1) self-identified as taking a systems- or complexity-informed approach; (2) were relevant to public health; (3) were process evaluations of interventions with empirical findings; and (4) utilized qualitative methods.

Studies were eligible for inclusion if they self-identified as using a systems and/or complexity perspective at any stage of the evaluative process, including during the design, data collection, analysis, or interpretation phases. We took a broad view of public health to include upstream determinants of population health, which include alcohol, the built environment, community health, community safety, education, employment, environmental health, food, health promotion, housing, illicit substances, obesity, policing, regeneration, sexual health, social welfare, tobacco, trading standards, transport, and urban planning. Studies that covered topics not included in the aforementioned list were considered if they concerned population health; decisions in these instances were made between 3 reviewers. Studies concerning treatment in health service settings were excluded. Studies were only included if they reported empirical findings of a process evaluation; protocols and discussion pieces describing evaluations without presenting results were excluded. Process evaluations alongside outcome evaluations were eligible for inclusion, although our analysis focused solely on the process evaluation component. Finally, studies were eligible for inclusion if they used qualitative methods, which included interviews, group interviews or focus group discussions, (participant) observation, document review, free form responses on questionnaires, and participatory and visual methods, including for example, mapping workshops and photography. Evaluations employing mixed methods (wherein qualitative data were integrated into the assessment of the intervention alongside other methods) were included, as long as there was a substantive component that generated and analyzed qualitative data. To operationalize this criterion, we considered the ways in which the mixed methods research was designed, and we included studies that generated qualitative and quantitative data concurrently to evaluate an intervention (triangulation design); studies in which the researchers primarily utilized a qualitative design with some supporting quantitative output or outcome data (embedded design); studies in which the qualitative data were used to make sense of intervention outcomes (explanatory design); or studies in which qualitative research was used to generate hypotheses about the intervention that could be tested quantitatively (exploratory design) [26,38]. Studies utilizing these mixed method designs were eligible for inclusion even if the authors did not label the design or describe the rationale for the chosen approach. A substantive qualitative component referred to the authors both describing the qualitative methods, including data collection and analysis, as well as presenting qualitative data. Covidence software was used to help facilitate the screening process [39].

Data extraction and synthesis

The analysis began with an in-depth reading of, and familiarization with, the included studies, with specific attention paid to the ways in which they drew on systems thinking and/or complexity science and the methods utilized to achieve their evaluative aims. Data were extracted on each study using a template designed for this review. Specifically, data on the study’s research question, public health area, country, intervention, the application of complex systems thinking, the methods and analytical approach, and system map (if presented) were extracted (see Table 1). The “complex systems perspective and evaluation stage” column shows how systems thinking and/or complexity science featured in each evaluation and at which stage in the evaluation (i.e., design, data collection, analysis). The system map column reports the studies that included a map of the system and describes what the map detailed. If the evaluators published a logic model, it is noted in this column. Where studies gave rise to more than one publication, we considered them “linked” and extracted data from across the identified studies. The data extraction process was completed by one reviewer and double checked by a second.

Alongside the data extraction process, a list of concepts from systems thinking and complexity science was generated through an ongoing familiarization with these bodies of literature. A number of papers and books that are frequently referenced within the public health literature on complex systems were selected during this familiarization period [1,6,7,9,12,15,22,40], and from this, a master list of systems and complexity terms was generated. Our aim was that this list captured the key principles associated with each of the traditions and could be used by those wishing to gain a familiarization with systems thinking and complexity science. We found that not all authors describe the same concepts within these traditions and they often use different language. As a result, there was a subjective element to generating the list with the research team making choices about which concepts to feature and how to define them. In particular, although many authors describe “context” as a key systems thinking concept, and we initially also included it in our list, we ultimately chose to exclude it due to its substantial overlap with many other concepts. “Context” describes the factors in the environment that affect the system, particularly historical, temporal, geographical, political, and social factors [13]. As a result, arguably the entire system represents the “context,” and it therefore does not represent a meaningful category when trying to describe and analyze a changing system. In addition, we recognize that there is conceptual overlap between many of the concepts and that the boundaries between them may be somewhat fluid. In the Discussion section a glossary of terms and how they might be applied within a process evaluation using qualitative methods are presented.

Critical appraisal

No tools exist to assess the quality of process evaluations informed by a complex systems perspective. Therefore, for this review, we critically appraised how systems thinking and complexity science were employed in each paper. Specifically, we assessed the degree to which each study identified through the search strategy described, captured, measured, or applied each concept in a meaningful way. The decisions were depicted using a traffic light color scheme. A green color code was applied when a study explicitly applied a concept at any stage of the evaluation process, including the design and planning stage, data collection, analysis, or interpretation. For example, a study would receive a green code if it explicitly described the boundaries of the system under inquiry at any stage in the evaluation. Evaluators might use the idea of boundaries, for instance, to shape the evaluation scope by designating clear system boundaries to bound the evaluation, or the concept might be applied within the interpretation of the data, to gain, for example, an understanding of how system elements view the boundaries of their own system. A yellow coding represented a study in which there was some attempt to apply a concept, but it was limited or addressed in an implicit manner. A red color code represented instances in which the concept was not utilized. The aim of this appraisal was not to be overly critical about individual studies but rather to understand the ways in which concepts from systems thinking and complexity science are applied in this body of literature. This process required us to make judgments, and in some instances, the decisions were not necessarily clear cut. In order to increase the validity of this process, 2 reviewers (EM and DM; or EM and ME) independently assessed each study, and disagreements were reconciled through discussion.

Results

Evaluation characteristics

A total of 21 unique evaluations (in 25 separate publications) were identified (see Fig 1). Their characteristics are presented in Table 1, and in-depth descriptions of 2 evaluations, one rooted in systems thinking [41] and another in complexity science [42], are presented in S2 Text. The in-depth descriptions were written to give clear examples of how these approaches have been applied in practice. A range of public health topics were represented in the sample, including social work [4345], school health [41,4650], workplace health [51,52], sexual health [48,53,54], health equity policy [55,56], urban planning [57,58], substance use [53,56,5961], child development [62], public–private partnerships [63], community empowerment and transformation [42,64], and violence prevention [65]. The studies were conducted in 13 countries, which included 9 high-income and 4 middle-income settings: Australia [51,52,57,65], Brazil [55], Chile [62], El Salvador [54], Finland [56], India [62], Israel [43,44], the Netherlands [46,47,58], New Zealand [50], South Africa [48,62], Sweden [59], the United Kingdom [42,49,60,61,63,64,41], and the United States [45,52].

The primary studies in this review were notable for their diversity in terms of the theories and frameworks used to inform the evaluation design and the focus of the analysis. Prominent theories included explicit applications of complexity theory [42,50,60] and diffusion of innovation theory [49]. Studies also used a number of frameworks to structure the analysis and to draw out evaluative findings. This included existing frameworks such as the Cynefin framework [48], Consolidated Framework for Implementation Research [59], a complex adaptive systems framework [54,62], and the socioecological model [41]. Other evaluations featured bespoke frameworks for analysis, including ones that focused on the role of critical events in an intervention’s trajectory [55], a systems framework focusing on governmental subsystems [56], and a framework that was used to identify and categorize different types of “by-effects” or unintended consequences [58].

The process evaluations in this literature base varied in terms of the stage of evaluation planning and conduct in which they drew on complex systems thinking concepts and frameworks. Although the reporting was not always clear, 14 evaluation teams used some facets of systems thinking and complexity science when planning and designing their evaluations [4147,49,5153,57,58,6062,64,65], which ranged from asking systems-oriented research questions to informing the sampling strategy (e.g., a conscious effort to sample different elements or from different levels within the system) and data collection tools (i.e., interview topic guides). Other evaluators used complex systems concepts, theories, or frameworks solely to structure their analyses [48,50,5456,59,63].

The evaluations identified also drew on a wide range of qualitative methodologies. Ten studies applied a case study design [4145,5052,56,6062,64]. The nature and boundary of a case varied from evaluation to evaluation. Some studies (n = 3), for example, defined a case based on geographical boundaries, and each case represented a geographical locality [42,60,61,64]. Other case study examples included individual families [43,44] or schools [41] or the specific application of a policy [56].

Evaluators utilized a number of different methods for data collection, and 13 applied a mixed methods approach, which included using multiple qualitative data collection methods [4145,4850,5558,62,64]. Seven studies employed a mix of qualitative and quantitative methods [46,47,5153,5961,63,65], although all of these studies had substantive qualitative findings. Not all evaluators articulated their rationales for choosing and combining certain qualitative methods, but in general, the different methods were employed to access, understand, and analyze different elements, structures, and relationships within the system. For example, speaking to a range of different actors within the system, through interviews (semi-structured, in-depth, or narrative) and focus groups [4165], was used to assess different perspectives about an intervention, relationships, and theories of change within the broader system and to make sense of system trajectories. Documentary review and analysis were also relatively common, being used in 7 studies [41,43,44,46,47,50,6264], and a range of documents were reviewed including media reports, community plans, evaluation documents, and case reports. Documents were used to understand intervention development and implementation and to generate data at different levels within systems, for example, with some evaluators choosing to review national-level documentation and subsequently conduct regional or local-level interviews [41]. Seven of the evaluations identified also conducted both participant and nonparticipant observation, which ranged from observations of meetings to community events [4144,46,47,49,51,52,64]. In addition to these researcher-led qualitative methods, some evaluators (n = 10) utilized more participatory research techniques, including research seminars and workshops, mapping exercises, the creation of intervention timelines, and other types of group exercises [41,42,48,55,64]. Participatory methods were utilized both as a means of bringing in the perspective of those affected directly by the intervention, as well as a method to check and present interim findings.

Several of the identified process evaluations were conducted alongside or after impact/outcome evaluations of the same intervention. Knai and colleagues integrated data from several evaluative strands including impact and process evaluations [63]. Five studies reported accompanying outcome evaluations, but those results were not presented alongside the process evaluation reports [43,44, 46,47,59,64]. Three studies presented outcome data alongside their process evaluations [5052, 60,61]. Finally, 2 papers reported independent outcome evaluations that were not linked to their own process evaluations [49,58].

The identified evaluations varied in the extent to which they produced and utilized system maps; 11 produced system maps of some description [41,4648,5157,60,63]; of these, only one used a formal system mapping technique: a causal-loop diagram [63]. The other system maps were bespoke maps that depicted different types of logic models [60,63], maps of the system structure [41,53,54], and maps that showed interactions between system elements [51,54,55,57].

Application of concepts from systems thinking and complexity science

Evaluations varied in the extent to which they applied concepts from systems thinking and complexity science to their evaluation design or analysis and concepts from systems thinking were utilized to a far greater extent than complexity concepts. Fig 2 shows this using a traffic light coloring scheme. The figure is structured with different concepts from systems thinking and complexity science in each of the columns. The concepts are presented as belonging along a continuum, with systems thinking on the far left-hand side and complexity science on the far right-hand side. Moving along the spectrum, from systems thinking to complexity science, represents a movement from static to dynamic. Key systems thinking concepts, on the left-hand side of the figure, are the structure of a system, its elements, and the relationships between them. Utilizing these allows researchers to create relatively static depictions of a system. Moving toward the middle of the figure, concepts from complexity science are introduced, which include attributes and dimensions of an intervention, and then a system undergoing change. The far right-hand side of the figure includes concepts that feature within the complexity sciences to computationally model complex systems in order to simulate and predict behavior and outcomes and to understand an evolving system.

thumbnail
Fig 2. Included studies and the degree to which they apply concepts from systems thinking and complexity science.

Each color-coded circle denotes the degree to which an evaluation applied the associated concept to any stage of the evaluation process. Green: study explicitly applied the concept; yellow: study attempted, or implicitly applied the concept; red: concept was not applied.

https://doi.org/10.1371/journal.pmed.1003368.g002

The evaluations identified in this review consistently applied key concepts from systems thinking: the identification and description of the system structure, including the different system elements and their differing perspectives. Thinking systemically also means making sense of the boundaries of a system and making decisions about what constitutes “the system” and what might be considered within or outside of the system. Although system maps are not a necessary element of systems thinking, they can be helpful for making sense of and depicting system boundaries, as articulated by both those acting within the system (“first-order” boundary judgments) and those studying it (“second-order” boundary judgments) [66]. Few evaluations (n = 3) in the sample [42,45,64] had explicit discussions of boundaries and the ways in which, or indeed if, boundary judgments were made. By contrast, 11 studies produced some form of system diagram [41,4648,5157,60,63], implying that boundary judgments were likely at least implicitly considered by evaluators. The identified papers focused analytically on the relationships between systems elements. Such a focus is understandable and indeed, a pre-requisite for being labeled as a system approach; without a focus on relationships and interactions—the key tenet of systems thinking—the approach fails to be systemic.

Somewhat surprisingly, only 4 fewer evaluations explicitly utilized a range of complexity concepts to assess changes within the system resulting from, or co-occurring with, intervention implementation over time [42,46,47,50,55]. By their nature, public health problems and the systems in which they are created and shaped are complex [40], and as a result, we might expect to see a more explicit attempt to use complexity concepts to generate evidence on public health interventions. Complexity science introduces a number of additional concepts that may be of value to researchers who seek to evaluate the mechanisms by which public health interventions have impacts in real-world environments. These concepts are used to describe, analyze, measure, and estimate attributes of change. The change first occurs within and across the system elements, and these collective changes result in emergent system change.

In the body of literature identified in this review, concepts from the complexity sciences, such as those that are used to understand change within systems, were utilized less frequently compared with concepts that could be used to describe static “snapshots” of systems. Although some papers were notable for applying a number of complexity concepts [42,46,47,50,55], the majority drew on only a few complexity-informed concepts in order to describe key mechanisms that might drive system change, such as a feedback loop. Researchers did not always provide a rationale for how the concepts had been chosen or specifically considered within the context of data collection and analysis. An exception to this was one study that created an explicit analytic framework to identify and explain a range of by-effects (unintended consequences stemming from an intervention) [58]. The framework categorized policy achievements as foreseen or unforeseen and desired or undesired [58]. Within the evaluations identified, the complexity concepts that were most frequently used included nonlinearity, feedback, and adaptation.

Discussion

We conducted a systematic search to identify examples of public health evaluations that apply a complex systems perspective to process evaluations involving qualitative methods. We then reviewed the systems and complexity concepts and methods currently used in this literature and found that evaluations of this nature draw on systems thinking to describe and analyze a system’s structure at one point in time, whereas fewer draw on concepts from complexity science to assess change in a system over time.

We identified evaluations of a wide range of interventions affecting population health or their social determinants. These include interventions in school, workplace, and neighborhood settings in high- and middle-income countries, addressing behavior change, urban planning, community empowerment, health policy, and public–private partnerships. Public health process evaluations with a complex systems perspective have roots in a range of different disciplines and draw on a number of theories and frameworks to understand intervention implementation in real-world settings. The kinds of qualitative methods used in the included studies are in many ways similar to those founds in other (i.e., not focused on complex systems) forms of qualitative research: for example, in-depth and semi-structured interviews, focus groups, document review, and participatory methods. As such, the methods are not particularly novel, but rather, this body of literature is characterized by existing tools being paired with a complex systems perspective.

Half of the included studies produce some form of visual representation of the system they sought to describe. In most cases, these maps did not use formal system mapping techniques, and the diagrams varied greatly from study to study. Concepts associated with complex systems also seemed to be applied by many of the included studies in an ad hoc manner, rather than drawing from established theories and frameworks associated with the complex systems literature. Most studies claimed that their systems perspective was planned at the design stage of their evaluation, but few reported basing their approach around an established systems theory or framework [42,48,50,54]. Evaluators’ attempts to utilize a complex systems perspective were most evident in the analysis stage of included studies, typically in the form of concepts from systems thinking and (less frequently) complexity science referred to in the analysis of qualitative data.

Included papers primarily utilized concepts from systems thinking to produce relatively static descriptions of systems and the interventions introduced within them. Although most evaluations concerned themselves to some degree with understanding mechanisms of, or barriers to, change, many did not make extensive use of the conceptual tools associated with complexity science that could help their attempts to better understand and unpack changes to the system of interest. In addition, although the evaluations identified in this body of literature drew on a range of qualitative methods, with many evaluators using a mix of qualitative methods within one evaluation design, it was often unclear why certain methods were chosen and the value added by each method.

From this summary of the review’s main findings, we suggest that approaches to designing, conducting, and reporting qualitative process evaluations that have a complex systems perspective are frequently underdeveloped and poorly specified. It is unclear to what extent systems thinking and complexity science influenced the key evaluation stages of study design, sampling, and data collection. The underlying theories informing evaluations are often unclear. The tendency to focus on systems concepts that describe a static system, rather than those best suited for assessing system change, seems counterintuitive, given that process evaluations are intended to assess mechanisms of change. We note that this rather critical assessment applies to many but not all of the studies we identified.

We would argue that all these studies are, in a sense, finding their way within an emerging field in which standards of best practice have yet to be established. We also believe that a contribution to the field would be a framework that seeks to address some of the problems identified in this review. Several authors have noted that although there are growing calls to utilize a complex systems approach, there have been fewer attempts to describe specific approaches or frameworks for doing so [35,71]. In particular, we advocate integrating a complex systems approach at the beginning of an evaluation design, to ensure that the perspective informs the evaluators’ theoretical position, the evaluation focus, sampling strategy, data collection methods, analysis, and interpretation of findings.

In order to advance this area of public health evidence generation, we now consider some potential ways forward by proposing a framework for qualitative process evaluations from a complex systems perspective. Fig 3 shows our proposed evaluation framework, which involves 2 distinct phases. The first phase is intended to produce a static system description at an early time point. This is then followed by a second phase focused on analyzing how that system undergoes change. Specific steps in the evaluation are shown in the squares with directions and prompts to the evaluators at each step provided in italics. The figure underscores the ways in which the outputs of Phase 1 inform the direction and scope of inquiry during Phase 2. Table 2 also shows the role of qualitative methods in a process evaluation and how these map onto the application of concepts from systems thinking and complexity science.

thumbnail
Fig 3. Framework for a process evaluation from a complex systems perspective.

Evaluation stages are show in squares; the italicized font provides directions and prompts for evaluators at each stage.

https://doi.org/10.1371/journal.pmed.1003368.g003

thumbnail
Table 2. Applying concepts from systems thinking and complexity science in a process evaluation.

https://doi.org/10.1371/journal.pmed.1003368.t002

Phase 1: A static system description

In the first part of this 2-phase framework, we propose that evaluators conduct a period of research in order to gain an initial understanding of the system, including the system structure, the boundaries, the constituent elements, and the relationships between these [6,14] at a given time point [24]. This description represents a snapshot of the system at one point in time. For many evaluators, it may make sense to capture the “initial conditions” or “initial state” of the system at the time the intervention is first implemented. In these cases, the evaluation would involve a period of familiarization and the first part of data collection as the intervention is being implemented or shortly thereafter. In this stage, evaluators would also begin to hypothesize some of the ways that the intervention may lead to change within the system (which may be informed by the intervention’s theory of change, if one is articulated). If the intervention designers have not described a theory of change, evaluators at this stage should articulate one by mapping out the initial hypotheses of system change.

In Phase 1, evaluators would begin to make sense of and document the “local rules” that govern both the intervention and the system, including the rules that govern how different system elements interact and relate to each other and how the intervention operates and relates to different parts of the system. In undertaking Phase 1, evaluators would draw on concepts that are most closely aligned with systems thinking (the left-hand side of Fig 2 and first half of Table 2) and use these to structure the initial data collection and analysis. Following the identification of the system structure, elements, boundaries, and relationships, evaluators should begin to consider some of the ways in which the intervention may lead to changes within the system. Evaluators could ask how the system elements respond to the intervention, comparing different stakeholder perspectives. Evaluators could also begin to assess system coherence by analyzing the degree to which the intervention is aligned with the interests of those in the system or the instances in which the intervention may “swim against the tide” [72,73].

In Phase 1, data should be collected from a range of different actors within the system. Evaluators may find a number of different data collection methods useful, including, but not limited to, an initial documentary review, interviews, and workshops. The boundary decision and the identification of system elements will inform from whom data are collected and through which methods [14].

As part of this process and as a way of analyzing the data collected in Phase 1, it may be helpful to create a map of the system. The type of map created will depend on the role it is to play in the evaluation. For example, if a map is made to visually represent the system structure and boundaries to help depict and understand the system structure and relationships between the system elements [57], it may be created through a semi-structured brainstorming session or interviews and the analysis of the data collected in Phase 1. Alternatively, evaluators may choose to create more structured system maps, drawing on established mapping methods, such as concept mapping or group model building, in order to map out causal linkages between system variables [74]. In these instances, Phase 1 represents an opportunity for initial preparatory work for the map creation process.

The output of Phase 1 would be relatively descriptive and static: a qualitative description of the system structure, elements, boundaries, and relationships which may well be depicted on a map, as well as some hypotheses about how the intervention may lead to system change, including the ways in which the elements and the system as a whole adapt and co-evolve in response. The hypotheses of system change may be depicted as a theory of change, which maps out how the intervention could lead to impacts, with particular consideration given to the pathways and mechanisms by which that change is brought about [6]. The initial system description and possible pathways for system change would then inform Phase 2.

Phase 2: A system undergoing change

The second phase of evaluation would examine emergent properties of the system and explore system change stemming from the intervention, drawing on a complexity perspective. In Phase 2, evaluators should be prepared to follow the pathway of emergent findings. In this sense, the evaluation needs to be adaptable, flexible, agile, incorporate multiple perspectives, and deal with uncertainty to support real-time decision-making. Evaluators would use the data collected in Phase 1 (particularly the emerging hypotheses about system change) to develop specific research questions about the intervention and the system. In defining the research questions, there is an opportunity to explicitly apply some of the complexity concepts—for example, by asking questions about the adaptive responses within different elements of the system, unintended consequences of the intervention for different population groups, or emergent system outcomes as the system co-evolves with its broader environment. It is not our suggestion that evaluators attempt to apply all complexity concepts to any one evaluation but rather focus on those that can generate useful evidence for decision-making [71]. Although the timing of Phase 2 may be determined by the theory of change, it may also be influenced by the timing of other types of data collection. For example, the process evaluation may accompany an impact evaluation that prespecifies time points for data collection [16,17].

At this stage, a more formal period of sampling and data collection would begin, to complement data collected in Phase 1 and to focus the sampling and data collection strategies to better answer the research questions. The specific sampling strategy and data collection methods will vary from evaluation to evaluation, but any process evaluation applying a complex systems perspective would sample multiple types of participants (e.g., different system elements) and use multiple methods [6,66]. As the papers in this review underscore, the careful use and reporting of different qualitative methods underpinned by complex systems theoretical principles can help an evaluator assess different perspectives across and within system levels, as well as different types of information [27]. Analyzing data generated through different qualitative methods can be used to bring a dynamic component to the evaluative research; for example, documents can be used to understand previous decisions and interviews or observations could then be used to understand the trajectory of those decisions and their impact across the system on different population groups [27]. Evaluators should consider the timing and ordering of mixed methods; a document review might, for example, provide important context in order to inform interview schedules [27]. Complexity concepts have traditionally been used within the context of quantitative and modeling methods. However, we argue that there is no reason that these concepts should not be of interest within a process evaluation using qualitative methods, particularly as many deal specifically with system changes upon which qualitative research could shed light [41,48].

During the analysis stage, the evaluators would begin to make sense of the emerging findings through the application of relevant complexity concepts. For example, an evaluation concerned with understanding the ways in which the intervention may lead to the amplification or dampening down of certain kinds of systemic change would have an explicit focus on identifying feedback loops within the system [75], or it might make sense (based on hypotheses generated in Phase 1) to focus the analysis on understanding how the system’s history influences its trajectory and adaption in response to the introduction of an intervention [76]. As the analysis is undertaken, there is likely a need to collect more data, in a kind of evaluative feedback loop. Such a process will be familiar to those who apply iterative research designs [17,77]. Throughout the analysis, evaluators would revisit, revise, and refine the theory of change and system map in light of the new data.

Generating outputs can be a challenge for public health evaluators applying a systems perspective. It is difficult to convey complex findings in a manner that is useful and timely for decision makers and does not result in an overly reductionist account or a confusingly “complex” set of findings. This is particularly a concern for qualitative research in which large volumes of data are collected. We suggest that one way to present the findings from a complex systems process evaluation is to create a “system story,” wherein the evaluator describes and analyses how the intervention embeds and co-evolves with the system and its elements overtime [3].

A more traditional approach to process evaluation is often rooted in the intervention itself, rather than the system in which that intervention is implemented. As a result of this orientation, such an evaluation generally considers the intervention and its immediate implementation processes and mechanisms, although there may be some consideration of more distal mechanisms and impacts [17]. In addition, more traditional process evaluations tend to adhere to research protocols that may themselves be relatively inflexible. A process evaluation from a complex systems perspective takes the system as the initial starting point of the analysis and considers the ways in which the intervention may lead to immediate, as well as more distal impacts, and the ways in which that intervention may change how the system elements—and the system as a whole—behave. Doing so will inherently require a flexible, adaptive, and iterative design. The framework presented here suggests at least 2 phases of data collection, with the understanding that the second phase will likely include an iterative process of defining research questions and collecting and analyzing data. Utilizing a longitudinal design with data collected over a relatively lengthy period of time or at more than one time point in order to capture a dynamic system undergoing change [24,67,71] may be a challenge to public health evaluators because it implies longer timescales [78], a move away from more standard evaluative approaches and a degree of risk with which some funders and decision makers may be uncomfortable. In addition, it may challenge traditional public health evaluation methods that strictly follow protocols in an attempt to control for internal validity [16]. In contrast, a complex systems approach to evaluation must inherently plan to adapt and change in response to early evaluative findings, as well as in response to the changing intervention and broader system. As a result of an adaptive evaluation design, the distinction between different types of evaluation (such as formative, process, outcome, and impact) may be less clearly defined. As evaluators follow the pathways of emergent hypotheses and findings, it may well make sense to, for example, measure or predict impacts alongside process mechanisms. Finally, further work remains on the ways in which realist and mixed methods approaches can more explicitly contribute to a process evaluation from a complex systems perspective, but it is beyond the scope of this current review.

Limitations

The nature of the review topic area required the research team to make a number of judgments throughout the review process. First, judgments were made regarding which studies to include or exclude on the basis of their public health relevance and the degree to which they featured a complex systems perspective. Although the majority of decisions were clear cut, the reviewers, in discussion with one another, had to make judgments in cases that were less obvious, and there is the possibility that other review teams would have made different decisions. In addition, there was a subjective element in deciding which concepts from systems thinking and complexity science to highlight; we sought to capture the key principles associated with each of the traditions with the goal of this list being used by those wishing to draw on systems thinking and complexity science within the context of public health evaluation. We recognize that other reviewers might have chosen to highlight other concepts. Finally, the critical appraisal of the studies again required judgments. In order to increase validity, 2 reviewers completed the process independently and reconciled their decisions, but the decisions were not always clear cut.

Another limitation of this review is the focus on studies which self-identify as taking a systems and/or complexity-informed approach. This focus has 2 possible limitations: First, it excludes studies that may be compatible with systems thinking but do not cite systems literature or draw explicitly on systems concepts, and second, it may include studies that utilize the terminology of complex systems, because it has become somewhat fashionable in the last few years, but fail to apply the concepts in such a manner that investigates complex uncertainties to generate better evidence for decision-making [71]. Taking the first concern, many rigorous qualitative studies foreground context in their research focus and analyses, considering the broader economic, social, political, cultural, environmental, and historical factors that impact interventions’ trajectories and influence diverse population groups [79]. As we have contended, “system” and “context” are broadly synonymous, in that all of a system can arguably be considered “contextual.” Therefore, qualitative research that actively engages with the broader context may apply a perspective that is compatible with systems thinking, without using the accompanying systems terminology. Indeed, the MRC Guidance on “Process Evaluation of Complex Interventions,” had limited reference to complex systems theory and terminology but nevertheless advocated a systems-compatible approach to process evaluation, namely, an approach that explores the “dynamic relationships between implementation, mechanisms and context, the importance of understanding the temporally situated nature of process data in understanding the evolution of an intervention within its system” [17,71]. With regards to the second concern, complex systems thinking is currently in vogue in public health, which can be seen in the growth of calls for the application of a complex systems perspective to public health practice and research [1,35,80,81]. Although many researchers are grappling with how to harness insights from the systems thinking and complexity science traditions to improve public health research, there is some concern that complex systems literature and concepts have been used without researchers truly engaging with the underlying theory [71]. These limitations suggest a number of opportunities for further research in this field. In particular, future research could fruitfully explore the degree to which public health literature—on intervention development and evaluation—is compatible with a complex systems perspective, even when not explicitly described as such. Other research might identify process evaluations that do not explicitly adopt a complex systems approach and analyze the added value of an explicit engagement with the systems and complexity literature.

Finally, we limited our search to English-language publications and relied on 2 previous reviews and an expert consultation to identify qualitative process evaluations from a complex systems perspective that were published prior to 2014, which is a limitation of our search’s sensitivity. The studies identified through these means may have been influenced by other researchers’ interpretations and possible biases. Any papers not identified from our search may have potentially added further to our methodological synthesis and the recommendations we put forward in the Discussion.

Conclusions

We have conducted a systematic review to identify qualitative process evaluations of public health interventions that consider themselves to be informed by systems thinking and/or complexity science, and we have analyzed the extent to which they feature key concepts from these fields. We found that this area of public health evidence generation is still in early stages of development and there is little consensus on a general approach. Informed by our evidence synthesis, we have therefore developed a framework for process evaluations that assesses change within the context of a wider complex adaptive system. We suggest that to do this, evaluations themselves need to be designed with a complex systems perspective, which requires being agile and adaptable in order to capture the system change they seek to assess. We are currently testing out this approach in an evaluation of how a system and its elements adapt and co-evolve in response to a local alcohol intervention that raises additional revenue to police and manage the night-time economy. We intend that this 2-phase framework can be of use, and be further refined, by public health practitioners and researchers who seek to produce evidence to improve health in complex social settings.

Supporting information

S1 PRISMA Checklist. PRISMA, Preferred reporting items for systematic reviews and meta-analyses.

https://doi.org/10.1371/journal.pmed.1003368.s001

(DOC)

S2 Text. Case study examples from the systems thinking and complexity science traditions.

https://doi.org/10.1371/journal.pmed.1003368.s003

(DOCX)

Acknowledgments

We thank the wider research team who have worked on the National Institute for Health Research, School for Public Health Research (NIHR SPHR) project “Developing a systems perspective for the evaluation of local public health interventions: theory, methods and practice.” Rachel Anderson de Cuevas (University of Liverpool), Steven Cummins (London School of Hygiene & Tropical Medicine), Frank de Vocht (University of Bristol), Karen Lock (London School of Hygiene & Tropical Medicine), Petra Meier (University of Sheffield), Lois Orton (University of Liverpool), Jennie Popay (Lancaster University), Harry Rutter (University of Bath), Natalie Savona (London School of Hygiene & Tropical Medicine), Richard Smith (University of Exeter), Margaret Whitehead (University of Liverpool), and Martin White (University of Cambridge) commented on the search terms and/or helped identify potentially relevant studies. In addition, we thank those academics who responded to our expert consultation; these include Zaid Chalabi (London School of Hygiene & Tropical Medicine), Peter Craig (University of Glasgow), Seanna Davidson (The Australian Prevention Partnership Centre), Ana Diez Roux (Drexel University), Anna Dowrick (Queen Mary University of London), Diane Finegood (Simon Fraser University), Penny Hawe (The University of Sydney), Vittal Katikireddi (University of Glasgow), Laurence Moore (University of Glasgow), David Peters (Johns Hopkins University), Mat Walton (Massey University), and Katrina Wyatt (University of Exeter).

References

  1. 1. Rutter H, Savona N, Glonti K, Bibby J, Cummins S, Finegood DT, et al. The need for a complex systems model of evidence for public health. Lancet. 2017;390(10112):2602–4. pmid:28622953
  2. 2. Public Health England. Whole Systems Approach to Obesity: A Guide to Support Local Approaches to Promoting a Healthy Weight. London: Public Health England. 2019 [cited 2020 Jan 8]. Available from: https://assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment_data/file/820783/Whole_systems_approach_to_obesity_guide.pdf
  3. 3. Egan M, McGill E, Penney T, Anderson De Cuevas R, Er V, Orton L, et al. NIHR SPHR Guidance on Systems Approaches to Local Public Health Evaluation. Part 1: Introducing Systems Thinking. London: National Institute for Health Research, School for Public Health Research. 2019 [cited 2020 Jan 8]. Available from: https://sphr.nihr.ac.uk/research/developing-a-systems-perspective-for-the-evaluation-of-local-public-health-interventions-theory-methods-and-practice/
  4. 4. Egan M, McGill E, Penney T, Anderson De Cuevas R, Er V, Orton L, et al. NIHR SPHR Guidance on Systems Approaches to Local Public Health Evaluation. Part 2: What to Consider When Planning a Systems Evaluation. London: National Institute for Health Research, School for Public Health Research. 2019 [cited 2020 Jan 8]. Available from: https://sphr.nihr.ac.uk/research/developing-a-systems-perspective-for-the-evaluation-of-local-public-health-interventions-theory-methods-and-practice/
  5. 5. Heng HHQ. The conflict between complex systems and reductionism. JAMA. 2008;300(13):1580–1. pmid:18827215
  6. 6. Finegood DT, Johnston LM, Steinberg M, Matteson CL, Deck P. Complexity, systems thinking and health behavior change. In: Kahan S, Green L, Gielen A, Fagen P, editors. Health behavior change in populations. Baltimore: Johns Hopkins University Press; 2014. pp. 435–58.
  7. 7. Rickles D, Hawe P, Shiell A. A simple guide to chaos and complexity. J Epidemiol Community Health. 2007;61(11):933–7. pmid:17933949
  8. 8. Snowden DJ, Boone ME. A leader's framework for decision making. Harv Bus Rev. 2007;85(11):68–76. pmid:18159787
  9. 9. Byrne D, Callaghan G. Complexity theory and the social sciences: the state of the art. Abingdon: Routledge; 2014.
  10. 10. Grant RL, Hood R. Complex systems, explanation and policy: implications of the crisis of replication for public health research. Crit Public Health. 2017;27(5):525–32.
  11. 11. Walton M. Expert views on applying complexity theory in evaluation: opportunities and barriers. Evaluation. 2016;22(4):410–23.
  12. 12. Boulton JG, Allen PM, Bowman C. Embracing complexity: strategic perspectives for an age of turbulence. Oxford: Oxford University Press; 2015.
  13. 13. Kania A, Patel AB, Roy A, Yelland GS, Nguyen DTK, Verhoef MJ. Capturing the complexity of evaluations of health promotion interventions–a scoping review. Can J Program Eval. 2012;27(1):65–91.
  14. 14. Westhorp G. Using complexity-consistent theory for evaluating complex systems. Evaluation. 2012;18)4):405–20.
  15. 15. Hawe P, Shiell A, Riley T. Theorising interventions as events in systems. Am J Community Psychol. 2009;43(3–4):267–76. pmid:19390961
  16. 16. Craig P, Dieppe P, Macintyre S, Michie S, Nazareth I, Petticrew M. Developing and evaluating complex interventions: the new Medical Research Council guidance. BMJ. 2008; 337(7676):a1655.
  17. 17. Moore GF, Audrey S, Barker M, Bond L, Bonell C, Hardeman W, et al. Process evaluation of complex interventions: Medical Research Council guidance. BMJ. 2015;350:h1258. pmid:25791983
  18. 18. Gates EF. Making sense of the emerging conversation in evaluation about systems thinking and complexity science. Eval Program Plann. 2016;59:62–73. pmid:27591941
  19. 19. Cilliers P. Boundaries, hierarchies and networks in complex systems. International Journal of Innovation Management. 2001;5(2):135–47.
  20. 20. Walton M. Applying complexity theory: a review to inform evaluation design. Eval Program Plann. 2014;45:119–26. pmid:24780280
  21. 21. Campbell M, Fitzpatrick R, Haines A, Kinmonth AL, Sandercock P, Spiegelhalter D, et al. Framework for design and evaluation of complex interventions to improve health. BMJ. 2000;321:694–6. pmid:10987780
  22. 22. Shiell A, Hawe P, Gold L. Complex interventions or complex systems? Implications for health economic evaluation. BMJ. 2008;336:1281–3. pmid:18535071
  23. 23. White M, Cummins S, Rayner M, Smith R, Rutter H, Adams J, et al. Protocol—Evaluation of the Health Impacts of the UK Treasury Soft Drinks Industry Levy [SDIL]. NIHR Journals Library. 2017 [cited 2020 Feb 23]. Available from: https://www.journalslibrary.nihr.ac.uk/programmes/phr/1613001/#/
  24. 24. Hawe P, Bond L, Butler H. Knowledge theories can inform evaluation practice: what can a complexity perspective add? New Dir Eval. 2009;2009(124):89–100.
  25. 25. Linnan L, Steckler A. Process evaluation for public health interventions and research. San Francisco: Jossey-Bass; 2002.
  26. 26. Moore GF, Audrey S, Barker M, Bond L, Bonell C, Hardeman W, et al. Process Evaluation of Complex Interventions: Medical Research Council guidance. London: MRC Population Health Science Research Network. 2014. [cited 2020 Feb 23]. Available from: https://mrc.ukri.org/documents/pdf/mrc-phsrn-process-evaluation-guidance-final/
  27. 27. Gear C, Eppel E, Koziol-Mclain J. Advancing complexity theory as a qualitative research methodology. Int J Qual Methods. 2018;17(10).
  28. 28. Allender S, Brown AD, Bolton KA, Fraser P, Lowe J, Hovmand P. Translating systems thinking into practice for community action on childhood obesity. Obes Rev. 2019;20(Suppl 2):179–83.
  29. 29. Atkinson JA, Prodan A, Livingston M, Knowles D, O'Donnell E, Room R, et al. Impacts of licensed premises trading hour policies on alcohol‐related harms. Addiction. 2018;113(7):1244–51. pmid:29396879
  30. 30. Jalali MS, Rahmandad H, Bullock SL, Lee-Kwan SH, Gittelsohn J, Ammerman A. Dynamics of intervention adoption, implementation, and maintenance inside organizations: the case of an obesity prevention initiative. Soc Sci Med. 2019;224:67–76. pmid:30763824
  31. 31. Lich KH, Urban JB, Frerichs L, Dave G. Extending systems thinking in planning and evaluation using group concept mapping and system dynamics to tackle complex problems. Eval Program Plann. 2017;60:254–64. pmid:27825622
  32. 32. Rosas S, Knight E. Evaluating a complex health promotion intervention: case application of three systems methods. Crit Public Health. 2019;29(3):337–52.
  33. 33. Urwannachotima N, Hanvoravongchai P, Ansah JP. Sugar‐sweetened beverage tax and potential impact on dental caries in Thai adults: an evaluation using the group model building approach. Syst Res Behav Sci. 2019;36(1):87–99.
  34. 34. Burke JG, Lich KH, Neal JW, Meissner HI, Yonas M, Mabry PL. Enhancing dissemination and implementation research using systems science methods. Int J Behav Med. 2015;22(3):283–91. pmid:24852184
  35. 35. Carey G, Malbon E, Carey N, Joyce A, Crammond B, Carey A. Systems science and systems thinking for public health: a systematic review of the field. BMJ Open. 2015;5(12):e009002. pmid:26719314
  36. 36. Rusoja E, Haynie D, Sievers J, Mustafee N, Nelson F, Reynolds M, et al. Thinking about complexity in health: A systematic review of the key systems thinking and complexity ideas in health. J Eval Clin Pract. 2018;24(3):600–6. pmid:29380477
  37. 37. Nilsen P. Making sense of implementation theories, models and frameworks. Implement Sci. 2015;10(1):53.
  38. 38. Cresswell JW, Plano Clark VL. Designing and conducting mixed methods research. 2nd ed. Los Angeles: SAGE Publications; 2011.
  39. 39. Covidence systematic review software. Veritas Health Innovation. Melbourne, Australia. Available at www.covidence.org
  40. 40. Diez Roux AV. Complex systems thinking and current impasses in health disparities research. Am J Public Health. 2011;101(9):1627–34. pmid:21778505
  41. 41. Rothwell H, Marks R, Shepherd M, Murphy S, Burgess S, Townsend N, et al. Implementing a social‐ecological model of health in Wales. Health Educ. 2010;110(6):471–89.
  42. 42. Durie R, Wyatt K. Connecting communities and complexity: a case study in creating the conditions for transformational change. Crit Public Health. 2013;23(2):174–87.
  43. 43. Alfandari R. Systemic barriers to effective utilization of decision making tools in child protection practice. Child Abuse Negl. 2017;67:207–15. pmid:28282594
  44. 44. Alfandari R. Multi-professional work in child protection decision-making: an Israeli case study. Child Youth Serv Rev. 2019;98:51–7.
  45. 45. Schelbe L, Randolph KA, Yelick A, Cheatham LP, Groton DB. Systems theory as a framework for examining a college campus-based support program for the former foster youth. J Evid Inf Soc Work. 2018;15(3):277–95. pmid:29412067
  46. 46. Bartelink N, van Assema P, Jansen M, Savelberg H, Moore G, Hawkins J, et al. Process evaluation of the Healthy Primary School of the Future: the key learning points. BMC Public Health. 2019;19(1):698. pmid:31170941
  47. 47. Bartelink N, van Assema P, Jansen M, Savelberg H, Willeboordse M, Kremers S. The healthy primary School of the Future: a contextual action-oriented research approach. Int J Environ Res Public Health. 2018;15(10):243.
  48. 48. Burman CJ, Aphane MA. Leadership emergence: the application of the Cynefin framework during a bio-social HIV/AIDS risk-reduction pilot. Afr J AIDS Res. 2016;15(3):249–60. pmid:27681149
  49. 49. Evans R, Murphy S, Scourfield J. Implementation of a school-based social and emotional learning intervention: understanding diffusion processes within complex systems. Prev Sci. 2015;16(5):754–64. pmid:25726153
  50. 50. Walton M. Setting the context for using complexity theory in evaluation: boundaries, governance and utilisation. Evid Policy. 2016;12(1):73–89.
  51. 51. Crane M, Bauman A, Lloyd B, McGill B, Rissel C, Grunseit A. Applying pragmatic approaches to complex program evaluation: a case study of implementation of the New South Wales Get Healthy at Work program. Health Promot J Austr. 2019;30(3):422–32. pmid:30860630
  52. 52. Crane M, Bohn-Goldbaum E, Lloyd B, Rissel C, Bauman A, Indig D, et al. Evaluation of Get Healthy at Work, a state-wide workplace health promotion program in Australia. BMC Public Health. 2019;19(1):183. pmid:30760237
  53. 53. Czaja SJ, Valente TW, Nair SN, Villamar JA, Brown CH. Characterizing implementation strategies using a systems engineering survey and interview tool: a comparison across 10 prevention programs for drug abuse and HIV sexual risk behavior. Implement Sci. 2016;11(1):70.
  54. 54. Dickson-Gomez J, Glasman LA, Bodnar G, Murphy M. A social systems analysis of implementation of El Salvador’s national HIV combination prevention: a research agenda for evaluating Global Health Initiatives. BMC Health Serv Res. 2018;18(1):848. pmid:30419904
  55. 55. Figueiro AC, de Araujo Oliveira SR, Hartz Z, Couturier Y, Bernier J, do Socorro Machado Freire M, et al. A tool for exploring the dynamics of innovative interventions for public health: the critical event card. Int J Public Health. 2017;62(2):177–86. pmid:27572490
  56. 56. Shankardass K, Muntaner C, Kokkinen L, Shahidi FV, Freiler A, Oneka G, et al. The implementation of Health in All Policies initiatives: a systems framework for government action. Health Res Policy Syst. 2018;16(1):26. pmid:29544496
  57. 57. Fisher M, Milos D, Baum F, Friel S. Social determinants in an Australian urban region: a 'complexity' perspective. Health Promot Int. 2016;31(1):163–74. pmid:25107921
  58. 58. van Twist M, Kort M, van der Steen M. Assessing and appraising the effects of policy for wicked issues: including unforeseen achievements in the evaluation of the district policy for deprived areas in The Netherlands. International Journal of Public Administration. 2015;38(8):596–605.
  59. 59. Haggard U, Trolldal B, Kvillemo P, Guldbrandsson K. Implementation of a multicomponent Responsible Beverage Service programme in Sweden—a qualitative study of promoting and hindering factors. Nordic Studies on Alcohol and Drugs. 2015;16(1):73–90.
  60. 60. McGill E, Marks D, Sumpter C, Egan M. Consequences of removing cheap, super-strength beer and cider: a qualitative study of a UK local alcohol availability intervention. BMJ Open. 2016;6(9): e010759. pmid:27687895
  61. 61. Sumpter C, McGill E, Dickie E, Champo E, Romeri E, Egan M. Reducing the Strength: a mixed methods evaluation of alcohol retailers’ willingness to voluntarily reduce the availability of low cost, high strength beers and ciders in two UK local authorities. BMC Public Health. 2016;16(1):448.
  62. 62. Pérez‐Escamilla R, Cavallera V, Tomlinson M, Dua T. Scaling up Integrated Early Childhood Development programs: lessons from four countries. Child Care Health Dev. 2018;44(1):50–61. pmid:29235170
  63. 63. Knai C, Petticrew M, Douglas N, Durand M, Eastmure E, Nolte E, et al. The public health responsibility deal: Using a systems-level analysis to understand the lack of impact on alcohol, food, physical activity, and workplace health sub-systems. Int J Environ Res Public Health. 2018;15(12):2895.
  64. 64. Orton L, Halliday E, Collins M, Egan M, Lewis S, Ponsford R, et al. Putting context centre stage: evidence from a systems evaluation of an area based empowerment initiative in England. Crit Public Health. 2016;27(4):477–89.
  65. 65. Kearney S, Leung L, Joyce A, Ollis D, Green C. Applying systems theory to the evaluation of a whole school approach to violence prevention. Health Promot J Austr. 2016;27(3):230–5. pmid:27719735
  66. 66. Buijs J-M, Eshuis J, Byrne D. Approaches to researching complexity in public management. In: Teisman G, van Buuren A, Gerrits L, editors. Managing complex governance systems: dynamics, self-organization and coevolution in public investments. London: Routledge; 2009. pp. 37–55.
  67. 67. Eppel E, Matheson A, Walton M. Applying complexity theory to New Zealand public policy: principles for practice. Policy Quarterly. 2011;7(1):48–55.
  68. 68. Byrne D. Evaluating complex social interventions in a complex world. Evaluation. 2013;19(3):217–228.
  69. 69. Foster-Fishman PG, Nowell B, Yang H. Putting the system back into systems change: a framework for understanding and changing organizational and community systems. Am J Community Psychol. 2007;39(3–4):197–215. pmid:17510791
  70. 70. Hawe P. Lessons from complex interventions to improve health. Annu Rev Public Health. 2015;36:307–23. pmid:25581153
  71. 71. Moore GF, Evans RE, Hawkins J, Littlecott H, Melendez-Torres G, Bonell C, et al. From complex social interventions to interventions in complex social systems: future directions and unresolved questions for intervention development and evaluation. Evaluation. 2019;25(1):23–45. pmid:30705608
  72. 72. Egan M, Bambra C, Thomas S, Petticrew M, Whitehead M, Thomson H. The psychosocial and health effects of workplace reorganisation. 1. a systematic review of organisational-level interventions that aim to increase employee control. J Epidemiol Community Health. 2007;61(11):945–54. pmid:17933951
  73. 73. Whitehead M, Orton L, Pennington A, Nayak S, King A, Petticrew M, et al. Is control in the living environment important for health and wellbeing, and what are the implications for public health interventions? Final Report. London: Public Health Research Consortium. 2014. [cited 2020 Jan 8]. Available from: http://phrc.lshtm.ac.uk/papers/PHRC_004_Final_Report.pdf
  74. 74. Brennan LK, Sabounchi NS, Kemner AL, Hovmand P. Systems thinking in 49 communities related to healthy eating, active living, and childhood obesity. J Public Health Manag Pract. 2015;21(S3):S55–69.
  75. 75. Petticrew M, Knai C, Thomas J, Rehfuess EA, Noyes J, Gerhardus A, et al. Implications of a complexity perspective for systematic reviews and guideline development in health decision making. BMJ Glob Health. 2019;4(Suppl 1):e000899. pmid:30775017
  76. 76. Booth A, Noyes J, Flemming K, Moore G, Tunçalp Ö, Shakibazadeh E. Formulating questions to explore complex interventions within qualitative evidence synthesis. BMJ Glob Health. 2019;4(S1) e001107:.
  77. 77. Patton MQ. Developmental evaluation. Eval Pract. 1994;15(3):311–19.
  78. 78. McGill E, Egan M, Petticrew M, Mountford L, Milton S, Whitehead M, et al. Trading quality for relevance: non-health decision-makers’ use of evidence on the social determinants of health. BMJ Open. 2015;5(4):e007053. pmid:25838508
  79. 79. Korstjens I, Moser A. Practical guidance to qualitative research. Part 2: context, research questions and designs. Eur J Gen Pract. 2017;23(1):274–9. pmid:29185826
  80. 80. Public Health England. Community-Centred Public Health: Taking a Whole System Approach. London: Public Health England. 2020 [cited 2020 Feb 23]. Available from: https://assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment_data/file/857029/WSA_Briefing.pdf
  81. 81. Medical Research Council. PHIND: Systems Based Approaches to Public Health Intervention Development. 2019 [cited 2020 Feb 23]. Available from: https://mrc.ukri.org/funding/browse/pre-call-public-health-intervention-development-scheme-phind-november-2019/phind-systems-based-approaches-to-public-health-intervention-development/