Melissa Mouthaan
OECD
Kathryn Oliver
London School of Hygiene and Tropical Medicine
Melissa Mouthaan
OECD
Kathryn Oliver
London School of Hygiene and Tropical Medicine
This chapter examines a subset of 85 education papers drawn from an external systematic review of barriers to and facilitators of the use of evidence by policy makers. It also presents the latest trends in research on factors shaping evidence use in education policy. It analyses the main trends in the education subset, draws comparisons with trends in other knowledge domains, and situates these findings within broader OECD research on knowledge mobilisation in education.
This chapter represents an analysis of a subset of 85 education papers included in an external systematic review of barriers to and facilitators of the use of evidence by policymakers. It highlights several key trends in research on knowledge mobilisation in education.
Much research on knowledge mobilisation emerges from just a few Anglo-Saxon countries and contexts, highlighting that more research on knowledge mobilisation from non-Anglo-Saxon contexts is urgently needed. It is also important to better understand how research from Anglo-Saxon contexts can be applied to other contexts, and what aspects of these studies are generalisable. In tangent, international research collaborations could be a way to redress this balance.
Drawing from lessons learned from other sectors is one way to address long-standing challenges in knowledge mobilisation. Some of the challenges in knowledge mobilisation are remarkably similar across knowledge domains. Some knowledge domains have advanced further in implementing certain types of mechanisms or interventions in knowledge mobilisation. Cross-sectoral learning therefore remains a highly valuable activity for knowledge mobilisation.
Many studies provide descriptions of knowledge mobilisation initiatives; however, comparatively few of these studies in education also examine their effectiveness. This suggests that more targeted efforts to ensure learning “what works” in these initiatives would be beneficial. This would, in turn, ensure new initiatives are developed based on lessons learned in past initiatives, while building understanding of and consensus on what quality use of evidence in policy processes looks like.
While systematic reviews can be of great value when compared to individual evidence pieces, there is little to indicate from this data that education policy makers indeed make substantive use of them. As evidence characteristics remain the main factor influencing use of evidence in policy, further research can explore how different types of evidence can be better incorporated into policy.
The expectation that education systems should strive to be evidence-informed is now widespread. Interventions designed to make policies more evidence-informed have emerged predominantly in the healthcare sector (Biesta, 2007[1]) – as a result, studies of these interventions, and systematic reviews of these studies, were often by default limited to health policy. However, this is changing. The growth in these interventions has led to increased “research on research use” in policy and practice, which is now much more commonplace across policy and knowledge domains – including not only education but also conservation and environment, city and regional planning, and public administration in general (among other domains). In the education sector, this is partly the result of a strong push towards using evidence by stakeholders in many systems and regions, and yet the impact of research on education policy is still found to be limited (Pellegrini and Vivanet, 2020[2]).
In education, this growth in knowledge mobilisation initiatives takes place against a backdrop of ongoing challenges concerning the existing evidence base in this field. There remain concerns about the quality of qualitative research in this field (too often considered to be of low quality), while others criticise the strong priority given to numerical or statistical data over other types of evidence when designing policies (Gorard, See and Siddiqui, 2020[3]; Burkhardt and Schoenfeld, 2003[4]; Aldridge et al., 2018[5]; Wainwright et al., 2020[6]; Wrigley, 2018[7]). Overall, a lack of replications of previous studies imply that education research is failing to build a cumulative knowledge base (Van Damme, 2024[8]). The question of how narrowly we should define the types of evidence that are suitable to inform education policy also continues to be the subject of much debate. It has been suggested that this should extend beyond the rigorous qualitative or quantitative enquiry that is often suggested as the minimum evidence standard within the education research (academic) community (Bainbridge, Troppe and Bartley, 2022[9]). And, increasingly, there is recognition that research evidence will not fundamentally inform policy or practice in education and other domains without integration with other sources of knowledge, e.g. professional knowledge (Rickinson et al., 2022[10]; Nutley et al., 2019[11]). Research knowledge does not enter a void but rather competes for space with other types of knowledge, and other inputs to decisions. This recognition has led to a decisive shift in discourse where the generally agreed ambition is to make policy “evidence-informed” rather than “evidence-based” (Nutley et al., 2019[11]).
Drawing on a wider systematic review, conducted in 2024, of studies that examine factors influencing the use of evidence in policy (see Box 2.1), this chapter contributes to these debates by presenting findings from an analysis of a subset of 85 empirical, mostly peer-reviewed education studies published between 2012 and 2022 (see complete list of studies included in the subset in Annex 2.B).
Oliver et al. (2014[12]) conducted an initial systematic review of factors influencing evidence use in policy in 2014 across multiple knowledge domains. In this review, 145 studies were included, with over half published between 2010 and 2014.
In 2024, an updated review was conducted in which 2 199 studies identified across multiple knowledge domains were included (for a detailed methodology, see Annex 2.A). As the authors note, the goal of the 2024 review was to determine how research activity in this area has changed since 2014 (Oliver et al., forthcoming[13]). The significantly larger number of studies included in the 2024 review points to a significant boom in published, peer-reviewed research on the topic of evidence-informed policy.
The subset of studies examined in this chapter is taken from the 2024 review (see (EPPI, 2024[14])). The 85 papers analysed were all coded as belonging to the education knowledge domain.
The chapter follows a three-pronged approach:
It analyses the main trends in the subset of 85 education papers.
It draws comparisons with other knowledge domain subsets.
It draws comparisons with OECD research conducted since 2021 on the topic of evidence use in education policy.
The chapter describes the main characteristics of the education subset, including geographical focus, study design, methodological focus and study focus. It then details what factors were found to influence evidence use in the studies. Finally, it explores specific knowledge mobilisation interventions that are described and sometimes evaluated in the studies. Examples of studies from the education subset are included where these have been chosen for interest and representativeness.
The subset of education papers represents 3.8% of the total studies included in the 2024 systematic review. Around one-third were simultaneously coded as other knowledge domains (n = 30), and public administration was the knowledge domain most often coded together with education (n = 15). Education is the sixth-largest knowledge domain in the review (see Figure 2.1.). The vast majority of included studies were coded as health (n = 1254). The following sections detail the main characteristics of the subset and the implications for future work on knowledge mobilisation and supporting research.
Note: Data show the number of times studies were coded as a given knowledge domain.
Source: EPPI (2024[14]), Barfac visualisation: What factors influence evidence use?, https://eppi.ioe.ac.uk/eppi-vis/Review/Index/616.
A large majority of education papers describe research in North America, Europe and/or Oceania (82% of assigned region codes). Within Europe, education studies examining evidence use in the UK context hugely dominated (n = 17), followed by Norway and the Netherlands (n = 4 each). In North America, 30 education studies reported findings in the US context and 9 in Canada. This confirms an ongoing phenomenon in knowledge mobilisation research: that English-language research conducted in this field is very much dominated by Anglo-Saxon countries, with a selection of these often coded across the main review (42% of country codes assigned to studies across all knowledge domains were the United Kingdom, Canada, the United States, Australia or New Zealand). While this is partly explained by the focus on English-language studies in the systematic review, this indicates that transferring knowledge on knowledge mobilisation from Anglo-Saxon to other cultural or national contexts will remain an important avenue to explore for the foreseeable future. At the same time, there is a risk of generating an increasingly-narrow evidence base if studies generated on knowledge mobilisation are limited to a few cultural contexts. More research on knowledge mobilisation in non-Anglo contexts is urgently needed, while international research collaborations could also help to redress this balance.
The most common methodological approach in the education subset was documentary analysis, followed by interviews (Figure 2.2). The popularity of documentary analysis (for example, the analysis of policy documents) can be partly explained by the relatively lower cost of this type of data collection and analysis. A notably higher percentage of studies in the education subset are systematic reviews when compared with any other knowledge domain.
The majority of education studies used one approach in their methodological design (n = 52) (e.g. just documentary analysis). However, 18% of the subset used three or more approaches. Documentary analysis was most often combined with interviews (n = 17), or a case study approach (papers reporting focusing on a single setting, n = 9).
Studies that have not been triangulated through diverse data collection methods may be limited in their reliability (how replicable the study is) and validity (how accurately the study measures what it sets out to measure). Given the concerns that have been voiced about the overall quality and rigour of educational research in general (Van Damme, 2024[8]), one possibility for ensuring more robustness in education research would be the more widespread use of mixed-methods research design.
Source: EPPI (2024[14]) Barfac visualisation: What factors influence evidence use?, https://eppi.ioe.ac.uk/eppi-vis/Review/Index/616.
The review coded the type of data and views and experiences collected in the studies.
Note: “Case study” refers to papers which reported focusing on a single setting.
Source: EPPI (2024[14]) Barfac visualisation: What factors influence evidence use?, https://eppi.ioe.ac.uk/eppi-vis/Review/Index/616.
Education studies were slightly more likely to draw on documents compared to other knowledge domains. Education papers were less likely to collect data from the private sector than papers in other knowledge domains. The most common type of data collected in the main dataset was policy makers’ views or experiences (in 52% of studies), while researchers’ views and experiences were, similarly to the education subset, also frequently collected (in 46% of studies in the main dataset).
The focus on collecting information and data from policy and research communities is understandable when considering that the systematic review focused on explaining the gaps that persist between these two communities. At the same time, it suggests that researchers across knowledge domains tend to collect data from within their own communities. In some examples within the education subset, findings on the policy impact of research are based on researchers’ perceptions rather than policy makers’ perceptions (see, for example, Cherney et al. (2013[15]) and O’Connor (2022[16])) where the authors acknowledge this limitation). Another implication of an overrepresentation of researchers’ views is a possible distorted picture of knowledge mobilisation that emphasises a “research push” approach. While there is a diverse landscape of actors participating in research mobilisation, these data collection trends suggest that it is still research producers (researchers) and users (policy makers) whose views and experiences are mostly considered. Yet, the reality of knowledge mobilisation is much more complex (OECD, 2022[17]). Future research could therefore benefit from including a greater diversity of views and insights, in particular from under-represented stakeholders such as parents, pupils and intermediaries.
Over half (51%) of the studies in the education subset reported on factors influencing evidence use at the level of national policy. Yet almost one-quarter (24%) reported on local policy making, and one-fifth (21%) reported on regional-level policy making, showing that these levels of policy are also being substantively researched. This broadly reflects the trend in the main dataset where 46% of studies reported on evidence use in national policy, 23% on local policy, and 28% on regional policy. The availability of evidence on knowledge mobilisation at local and regional levels of policy provide a knowledge base to tap into. This is especially important for decentralised education systems, which often have a lot of involvement from local authorities or regional administrations, and where student outcomes are heavily influenced by local conditions and factors.
Most of the papers in the education subset focused their analysis on the mobilisation (n = 36) or the use of evidence (n = 37) (Box 2.2). A minority focused on the production of research evidence (n = 12). The review also coded other aspects of the focus of studies (see Figure 2.4).
Evidence production was used to describe studies which primarily focused on research activities, research production or the outputs of (often) collaborative research endeavours.
Mobilisation of evidence was used to describe studies which primarily focused on how knowledge was translated, made more accessible, or otherwise made more available to decision makers.
Evidence use was used only for studies that examined how decision-making settings or processes were influenced by evidence; for example, looking at how teaching practices changed in response to evidence provision.
Note: The data show the focus of included studies in three different knowledge domain subsets: education (n = 85); health (n = 1 254); and conservation and environment (n = 548); and the average for all studies included in the review.
Source: EPPI (2024[14]), Barfac visualisation: What factors influence evidence use?, https://eppi.ioe.ac.uk/eppi-vis/Review/Index/616.
Of the 26 studies describing evidence use interventions in education captured in Figure 2.4, 9 focused on evaluations. The methods adopted in these studies vary, and include two systematic reviews of studies evaluating the impact of research on policy, and effective mechanisms to ensure uptake of research evidence in policy (Ashcraft, Quinn and Brownson, 2020[18]; Gorard, See and Siddiqui, 2020[3]), as well as a qualitative case study approach drawing on documentary research to examine the impact of strategies implemented in a Canadian knowledge mobilisation network (Campbell et al., 2017[19]). Overall, the gap between descriptive accounts and reporting on evaluation findings is larger in education than in other domains. Van Damme’s (2022[20]) pyramid of evidence framework classifies descriptive research as valuable when seeking to identify characteristics and trends – it can lead to conceptualisation and some generalisation. However, its primary value is as a “stepping stone” towards other research approaches. A better balance between description and evaluation would allow for an improved understanding of not only the “what” but the “how” of knowledge mobilisation interventions. Evaluating knowledge mobilisation is far from being a systematic practice in many organisations in the education landscape across countries and is often not readily acknowledged as being a valuable activity. A lack of funding for conducting evaluations is also a commonly reported barrier that could explain this gap (see Chapter 6).
Studies that measure the use of evidence can provide valuable insight into the quality of research use in policy communities. Education papers that measured the use of evidence (n = 21) include the study by O’Connor (2022[16]), which draws on survey data in Ireland and develops a research use scale ranging from simple awareness of research to the use of research in policy design, implementation and evaluation. The author concludes that there is a poor uptake of educational research among Irish policy makers. Similarly, Cherney et al. (2013, p. 255[15]) refer to a “ladder of [research] utilisation” and draw on survey data to examine levels of social science research impact among Australian policy makers, where they conclude that a combination of linear and relational mechanisms to mobilise research findings has a significant positive bearing on research use.
Ensuring evidence uptake in policy requires organisational structures, governance and a regulatory environment that is conducive to its uptake (Slade, Philip and Morris, 2018[21]; Mouthaan and Révai, 2023[22]). Analytical, operational and political skills and competencies are necessary at individual, organisational and systemic levels in order to perform policy functions effectively (Wu, Ramesh and Howlett, 2015[23]). Studies examining the feasibility of research use tend to take a more holistic view of state capacity to integrate research evidence into policy, and can provide valuable insights into capacity gaps. For example, Ion, Marin and Proteasa (2018[24]) signal the issue of weak administrative capacity for the development of evidence-informed education policies in former communist states, where they recommend that governments prioritise the use of systematic review and evaluation of policy programmes. The low number of education studies examining feasibility of research use suggests that this perspective is missing.
As noted in the introduction, there is significant debate around the types of evidence that can be suitable to influence policy. The review aimed to capture the different types of evidence that were reported as being used in education policy processes (see Figure 2.5). This gives a picture of where there are still gaps: in the types of evidence available to inform education policy decision making; and in our understanding of how policy makers are using different types of evidence.
Around half of the education subset examined policy makers’ or practitioners’ use of general social sciences research (47%); 34% explored the use of general scientific research (used to denote studies which did not specify a particular kind of evidence). Social sciences research was a broad code applied when studies referred to academic social research, whether quantitative or qualitative (e.g. (Cherney et al., 2012[25])) or for example research reports presenting evaluation findings (e.g. (Forbes, 2022[26])). One-quarter examined the use of surveillance or administrative data, which includes quantitative data such as on student achievement (see, for example, the study by Daly et al. (2014[27])).
Note: Data show the relative frequency of studies in the education subset that examined the uptake of different types of evidence in policy or practice. The categories of evidence represented reflect categories of at least 4% of the education subset or higher.
Source: EPPI (2024[14]), Barfac visualisation: What factors influence evidence use?, https://eppi.ioe.ac.uk/eppi-vis/Review/Index/616.
A relatively low percentage of papers found or examined professional views as input to policy (12%). This suggests that professional knowledge such as from teachers or education administrators has not been the focus of much study in this research; likely because such views or knowledge are not viewed as a form of evidence. In comparison, around one-quarter of studies in the main review explored professional expertise as input to policy making (EPPI, 2024[14]). The low percentage of studies examining use of evaluation data as a form of evidence as input into education policy may be a reflection of there often being little evaluation data to draw on (see Chapter 6).
Similarly, synthesised evidence – such as from systematic reviews or meta-analyses – was explored as input to policy only infrequently; and two such studies are systematic reviews themselves. This is a far smaller proportion than the 114 studies in health (of which 17 are systematic reviews) that examined synthesised evidence as input to policy. Education is not the only knowledge domain in which the use of synthesised evidence in policy is either low, or an under-researched phenomenon. In economics, criminal justice, city and regional planning and other knowledge domains, there were no studies reporting or exploring the use of systematic review evidence or evidence synthesis in policy. In comparison, in international development, 15% of studies reported on the use of synthesised evidence. The challenge for knowledge domains where use of systematic review evidence is low is two-fold: more systematic synthesis of research is needed, and more research on the use of such synthesised evidence in policy is needed.
One important aspect that may explain these differences between knowledge domains is the availability of synthesised evidence overall; in education, a lack of evidence syntheses is still a major barrier and remains an important issue to address (Mouthaan and Steponavičius, 2023[28]; OECD, 2023[29]). A consequence for policy may be that using individual studies instead of systematic reviews or other forms of synthesised evidence can be less reliable in terms of deriving implications or recommendations. The OECD 2021 policy survey1 found that among 37 respondent education systems, less than two in three systems synthesise and disseminate research findings through user-friendly tools to policy makers or practitioners (OECD, 2022[17]). Evidence synthesis is much more routine in the health sector, which produces 26 times more syntheses than the education sector (Education.org, 2021[30]). A growing number of intermediaries are addressing this challenge by contributing to evidence synthesis, including the Education Endowment Foundation (EEF), whose Teaching and Learning Toolkit has been adapted to different regional contexts (Mouthaan and Steponavičius, 2023, p. 44[28]); the Norwegian Knowledge Centre; and DIPF | Leibniz Institute for Research and Information in Education, among others. While some research on the use of such syntheses exists (e.g. (Higgins, 2020[31]) on the use of the EEF toolkit), this has been published outside of peer-reviewed journals and does not appear in this review. Moreover, the effects of recent efforts to improve evidence synthesis and uptake of synthesised evidence may not yet be fully observable. Further research would be useful to understand how (or if) policy and practice communities indeed make use of synthesised evidence.
In education and in the dataset as a whole, relatively few studies examined the use of evidence from RCTs as input to policy (n = 26 in the main dataset). RCTs have a significant cost implication and their use in education has been controversial. Its critics emphasise that it is not always an appropriate or useful form of investigation (McKnight and Morgan, 2019[32]) and there are challenges in measuring what works in classroom contexts where human agency is at the heart of interventions, unlike in medicine trials (Wrigley and McCusker, 2019[33]; Parra and Edwards, 2024[34]). For instance, the EEF’s commitment to a “what works” agenda is seen by some as a reductionist approach that reduces appetite for innovation by encouraging only the adoption of “proven” approaches (see Greany in (Pino-Yancovic et al., 2023[35])). Certain findings on the impact of RCT evidence have reinforced negative views. The EEF itself showed in an impact study that the uptake of Teaching Assistants’ support to disadvantaged students have not improved their educational attainment (see EEF (2021[36]), and the EEF case study in Chapter 6)). On the other hand, proponents stress that RCTs can help to identify effective pedagogical interventions and create a pool of key approaches that can improve student learning. In this sense, RCTs allow for rigorous testing of the effectiveness of new pedagogical approaches. The EEF, for example, has a pipeline model that supports schools in developing innovative solutions to address their context-specific problems and in testing these (EEF, 2024[37]). To move from beliefs about experimental designs to evidence on their usefulness, it would be important to invest in better understanding how and to what extent data from RCTs is being used in education policy, how this type of evidence is perceived by policy makers in terms of its breadth of applicability and relevance, and its impact on educational outcomes.
The majority of studies across the main 2024 dataset found evidence characteristics and organisation and resources to play a role in determining evidence use (negatively or positively) (see Figure 2.6). Overall, the picture is very similar across the knowledge domains, which – perhaps unsurprisingly – suggests that a lot of knowledge and know-how on evidence use is transferable across sectors. In many cases, factors influencing evidence use are deeply connected to each other. The following sub-sections explore the findings on factors affecting use of evidence and some of their important ties.
Note: The data show the factors found to influence evidence use in three different knowledge domain subsets and the average for all studies included in the review.
Source: EPPI (2024[14]), Barfac visualisation: What factors influence evidence use?, https://eppi.ioe.ac.uk/eppi-vis/Review/Index/616.
Evidence characteristics include the type of evidence, its source, availability, accessibility, credibility, legitimacy, relevance and factors such as how actionable the evidence or recommendations are. Researcher characteristics include researchers’ skills, policy awareness, will and interest, and credibility. Policy maker characteristics include skills, research literacy, and will and motivation to use research evidence.
Evidence characteristics are a key factor influencing evidence use in education and other knowledge fields, while their significance depends in large part on policy maker characteristics. In the main dataset, many studies discuss the legitimacy and credibility of evidence, such as whether the evidence was considered trustworthy, and if it took a holistic or, conversely, a narrow political lens on an issue. In the main review, credibility of evidence was usually linked to researcher characteristics, such as the trustworthiness of individual researchers or their organisations, and was frequently discussed in relation to rigour, peer review and research quality (Oliver et al., forthcoming[13]).
Studies in the education subset reported researcher characteristics as influencing evidence use more frequently than in other knowledge domains. Researchers’ policy awareness (n = 22) and skills (n = 18) were the most highly-reported factors in this category. One education paper reported findings from a study of academic researchers in Australia and demonstrated that a high awareness of the priorities of non‑academic end users in their sample meant that research projects and findings were tailored to meet these end-user needs (Cherney et al., 2012[25]). While beyond the scope of this chapter, a more fine-grained and comprehensive content analysis of the individual studies examining these factors would allow insight into whether these factors were more often reported as barriers or as facilitating factors, and whether the education sector has more challenges than other sectors in fostering a culture of research use in policy among academic and policy communities.
In the education subset, the type of evidence (n = 40) and the relevance of evidence (n = 40) were the most frequently cited evidence characteristics. These studies often interrogate the extent to which research evidence is “policy-ready” i.e. the outputs incorporate or act in response to a discourse originated by policy makers (such as reported by Brown (2012[38]; 2014[39])). They also often point to a situation where different types of evidence and their value are not well understood or regarded (e.g. Gleeson et al. (2020[40])) which inevitably limits the possibilities for a quality use of research. In practice, this might mean that researchers have a limited ability to convey the benefit of evidence, or that policy makers have a limited ability to understand different evidence types and use them in effective ways. In both cases, there may be a need for professional learning to increase the first key step in evidence uptake: understanding and interpreting evidence itself (OECD, 2024[41]). In the OECD 2021 policy survey, over half of respondent ministries considered that extensive learning opportunities to develop policy makers’ research knowledge and skills were absent in their systems (Hill and Torres, 2023, p. 71[42]). Crucially, the quality of learning opportunities for improving research engagement skills is as relevant as their quantity (Hill and Torres, 2023[42]).
Policy characteristics include the competing pressures on policy decision making, framing, and the context of decision making. Studies that found organisation and resources to play a role examined factors such as funding, staff turnover and continuity, communication channels, time, culture, managerial support and legislative support.
The context of decision making (n = 31) emerged as the most highly-reported factor in this category in the education subset. One study examined the role of power and race relations in the context of higher education policy in an Australian state, and concluded that selectively determining policy success using quantitative data masked real racial inequalities (Street et al., 2021[43]). A study conducted in England found that ministerial views on teacher education led to evidence from national reviews being cherry-picked to fit within the decision-making context (Helgetun and Menter, 2020[44]).
Managerial support (n = 28), staff and personnel resources (n = 32) and material resources (n = 29) emerged as the most highly-reported organisation- and resources-related factors in the education subset. This echoes OECD recommendations that tailored human resource strategies are needed to build the necessary skills within a ministry or policy organisation to allow systematic and thoughtful engagement with research, as well as organisational structures and processes that ensure resources and time to engage with research and provide tools to do so (OECD, 2024[41]). Ideally, these structures need to be stable and long-term so that they are resistant to the organisational changes, staff turnover and political shifts that happen in policy organisations (OECD, 2024[41]). Similarly, studies in the United States on research-practice-partnerships have argued the necessity of the long-term continuity of such partnerships (Penuel et al., 2015[45]; Farley-Ripple, Oliver and Boaz, 2020[46]). Education studies in the review reported contradictory pressures on academic researchers that disincentivise time spent on activities aimed at feeding evidence into the policy process, such as investing time and resources into research-practice and research-policy collaborations (Matthews et al., 2018[47]; Heinrich and Good, 2018[48]). This suggests there is still significant scope for systems to consider how they can better incentivise policy-oriented research, and research-policy partnerships, where dedicated funding for such initiatives and a rethinking of academic incentives are important avenues to explore.
The 2014 systematic review also found organisation and resources as both key barriers and facilitators to the use of evidence in policy (Oliver et al., 2014[12]). The persistence of these organisational barriers to evidence use in policy over time suggests that implementing changes to well-established cultures within academic and policy organisations remains a difficult task (Mouthaan and Révai, 2023[22]).
Studies exploring the role of intermediaries in determining evidence use have been increasing steadily. The role of intermediaries was not coded separately from research and researcher characteristics in the systematic review conducted in 2014 (Oliver et al., 2014[12]), suggesting that intermediaries did not feature prominently in earlier research.
In the present review, around one-fifth of studies in both the main dataset and the education subset reported factors relating to intermediaries to play a role in determining evidence use. In the education subset, studies explored the role of intermediaries that are active at the local, national and global levels. These studies examine, for instance, the effects of the Indicators of Education Systems Programme of the OECD on the organisation’s growing status as a knowledge broker in education (Grek and Ydesen, 2021[49]); the role of US state higher education governing agencies as information hubs that state policy makers and campus officials turn to inform their decision making (Rubin and Ness, 2019[50]); and the role of professional associations as knowledge brokers in the United States, in enabling members’ access to research and facilitating between-state research exchanges (Hopkins et al., 2018[51]). Of the 16 education studies that reported on intermediaries influencing evidence use, 13 were published after 2018, and the remaining 3 were published between 2013-2017, suggesting a surge in interest in researching intermediaries in very recent years. This surge is indicative of the proliferation of intermediaries that are actively brokering knowledge in the education sectors of different systems (Torres and Steponavičius, 2022[52]). These actors seem likely to gain further importance in the evidence landscape, and more attention is needed to increase their effectiveness in ensuring that evidence is being produced, mobilised and used.
On the surface, the data seem to indicate that contact and collaboration are factors of comparatively less importance in education than in other knowledge domains. While this descriptive study is not suitable for a comparative evaluation of how much different factors matter across policy domains, one interpretation is that contact and collaboration may already be well established in the education sector compared to sectors such as conservation. Education may have moved on from discussions around the importance of collaboration, placing more emphasis on the need to incentivise researchers to generate relevant evidence. The data may also indicate that while there is a growing recognition that trusting relationships are important, they are not sufficient for evidence use. In this sense, contact and collaboration on its own does not constitute enough of a facilitator of evidence use – it needs to be supported within an organisation with high absorptive capacity and leadership.
A number of studies reviewed specific knowledge mobilisation interventions (see Box 2.3). These studies provide a descriptive account of the intervention, and sometimes an evaluation of the intervention’s impact or effectiveness. Such analyses provide concrete insights into the mechanisms being deployed in specific contexts, organisations, systems or countries to mobilise research in policy. Development, health and education were the knowledge domains with the highest percentage of studies that described a knowledge mobilisation intervention (Figure 2.7).
The described interventions include intermediary organisation activities; policy briefs; policy dialogues; networks; research-practice and research-policy partnerships; training or capacity-building activities; the provision of a resource (such as indicators, a database or map); and other such interventions aimed at increasing the use of research in policy.
Source: EPPI (2024[14]), Barfac visualisation: What factors influence evidence use?, https://eppi.ioe.ac.uk/eppi-vis/Review/Index/616.
The data suggest that knowledge mobilisation interventions are most frequently observed in applied sciences and social sciences. While a comparatively high proportion of education studies describe a knowledge mobilisation intervention, only a small proportion of these report on their effectiveness. This echoes findings from the 2023 OECD Survey of Knowledge Mobilisation in Education (the intermediaries’ survey) that much knowledge mobilisation in the education sector is simply not evaluated. Understanding effectiveness would allow for building a cumulative knowledge base on what works in knowledge mobilisation, but requires significantly more systematic and quality evaluation of knowledge mobilisation activity (see Chapter 6 for further detail).
Types of interventions described as a percentage of the total knowledge domain subset
Source: EPPI (2024[14]), Barfac visualisation: What factors influence evidence use?, https://eppi.ioe.ac.uk/eppi-vis/Review/Index/616.
Health intervention studies were more likely to report policy briefs or to provide a tool or resource than education intervention studies. This confirms other research that shows that mechanisms and processes to render research accessible to policy audiences are more common in the health domain (Education.org, 2021[30]). The OECD 2021 policy survey found that among respondent systems, policy makers perceived some types of education research to be quite or highly relevant to policy needs, but not very accessible (Mouthaan and Steponavičius, 2023[28]). While the exact causes of low research accessibility may vary by context or system, policy briefs and toolkits can help address issues of accessibility. Similarly, regular consultations with policy makers on their research needs can help to ensure alignment of outputs and tools with actual research gaps and needs. Intermediary organisations can play a role in supporting these initiatives and identifying specific needs.
Education studies were far less likely than other knowledge domains to describe training or capacity-building activities, with only two of the 33 education studies describing such interventions (see Figure 2.8). In comparison, in public administration 5% (12 out of 75 studies), in sustainable development 35% (7 out of 20 studies) and in health 19.5% (105 out of 536 studies) of intervention studies described training or capacity-building initiatives. This is surprising given the relatively high level of capacity-building activities of educational organisations reported in Chapter 3.
Collaborations between research and policy communities did not feature in descriptions of interventions in education, yet we know that increasingly, different systems are experimenting with establishing these types of partnerships, including networks between science and policy actors (for country examples, see (OECD, 2024[41])). In comparison, seven intervention studies in education described research-practice partnerships. This includes a study of partnerships in the United States between researchers and school districts that showed that the involvement of practice partners in the scheduling and co-ordination of research activities, as well as researcher-developed learning opportunities for practitioners, helped practitioners to better understand how to administer language and literacy interventions (Alonzo et al., 2022[53]). Another study presented and tested a conceptual framework that aims to explain the mechanisms in these partnerships that lead to evidence-based decision making among practitioners. The authors note that an intentional process of measuring and examining outputs in a partnership can help to improve the partnership’s outcomes (Wentworth, Mazzeo and Connolly, 2017[54]). What these studies also have in common is that they emphasise the importance of allocating time to developing relationships of trust within the partnership, and of continuous and open communication, for the collaboration to be successful. All seven studies on research-practice partnerships focus on the North American context, suggesting that research on these types of partnerships in other geographical contexts would be highly valuable.
Education papers detailing knowledge mobilisation interventions more often described research infrastructures for knowledge mobilisation (e.g. a new evidence synthesis unit or a funding stream) and intermediary organisation activities than papers describing interventions in health, criminal justice or public administration. As noted earlier, this could be indicative of a shift in focus in education research from interventions focused on relationships and collaboration to the necessity of resources to support knowledge mobilisation. Of the education studies describing knowledge mobilisation interventions (n = 33), 9 studies discussed such research infrastructures (see Figure 2.8). These include the study by Zapp and Powell (2017[55]) that describes the growth of federal programme- and project-based funding in Germany that many organisations rely on; and the study by Ranchod (2017[56]) examining the development of a research-policy nexus as a space for exchanges between researchers, government officials and other stakeholders in a research-based national-level government department in South Africa. Descriptions of these kinds of systems approaches give insight into co-ordination mechanisms at the system level that are critical to supporting research engagement (Maxwell, Sharples and Coldwell, 2022[57]). 8 of the 33 education intervention studies described intermediary organisation activities – this is a welcome addition to the literature given that little is known about the characteristics and impact of intermediaries in the knowledge mobilisation sphere in education (Torres and Steponavičius, 2022[52]).
In their review of the main dataset, Oliver et al (forthcoming[13]) highlight a few implications of a proliferation of knowledge mobilisation initiatives. First, it may imply that taking a systems perspective (i.e. mapping these initiatives and their associated actors) will need to be done routinely, which will be challenging. Secondly, in an ideal situation, new initiatives build on knowledge derived from existing initiatives (e.g. by drawing on evaluation data) in order to take an evidence-informed approach on what works in knowledge mobilisation. However, this kind of data is not always publicly accessible (see Chapter 6) and better ensuring its existence and accessibility will require substantive concerted effort and resources. Overall, it remains concerning that the now well-established organisational barriers to using evidence in policy persist, and may continue to hamper the effectiveness of new knowledge mobilisation initiatives.
This review of the subset of education papers included in the Oliver et al. (forthcoming[13]) review aimed to detail the main trends in research on knowledge mobilisation in education. A few key messages emerge from this analysis, with implications for future research co-ordination and collaboration in the education sector.
While the review was limited to studies written in English, the analysis confirms an already established phenomenon: that much research on knowledge mobilisation emerges from just a few Anglo-Saxon countries and contexts. If we interpret this as an indication that there is a vibrant community of knowledge mobilisation research in institutions in these countries, then it can be expected that this will continue to be the case for foreseeable future. It is therefore important to dig deep to better understand how this research can be applied to other contexts, and which aspects of these studies are generalisable. On the other hand, this chapter has reiterated the need for more research on knowledge mobilisation in countries beyond the Anglosphere. Some notable efforts have been made to showcase knowledge mobilisation initiatives from non-Anglo-Saxon contexts including the Evidence Informed Policy making in Education in Europe project (see (Gough et al., 2011[58]; Oliver et al., 2022[59])) and OECD efforts (OECD, 2022[17]; OECD, 2023[29]), including this publication.
The comparative analysis also confirms that cross-sectorial learning in the field of knowledge mobilisation remains a highly valuable activity (OECD, 2022[17]; Boaz, Oliver and Hopkins, 2022[60]). Learning from other sectors is one way to address long-standing challenges in knowledge mobilisation, and this chapter has highlighted instances where other knowledge domains can offer inspiration in areas where the education sector appears to be weaker. This includes implementing much-needed capacity building initiatives in policy communities but also establishing well-functioning mechanisms for synthesising evidence for policy (both of these are more established in the health sector).
While the growth in descriptions of knowledge mobilisation initiatives is a promising development, in domains such as education too little is being done to examine their effectiveness. Are these new initiatives learning from past initiatives? Are they connected to and building on one another? The evidence use field thus needs to consider how it can ensure learning on “what works” in these initiatives, which requires resources for conducting evaluations; and how it can make this learning available to those who need it, which necessitates better co-ordination and use of resources within the system (Oliver et al., forthcoming[13]). Crucially, systems approaches are needed to support and co-ordinate this growth in initiatives and ensure they are indeed effective, while adding to the cumulative knowledge base on effectiveness.
The data in this chapter show that while systematic reviews can be of great value, little is known about whether and how education policy makers use them. As evidence characteristics remain the main factor influencing use of evidence in policy, it remains important to address quality concerns in education research, while further research can explore how different types of evidence can be better incorporated into policy. Again, understanding how this is done in other policy sectors is a good starting point. Finally, while this chapter has focused on analysing the overall observable trends in the education subset in terms of their focus, there is also considerable value in understanding the substantive conclusions of such studies and collaboratively exploring their implications for policy and practice.
[5] Aldridge, D. et al. (2018), “Why the nature of educational research should remain contested: A statement from the new editors of the British Educational Research Journal”, British Educational Research Journal, Vol. 44/1, pp. 1-4, https://doi.org/10.1002/berj.3326.
[53] Alonzo, C. et al. (2022), “Building Sustainable Models of Research–Practice Partnerships Within Educational Systems”, American Journal of Speech-Language Pathology, Vol. 31/3, pp. 1-13, https://doi.org/10.1044/2021_ajslp-21-00181.
[18] Ashcraft, L., D. Quinn and R. Brownson (2020), “Strategies for effective dissemination of research to United States policymakers: a systematic review”, Implementation Science, Vol. 15/1, https://doi.org/10.1186/s13012-020-01046-3.
[9] Bainbridge, A., T. Troppe and J. Bartley (2022), “Responding to research evidence in Parliament: A case study on selective education policy”, Review of Education, Vol. 10/1, https://doi.org/10.1002/rev3.3335.
[1] Biesta, G. (2007), “Why “What Works” Won’t Work: Evidence-Based Practice and the Democratic Deficit in Educational Research”, Educational Theory, Vol. 57/1, pp. 1-22, https://doi.org/10.1111/j.1741-5446.2006.00241.x.
[60] Boaz, A., K. Oliver and A. Hopkins (2022), “Linking research, policy and practice: Learning from other sectors”, in Who Cares about Using Education Research in Policy and Practice?: Strengthening Research Engagement, OECD Publishing, Paris, https://doi.org/10.1787/70c657bc-en.
[39] Brown, C. (2014), “The policy agora: how power inequalities affect the interaction between researchers and policy makers”, Evidence & Policy, Vol. 10/3, pp. 421-438, https://doi.org/10.1332/174426514x672353.
[38] Brown, C. (2012), “The ‘policy-preferences model’: a new perspective on how researchers can facilitate the take-up of evidence by educational policy makers”, Evidence & Policy, Vol. 8/4, pp. 455-472, https://doi.org/10.1332/174426412x660106.
[4] Burkhardt, H. and A. Schoenfeld (2003), “Improving Educational Research:Toward a More Useful, More Influential, and Better-Funded Enterprise”, Educational Researcher, Vol. 32/9, pp. 3-14, https://doi.org/10.3102/0013189x032009003.
[19] Campbell, C. et al. (2017), “Developing a knowledge network for applied education research to mobilise evidence in and for educational practice”, Educational Research, Vol. 59/2, pp. 209-227, https://doi.org/10.1080/00131881.2017.1310364.
[15] Cherney, A. et al. (2013), “The utilisation of social science research – the perspectives of academic researchers in Australia”, Journal of Sociology, Vol. 51/2, pp. 252-270, https://doi.org/10.1177/1440783313505008.
[25] Cherney, A. et al. (2012), “What influences the utilisation of educational research by policy-makers and practitioners?: The perspectives of academic educational researchers”, International Journal of Educational Research, Vol. 56, pp. 23-34, https://doi.org/10.1016/j.ijer.2012.08.001.
[27] Daly, A. et al. (2014), “Misalignment and Perverse Incentives”, Educational Policy, Vol. 28/2, pp. 145-174, https://doi.org/10.1177/0895904813513149.
[30] Education.org (2021), Calling for an Education Knowledge Bridge: A White Paper to Advance Evidence Use in Education, https://whitepaper.education.org/download/white_paper.pdf.
[37] EEF (2024), EEF’s programme pipeline of evidence generation, https://educationendowmentfoundation.org.uk/projects-and-evaluation/evaluation/process-and-people/pipeline-of-eef-trials (accessed on 8 November 2024).
[36] EEF (2021), Maximising the Impact of Teaching Assistants: Evaluation Report, https://educationendowmentfoundation.org.uk/projects-and-evaluation/projects/maximising-the-impact-of-teaching-assistants.
[14] EPPI (2024), Barfac visualisation: What factors influence evidence use?, https://eppi.ioe.ac.uk/eppi-vis/Review/Index/616 (accessed on 16 September 2024).
[46] Farley-Ripple, E., K. Oliver and A. Boaz (2020), “Mapping the community: use of research evidence in policy and practice”, Humanities and Social Sciences Communications, Vol. 7/1, https://doi.org/10.1057/s41599-020-00571-2.
[26] Forbes, C. (2022), “Exploring barriers and solutions to encouraging evidence‐into‐use within an embedded evaluation approach: Reflections from the field”, Review of Education, Vol. 10/2, https://doi.org/10.1002/rev3.3351.
[40] Gleeson, J. et al. (2020), “Challenges and Opportunities of Evidence Use in Practice in Australian Children’s Development Programs”, Journal of Evidence-Based Social Work, Vol. 17/5, pp. 593-610, https://doi.org/10.1080/26408066.2020.1781727.
[3] Gorard, S., B. See and N. Siddiqui (2020), “What is the evidence on the best way to get evidence into use in education?”, Review of Education, Vol. 8/2, pp. 570-610, https://doi.org/10.1002/rev3.3200.
[58] Gough, D. et al. (2011), Evidence Informed Policymaking in Europe: EIPEE Final Project Report, Institute of Education, University of London.
[49] Grek, S. and C. Ydesen (2021), “Where science met policy: governing by indicators and the OECD’s INES programme”, Globalisation, Societies and Education, Vol. 19/2, pp. 122-137, https://doi.org/10.1080/14767724.2021.1892477.
[48] Heinrich, C. and A. Good (2018), “Research-informed practice improvements: exploring linkages between school district use of research evidence and educational outcomes over time”, School Effectiveness and School Improvement, Vol. 29/3, pp. 418-445, https://doi.org/10.1080/09243453.2018.1445116.
[44] Helgetun, J. and I. Menter (2020), “From an age of measurement to an evidence era? Policy-making in teacher education in England”, Journal of Education Policy, Vol. 37/1, pp. 88-105, https://doi.org/10.1080/02680939.2020.1748722.
[31] Higgins, S. (2020), “The development and worldwide impact of the Teaching and Learning Toolkit”, in Getting Evidence into Education, Routledge.
[42] Hill, J. and J. Torres (2023), “Terms of engagement: Where learning meets culture”, in Who Really Cares about Using Education Research in Policy and Practice?: Developing a Culture of Research Engagement, OECD Publishing, Paris, https://doi.org/10.1787/bfd04a1f-en.
[51] Hopkins, M. et al. (2018), “Brokering research in science education policy implementation: the case of a professional association”, Evidence & Policy, Vol. 14/03, pp. 459-476, https://doi.org/10.1332/174426418x15299595170910.
[24] Ion, G., E. Marin and C. Proteasa (2018), “How does the context of research influence the use of educational research in policy-making and practice?”, Educational Research for Policy and Practice, Vol. 18/2, pp. 119-139, https://doi.org/10.1007/s10671-018-9236-4.
[47] Matthews, P. et al. (2018), “Everyday stories of impact: interpreting knowledge exchange in the contemporary university”, Evidence and Policy, Vol. 14/04, pp. 665-682, https://doi.org/10.1332/174426417x14982110094140.
[57] Maxwell, B., J. Sharples and M. Coldwell (2022), “Developing a systems‐based approach to research use in education”, Review of Education, Vol. 10/3, https://doi.org/10.1002/rev3.3368.
[32] McKnight, L. and A. Morgan (2019), “A broken paradigm? What education needs to learn from evidence-based medicine”, Journal of Education Policy, Vol. 35/5, pp. 648-664, https://doi.org/10.1080/02680939.2019.1578902.
[22] Mouthaan, M. and N. Révai (2023), “Building a culture of research engagement in education”, in Who Really Cares about Using Education Research in Policy and Practice?: Developing a Culture of Research Engagement, OECD Publishing, Paris, https://doi.org/10.1787/c8ebdafe-en.
[28] Mouthaan, M. and M. Steponavičius (2023), “Co-ordinating the production of education research: Towards a system-level culture”, in Who Really Cares about Using Education Research in Policy and Practice?: Developing a Culture of Research Engagement, OECD Publishing, Paris, https://doi.org/10.1787/dd194cf4-en.
[11] Nutley, S. et al. (2019), “New development: What works now? Continuity and change in the use of evidence to improve public policy and service delivery”, Public Money & Management, Vol. 39/4, pp. 310-316, https://doi.org/10.1080/09540962.2019.1598202.
[16] O’Connor, J. (2022), “Evidence based education policy in Ireland: insights from educational researchers”, Irish Educational Studies, Vol. 43/1, pp. 21-45, https://doi.org/10.1080/03323315.2021.2021101.
[41] OECD (2024), “Yes Minister, Yes Evidence: Structures and skills for better evidence use in education policy”, OECD Education Policy Perspectives, No. 96, OECD Publishing, Paris, https://doi.org/10.1787/6f97bcda-en.
[29] OECD (2023), Who Really Cares about Using Education Research in Policy and Practice?: Developing a Culture of Research Engagement, Educational Research and Innovation, OECD Publishing, Paris, https://doi.org/10.1787/bc641427-en.
[17] OECD (2022), Who Cares about Using Education Research in Policy and Practice?: Strengthening Research Engagement, Educational Research and Innovation, OECD Publishing, Paris, https://doi.org/10.1787/d7ff793d-en.
[59] Oliver, K. et al. (2022), “What works to promote research-policy engagement?”, Evidence & Policy, Vol. 18/4, pp. 691-713, https://doi.org/10.1332/174426421x16420918447616.
[12] Oliver, K. et al. (2014), “A systematic review of barriers to and facilitators of the use of evidence by policymakers”, BMC Health Services Research, Vol. 14/1, https://doi.org/10.1186/1472-6963-14-2.
[13] Oliver, K. et al. (forthcoming), “What Factors Influence Evidence Use in Policy?”, Evidence & Policy.
[34] Parra, J. and D. Edwards (2024), “Challenging the gold standard consensus: Randomised controlled trials (RCTs) and their pitfalls in evidence-based education”, Critical Studies in Education, pp. 1-18, https://doi.org/10.1080/17508487.2024.2314118.
[2] Pellegrini, M. and G. Vivanet (2020), “Evidence-Based Policies in Education: Initiatives and Challenges in Europe”, ECNU Review of Education, Vol. 4/1, pp. 25-45, https://doi.org/10.1177/2096531120924670.
[45] Penuel, W. et al. (2015), “Conceptualizing Research–Practice Partnerships as Joint Work at Boundaries”, Journal of Education for Students Placed at Risk (JESPAR), Vol. 20/1-2, pp. 182-197, https://doi.org/10.1080/10824669.2014.988334.
[35] Pino-Yancovic, M. et al. (2023), “Organisational and network culture: A lens on leadership”, in Mouthaan, M. (ed.), Who Really Cares about Using Education Research in Policy and Practice?: Developing a Culture of Research Engagement, OECD Publishing, Paris, https://doi.org/10.1787/ae4354f1-en.
[56] Ranchod, R. (2017), “Building the research-policy nexus in South Africa: insights from a skills planning policy intervention”, Evidence and Policy, Vol. 13/4, pp. 669-686, https://doi.org/10.1332/174426416x14748859703063.
[10] Rickinson, M. et al. (2022), “A framework for understanding the quality of evidence use in education”, Educational Research, Vol. 64/2, pp. 133-158, https://doi.org/10.1080/00131881.2022.2054452.
[50] Rubin, P. and E. Ness (2019), “State Higher Education Governing Agencies and the Knowledge Brokering Process: Investigating Their Role as Multi-facing Organizations in the United States”, Higher Education Policy, Vol. 34/3, pp. 643-663, https://doi.org/10.1057/s41307-019-00155-z.
[21] Slade, S., K. Philip and M. Morris (2018), “Frameworks for embedding a research culture in allied health practice: A rapid review”, Health Research Policy and Systems, Vol. 16/1, pp. 1-15, https://doi.org/10.1186/S12961-018-0304-2/TABLES/3.
[43] Street, C. et al. (2021), “Do numbers speak for themselves? Exploring the use of quantitative data to measure policy ‘success’ in historical Indigenous higher education in the Northern Territory, Australia”, Race Ethnicity and Education, Vol. 25/3, pp. 309-330, https://doi.org/10.1080/13613324.2021.2019003.
[52] Torres, J. and M. Steponavičius (2022), “More than just a go-between: The role of intermediaries in knowledge mobilisation”, OECD Education Working Papers, No. 285, OECD Publishing, Paris, https://doi.org/10.1787/aa29cfd3-en.
[8] Van Damme, D. (2024), Keynote speech: Conference: Evidence-informed education, policy and practice. Belgian presidency Council of the European Union.
[20] Van Damme, D. (2022), The Power of Proofs: (Much) Beyond RCTs, https://curriculumredesign.org/wp-content/uploads/The-Power-of-Proofs.pdf (accessed on 24 October 2024).
[6] Wainwright, E. et al. (2020), “Why educational research should remain mindful of its position: Questions of boundaries, identity and scale”, British Educational Research Journal, Vol. 46/1, pp. 1-5, https://doi.org/10.1002/berj.3594.
[54] Wentworth, L., C. Mazzeo and F. Connolly (2017), “Research practice partnerships: a strategy for promoting evidence-based decision-making in education”, Educational Research, Vol. 59/2, pp. 241-255, https://doi.org/10.1080/07391102.2017.1314108.
[7] Wrigley, T. (2018), “The power of ‘evidence’: Reliable science or a set of blunt tools?”, British Educational Research Journal, Vol. 44/3, pp. 359-376, https://doi.org/10.1002/berj.3338.
[33] Wrigley, T. and S. McCusker (2019), “Evidence-based teaching: a simple view of “science””, Educational Research and Evaluation, Vol. 25/1-2, pp. 110-126, https://doi.org/10.1080/13803611.2019.1617992.
[23] Wu, X., M. Ramesh and M. Howlett (2015), “Policy capacity: A conceptual framework for understanding policy competences and capabilities”, Policy and Society, Vol. 34/3-4, pp. 165-171, https://doi.org/10.1016/j.polsoc.2015.09.001.
[55] Zapp, M. and J. Powell (2017), “Moving towards Mode 2? Evidence-based policy-making and the changing conditions for educational research in Germany”, Science and Public Policy, https://doi.org/10.1093/scipol/scw091.
The 2024 updated systematic review screened on title and abstract (n = 18 175) and on full text (n = 5 444), of which 2 199 papers were included in the final review. Fifteen electronic databases were searched using search strings and keywords reflecting three concepts: “evidence”; “barriers and facilitators”; and “policy makers”. To be included, studies had to be empirical research (describing a study design) or systematic reviews describing or explaining how evidence is used in policy or practice. Studies that did not report on factors influencing evidence use were excluded, as were studies in languages other than English. Multiple reviewers from different knowledge domains were involved in coding studies. Studies were stored, screened and keyworded using the Evidence for Policy and Practice Information (EPPI) Reviewer software, where keywording was done in categories such as knowledge domain, overall focus of the study and the study’s aim in relation to exploring evidence use.
For some categories – such as factors found to influence evidence use, and characterisation of interventions – a pre-defined list was used to code but was iteratively updated if reviewers proposed new factors or types of interventions based on their reviews. For example, “factors relating to intermediaries” was added as a code for factors found to influence evidence use when identified by several reviewers.
Further details on the protocol for the updated 2024 systematic review on factors affecting evidence use in policy can be found in Oliver et al (forthcoming[13]).
|
Author(s) |
Year |
Title |
|---|---|---|
|
Afdal, Hilde Wgss |
2013 |
Policy making processes with respect to teacher education in Finland and Norway |
|
Alonzo, Crystle N; Komesidou, Rouzana; Wolter, Julie A; Curran, Maura; Ricketts, Jessie; Hogan, Tiffany P. |
2022 |
Building Sustainable Models of Research-Practice Partnerships Within Educational Systems |
|
Archibald, T. |
2015 |
"They just know": the epistemological politics of "evidence-based" non-formal education |
|
Ashcraft, L.E; Quinn, D.A; Brownson, R.C. |
2020 |
Strategies for effective dissemination of research to United States policy makers: a systematic review |
|
Azuh, D; Joshua, S; Ibietan, J. |
2016 |
The use of data in policy-making in Nigeria's educational sector: Implications for national development |
|
Baek, Chanwoong |
2022 |
Internalising externalisation: utilisation of international knowledge in education policy making |
|
Bainbridge, A; Troppe, T; Bartley, J. |
2022 |
Responding to research evidence in Parliament: A case study on selective education policy |
|
Bartlett, Will |
2013 |
Obstacles to Evidence-based Policy-making in the EU Enlargement Countries: The Case of Skills Policies |
|
Bélanger, N; Dulude, E. |
2021 |
Investigating the challenges and opportunities of a bilingual equity knowledge brokering network: A critical and reflective perspective from university partners |
|
Bonetti, Sara |
2017 |
Identifying the complexities of early care and education professionals' use and production of quantitative data |
|
Briscoe, Patricia; Persad, Robin |
2021 |
Building entrepreneurial researcher capacity to increase positive changes in practice |
|
Brown, C. |
2012 |
The 'policy-preferences model': a new perspective on how researchers can facilitate the take-up of evidence by educational policy makers |
|
Brown, Chris |
2014 |
The policy agora: How power inequalities affect the interaction between researchers and policy makers |
|
Brown, Chris; Rogers, Sue |
2014 |
Measuring the effectiveness of knowledge creation as a means of facilitating evidence-informed practice in early years settings in one London borough |
|
Bushouse, B.K; Mosley, J.E. |
2018 |
The intermediary roles of foundations in the policy process: building coalitions of interest |
|
Campbell, Carol; Pollock, Katina; Briscoe, Patricia; Carr-Harris, Shasta; Tuters, Stephanie |
2017 |
Developing a knowledge network for applied education research to mobilise evidence in and for educational practice |
|
Cherney, Adrian; Povey, Jenny; Head, Brian; Boreham, Paul; Ferguson, Michele |
2012 |
What influences the utilisation of educational research by policy-makers and practitioners? The perspectives of academic educational researchers |
|
Cherney, Adrian; Head, Brian; Povey, Jenny; Boreham, Paul; Ferguson, Michele |
2015 |
The utilisation of social science research-The perspectives of academic researchers in Australia |
|
Cohen, A.K; Ozer, E.J; Abraczinskas, M; Voight, A; Kirshner, B; Devinney, M. |
2020 |
Opportunities for youth participatory action research to inform school district decisions |
|
Connolly, Faith; Durant, Tracey L; Durham, Rachel E. |
2020 |
Indicators to Assist in Addressing Equity Through Policy Adoption |
|
Cooper, Amanda |
2014 |
Knowledge mobilisation in education across Canada: a cross-case analysis of 44 research brokering organisations |
|
Daly, Alan J; Finnigan, Kara S; Jordan, Stuart; Moolenaar, Nienke M; Che, Jing |
2014 |
Misalignment and Perverse Incentives: Examining the Politics of District Leaders as Brokers in the Use of Research Evidence |
|
De Lisle, Jerome |
2016 |
Evolving data use policy in Trinidad and Tobago: The search for actionable knowledge on educational improvement in a small island developing state |
|
De Ruiter, R; Schalk, J. |
2017 |
Explaining cross-national policy diffusion in national parliaments: A longitudinal case study of plenary debates in the Dutch parliament |
|
Farley-Ripple, Elizabeth N. |
2012 |
Research Use in School District Central Office Decision Making: A Case Study |
|
Faubert, B.C. |
2019 |
Transparent resource management Implications for leadership and democracy in education |
|
Fitzgerald, K.G; Tipton, E. |
2022 |
The Meta-Analytic Rain Cloud Plot: A New Approach to Visualizing Clearinghouse Data |
|
Forbes, S. |
2017 |
Determining state sector statistics training priorities |
|
Forbes, C. |
2022 |
Exploring barriers and solutions to encouraging evidence-into-use within an embedded evaluation approach: Reflections from the field |
|
GNdara, Denisa; Hearn, James C. |
2019 |
College Completion, the Texas Way: An Examination of the Development of College Completion Policy in a Distinctive Political Culture |
|
Gleeson, J; Walsh, L; Rickinson, M; Kirkby, J; O'Donovan, R; Grimmett, H. |
2020 |
Challenges and Opportunities of Evidence Use in Practice in Australian Children's Development Programs |
|
Gorard, S; See, B.H; Siddiqui, N. |
2020 |
What is the evidence on the best way to get evidence into use in education? |
|
Green, Chris; Taylor, Celia; Buckley, Sharon; Hean, Sarah |
2016 |
Beyond synthesis: augmenting systematic review procedures with practical principles to optimise impact and uptake in educational policy and practice |
|
Grek, Sotiria; Ydesen, Christian |
2021 |
Where science met policy: governing by indicators and the OECD's INES programme |
|
Head, B.W. |
2016 |
Toward More "Evidence-Informed" Policy Making? |
|
Heinrich, Carolyn J; Good, Annalee |
2018 |
Research-informed practice improvements: exploring linkages between school district use of research evidence and educational outcomes over time |
|
Helgetun, Jo B; Menter, Ian |
2022 |
From an age of measurement to an evidence era? Policy-making in teacher education in England |
|
Hodgson, Ann; Spours, Ken |
2016 |
Restrictive and expansive policy learning challenges and strategies for knowledge exchange in upper secondary education across the four countries of the United Kingdom |
|
Hopkins, M; Wiley, K.E; Penuel, W.R; Farrell, C.C. |
2018 |
Brokering research in science education policy implementation: the case of a professional association |
|
Hordsy, Rita |
2017 |
How do different stakeholders utilise the same data? The case of school leavers and graduates information systems in three European countries |
|
Ion, G; Marin, E; Proteasa, C. |
2019 |
How does the context of research influence the use of educational research in policy-making and practice? |
|
Jimerson, Jo Beth; Childs, Joshua |
2017 |
Signal and Symbol: How State and Local Policies Address Data-Informed Practice |
|
Johns, Carolyn; MacLellan, Duncan |
2020 |
Public administration in the cross-hairs of evidence-based policy and authentic engagement: School closures in Ontario |
|
Lassnigg, Lorenz |
2012 |
'Use of current best evidence': Promises and illusions, limitations and contradictions in the triangle of research, policy and practice |
|
Lewis, David |
2018 |
Peopling policy processes? Methodological populism in the Bangladesh health and education sectors |
|
Lewis, Maria M; Bray, Laura E. |
2019 |
A call for amicus briefs as a means to influence special education policy: Lessons learned from Endrew F. |
|
Lotz-Sisitka, Heila; Rosenberg, Eureta; Ramsarup, Presha |
2021 |
Environment and sustainability education research as policy engagement: (re-) invigorating 'politics as potential' in South Africa |
|
Lundin, Martin; Oberg, PerOla |
2014 |
Expert knowledge use and deliberation in local policy making |
|
MacGregor, Stephen; Malin, Joel R; Farley-Ripple, Elizabeth N. |
2022 |
An Application of the Social-ecological Systems Framework to Promoting Evidence-informed Policy and Practice |
|
Marin, Elena; Ion, Georgeta; Stngu, Mihaela |
2019 |
How can researchers facilitate the utilisation of research by policy-makers and practitioners in education? |
|
Marin, Patricia; Yun, John T; Garces, Liliana M; Horn, Catherine L. |
2020 |
Bridging Divides: Understanding Knowledge Producers' Perspectives on the Use of Research in Law |
|
Matthews, P; Rutherfoord, R; Connelly, S; Richardson, L; Durose, C ; Vanderhoven, D. |
2018 |
Everyday stories of impact: interpreting knowledge exchange in the contemporary university |
|
McDonnell, Lorraine M; Weatherford, M. |
2013 |
Evidence use and the Common Core State Standards movement: From problem definition to policy adoption |
|
McShane, Ian |
2016 |
'Educare' in Australia: Analysing policy mobility and transformation |
|
Molla, Tebeje |
2014 |
Knowledge aid as instrument of regulation: World Bank's non-lending higher education support for Ethiopia |
|
Montz, Burrell E; Galluppi, Kenneth J; Losego, Jessica L; Smith, Catherine F. |
2015 |
Winter weather decision-making: North Carolina school closures, 2010-2011 |
|
Nathanail, E; Adamos, G ; Mitropoulos, L; Karakikes, I; Yatskiv, I. |
2020 |
How efficiently educational programs prepare professionals to meet current and future challenges of transport interchanges |
|
Niemann, Dennis; Martens, Kerstin |
2018 |
Soft governance by hard fact? The OECD as a knowledge broker in education policy |
|
O'Connor, J. |
2022 |
Evidence based education policy in Ireland: insights from educational researchers |
|
Orr, L.L; Olsen, R.B; Bell, S.H; Schmid, I; Shivji, A; Stuart, E.A. |
2019 |
Using the Results from Rigorous Multisite Evaluations to Inform Local Policy Decisions |
|
Pasachoff, Eloise |
2017 |
Two Cheers for Evidence: Law, Research, and Values in Education Policy Making and Beyond |
|
Pizmony-Levy, Oren; McDermott, Meredith; Copeland, Thaddeus T. |
2021 |
Improving Environmental and Sustainability Education (ESE) policy through research-practice partnerships: Reflections and analysis from New York City |
|
Project Evident |
2021 |
Actionable evidence: toward equitable outcomes: practical framework for research, evaluation, and technical assistance in the education and social sectors |
|
Ramot, R; Bialik, G. |
2020 |
Researchers as Knowledge Brokers: A Step toward Research-Informed Policy? Lessons from the Israeli Case |
|
Ranchod, R. |
2017 |
Building the research-policy nexus in South Africa: insights from a skills planning policy intervention |
|
Reckhow, Sarah; Tompkins-Stange, Megan |
2018 |
Financing the education policy discourse: philanthropic funders as entrepreneurs in policy networks |
|
Rickinson, Mark; de Bruin, Kate; Walsh, Lucas; Hall, Matthew |
2017 |
What can evidence-use in practice learn from evidence-use in policy? |
|
Rickinson, M; Walsh, L; de Bruin, K; Hall, M. |
2019 |
Understanding evidence use within education policy: a policy narrative perspective |
|
Rubin, Paul G; Ness, Erik C. |
2021 |
State Higher Education Governing Agencies and the Knowledge Brokering Process: Investigating Their Role as Multi-facing Organizations in the United States |
|
Schlaufer, C. |
2016 |
Global evidence in local debates: the Programme for International Student Assessment (PISA) in Swiss direct-democratic debates on school policy |
|
See, Beng Huat |
2018 |
Evaluating the evidence in evidence-based policy and practice: Examples from systematic reviews of literature |
|
Segerholm, Christina; Lindgren, Joakim; Novak, Judit |
2022 |
Evidence-Based Governing? Educational Research in the Service of the Swedish Schools Inspectorate |
|
Shepherd, Jonathan |
2014 |
How to achieve more effective services |
|
Sirat, Morshidi; Azman, Norzaini |
2014 |
Malaysia's National Higher Education Research Institute (IPPTN): narrowing the research-policy gap in a dynamic higher education system |
|
Skalitzky, E; Joyner, H ; Weymouth, L. |
2022 |
Local Data for Action: Statewide Dissemination of School Wellness Policy Evaluations in Wisconsin |
|
Steiner-Khamsi, Gita; Karseth, Berit; Baek, Chanwoong |
2020 |
From science to politics: commissioned reports and their political translation into White Papers |
|
Stratford, R; Wals, A.E. |
2020 |
In search of healthy policy ecologies for education in relation to sustainability: Beyond evidence-based policy and post-truth politics |
|
Street, C; Guenther, J; Smith, J; Robertson, K; Ludwig, W; Motlap, S; Woodroffe, T; Ober, R; Gillan, K; Larkin, S. |
2022 |
Do numbers speak for themselves? Exploring the use of quantitative data to measure policy 'success' in historical Indigenous higher education in the Northern Territory, Australia |
|
Wentworth, Laura; Mazzeo, Christopher; Connolly, Faith |
2017 |
Research practice partnerships: A strategy for promoting evidence-based decision-making in education |
|
Winton, Sue; Evans, Michael P. |
2016 |
Consulting, Mediating, Conducting, and Supporting: How Community-Based Organizations Engage with Research to Influence Policy |
|
Wollscheid, Sabine; Stensaker, Bjørn; Bugge, Markus M. |
2019 |
Evidence-Informed Policy and Practice in the Field of Education: The Dilemmas Related to Organizational Design |
|
Wyse, Dominic; Torgerson, Carole |
2017 |
Experimental trials and 'what works?' in education: The case of grammar for writing |
|
Yingling, D.L; Mallinson, D.J. |
2020 |
Explaining variation in evidence-based policy making in the American states |
|
Zapp, Mike; Powell, Justin J.W. |
2017 |
Moving towards Mode 2? Evidence-based policy-making and the changing conditions for educational research in Germany |
|
Zhang, Q.T. |
2018 |
Theory, practice and policy: A longitudinal study of university knowledge exchange in the United Kingdom |
← 1. The 2021 OECD Strengthening the Impact of Education Research policy survey (see OECD (2022[17])) collected data from 37 ministries and departments of education in 29 countries to map the knowledge mobilisation mechanisms, actors and challenges across education systems. Responses were collected from ministries of education at the national or sub-national level – e.g. state, province, canton.