chevron icon Twitter logo Facebook logo LinkedIn logo YouTube logo download icon quote icon posted icon clock icon author icon arrow icon arrow icon plus icon Search icon location icon location icon document icon plus-alt

Policy responses to the Covid-19 crisis have put a spotlight on the tensions and constraints of making evidence-informed policy decisions. Amid a backdrop of growing anti-science rhetoric, fuelled by populist politicians and some media outlets, the record speed at which scientists across the world mobilised to understand, treat and prevent Covid-19 has been an unquestionable triumph of science. At the same time, disagreement among experts, ineffective communications, uncertainty and ambiguity regarding the most adequate measures (perhaps all inevitable given the unprecedent nature of this crisis) have shown the limits of ‘following the science’ in policymaking.  

Covid-19 might not quite be the ‘revenge of the experts’, but should we reconcile ourselves with the idea that evidence-based policymaking is doomed to be an oxymoron? At the GO Lab, we think not. And while we haven’t stumbled upon the perfect formula for evidence-based policymaking, here are three ingredients for better use of evidence in policy and practice tried and tested through our work. We’ve distilled these with help from three of our Visiting Fellows of Practice, Tim HobbsSam Windett and Gen Maitland-Hudson. We’re hugely grateful to them for inspiring us to reflect more deeply on the challenges and opportunities of using evidence.  

Embrace a broader understanding of evidence 

In the world of academia, randomised controlled trials (RCTs) are seen as the gold standard for evaluating the effectiveness of interventions, treatments or programmes. In health research there is a well-defined hierarchy of evidence, but in the infinitely complex world of social policy it is often far less clear what constitutes appropriate evidence. Even the best designed RCTs can only tell part of the story; an RCT might show whether a programme did work, but won’t tell us why the programme worked, or whether it will work just as effectively in a different context. It also can’t tell us if the programme was indeed desirable from the perspective of the intended beneficiaries, or what the lessons learnt were for those implementing it. 

This is why it’s important to understand what ‘good evidence’ for policy looks like, beyond hierarchies, and to generate evidence using tools that are relevant to specific policy goals and appropriate in the local context. We do this in our own work at the GO Lab -  for example, in the way we have approached our role as an evaluator and learning partner of the Life Chances Fund (LCF). To investigate whether and how impact bonds function to deliver better outcomes and value for money, compared to alternative commissioning approaches, we employ a range of research methods. This includes collecting data through the fund administration process and from stakeholder surveys administered across all LCF projects, alongside more in-depth evaluations of selected projects comparing the provision of the same or very similar services under impact bonds and other funding mechanisms.    

Co-design your research with policymakers 

Data-hungry researchers are often driven by intellectual curiosity as much as the desire to help address pressing social challenges. But a mismatch between research and policy priorities (which can often change at a much swifter pace than a research study can ever shift course) can significantly limit the effective use of evidence in policy decisions. 

To ensure evidence speaks to policymaking priorities, it can be helpful to co-design and co-produce research with policymakers. This may not always be possible or even advisable. Co-designing research with policymakers, especially when they are funding the research work, can raise power dynamics challenges that risk compromising the rigour, integrity and credibility of evidence. From our partnership with the UK Government, we have seen that these tensions can be mitigated by being clear upfront about the research questions and respective methodologies, whilst operating in a culture that prioritises openness, transparency and a learning mindset.   

Constantly building or consolidating bridges between evidence users and evidence generators is key, and in the long-term we should strive to move towards a shared culture of genuine data sharing and transparency across government and academia. This inspired our INDIGO data collaborative – our effort to galvanise a community of data lovers, practitioners, and policymakers that share, curate and use data effectively. Through our INDIGO peer learning group, we make data more accessible to policymakers, highlight the practical value of sharing data, and explore the many ways in which existing data can inform policy decisions. Our Hack and Learn events bring together not just data enthusiasts, but also civil servants, lawyers, academics, students, and social entrepreneurs. They offer engaging and highly participatory ways for policymakers to look under the bonnet of how data is collated, analysed, and synthesised to generate new insights.  

Get your voice heard by building consensus 

Talented researchers shine brightest when their work enables them to challenge widely held (but misplaced) beliefs. Critical questioning and probing to unearth surprising, perhaps uncomfortable, truths is at the heart of good research. This can make it hard to generate consensus among the scientific community itself (as Covid-19 has demonstrated), let alone among policymakers and the myriad groups of stakeholders competing for their ear. 

Even the highest quality, most compelling evidence only gives policymakers one perspective among many seeking to inform and influence policymaking. At the GO Lab, we are committed to working collaboratively with other researchers and - crucially - the community of practitioners, to not only generate fresh evidence, but also synthesise existing knowledge and build consensus about the implications of what the evidence tells us. The peer learning networks we curate provide an open space for knowledge sharing, but also for constructive debate. By acting as a neutral but credible convenor we hope to move communities of practitioners and academics closer to consensus, not just on challenges but also pragmatic ways forward.  

In the world of impact bonds, we are still far from unequivocal consensus. But we are much encouraged that broad agreement is emerging globally around the need to look at impact bonds as one tool (among others) for better social outcomes and understand what, beneath the buzzwords and hype, are the active ingredients that may make impact bonds a more effective tool to tackle complex social problems.  

So, how do we make evidence great again?  

By making it more open, fun and inclusive. This is a mindset we seek to nurture among all those we work with – policymakers and beyond. At the GO Lab, we’ve been proactive in building this culture within our community, but changing mindsets and practice across government is infinitely more challenging.  

At the start of the coronavirus outbreak, governments across the world made strong public commitments to following scientific evidence and advice to inform Covid-19 policies, with countries taking a range of approaches to bringing evidence into policy responses. But the way the crisis has been handled over the past twelve months has shown the challenges of getting evidence into practice. A lack of transparency around decision-making, uncertain or shifting evidence, public misinformation and an overabundance of (sometimes contradictory) information can combine to form a lethal cocktail, leading not only to poor policymaking but also declining trust in the scientific community and government. We hope that at the GO Lab we’ve got some of the antidotes. 

Want to join us on our journey to make evidence great again?

There is no better opportunity to do so than our annual Social Outcomes Conference. We want the conference to serve as an inclusive platform that fosters greater awareness of and engagement with the varied types of evidence emerging in the field of social outcomes. The ever-growing number of participants to the conference gives us hope that there is increasing recognition of the need to look at evidence holistically and genuine appetite for global cooperation to contextualise existing evidence and generate new insights. To add your voice to the vibrant community that the conference gathers annually, why not check out our Call for Papers and Presentations and make a submission?