No more grand declarations—it’s time for action, say Stephen Curry and James Wilsdon
The stunning interim results of Moderna’s Covid-19 vaccine trial—hot on the heels of a similar announcement from Pfizer and BioNTech last week—is the best news many of us have had for months. It’s another reminder, if any were needed, of the international research community’s remarkable response to the pandemic.
But before we get swept up in self-congratulation, we should pause to reflect on how this year’s crisis has also illuminated the inner workings of research. In lots of ways, both good and bad, 2020 has intensified scrutiny of how research is funded, practised, disseminated and evaluated, and how research cultures can be made more open, inclusive and impactful.
The uncertain possibilities that flow from this moment follow a period of growing concern over several long-standing problems, all linked to research assessment.
First, there is the misapplication of narrow criteria and indicators of research quality or impact, in ways that distort incentives, create unsustainable pressures on researchers, and exacerbate problems with research integrity and reproducibility.
Second, this narrowing of criteria and indicators has reduced the diversity of research missions and purposes, leading institutions and researchers to adopt similar strategic priorities, or to focus on lower-risk, incremental work.
Third, systemic biases against those who do not meet—or choose not to prioritise—narrow criteria and indicators of quality, impact or career progression, have reduced the diversity, vitality and representative legitimacy of the research community.
Finally, there has been a diversion of policy and managerial attention towards things that can be measured, at the expense of less tangible or quantifiable qualities, impacts, assets and values. The rise of flawed university league tables has exacerbated this trend.
Swelling tide
We’ve been involved in diagnosing, assembling evidence and banging drums about these problems, through initiatives such as the Declaration on Research Assessment (Dora), the Metric Tide report and the UK Forum for Responsible Research Metrics.
So we welcome signs that attention is shifting towards implementing solutions, and coalescing around a more expansive agenda for responsible research assessment (RRA). Early debates on metrics and measurement have expanded to encompass questions about how to create a healthy work culture for researchers, how to promote research integrity, how to move from closed to open scholarship, and how to embed the principles of equality, diversity and inclusion across the research community.
This more holistic approach can be seen, for example, in UK Research and Innovation’s commitment to a healthy research culture, and in the recent guidelines on good research practice from the German Research Foundation (DFG).
Next week’s Global Research Council virtual conference on RRA—hosted by UKRI in collaboration with the UK Forum for Responsible Research Metrics and South Africa’s National Research Foundation—comes at a pivotal time.
State of play
Ahead of the conference, with our colleagues—Sarah de Rijcke, scientific director and professor of science, technology and innovation at the Centre for Science and Technology Studies (CWTS); Anna Hatch, programme director at Dora; Gansen Pillay, depurty chief executive officer for the National Research Foundation; and Inge van der Weijden, lecturer and PhD coordinator at CWTS—we’re today publishing a working paper through the Research on Research Institute, intended as both a primer and a conversation starter.
The paper explores what RRA is, and where it comes from, by outlining 15 initiatives that have influenced the content, shape and direction of current debates, and the responses they have elicited. We focus on the role of research funders, who have more freedom and power to experiment and drive change than many other actors in research systems.
We also present the findings of a survey of RRA policies and practices among GRC participant organisations—mostly public funding agencies—with responses from 55 organisations worldwide.
Their responses show a shift away from reliance on metrics towards more qualitative or mixed-methods modes of assessment. Alternative CV formats are now being piloted or implemented by almost 60 per cent of respondents from all regions.
This is real progress, particularly as the conversation has expanded beyond Europe and North America to become genuinely global. Some of the most exciting innovations in RRA are coming out of Latin America and Africa.
Time for action
Declarations and statements of principle have been an important part of this story. But even though we have co-authored some of these, we feel the time for grand declarations has passed. They risk becoming substitutes for action.
RRA now needs to focus on action and implementation—testing and identifying what works in building a healthy and productive research culture. Institutional commitments must be followed by the hard graft of reforming cultures, practices and processes.
The research community also needs an open, global forum where common values and important differences can be articulated and debated, and where emerging good practices can be shared. The Global Research Council is ideally placed to play a role here, bringing in views and voices from across global research.
Whether you’re an advocate, a critic or entirely agnostic about RRA, we hope you’ll join us and more than 500 others in making next week’s conference the start of a fresh chapter in these debates.
Stephen Curry is professor of structural biology and assistant provost for equality, diversity and inclusion at Imperial College London. He is also chair of the Declaration on Research Assessment and a co-author of The Metric Tide
James Wilsdon is Director of the Research on Research Institute and digital science professor of research policy at the University of Sheffield. He chaired the Metric Tide review and the European Commission’s Next Generation Metrics Expert Group, and is a member of the UK Forum for Responsible Research Metrics