MetaROR

Translating Publish-Review-Curate outputs into actionable signals for research funders: A MetaROR Study

Summary

The Publish-Review-Curate (PRC) model adopted by MetaROR (MetaResearch Open Review), removes the binary “accept/reject” gate from the assessment of scholarly contributions. The evaluation process of articles submitted to the platform, through peer reviews and an editorial assessment, is openly accessible, and the measure of value is apparent from a qualitative perspective.

While this transparency accelerates knowledge flow, funders still struggle to recognise PRC outputs when allocating scarce resources. This mixed‑methods project will:

  1. Map funders’ readiness to use reviewed preprints
  2. Uncover barriers faced by reviewers and panel members
  3. Create the foundations to co-design a prototype and pilot machine‑ and human‑readable artefacts (“PRC Evaluation Summaries”) that allow evaluators to grasp at a glance the strengths, weaknesses and societal relevance of research hosted on MetaROR

  • Austrian Science Fund (FWF)
  • Dutch Research Council (NWO)
  • Swiss National Science Foundation (SNSF)
  • Social Science and Humanities Research Council of Canada (SSHRC)

  • Recognition & rewards in flux: Global initiatives (DORA, Leiden Manifesto, CoARA) shift away from metrics such as the Journal Impact Factor and the h-index, promoting recognition of a broader set of research outputs. Yet many funders resist implementing these new approaches, particularly those in the Global South. Likewise, even when funders take innovative action, some reviewers hold on to traditional methods and revert to legacy metrics when decisions get tight.
  • MetaROR’s promise: By publishing transparent reviews and assessment reports for each contribution, MetaROR surfaces nuanced judgements of rigour, novelty and usefulness that could replace binary proxies.
  • Gap: Funders lack simple, trustworthy ways to interpret narrative information at scale. A structured “PRC Evaluation Summary” layer, co‑designed with funders, could bridge this gap.
  • Main actors and audience: This project focus on two distinct groups: funders and evaluators, represented by the leadership of evaluator panels in each funder, as this is a group that helps shape evaluation strategies. Recommendations from the project will be also directed to these two groups.
ObjectiveResearch questions
Map perceptionsHow do funders currently view reviewed preprints and PRC artefacts? Which incentives or constraints shape adoption?
Diagnose resistanceWhy do some expert reviewers continue to consult traditional bibliometrics despite new policies? (+funders)
Co‑design solutionsWhich metadata fields, visual cues, and narrative elements help evaluators make confident, fair judgements within time pressure?
Pilot & evaluateDoes the prototype “PRC Evaluation Summary” improve funder decision‑making efficiency, transparency and satisfaction?
Work packageActivitiesDeliverables
1. Landscape exploration (funders)Discussion with MetaROR steering group Survey design from findings
2. Funder survey Online survey of organisations in the RoRI networkReport on funder perceptions & readiness.
3. In‑depth interviewsSemi‑structured interviews with programme officers selected from the survey.Thematic analysis identifying pain points & emerging good practice.
4. Reviewer survey (conditional)If respondents cite reviewer resistance, deploy a survey to reviewers (via funders).Dataset on reviewer attitudes.
5. Synthesis & disseminationAnalyse results from the survey(s) and interviews, relating these with relevant literature.Policy brief; Working paper (preprint)
6. Design sprints with the MetaROR teamCo‑create metadata schema, UX wireframes & dashboard output.Alpha prototypes; documentation.
7. Pilot implementationIntegrate prototypes into MetaRORAn evaluation component integrated into MetaROR

The MetaROR platform launched in November 2024 and the project based on the platform runs until April 2026.

Work packageMonths
1. Landscape exploration (funders)April – June 2025
2. Funder survey July – August 2025
3. In‑depth interviewsSeptember- October 2025
4. Reviewer survey (conditional)November – December 2025
5. Synthesis & disseminationJanuary – April 2026
6. Design sprints with the MetaROR teamTBD
7. Pilot implementationTBD

The MetaROR project is at an early stage (as the MetaROR platform needed to be operational for a certain period before the research can take place). We anticipate project outputs to include:

  • Evidence on funder and reviewer behaviour around PRC outputs
  • The interoperable “PRC Evaluation Summary” specification is ready for MetaROR deployment
  • Policy guidance enabling funders to cite PRC outputs in calls and panel briefs
  • Prototype dashboard visualising PRC signals, such as traffic‑light badges and narrative highlights (linked to objective Pilot & Evaluate)
  • Contribute to a cultural shift away from journal‑level metrics toward equitable funding decisions

Additional information

To build MetaROR into a community-driven collaboration that reflects the rich and growing diversity of metaresearch, we hope to further expand its project and editorial team. We invite anyone interested in contributing to MetaROR’s development and implementation to reach out to us.