Peer Review Market Design: Effort-Based Matching and Admission Control
Speaker: Raghav Singal (Dartmouth)
Date: 5/6/25
Abstract: Peer review is fundamental to academic research, yet its effectiveness hinges on reviewers voluntarily exerting costly effort to evaluate submissions. Current mechanisms used by conference organizers, such as random or affinity-based matching, fail to adequately incentivize effort, leading to suboptimal outcomes. This study investigates how peer review markets can be designed to maximize the acceptance of high-quality submissions while minimizing the acceptance of low-quality ones. We model a one-sided market where each agent serves as both an author and a reviewer. Agents submit papers of private quality and decide whether to exert effort when reviewing. The designer (e.g., conference organizer) employs mechanisms to match papers to reviewers, with the goal of optimizing market welfare. We first analyze two existing mechanisms: Random, which induces no effort, and Symmetric, which incentivizes effort but can be exploited by low-quality authors. To address these issues, we propose a novel Admission Control (AC) mechanism, which penalizes reviewers who fail to exert effort by rejecting their submissions. Under perfectly observable effort, AC achieves first-best welfare, dominating the other mechanisms. When effort is noisily observed, AC leverages stochastic penalties to ensure incentive compatibility and we theoretically establish that it achieves a welfare that is at least 1/2-optimal, with simulations demonstrating significantly stronger performance. Robustness analyses confirm the superiority of AC across various extensions, including noisy review outcomes, author biases, and status disclosures through preprints. Our findings underscore the importance of linking reviewer effort to submission acceptance in peer review markets. The AC mechanism provides a practical framework for achieving this, suggesting that conference organizers should enforce effort-based participation criteria. While noisy observation of effort poses challenges, our results show that carefully calibrated penalties can mitigate these effects. Additionally, investing in technologies to improve effort detection significantly enhances outcomes. These insights offer actionable strategies for improving the integrity and efficiency of peer review processes in academic conferences. This is joint work with Craig Fernandes and James Siderius.