Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Inconsistency in Conference Peer Review: Revisiting the 2014 NeurIPS Experiment

Yannic Kilcher via YouTube

Overview

Coursera Spring Sale
40% Off Coursera Plus Annual!
Grab it
Explore a detailed analysis of the 2014 NeurIPS peer review experiment in this informative video. Delve into the subjective nature of conference paper reviews, examining how well reviewers can predict future impact and the fate of rejected papers. Learn about the experiment's findings, including the lack of correlation between quality scores and citation counts for accepted papers, and the implications for assessing researcher quality. Gain insights into potential improvements for the reviewing process and understand the broader context of peer review in machine learning conferences.

Syllabus

- Intro & Overview
- Recap: The 2014 NeurIPS Experiment
- How much of reviewing is subjective?
- Validation via simulation
- Can reviewers predict future impact?
- Discussion & Comments

Taught by

Yannic Kilcher

Reviews

Start your review of Inconsistency in Conference Peer Review: Revisiting the 2014 NeurIPS Experiment

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.