Fairness and Bias in Online Selection
Fairness in Online Selection Problems Two of the most studied models in online decision making are the secretary problem and the prophet inequality problem. Both capture the challenge of making irrevocable choices under uncertainty. But, what happens when candidates come from different groups and fairness enters the picture? In “Fairness and bias in online selection,” José Correa, Andrés Cristi, Paul Dütting, and Ashkan Norouzi-Fard introduce and analyze multicolor variants of these problems. In these models, each candidate belongs to a “color,” and comparisons are only meaningful within the same color. This captures real-world situations where crossgroup rankings are unreliable or biased—for instance, when evaluating students from different schools or job applicants from diverse backgrounds. For the multicolor secretary problem, the authors characterize the optimal online algorithm. In contrast to the offline optimum—which always selects from the most promising group—the optimal online algorithm is inherently fairer. For the multicolor prophet inequality, they design algorithms that enforce target selection probabilities across groups, ensuring equitable treatment.
University of Chile
École Polytechnique Fédérale de Lausanne
Google (Switzerland)
Google (Switzerland)
2025-09-24
opre.2021.0662
REVIEWED
EPFL