Bozorgtabar, BehzadVray, GuillaumeMahapatra, DwarikanathThiran, Jean-Philippe2022-02-282022-02-282022-02-282021-01-0110.1109/ICCVW54120.2021.00371https://infoscience.epfl.ch/handle/20.500.14299/185792WOS:000739651103047The goal of out-of-distribution (OoD) detection is to identify unseen categories of inputs different from those seen during training, which is an important requirement for the safe deployment of deep neural networks in computational pathology. Additionally, to make OoD detection applicable in clinical applications, one may encounter image data distribution shifts. This paper argues that practical OoD detection should handle both semantic shift and data distribution shift simultaneously. We propose a new self-supervised OoD detector for colorectal cancer tissue types based on a clustering scheme. Our work's central tenet benefits from multi-view consistency learning with a supplementary view based on style augmentation to mitigate domain shift. The learned representation is then adapted to minimize images' predictive entropy to segregate indistribution examples from OoDs on the target data domain. We evaluated our method on two public colorectal tissue types datasets. Our method achieved state-of-the-art OoD detection performance over various sell-supervised baselines. The code, data, and models are available at https://github.com/BehzadBozorgtabar/SOoD.Computer Science, Artificial IntelligenceComputer Science, Interdisciplinary ApplicationsImaging Science & Photographic TechnologyComputer ScienceImaging Science & Photographic TechnologySOoD: Self-Supervised Out-of-Distribution Detection Under Domain Shift for Multi-Class Colorectal Cancer Tissue Typestext::conference output::conference proceedings::conference paper