Abstract

Training deep neural networks requires well-annotated datasets. However, real world datasets are often noisy, especially in a multi-label scenario, i.e. where each data point can be attributed to more than one class. To this end, we propose a regularization method to learn multi-label classification networks from noisy data. This regularization is based on the assumption that semantically close classes are more likely to appear together in a given image. Hereby, we encode label correlations with prior knowledge and regularize noisy network predictions using label correlations. To evaluate its effectiveness, we perform experiments on a mutli-label aerial image dataset contaminated with controlled levels of label noise. Results indicate that networks trained using the proposed method outperform those directly learned from noisy labels and that the benefits increase proportionally to the amount of noise present.

Details

Actions