Repository logo

Infoscience

  • English
  • French
Log In
Logo EPFL, École polytechnique fédérale de Lausanne

Infoscience

  • English
  • French
Log In
  1. Home
  2. Academic and Research Output
  3. Conferences, Workshops, Symposiums, and Seminars
  4. Towards Self-Supervised Covariance Estimation in Deep Heteroscedastic Regression
 
conference paper

Towards Self-Supervised Covariance Estimation in Deep Heteroscedastic Regression

Shukla, Megh  
•
Shameem, Aziz
•
Salzmann, Mathieu  
Show more
April 24, 2025
Proceedings of the Thirteenth International Conference on Learning Representations (ICLR) 2025 [Forthcoming publication]
13th International Conference on Learning Representations (ICLR 2025)

Deep heteroscedastic regression models the mean and covariance of the target distribution through neural networks. The challenge arises from heteroscedasticity, which implies that the covariance is sample dependent and is often unknown. Consequently, recent methods learn the covariance through unsupervised frameworks, which unfortunately yield a trade-off between computational complexity and accuracy. While this trade-off could be alleviated through supervision, obtaining labels for the covariance is non-trivial. Here, we study self-supervised covariance estimation in deep heteroscedastic regression. We address two questions: (1) How should we supervise the covariance assuming ground truth is available? (2) How can we obtain pseudo-labels in the absence of the ground-truth? We address (1) by analysing two popular measures: the KL Divergence and the 2-Wasserstein distance. Subsequently, we derive an upper bound on the 2-Wasserstein distance between normal distributions with non-commutative covariances that is stable to optimize. We address (2) through a simple neighborhood based heuristic algorithm which results in surprisingly effective pseudo-labels for the covariance. Our experiments over a wide range of synthetic and real datasets demonstrate that the proposed 2-Wasserstein bound coupled with pseudo-label annotations results in a computationally cheaper yet accurate deep heteroscedastic regression.

  • Files
  • Details
  • Metrics
Type
conference paper
Author(s)
Shukla, Megh  

EPFL

Shameem, Aziz
Salzmann, Mathieu  

EPFL

Alahi, Alexandre  

EPFL

Date Issued

2025-04-24

Publisher

ICLR

Published in
Proceedings of the Thirteenth International Conference on Learning Representations (ICLR) 2025 [Forthcoming publication]
Subjects

deep regression

•

heteroscedastic

•

uncertainty

•

2-Wasserstein

•

KL-Divergence

•

Negative Log-Likelihood

Editorial or Peer reviewed

REVIEWED

Written at

EPFL

EPFL units
VITA  
Event nameEvent acronymEvent placeEvent date
13th International Conference on Learning Representations (ICLR 2025)

ICLR 2025

Singapore

2025-04-24 - 2025-04-28

FunderFunding(s)Grant NumberGrant URL

Swiss National Science Foundation

Narratives from the Long Tail: Transforming Access to Audiovisual Archives

CRSII5 198632

https://www.futurecinema.live/project/
Available on Infoscience
February 19, 2025
Use this identifier to reference this record
https://infoscience.epfl.ch/handle/20.500.14299/247071
Logo EPFL, École polytechnique fédérale de Lausanne
  • Contact
  • infoscience@epfl.ch

  • Follow us on Facebook
  • Follow us on Instagram
  • Follow us on LinkedIn
  • Follow us on X
  • Follow us on Youtube
AccessibilityLegal noticePrivacy policyCookie settingsEnd User AgreementGet helpFeedback

Infoscience is a service managed and provided by the Library and IT Services of EPFL. © EPFL, tous droits réservés