Repository logo

Infoscience

  • English
  • French
Log In
Logo EPFL, École polytechnique fédérale de Lausanne

Infoscience

  • English
  • French
Log In
  1. Home
  2. Academic and Research Output
  3. Journal articles
  4. Eigendecomposition-Free Training of Deep Networks for Linear Least-Square Problems
 
research article

Eigendecomposition-Free Training of Deep Networks for Linear Least-Square Problems

Dang, Zheng  
•
Yi, Kwang Moo  
•
Hu, Yinlin  
Show more
September 1, 2021
Ieee Transactions On Pattern Analysis And Machine Intelligence

Many classical Computer Vision problems, such as essential matrix computation and pose estimation from 3D to 2D correspondences, can be tackled by solving a linear least-square problem, which can be done by finding the eigenvector corresponding to the smallest, or zero, eigenvalue of a matrix representing a linear system. Incorporating this in deep learning frameworks would allow us to explicitly encode known notions of geometry, instead of having the network implicitly learn them from data. However, performing eigendecomposition within a network requires the ability to differentiate this operation. While theoretically doable, this introduces numerical instability in the optimization process in practice. In this paper, we introduce an eigendecomposition-free approach to training a deep network whose loss depends on the eigenvector corresponding to a zero eigenvalue of a matrix predicted by the network. We demonstrate that our approach is much more robust than explicit differentiation of the eigendecomposition using two general tasks, outlier rejection and denoising, with several practical examples including wide-baseline stereo, the perspective-n-point problem, and ellipse fitting. Empirically, our method has better convergence properties and yields state-of-the-art results.

  • Details
  • Metrics
Type
research article
DOI
10.1109/TPAMI.2020.2978812
Web of Science ID

WOS:000681124300024

Author(s)
Dang, Zheng  
Yi, Kwang Moo  
Hu, Yinlin  
Wang, Fei
Fua, Pascal  
Salzmann, Mathieu  
Date Issued

2021-09-01

Publisher

IEEE COMPUTER SOC

Published in
Ieee Transactions On Pattern Analysis And Machine Intelligence
Volume

43

Issue

9

Start page

3167

End page

3182

Subjects

Computer Science, Artificial Intelligence

•

Engineering, Electrical & Electronic

•

Computer Science

•

Engineering

•

eigenvalues and eigenfunctions

•

three-dimensional displays

•

machine learning

•

optimization

•

computer vision

•

task analysis

•

training

•

end-to-end learning

•

eigendecomposition

•

singular value decomposition

•

geometric vision

Editorial or Peer reviewed

REVIEWED

Written at

EPFL

EPFL units
CVLAB  
Available on Infoscience
August 28, 2021
Use this identifier to reference this record
https://infoscience.epfl.ch/handle/20.500.14299/180953
Logo EPFL, École polytechnique fédérale de Lausanne
  • Contact
  • infoscience@epfl.ch

  • Follow us on Facebook
  • Follow us on Instagram
  • Follow us on LinkedIn
  • Follow us on X
  • Follow us on Youtube
AccessibilityLegal noticePrivacy policyCookie settingsEnd User AgreementGet helpFeedback

Infoscience is a service managed and provided by the Library and IT Services of EPFL. © EPFL, tous droits réservés