Repository logo

Infoscience

  • English
  • French
Log In
Logo EPFL, École polytechnique fédérale de Lausanne

Infoscience

  • English
  • French
Log In
  1. Home
  2. Academic and Research Output
  3. Conferences, Workshops, Symposiums, and Seminars
  4. Large Scale Variational Bayesian Inference for Structured Scale Mixture Models
 
conference paper

Large Scale Variational Bayesian Inference for Structured Scale Mixture Models

Ko, Young Jun  
•
Seeger, Matthias  
2012
Proceedings of the 29th International Conference on Machine Learning
International Conference on Machine Learning 29

Natural image statistics exhibit hierarchical dependencies across multiple scales. Representing such prior knowledge in non-factorial latent tree models can boost performance of image denoising, inpainting, deconvolution or reconstruction substantially, beyond standard factorial ``sparse'' methodology. We derive a large scale approximate Bayesian inference algorithm for linear models with non-factorial (latent tree-structured) scale mixture priors. Experimental results on a range of denoising and inpainting problems demonstrate substantially improved performance compared to MAP estimation or to inference with factorial priors.

  • Files
  • Details
  • Metrics
Loading...
Thumbnail Image
Name

icml12_struct_sparse.pdf

Access type

openaccess

Size

749.99 KB

Format

Adobe PDF

Checksum (MD5)

5585221c0f09c6e2c042cd6b77e49d45

Logo EPFL, École polytechnique fédérale de Lausanne
  • Contact
  • infoscience@epfl.ch

  • Follow us on Facebook
  • Follow us on Instagram
  • Follow us on LinkedIn
  • Follow us on X
  • Follow us on Youtube
AccessibilityLegal noticePrivacy policyCookie settingsEnd User AgreementGet helpFeedback

Infoscience is a service managed and provided by the Library and IT Services of EPFL. © EPFL, tous droits réservés