Repository logo

Infoscience

  • English
  • French
Log In
Logo EPFL, École polytechnique fédérale de Lausanne

Infoscience

  • English
  • French
Log In
  1. Home
  2. Academic and Research Output
  3. Journal articles
  4. Visual Distortions in 360-degree Videos
 
research article

Visual Distortions in 360-degree Videos

De Albuquerque Azevedo, Roberto Gerson  
•
Birkbeck, Neil
•
De Simone, Francesca
Show more
2020
IEEE Transactions on Circuits and Systems for Video Technology

Omnidirectional (or 360-degree) images and videos are emergent signals being used in many areas such as robotics and virtual/augmented reality. In particular, for virtual reality applications, they allow an immersive experience in which the user can interactively navigate through a scene with three degrees of freedom, wearing a head-mounted display. Current approaches for capturing, processing, delivering, and displaying 360-degree content, however, present many open technical challenges and introduce several types of distortions in the visual signal. Some of the distortions are specific to the nature of 360-degree images and often differ from those encountered in classical visual communication frameworks. This paper provides a first comprehensive review of the most common visual distortions that alter 360-degree signals going through the different processing elements of the visual communication pipeline. While their impact on viewers’ visual perception and the immersive experience at large is still unknown –thus, it is an open research topic– this review serves the purpose of proposing a taxonomy of the visual distortions that can be encountered in 360-degree signals. Their underlying causes in the end-to-end 360-degree content distribution pipeline are identified. This taxonomy is essential as a basis for comparing different processing techniques, such as visual enhancement, encoding, and streaming strategies, and allowing the effective design of new algorithms and applications. It is also a useful resource for the design of psycho-visual studies aiming to characterize human perception of 360-degree content in interactive and immersive applications.

  • Details
  • Metrics
Type
research article
DOI
10.1109/TCSVT.2019.2927344
Author(s)
De Albuquerque Azevedo, Roberto Gerson  
Birkbeck, Neil
De Simone, Francesca
Janatra, Ivan
Adsumilli, Balu
Frossard, Pascal
Date Issued

2020

Published in
IEEE Transactions on Circuits and Systems for Video Technology
Volume

30

Issue

8

Start page

2524

End page

2537

Editorial or Peer reviewed

REVIEWED

Written at

EPFL

EPFL units
LTS4  
Available on Infoscience
August 7, 2019
Use this identifier to reference this record
https://infoscience.epfl.ch/handle/20.500.14299/159566
Logo EPFL, École polytechnique fédérale de Lausanne
  • Contact
  • infoscience@epfl.ch

  • Follow us on Facebook
  • Follow us on Instagram
  • Follow us on LinkedIn
  • Follow us on X
  • Follow us on Youtube
AccessibilityLegal noticePrivacy policyCookie settingsEnd User AgreementGet helpFeedback

Infoscience is a service managed and provided by the Library and IT Services of EPFL. © EPFL, tous droits réservés