Repository logo

Infoscience

  • English
  • French
Log In
Logo EPFL, École polytechnique fédérale de Lausanne

Infoscience

  • English
  • French
Log In
  1. Home
  2. Academic and Research Output
  3. Journal articles
  4. Single Image Deraining Using Time-Lapse Data
 
research article

Single Image Deraining Using Time-Lapse Data

Cho, Jaehoon
•
Kim, Seungryong  
•
Min, Dongbo
Show more
January 1, 2020
Ieee Transactions On Image Processing

Leveraging on recent advances in deep convolutional neural networks (CNNs), single image deraining has been studied as a learning task, achieving an outstanding performance over traditional hand-designed approaches. Current CNNs based deraining approaches adopt the supervised learning framework that uses a massive training data generated with synthetic rain streaks, having a limited generalization ability on real rainy images. To address this problem, we propose a novel learning framework for single image deraining that leverages time-lapse sequences instead of the synthetic image pairs. The deraining networks are trained using the time-lapse sequences in which both camera and scenes are static except for time-varying rain streaks. Specifically, we formulate a background consistency loss such that the deraining networks consistently generate the same derained images from the time-lapse sequences. We additionally introduce two loss functions, the structure similarity loss that encourages the derained image to be similar with an input rainy image and the directional gradient loss using the assumption that the estimated rain streaks are likely to be sparse and have dominant directions. To consider various rain conditions, we leverage a dynamic fusion module that effectively fuses multi-scale features. We also build a novel large-scale time-lapse dataset providing real world rainy images containing various rain conditions. Experiments demonstrate that the proposed method outperforms state-of-the-art techniques on synthetic and real rainy images both qualitatively and quantitatively. On the high-level vision tasks under severe rainy conditions, it has been shown that the proposed method can be utilized as a pre-preprocessing step for subsequent tasks.

  • Details
  • Metrics
Type
research article
DOI
10.1109/TIP.2020.3000612
Web of Science ID

WOS:000553851400003

Author(s)
Cho, Jaehoon
•
Kim, Seungryong  
•
Min, Dongbo
•
Sohn, Kwanghoon
Date Issued

2020-01-01

Published in
Ieee Transactions On Image Processing
Volume

29

Start page

7274

End page

7289

Subjects

Computer Science, Artificial Intelligence

•

Engineering, Electrical & Electronic

•

Computer Science

•

Engineering

•

rain

•

training data

•

task analysis

•

convolutional neural networks

•

rendering (computer graphics)

•

training

•

feature extraction

•

single image deraining

•

convolutional neural networks (cnns)

•

time-lapse dataset

•

dynamic fusion module

•

removal

Editorial or Peer reviewed

REVIEWED

Written at

EPFL

EPFL units
IVRL  
Available on Infoscience
August 13, 2020
Use this identifier to reference this record
https://infoscience.epfl.ch/handle/20.500.14299/170823
Logo EPFL, École polytechnique fédérale de Lausanne
  • Contact
  • infoscience@epfl.ch

  • Follow us on Facebook
  • Follow us on Instagram
  • Follow us on LinkedIn
  • Follow us on X
  • Follow us on Youtube
AccessibilityLegal noticePrivacy policyCookie settingsEnd User AgreementGet helpFeedback

Infoscience is a service managed and provided by the Library and IT Services of EPFL. © EPFL, tous droits réservés