Repository logo

Infoscience

  • English
  • French
Log In
Logo EPFL, École polytechnique fédérale de Lausanne

Infoscience

  • English
  • French
Log In
  1. Home
  2. Academic and Research Output
  3. Conferences, Workshops, Symposiums, and Seminars
  4. A cross-sensor approach for marine litter detection with self-supervised learning
 
conference paper not in proceedings

A cross-sensor approach for marine litter detection with self-supervised learning

Dalsasso, Emanuele  
•
Russwurm, Marc
•
Donner, Christian
Show more
March 18, 2025
EGU25, the 27th EGU General Assembly

Marine litter is a growing ecologic, economic, and societal concern that must be addressed at a global scale. Floating material aggregates under the effect of oceanic processes to form so-called windrows;, used as proxies for marine litter. Windrows reach sizes that make them visible for high-resolution optical satellites. Most recently, the availability of labeled datasets of Sentinel-2 images (MARIDA, FloatingObjects) has enabled the use of deep learning for large-scale marine litter monitoring: a segmentation model can be trained in a supervised manner to predict the presence of floating objects. However, the temporal resolution of Sentinel-2 (up to 6 days between consecutive acquisitions) limits the operational impact of such tools. Within this context, PlanetScope images can be leveraged to fill the temporal gaps of Sentinel-2 even at a higher spatial resolution: PlanetScope images have a higher spatial resolution than Sentinel-2 (3m vs. 10m) and are acquired daily. Nevertheless, there is a lack of labeled PlanetScope images for the specific purpose of marine debris detection.To address this gap, we propose a cross-sensor training strategy that allows a model to transfer knowledge from Sentinel-2 to PlanetScope without extra supervision. In particular, we leverage self-supervised learning to pre-train a model that learns a common latent space between the two sensors. Sensor-specific embedding layers project their features into a common U-Net model, itself trained to remove noise from the input images as a self-supervised learning task. Thanks to this self-supervised task, the model learns the semantics of the data without requiring any labels. Next, the model is fine-tuned on labeled Sentinel-2 images, as in most recent deep learning solutions. Since self-supervised cross-sensor pre-training has forced the model to learn a common representation between the two satellite sources, while learning to identify marine litter on Sentinel-2 images, the model co-learns to segment PlanetScope data. Thus, at prediction time, the model can be directly applied to PlanetScope images with excellent results.We evaluate the performances of the developed model on a manually annotated validation set of PlanetScope images: both visual inspection and quantitative assessment highlight the significant improvement of the proposed model, compared against a fully supervised model trained on Sentinel-2 only. This demonstrates the effectiveness of the proposed pre-training strategy as a promising solution to enable continuous large-scale mapping of marine litter on optical satellites.

  • Details
  • Metrics
Type
conference paper not in proceedings
DOI
10.5194/egusphere-egu25-8279
Author(s)
Dalsasso, Emanuele  

EPFL

Russwurm, Marc
Donner, Christian
de Vries, Robin
Volpi, Michele  

EPFL

Tuia, Devis  

EPFL

Date Issued

2025-03-18

URL

View EGU General Assembly 2025 website

https://meetingorganizer.copernicus.org/EGU25
Written at

EPFL

EPFL units
ECEO  
Event nameEvent acronymEvent placeEvent date
EGU25, the 27th EGU General Assembly

EGU25

Vienne, Austria

2025-04-27 - 2025-05-02

Available on Infoscience
March 21, 2025
Use this identifier to reference this record
https://infoscience.epfl.ch/handle/20.500.14299/248148
Logo EPFL, École polytechnique fédérale de Lausanne
  • Contact
  • infoscience@epfl.ch

  • Follow us on Facebook
  • Follow us on Instagram
  • Follow us on LinkedIn
  • Follow us on X
  • Follow us on Youtube
AccessibilityLegal noticePrivacy policyCookie settingsEnd User AgreementGet helpFeedback

Infoscience is a service managed and provided by the Library and IT Services of EPFL. © EPFL, tous droits réservés