Repository logo

Infoscience

  • English
  • French
Log In
Logo EPFL, École polytechnique fédérale de Lausanne

Infoscience

  • English
  • French
Log In
  1. Home
  2. Academic and Research Output
  3. Journal articles
  4. M3BAT: Unsupervised Domain Adaptation for Multimodal Mobile Sensing with Multi-Branch Adversarial Training
 
research article

M3BAT: Unsupervised Domain Adaptation for Multimodal Mobile Sensing with Multi-Branch Adversarial Training

Meegahapola, Lakmal
•
Hassoune, Hamza  
•
Gatica-Perez, Daniel  
May 1, 2024
Proceedings Of The Acm On Interactive Mobile Wearable And Ubiquitous Technologies-Imwut

Over the years, multimodal mobile sensing has been used extensively for inferences regarding health and well-being, behavior, and context. However, a significant challenge hindering the widespread deployment of such models in real-world scenarios is the issue of distribution shift. This is the phenomenon where the distribution of data in the training set differs from the distribution of data in the real world-the deployment environment. While extensively explored in computer vision and natural language processing, and while prior research in mobile sensing briefly addresses this concern, current work primarily focuses on models dealing with a single modality of data, such as audio or accelerometer readings, and consequently, there is little research on unsupervised domain adaptation when dealing with multimodal sensor data. To address this gap, we did extensive experiments with domain adversarial neural networks (DANN) showing that they can effectively handle distribution shifts in multimodal sensor data. Moreover, we proposed a novel improvement over DANN, called M3BAT, unsupervised domain adaptation formultimodalmobile sensing withmulti-branch adversarial training, to account for the multimodality of sensor data during domain adaptation with multiple branches. Through extensive experiments conducted on two multimodal mobile sensing datasets, three inference tasks, and 14 source-target domain pairs, including both regression and classification, we demonstrate that our approach performs effectively on unseen domains. Compared to directly deploying a model trained in the source domain to the target domain, the model shows performance increases up to 12% AUC (area under the receiver operating characteristics curves) on classification tasks, and up to 0.13 MAE (mean absolute error) on regression tasks.

  • Details
  • Metrics
Type
research article
DOI
10.1145/3659591
Web of Science ID

WOS:001229316000005

Author(s)
Meegahapola, Lakmal
•
Hassoune, Hamza  
•
Gatica-Perez, Daniel  
Date Issued

2024-05-01

Publisher

Assoc Computing Machinery

Published in
Proceedings Of The Acm On Interactive Mobile Wearable And Ubiquitous Technologies-Imwut
Volume

8

Issue

2

Start page

46

Subjects

Technology

•

Mobile And Wearable Sensing

•

Multimodal Sensing

•

Domain Adaptation

•

Distribution Shift

•

Generalization

•

Transfer Learning

•

Mood

•

Social Context

•

Energy Expenditure Estimation

Editorial or Peer reviewed

REVIEWED

Written at

EPFL

EPFL units
LIDIAP  
FunderGrant Number

European Union

823783

Available on Infoscience
June 19, 2024
Use this identifier to reference this record
https://infoscience.epfl.ch/handle/20.500.14299/208622
Logo EPFL, École polytechnique fédérale de Lausanne
  • Contact
  • infoscience@epfl.ch

  • Follow us on Facebook
  • Follow us on Instagram
  • Follow us on LinkedIn
  • Follow us on X
  • Follow us on Youtube
AccessibilityLegal noticePrivacy policyCookie settingsEnd User AgreementGet helpFeedback

Infoscience is a service managed and provided by the Library and IT Services of EPFL. © EPFL, tous droits réservés