Repository logo

Infoscience

  • English
  • French
Log In
Logo EPFL, École polytechnique fédérale de Lausanne

Infoscience

  • English
  • French
Log In
  1. Home
  2. Academic and Research Output
  3. Conferences, Workshops, Symposiums, and Seminars
  4. SynthDistill: Face Recognition with Knowledge Distillation from Synthetic Data
 
conference paper

SynthDistill: Face Recognition with Knowledge Distillation from Synthetic Data

Shahreza, Hatef Otroshi  
•
George, Anjith
•
Marcel, Sebastien
January 1, 2023
2023 Ieee International Joint Conference On Biometrics, Ijcb
IEEE International Joint Conference on Biometrics (IJCB)

State-of-the-art face recognition networks are often computationally expensive and cannot be used for mobile applications. Training lightweight face recognition models also requires large identity-labeled datasets. Meanwhile, there are privacy and ethical concerns with collecting and using large face recognition datasets. While generating synthetic datasets for training face recognition models is an alternative option, it is challenging to generate synthetic data with sufficient intra-class variations. In addition, there is still a considerable gap between the performance of models trained on real and synthetic data. In this paper, we propose a new framework (named SynthDistill) to train lightweight face recognition models by distilling the knowledge of a pretrained teacher face recognition model using synthetic data. We use a pretrained face generator network to generate synthetic face images and use the synthesized images to learn a lightweight student network. We use synthetic face images without identity labels, mitigating the problems in the intra-class variation generation of synthetic datasets. Instead, we propose a novel dynamic sampling strategy from the intermediate latent space of the face generator network to include new variations of the challenging images while further exploring new face images in the training batch. The results on five different face recognition datasets demonstrate the superiority of our lightweight model compared to models trained on previous synthetic datasets, achieving a verification accuracy of 99.52% on the LFW dataset with a lightweight network. The results also show that our proposed framework significantly reduces the gap between training with real and synthetic data. The source code for replicating the experiments is publicly released.

  • Details
  • Metrics
Type
conference paper
DOI
10.1109/IJCB57857.2023.10448642
Web of Science ID

WOS:001180818700016

Author(s)
Shahreza, Hatef Otroshi  
George, Anjith
Marcel, Sebastien
Corporate authors
IEEE
Date Issued

2023-01-01

Publisher

IEEE

Publisher place

New York

Published in
2023 Ieee International Joint Conference On Biometrics, Ijcb
ISBN of the book

979-8-3503-3726-6

Subjects

Technology

Editorial or Peer reviewed

REVIEWED

Written at

EPFL

EPFL units
LIDIAP  
Event nameEvent placeEvent date
IEEE International Joint Conference on Biometrics (IJCB)

Ljubljana, SLOVENIA

SEP 25-28, 2023

FunderGrant Number

H2020 TReSPAsS-ETN Marie Sklodowska-Curie early training network

860813

Office of the Director of National Intelligence (ODNI), Intelligence Advanced Research Projects Activity (IARPA)

2022-21102100007

Available on Infoscience
April 17, 2024
Use this identifier to reference this record
https://infoscience.epfl.ch/handle/20.500.14299/207186
Logo EPFL, École polytechnique fédérale de Lausanne
  • Contact
  • infoscience@epfl.ch

  • Follow us on Facebook
  • Follow us on Instagram
  • Follow us on LinkedIn
  • Follow us on X
  • Follow us on Youtube
AccessibilityLegal noticePrivacy policyCookie settingsEnd User AgreementGet helpFeedback

Infoscience is a service managed and provided by the Library and IT Services of EPFL. © EPFL, tous droits réservés