Repository logo

Infoscience

  • English
  • French
Log In
Logo EPFL, École polytechnique fédérale de Lausanne

Infoscience

  • English
  • French
Log In
  1. Home
  2. Academic and Research Output
  3. Preprints and Working Papers
  4. Supplementary Material - AL2: Progressive Activation Loss for Learning General Representations in Classification Neural Networks
 
working paper

Supplementary Material - AL2: Progressive Activation Loss for Learning General Representations in Classification Neural Networks

El Helou, Majed  
•
Dümbgen, Frederike  
•
Süsstrunk, Sabine  
2020

In this supplementary material, we present the details of the neural network architecture and training settings used in all our experiments. This holds for all experiments presented in the main paper as well as in this supplementary material. We also show the summary results of all of our 96 experiments (test accuracy, training cross-entropy loss, and regularization loss), sampled at 100 epoch intervals. We analyze these results for each of the benchmark datasets, namely MNIST, Fashion-MNIST and CIFAR10, and underline global observations we make throughout the entire experiment set.

  • Files
  • Details
  • Metrics
Type
working paper
Author(s)
El Helou, Majed  
Dümbgen, Frederike  
Süsstrunk, Sabine  
Date Issued

2020

Editorial or Peer reviewed

NON-REVIEWED

Written at

EPFL

EPFL units
IVRL  
LCAV  
Available on Infoscience
October 21, 2019
Use this identifier to reference this record
https://infoscience.epfl.ch/handle/20.500.14299/162164
Logo EPFL, École polytechnique fédérale de Lausanne
  • Contact
  • infoscience@epfl.ch

  • Follow us on Facebook
  • Follow us on Instagram
  • Follow us on LinkedIn
  • Follow us on X
  • Follow us on Youtube
AccessibilityLegal noticePrivacy policyCookie settingsEnd User AgreementGet helpFeedback

Infoscience is a service managed and provided by the Library and IT Services of EPFL. © EPFL, tous droits réservés