Repository logo

Infoscience

  • English
  • French
Log In
Logo EPFL, École polytechnique fédérale de Lausanne

Infoscience

  • English
  • French
Log In
  1. Home
  2. Academic and Research Output
  3. Journal articles
  4. Eflop: a Sparsity-aware Metric for Evaluating Computational Cost in Spiking and Non-spiking Neural Networks
 
research article

Eflop: a Sparsity-aware Metric for Evaluating Computational Cost in Spiking and Non-spiking Neural Networks

Narduzzi, Simon
•
Zenke, Friedemann
•
Liu, Shih-Chii
Show more
September 1, 2025
Neuromorphic Computing And Engineering

Deploying energy-efficient deep neural networks on energy-constrained edge devices is an important research topic in both machine learning and circuit design communities. Both artificial neural networks (ANNs) and spiking neural networks (SNNs) have been proposed as candidates for these tasks. In particular, SNNs are considered energy-efficient because they leverage temporal sparsity in their outputs. However, existing computational frameworks fail to accurately estimate the cost of running sparse networks on modern time-stepped hardware, which exploits sparsity by skipping zero-valued operations. Meanwhile, weight sparsity-aware training remains underexplored for SNNs and lacks systematic benchmarking against optimized ANNs, making fair comparisons between the two paradigms difficult. To bridge this gap, we introduce the effective floating-point operation (EFLOP), a metric that accounts for the sparse operations during pre-activation updates of both ANNs and SNNs. Applying weight sparsity-aware training to both SNNs and ANNs, we achieve up to 8.9x reduction in EFLOPs for gated recurrent unit models and 3.6x for LIF models by sparsifying weights by 80 %, without sacrificing accuracy on the Spiking Heidelberg Digits and Spiking Speech Command datasets. These findings highlight the critical role of network sparsity in designing energy-efficient neural networks and establish EFLOPs as a robust framework for cross-paradigm comparisons.

  • Files
  • Details
  • Metrics
Loading...
Thumbnail Image
Name

Narduzzi_2025_Neuromorph._Comput._Eng._5_034011.pdf

Type

Main Document

Version

Published version

Access type

openaccess

License Condition

CC BY

Size

1.44 MB

Format

Adobe PDF

Checksum (MD5)

6d08cf3d972ac5c4704b8cba8de94ef9

Logo EPFL, École polytechnique fédérale de Lausanne
  • Contact
  • infoscience@epfl.ch

  • Follow us on Facebook
  • Follow us on Instagram
  • Follow us on LinkedIn
  • Follow us on X
  • Follow us on Youtube
AccessibilityLegal noticePrivacy policyCookie settingsEnd User AgreementGet helpFeedback

Infoscience is a service managed and provided by the Library and IT Services of EPFL. © EPFL, tous droits réservés