Repository logo

Infoscience

  • English
  • French
Log In
Logo EPFL, École polytechnique fédérale de Lausanne

Infoscience

  • English
  • French
Log In
  1. Home
  2. Academic and Research Output
  3. EPFL thesis
  4. SPAD Image Sensors with Embedded Intelligence
 
doctoral thesis

SPAD Image Sensors with Embedded Intelligence

Lin, Yang  
2025

Single-photon avalanche diodes (SPADs) are solid-state photodetectors that can detect individual photons with picosecond timing precision, enabling powerful time-resolved imaging across scientific, industrial, and biomedical applications. Despite their unique sensitivity, conventional SPAD imaging workflows passively collect photons, transfer large volumes of raw data off-chip, and reconstruct results through offline post-processing, leading to inefficiencies in photon usage, high latency, and limited adaptability. This thesis explores the potential of embedded artificial intelligence (AI) for efficient, real-time, intelligent processing in SPAD imaging through hardware-software co-design, bringing computation directly to the sensor to process photon data in its native form. Two general frameworks are proposed, each representing a paradigm shift from the conventional process. The first framework is inspired by the power of artificial neural networks (ANNs) in computer vision. It employs recurrent neural networks (RNNs) that operate directly on timestamps of photon arrival, extracting temporal information in an event-driven manner. The RNN is trained and evaluated for fluorescence lifetime estimation, achieving high precision and robustness. Quantization and approximation techniques are explored to enable FPGA implementation. Based on this, an imaging system integrating a SPAD image sensor with an on-FPGA RNN is developed, enabling real-time fluorescence lifetime imaging and demonstrating generalizability to other time-resolved tasks. The second framework is inspired by the human visual system, employing spiking neural networks (SNNs) that operate directly on the asynchronous pulses generated by SPAD avalanche breakdown upon photon arrival, thereby enabling temporal analysis with ultra-low latency and energy-efficient computation. Two hardware-friendly SNN architectures, Transporter SNN and Reversed start-stop SNN are proposed, which transform the phase-coded spike trains into density-coded and inter-spike-interval-coded representations, enabling more efficient training and processing. Dedicated training methods are explored, and both architectures are validated through fluorescence lifetime imaging. Based on the Transporter SNN architecture, the first SPAD image sensor with on-chip spike encoder for active time-resolved imaging is developed. This thesis encompasses a full-stack imaging workflow, spanning SPAD image sensor design, FPGA implementation, software development, neural network training and evaluation, mathematical modeling, fluorescence lifetime imaging, and optical system setup. Together, these contributions establish new paradigms of intelligent SPAD imaging, where sensing and computation are deeply integrated. The proposed frameworks demonstrate significant gains in photon efficiency, processing speed, robustness, and adaptability, illustrating how embedded AI can transform SPAD systems from passive detectors into intelligent, adaptive, and autonomous imaging platforms for next-generation applications.

  • Files
  • Details
  • Metrics
Loading...
Thumbnail Image
Name

EPFL_TH11448.pdf

Type

Main Document

Version

Not Applicable (or Unknown)

Access type

openaccess

License Condition

N/A

Size

30.53 MB

Format

Adobe PDF

Checksum (MD5)

de74a51ce6e1e44d061c6fb400c922fa

Logo EPFL, École polytechnique fédérale de Lausanne
  • Contact
  • infoscience@epfl.ch

  • Follow us on Facebook
  • Follow us on Instagram
  • Follow us on LinkedIn
  • Follow us on X
  • Follow us on Youtube
AccessibilityLegal noticePrivacy policyCookie settingsEnd User AgreementGet helpFeedback

Infoscience is a service managed and provided by the Library and IT Services of EPFL. © EPFL, tous droits réservés