Repository logo

Infoscience

  • English
  • French
Log In
Logo EPFL, École polytechnique fédérale de Lausanne

Infoscience

  • English
  • French
Log In
  1. Home
  2. Academic and Research Output
  3. Conferences, Workshops, Symposiums, and Seminars
  4. Learning, Inference, and Replay of Hidden State Sequences in Recurrent Spiking Neural Networks
 
conference poster not in proceedings

Learning, Inference, and Replay of Hidden State Sequences in Recurrent Spiking Neural Networks

Corneil, Dane Sterling  
•
Neftci, Emre
•
Indiveri, Giacomo
Show more
2014
COSYNE 2014

Learning to recognize, predict, and generate spatio-temporal patterns and sequences of spikes is a key feature of nervous systems, and essential for solving basic tasks like localization and navigation. How this can be done by a spiking network, however, remains an open question. Here we present a STDP-based framework extending a previous model [1], that can simultaneously learn to abstract hidden states from sensory inputs and learn transition probabilities [2] between these states in recurrent connection weights.

  • Files
  • Details
  • Metrics
Loading...
Thumbnail Image
Name

Cosyne14_Corneil.pdf

Access type

openaccess

Size

4.2 MB

Format

Adobe PDF

Checksum (MD5)

45e47d219ebc61697ee4d820b607344e

Logo EPFL, École polytechnique fédérale de Lausanne
  • Contact
  • infoscience@epfl.ch

  • Follow us on Facebook
  • Follow us on Instagram
  • Follow us on LinkedIn
  • Follow us on X
  • Follow us on Youtube
AccessibilityLegal noticePrivacy policyCookie settingsEnd User AgreementGet helpFeedback

Infoscience is a service managed and provided by the Library and IT Services of EPFL. © EPFL, tous droits réservés