Repository logo

Infoscience

  • English
  • French
Log In
Logo EPFL, École polytechnique fédérale de Lausanne

Infoscience

  • English
  • French
Log In
  1. Home
  2. Academic and Research Output
  3. Journal articles
  4. Anthropic Correction of Information Estimates and Its Application to Neural Coding
 
research article

Anthropic Correction of Information Estimates and Its Application to Neural Coding

Gastpar, Michael C.  
•
Gill, Patrick R.
•
Huth, Alexander G.
Show more
2010
IEEE Transactions on Information Theory

Information theory has been used as an organizing principle in neuroscience for several decades. Estimates of the mutual information (MI) between signals acquired in neurophysiological experiments are believed to yield insights into the structure of the underlying information processing architectures. With the pervasive availability of recordings from many neurons, several information and redundancy measures have been proposed in the recent literature. A typical scenario is that only a small number of stimuli can be tested, while ample response data may be available for each of the tested stimuli. The resulting asymmetric information estimation problem is considered. It is shown that the direct plug-in information estimate has a negative bias. An anthropic correction is introduced that has a positive bias. These two complementary estimators and their combinations are natural candidates for information estimation in neuroscience. Tail and variance bounds are given for both estimates. The proposed information estimates are applied to the analysis of neural discrimination and redundancy in the avian auditory system.

  • Details
  • Metrics
Type
research article
DOI
10.1109/TIT.2009.2037053
Web of Science ID

WOS:000275032900019

Author(s)
Gastpar, Michael C.  
Gill, Patrick R.
Huth, Alexander G.
Theunissen, Frederic E.
Date Issued

2010

Published in
IEEE Transactions on Information Theory
Volume

56

Start page

890

End page

900

Subjects

Anthropic principle

•

information estimation

•

neural coding

•

neuron

•

redundancy

•

Mutual Information

•

Redundancy

•

Neurons

•

Entropy

•

Representation

•

Distributions

•

Responses

•

Synergy

•

Cortex

•

Code

Editorial or Peer reviewed

REVIEWED

Written at

OTHER

EPFL units
LINX  
Available on Infoscience
October 17, 2011
Use this identifier to reference this record
https://infoscience.epfl.ch/handle/20.500.14299/71647
Logo EPFL, École polytechnique fédérale de Lausanne
  • Contact
  • infoscience@epfl.ch

  • Follow us on Facebook
  • Follow us on Instagram
  • Follow us on LinkedIn
  • Follow us on X
  • Follow us on Youtube
AccessibilityLegal noticePrivacy policyCookie settingsEnd User AgreementGet helpFeedback

Infoscience is a service managed and provided by the Library and IT Services of EPFL. © EPFL, tous droits réservés