Anthropic Correction of Information Estimates and Its Application to Neural Coding
Information theory has been used as an organizing principle in neuroscience for several decades. Estimates of the mutual information (MI) between signals acquired in neurophysiological experiments are believed to yield insights into the structure of the underlying information processing architectures. With the pervasive availability of recordings from many neurons, several information and redundancy measures have been proposed in the recent literature. A typical scenario is that only a small number of stimuli can be tested, while ample response data may be available for each of the tested stimuli. The resulting asymmetric information estimation problem is considered. It is shown that the direct plug-in information estimate has a negative bias. An anthropic correction is introduced that has a positive bias. These two complementary estimators and their combinations are natural candidates for information estimation in neuroscience. Tail and variance bounds are given for both estimates. The proposed information estimates are applied to the analysis of neural discrimination and redundancy in the avian auditory system.
Keywords: Anthropic principle ; information estimation ; neural coding ; neuron ; redundancy ; Mutual Information ; Redundancy ; Neurons ; Entropy ; Representation ; Distributions ; Responses ; Synergy ; Cortex ; Code
Record created on 2011-10-17, modified on 2016-08-09