Similarity of interspike interval distributions and information gain in a stationary neuronal firing

Publication
Biol. Cybern.

Article

The Kullback-Leibler (KL) information distance is proposed for judging similarity between two different interspike interval (ISI) distributions. The method is applied by comparison of four common ISI descriptors with an exponential model which is characterized by the highest entropy. Under the condition of equal mean ISI values, the KL distance corresponds to information gain coming from the state described by the exponential distribution to the state described by the chosen ISI model. It has been shown that information can be transmitted changing neither the spike rate nor coefficient of variation (CV ). Furthermore the KL distance offers an indication of the exponentiality of the chosen ISI descriptor (or data): the distance is zero if, and only if, the ISIs are distributed exponentially. Finally an application on experimental data coming from the olfactory sensory neurons of rats is shown.