The effect of interspike interval statistics on the information gain under the rate coding hypothesis

Publication
Math. Biosci. Eng.

Abstract

The question, how much information can be theoretically gained from variable neuronal firing rate with respect to constant average firing rate is investigated. We employ the statistical concept of information based on the Kullback-Leibler divergence, and assume rate-modulated renewal processes as a model of spike trains. We show that if the firing rate variation is sufficiently small and slow (with respect to the mean interspike interval), the information gain can be expressed by the Fisher information. Furthermore, under certain assumptions, the smallest possible information gain is provided by gamma- distributed interspike intervals. The methodology is illustrated and discussed on several different statistical models of neuronal activity.