Information Theory

Asymptotic Properties of the Plug-in Estimator of the Discrete Entropy under Dependence

We consider the estimation of the entropy of a discretely-supported time series through a plug-in estimator. We provide a correction of the bias and we study the asymptotic properties of the estimator. We show that the widely-used correction proposed by Roulston (1999) is incorrect as it does not remove the $O\left(N^{-1}\right)$ part of the bias while ours does. We provide the asymptotic distribution and we show that it differs when the values taken by the marginal distribution of the process are equiprobable (a situation that we call *degeneracy*) and when they are not. We introduce estimators of the bias, the variance and the distribution under degeneracy and we study the estimation error. Finally, we propose a goodness-of-fit test based on entropy and give two motivations for it. The theoretical results are supported by specific numerical examples.

Asymptotic properties of the plug-in estimator of the discrete entropy under dependence

Seminar

Statistical properties of a divergence measure

Seminar