A New Estimator of Shannon Entropy with Application to Goodness-of-Fit Test to Normal Distribution
Mbanefo S. Madukaife *
Department of Statistics, University of Nigeria, Nsukka, Nigeria.
*Author to whom correspondence should be addressed.
Abstract
In this paper, a new estimator of the Shannon entropy of a random variable X having a probability density function \(\mathit{f}\)(\(\mathit{x}\)) is obtained based on window size spacings. Under the standard normal, standard exponential and uniform distributions, the estimator is shown to have relative low bias and low RMSE through extensive simulation study at sample sizes 10, 20, and 30. Based on the results, it is recommended as a good estimator of the entropy. Also, the new estimator is applied in goodness-of-fit test to normality. The statistic is affine invariant and consistent and the results show that it is a good statistic for assessing univariate normality of datasets.
Keywords: Entropy estimator, window size spacing, bias of an estimator, root mean square error of an estimator, test for normality
How to Cite
Downloads
References
Shannon CE. A mathematical theory of communications. Bell System Technical Journal. 1948;27:379- 423,623-656. DOI:10.1002/bltj.1948.27.issue-3
Vasicek O. A test for normality based on sample entropy. Journal of the Royal Statistical Society B. 1976;38:54-59.
Ebrahimi N, Pflughoeft K, Soofi ES. Two measures of sample entropy. Statistics and Probability Letters. 1994;20:225-234.
Wieczorkowski R, Grzegorzewsky P. Entropy estimators improvements and comparisons. Communication in Statistics-Simulation and Computation. 1999;28(2):541-567. DOI:10.1080/ 03610919908813564
Alizadeh Noughabi H. A new estimator of entropy and its application in testing normality. Journal of Statistical Computation and Simulation. 2010;80(10):1151-1162.
Alizadeh Noughabi H, Arghami NR. A new estimator of entropy. Journal of the Iranian Statistical Society. 2010;9(1):53-64.
Zamanzade E, Arghami NR. Testing normality based on new entropy estimators. Journal of Statistical Computation and Simulation. 2012;82(11):1701-1713.
Al-Omari AI. A new measure of entropy of continuous random variable. Journal of Statistical Theory and Practice. 2016;10:721-735. DOI: 10.1080/15598608.2016.1217444
Lombardi M, Pant S. Nonparametric k-nearest-neighbor entropy estimator. Physical Review E. 2016;93:013310.
Kohansal A, Rezakhah S. Two new estimators of entropy for testing normality. Communications in Statistics – Theory and Methods. 2016;45(18):5392-5411. DOI: 10.1080/03610926.2014.942436
Bitaraf M, Rezaei M, Yousefzadeh F. Test for normality based on two new estimators of entropy. Journal of Statistical Computation and Simulation 2017;87(2):280-294. DOI: 10.1080/00949655.2016.1208201
Al-Omari AI. Estimation of entropy using random sampling. Journal of Computation and Applied Mathematics. 2014;261:95-102. DOI:10.1016/j.cam.2013.10.047.
Zhang Z. Entropy estimation in Turing’s perspective. Neural Computation. 2012;24(5):1368-1389.
Zhang Z. Asymptotic normality of an entropy estimator with exponentially decaying bias. IEEE Transactions on Information Theory. 2013;59(1):504-508.
Grzegorzewski P, Wieczorkowski R. Entropy-based goodness-of-fit test for exponentiality. Commun Stat - Theory Methods. 1999;28:1183-1202.
Seber WAF. Multivariate observations, 2nd edition. John Wiley and Sons Inc. New York; 2004.