Bounds for the Varentropy of Basic Discrete Distributions and Characterization of Some Discrete Distributions

Document Type : Original Paper

Author

Department of Statistics, Faculty of Mathematical Sciences, University of Kashan, Kashan, Islamic Republic of Iran

Abstract

Given the importance of varentropy in information theory, and since a closed form cannot be derived for some discrete distributions, we aim to establish bounds for the varentropy of these distributions and introduce the past varentropy for discrete random variables. In this article, we first acquired lower and upper bounds for the varentropy of the Poisson, binomial, negative binomial, and hypergeometric distributions. Since the resulting upper bounds are expressed as squared logarithmic expectations, we provide an equivalent formulation using squared logarithmic difference coefficients. Similarly, we present lower bounds in terms of logarithmic difference coefficients. Furthermore, an upper bound is derived for the variance of a function of discrete reversed residual lifetime function. We also investigate inequalities involving moments of selected functions via the reversed hazard rate and characterize certain discrete distributions by the Cauchy-Schwarz inequality.

Keywords

Main Subjects

  1. Jiang J, Wang R., Pezeril M. and Wang QA. Application of varentropy as a measure of

     probabilistic uncertainty for complex networks, Science Bulletin. 2011; 56: 3677–3682.

  1. Li J, Fradelizi M. and Madiman M. Information concentration for convex measures, IEEE

      International Symposium on Information Theory, Barcelona. 2016; 1128-1132.

  1. Fradelizi M, Madiman M. and Wang L. Optimal concentration of information content for

    logconcave densities. In C. Houdré, D. Mason, P. Reynaud-Bouret & J. Rosin ́nski (eds.), High               Dimensional Probability VII. Progress in Probability, vol. 71, Cham, Springer. 2016;  45-60.

  1. Gupta BB and Badve OP. GARCH and ANN-based DDoS detection and filtering in cloud computing environment, International Journal of Embedded Systems. 2017; 9: 391-400.
  2. De Gregorio A and Iacus SM. On Rényi information for ergodic diffusion processes, Information Sciences. 2009; 179: 279-291.
  3. Goodarzi F, Amini M and Mohtashami Borzadaran GR. Characterizations of continuous distributions through inequalities involving the expected values of selected functions, Applications of Mathematics. 2017(a); 62: 493–507.
  4. Goodarzi F, Amini M and Mohtashami Borzadaran GR. On lower bounds for the variance of functions of random variables, Applications of Mathematics. 2021; 66: 767–788.
  5. Sharma A and Kundu C. Varentropy of doubly truncated random variable, Probability in the

      Engineering and Informational Sciences. 2022; 37(3): 852–871.

  1. Maadani S, Mohtashami Borzadaran GR and Rezaei Roknabadi AH. Varentropy of order statistics and some stochastic, Communication in Statistics-Theory and Methods. 2022; 51: 6447-6460.
  2. Goodarzi F, Amini M and Mohtashami Borzadaran GR. Some results on upper bounds

 

    for the variance of functions of the residual life random variables, Journal of Computational and Applied

    Mathematics. 2017(b); 320, 30-42.

  1. Goodarzi F, Amini M and Mohtashami Borzadaran GR. On upper bounds for the variance

    of functions of the inactivity time, Statistics and Probability Letters. 2016; 117: 62–71.

  1. Buono F, Longobardi M. Varentropy of past lifetimes, Mathematical Methods of Statistics. 2022; 31: 57-73.
  2. Goodarzi F, Characterizations of some discrete distributions and upper bounds on discrete

     residual varentropy, Journal of the Iranian Statistical Society. 2022; 21(2): 233–250.

  1. Alizadeh Noughabi H and Shafaei Noughabi M. Varentropy estimators with applications in

    testing uniformity, Journal of Statistical Computation and Simulation. 2023; 93: 2582-2599.

  1. Kontoyiannis I and Verdú. S. Optimal lossless data compression: non-asymptotics and asymp-

      totics, IEEE Transactions on Information Theory. 2014; 60: 777-795.

  1. Nair NU and Sankaran PG. Characterizations of discrete distributions using reliability concepts in reversed time, Statistics and Probability Letters. 2013; 83: 1939–1945.
  2. Cacoullos T and Papathanasiou V. Characterizations of distributions by variance bounds, Statistics and Probabability Letters. 1989; 7: 351-356.
  3. Kelley CT. Solving nonlinear equations with Newtonś method, SIAM, Philadelphia, 2003.
  4. Song K.-S. Rényi information, loglikelihood and an intrinsic distribution measure, Journal of

     Statistical Planning and Inference. 2001; 93: 51-69.

  1. Cheraghchi M. Expressions for the entropy of basic discrete distribution, IEEE Transactions on

    Information Theory. 2019; 65: 3999-4009.

  1. Gupta L. Properties of reliability functions of discrete distributions, Communication in

     Statistics-Theory and Methods. 2015; 44: 4114-4131.