Skip to main content
Log in

Jensen-variance distance measure: a unified framework for statistical and information measures

  • Published:
Computational and Applied Mathematics Aims and scope Submit manuscript

Abstract

The Jensen-variance (JV) distance measure is introduced and some properties are developed. The JV distance measure can be expressed using two interesting representations: the first one is based on mixture covariances, and the second one is in terms of the scaled variance of the absolute difference of two random variables. The connections between the JV distance measure and some well-known information measures, such as Fisher information, Gini mean difference, cumulative residual entropy, Fano factor, varentropy, varextropy, and chi-square distance measures, are examined. Specifically, the JV distance measure possesses metric properties and unifies most of the information measures within a general framework. It also includes variance and conditional variance as special cases. Furthermore, an extension of the JV distance measure in terms of transformed variables is provided. Finally, to demonstrate the usefulness of proposed methods, JV distance is applied to a real-life dataset related to fish condition factor index and some numerical results assuming skew-normal-distributed samples are presented.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3

Similar content being viewed by others

Data availibility

Data used in Sect. 3.3 and R codes of this paper will be made available upon reasonable request from the corresponding author.

References

  • Abid SH, Quaez UJ, Contreras-Reyes JE (2021) An information-theoretic approach for multivariate skew-\(t\) distributions and applications. Mathematics 9:146

    Article  Google Scholar 

  • Azzalini A (2013) The skew-normal and related families, vol 3. Cambridge University Press, Cambridge, UK

    Book  Google Scholar 

  • Balakrishnan N, Buono F, Calì C, Longobardi M (2023) Dispersion indices based on Kerridge inaccuracy measure and Kullback-Leibler divergence. Commun Stat Theory Method. https://doi.org/10.1080/03610926.2023.2222926

    Article  Google Scholar 

  • Balakrishnan N, Kharazmi O (2022) Cumulative past Fisher information measure and its extensions. Braz J Prob Stat 36:540–559

    Article  MathSciNet  Google Scholar 

  • Barlow RE, Proschan F (1975) Statistical theory of reliability and life testing: probability models. Florida State University, Tallahassee

    Google Scholar 

  • Bobkov SG (2019) Moments of the scores. IEEE Trans Inf Theory 65:5294–5301

    Article  MathSciNet  Google Scholar 

  • Bercher JF (2013) Some properties of generalized Fisher information in the context of nonextensive thermostatistics. Physica A 392:3140–3154

    Article  MathSciNet  Google Scholar 

  • Casella G, Berger RL (2021) Statistical inference. Cengage Learning, Sao Paulo

    Google Scholar 

  • Chiogna M (2005) A note on the asymptotic distribution of the maximum likelihood estimator for the scalar skew-normal distribution. Stat Methods Appl 14:331–341

    Article  MathSciNet  Google Scholar 

  • Contreras-Reyes JE (2016) Analyzing fish condition factor index through skew-Gaussian information theory quantifiers. Fluct Noise Lett 15:1650013

    Article  Google Scholar 

  • Contreras-Reyes JE (2021) Fisher information and uncertainty principle for skew-Gaussian random variables. Fluct Noise Lett 20:2150039

    Article  Google Scholar 

  • Contreras-Reyes JE (2023a) Information quantity evaluation of nonlinear time series processes and applications. Physica D 445:133620

    Article  MathSciNet  Google Scholar 

  • Contreras-Reyes JE (2023b) Information quantity evaluation of multivariate SETAR processes of order one and applications. Stat Papers. https://doi.org/10.1007/s00362-023-01457-6

    Article  Google Scholar 

  • Contreras-Reyes JE, Canales TM, Rojas PM (2016) Influence of climate variability on anchovy reproductive timing off northern Chile. J Mar Syst 164:67–75

    Article  Google Scholar 

  • Cover TM, Thomas JA (2006) Elements of information theory. Wiley, New York

    Google Scholar 

  • Di Crescenzo A, Paolillo L (2021) Analysis and applications of the residual varentropy of random lifetimes. Prob Eng Inf Sci 35:680–698

    Article  MathSciNet  Google Scholar 

  • Feller W, Morse PM (1958) An introduction to probability theory and its applications. Wiley, New York

    Book  Google Scholar 

  • Fisher RA (1929) Tests of significance in harmonic analysis. Proc R Soc Lond Series A 125:54–59

    Article  Google Scholar 

  • Gupta RC, Brown N (2001) Reliability studies of the skew-normal distribution and its application to a strength-stress model. Commun Stat Theory Methods 30:2427–2445

    Article  MathSciNet  Google Scholar 

  • Hastie T, Tibshirani R, Friedman JH, Friedman JH (2009) The elements of statistical learning: data mining, inference, and prediction, vol 2. Springer, New York

    Book  Google Scholar 

  • Johnson O (2004) Information theory and the central limit theorem. World Scientific, Singapore

    Book  Google Scholar 

  • Kattumannil SK, Sreelakshmi N, Balakrishnan N (2020) Non-parametric inference for Gini covariance and its variants. Sankhya A 84:790–807

    Article  MathSciNet  Google Scholar 

  • Kharazmi O, Asadi M (2018) On the time-dependent Fisher information of a density function. Braz J Prob Stat 32:795–814

    Article  MathSciNet  Google Scholar 

  • Kharazmi O, Balakrishnan N, Jamali H (2022) Cumulative residual \(q\)-Fisher information and Jensen-cumulative residual \(\chi ^2\) divergence measures. Entropy 24:341

    Article  MathSciNet  Google Scholar 

  • Kharazmi O, Contreras-Reyes JE, Balakrishnan N (2023a) Jensen-Fisher information and Jensen-Shannon entropy measures based on complementary discrete distributions with an application to Conway’s game of life. Physica D 453:133822

    Article  MathSciNet  Google Scholar 

  • Kharazmi O, Jamali H, Contreras-Reyes JE (2023b) Fisher information and its extensions based on infinite mixture density functions. Physica A 624:128959

    Article  MathSciNet  Google Scholar 

  • Kharazmi O, Balakrishnan N, Ozonur D (2023c) Jensen-discrete information generating function with an application to image processing. Soft Comput 27:4543–4552

    Article  Google Scholar 

  • Lin J (1991) Divergence measures based on the Shannon entropy. IEEE Trans Inf Theory 37:145–151

    Article  MathSciNet  Google Scholar 

  • Mehrali Y, Asadi M, Kharazmi O (2018) A Jensen-Gini measure of divergence with application in parameter estimation. Metron 76:115–131

    Article  MathSciNet  Google Scholar 

  • Montgomery DC, Runger GC (2020) Applied statistics and probability for engineers. Wiley, Chichester

    Google Scholar 

  • Nielsen F, Nock R (2013) On the chi square and higher-order chi distances for approximating \(f\)-divergences. IEEE Signal Process Lett 21:10–13

    Article  Google Scholar 

  • Noughabi HA, Noughabi MS (2023) Varentropy estimators with applications in testing uniformity. J Stat Comput Simul 93:2582–2599

    Article  MathSciNet  Google Scholar 

  • Ramachandran KM, Tsokos CP (2020) Mathematical statistics with applications in R. Academic Press, Hoboken

    Google Scholar 

  • Ross SM (2014) Introduction to probability models. Academic Press, San Diego

    Google Scholar 

  • Sánchez-Moreno P, Zarzo A, Dehesa JS (2012) Jensen divergence based on Fisher’s information. J Phys A 45:125305

    Article  MathSciNet  Google Scholar 

  • Shao J (2003) Mathematical statistics. Springer, New York

    Book  Google Scholar 

  • Wooldridge JM (2015) Introductory econometrics: a modern approach. Cengage Learning, Mason

    Google Scholar 

Download references

Acknowledgements

The authors would like to thank the editor and an anonymous referee for their helpful comments and suggestions.

Funding

No funds received.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Javier E. Contreras-Reyes.

Ethics declarations

Conflict of interest

The authors declare that they have no known conflict of interest/competing interest that could have appeared to influence the work reported in this paper.

Additional information

Communicated by Clémentine Prieur.

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Appendices

Appendix A: \((\alpha ,w)\)-Jensen-variance distance measure

Definition 4

The \((\alpha ,w)-\)JV distance between two random variables X and Y, for \(\alpha \) and w \(\in (0,1)\), is defined as

$$\begin{aligned} \mathcal{J}_{w,\alpha }(X,Y)= & {} w Var\big ((1-\alpha )X+\alpha Y\big )+(1-w)Var\big (\alpha X+(1-\alpha )Y\big )\nonumber \\{} & {} -Var\big ((1-\bar{s})X+\bar{s}Y\big ), \end{aligned}$$
(18)

where \({\bar{s}}=w\alpha +(1-w)(1-\alpha )\).

Theorem 10

The connection between \((\alpha ,w)-\)JV distance and JV one is given by

$$\begin{aligned} \mathcal{J}_{w,\alpha }(X,Y)=(1-2\alpha )^2 \mathcal{J}_w(X,Y). \end{aligned}$$
(19)

Proof

From the definition of \(\mathcal{J}_{w,\alpha }(X,Y)\) measure in (18) and upon making use Theorem 3, we have

$$\begin{aligned} \mathcal{J}_{w,\alpha }(X,Y)= & {} w(1-w)Var\big ( (1-\alpha )X+\alpha Y-(\alpha X+(1-\alpha )Y) \big )\\= & {} w(1-w) Var\big ( (1-2\alpha )(X-Y)\big )\\= & {} (1-2\alpha )^2 w(1-w)Var(X-Y)\\= & {} (1-2\alpha )^2\mathcal{J}_w(X,Y), \end{aligned}$$

as required. \(\square \)

Corollary 4

If X and Y be two independent random variables, then from the inequality in (5) and Theorem 10, we find that

$$\begin{aligned} \mathcal{J}_{w,\alpha }(X,Y)\le & {} \mathcal{J}_w(X,Y)\\\le & {} \frac{Var(X)+Var(Y)}{4}. \end{aligned}$$

Appendix B: Connection between Jensen-variance distance and Fano factor measure

Let X be a random variable with finite mean E[X] and variance Var(X). Then, the ratio

$$\begin{aligned} \mathcal{F}(X)=\frac{Var(X)}{E[X]} \end{aligned}$$

is known as Fano factor (Feller and Morse 1958; Cover and Thomas 2006). In fact, the Fano factor is a normalized measure used to quantify the dispersion of event occurrences within a given time window. It is often used to analyze and characterize the variability or clustering behavior of events, particularly in scenarios where events occur randomly or intermittently.

Lemma 2

Then connection between JV distance and Fano factor measure is given by

$$\begin{aligned} \mathcal{F}(X)=\mathcal{J}_\alpha \bigg (\frac{X}{\sqrt{\alpha (1-\alpha )E[X]}},W\bigg ), \end{aligned}$$

where W is a degenerate random variable in point \(E[W]=c\).

Proof

From the definition of JV distance and making use Theorem 3, we have

$$\begin{aligned} \mathcal{J}_\alpha \bigg (\frac{X}{\sqrt{\alpha (1-\alpha )E[X]}},W\bigg )= & {} \alpha (1-\alpha )Var\bigg (\frac{X}{\sqrt{\alpha (1-\alpha )E[X]}}-W\bigg )\\= & {} \frac{Var(X-W)}{E[X]}\\= & {} \frac{Var(X)}{E[X]}\\= & {} \mathcal{F}(X), \end{aligned}$$

as required. \(\square \)

Theorem 11

Let X and Y be two random variables. Then, a connection between JV measure and Fano factor is given by

$$\begin{aligned} \frac{\mathcal{J}_\alpha (X,Y)}{\mu _\alpha }=\Lambda \mathcal{F}(X)+(1-\Lambda )\mathcal{F}(Y)-\mathcal{F}\big (\alpha X+(1-\alpha )Y\big ), \end{aligned}$$
(20)

where \(\Lambda =\frac{\alpha E[X]}{\mu _\alpha }\) and \(\mu _\alpha =\alpha E[X]+(1-\alpha )E[Y].\)

Proof

From the definition of \({\mathcal{J}_\alpha (X,Y)}\) and considering \(\Lambda =\frac{\alpha E[X]}{\mu _\alpha }\) and \(\mu _\alpha =\alpha E[X]+(1-\alpha )E[Y]\), we have

$$\begin{aligned} \mathcal{J}_\alpha (X,Y)= & {} \alpha Var(X)+(1-\alpha )Var(Y)-Var\big (\alpha X+(1-\alpha )Y\big )\\= & {} \alpha E[X]\frac{Var(X)}{ E[X]}+(1-\alpha ) E[Y] \frac{Var(Y)}{E[Y]}-\mu _\alpha \frac{Var\big (\alpha X+(1-\alpha )Y\big )}{\mu _\alpha }\\= & {} \mu _\alpha \bigg \{\Lambda \frac{Var(X)}{ E[X]}+(1-\Lambda )\frac{Var(Y)}{E[Y]}- \frac{Var\big (\alpha X+(1-\alpha )Y\big )}{\mu _\alpha }\bigg \}\\= & {} \mu _\alpha \bigg \{\Lambda \mathcal{F}(X)+(1-\Lambda )\mathcal{F}(Y)-\mathcal{F}\big (\alpha X+(1-\alpha )Y\big )\bigg \}, \end{aligned}$$

as required. \(\square \)

From Theorem 11, it is clear that

$$\begin{aligned} \mathcal{F}\big (\alpha X+(1-\alpha )Y\big )\le \Lambda \mathcal{F}(X)+(1-\Lambda )\mathcal{F}(Y). \end{aligned}$$

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Kharazmi, O., Contreras-Reyes, J.E. & Basirpour, M.B. Jensen-variance distance measure: a unified framework for statistical and information measures. Comp. Appl. Math. 43, 144 (2024). https://doi.org/10.1007/s40314-024-02666-x

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1007/s40314-024-02666-x

Keywords

Mathematics Subject Classification

Navigation