Rényi entropy rate for Gaussian processes
Introduction
The Shannon entropy was first axiomatized by Shannon [30]. Then, some of these axioms were changed by Khinchin [21] and others [13]. The Rényi entropy was axiomatized by Rényi [28] and was modified by some authors [13]. Recently, Jizba and Arimitsu [20] used the extant axioms for Shannon and Rényi entropies to introduce a set of new axioms on the basis of which one can get both Shannon and Rényi entropies.
Based on the axioms used for introducing the Shannon entropy, other types of entropy, such as the Shannon conditional entropy, were introduced. But the definition of the Rényi conditional entropy was not based on the axioms underlying the Rényi entropy. Of course based on the definition of the conditional Shannon entropy, some authors, such as Cachin [5], gave a definition for the Rényi conditional entropy. But it is possible to use the axioms introduced by Jizba and Arimitsu [20] to give another definition for the Rényi conditional entropy that is more suitable than the definition given by Cachin [12].
After Shannon proved that the Shannon entropy rate exists for stationary stochastic processes [30], the entropy rate was defined for stochastic processes. The Shannon entropy rate is extensively studied for stochastic processes, especially for stationary processes with discrete or continuous time (see [10] and references therein). For example, the rate of Shannon entropy for a stationary Gaussian process was obtained by Kolmogorov [23].
The rate of Rényi entropy for stochastic processes was obtained by Rached et al. [26]. He also obtained an operational characteristic for the Rényi entropy rate in coding theory. Chen and Alajaji [6] obtained another application for the Rényi entropy rate by showing the bound for the error probability of information transmission for a source coding that is based on discrete-time processes with finite state space. After them, Shimokawa [31] extended this result to a countable state space. The Rényi entropy and the Rényi entropy rate have a number of applications in other fields such as statistics and related fields [1], [14], [15], [19], [33], biomedical engineering [18], statistical mechanics [8], [22], economics [3], stochastic processes [12], [17] , and some other areas [4], [9], [24], [25], [27], [29].
Among the family of stochastic processes, choosing the process with the maximum entropy is equivalent to adding the least information possible for the problem under consideration. Maximum entropy is extensively used in the study of stochastic processes (see for instance [11] ). Thus, it is necessary to obtain the entropy rate for stochastic processes. The maximization of Shannon entropy for stationary Gaussian processes has already been studied. In this paper, we obtain Rényi’s entropy rate for this kind of processes. This enables us to handle Gaussian processes through the maximization of the Rényi entropy.
This paper is organized as follows. In Section 2, we introduce a suitable definition for the conditional Rényi entropy. Then, we use this definition to show that the chain rule holds for the Rényi entropy. In Section 3, we obtain the Rényi entropy rate for stationary Gaussian processes. Finally in Section 4, we show that for stationary Gaussian processes the bound for the Rényi entropy rate is simply the Shannon entropy rate and that the Rényi entropy rate reduces to the Shannon entropy rate as .
Section snippets
Conditional Rényi entropy
Let X be a random variable having an absolutely continuous distribution with density function . The Rényi entropy of order is defined asand is the Shannon entropy if both integrals exist. Remark 2.1 The Rényi entropy for normal distribution isNow, we review some properties of the Rényi entropy: The Rényi entropy can be negative. The Rényi entropy is invariant under a location transformation of the
The rate of entropy for Gaussian processes
Let be a discrete-time process, where each is a real random variable defined on a probability space . The rate of Rényi entropy is defined as the limit of the entropy at time , divided by n, when the limit exists, namelyThis definition is a regular definition for the entropy rate. By using the chain rule we can get another relation for obtaining the rate of Rényi entropy, based on the limit of the conditional Rényi entropy.
Multiplying
Bounds for the Rényi entropy rate
In this section, we show that the bound for the Rényi entropy rate is the Shannon entropy rate and that the Rényi entropy rate reduces to the Shannon entropy rate as .
From Property 4, we have the following inequalities:where is the Shannon entropy.
Now, by using (4.1), (4.2), we obtain the bounds for the Rényi entropy rate of a stationary Gaussian process.
For a random vector , the inequality (4.1) becomes:and
Conclusion
In this paper, we introduced a definition for the conditional Rényi entropy and demonstrated that the chain rule holds for this definition. Then, we used the chain rule to derive another relation for obtaining the rate of Rényi entropy. Using this relation and characteristics of Rényi entropy, we obtained the rate of stationary Gaussian processes and showed that this rate depends on the spectral density function of the processes. Furthermore, we showed that the Shannon entropy rate is a bound
References (33)
On the geometry of generalized Gaussian distributions
Journal of Multivariate Analysis
(2009)- et al.
Topics in Stochastic Processes
(1975) - et al.
Long memory and volatility clustering: is the empirical evidence consistent across stock markets?
Physica A
(2008) On some entropy functionals derived from Rényi information divergence
Information Sciences
(2008)- C. Cachin, Entropy measures and unconditional security in cryptography, Ph.D. Thesis, Swiss Federal Institute of...
- et al.
Csiszar’s cutoff rates for arbitrary discrete sources
IEEE Transactions on Information Theory
(2001) - et al.
The Elements of Information Theory
(1991) - et al.
On Rényi information for ergodic diffusion processes
Information Sciences
(2009) - et al.
Robust codind for a class of sources: applications in control and reliable communication over limited capacity channels
Systems Control Letter
(2008) - et al.
Entropy rate and maximum entropy methods for countable Semi-Markov chains
Communications in Statistics: Theory and Methods
(2004)
Entropy maximization for Markov and semi-Markov processes
Methodology and Computing in Applied Probability
Some properties of Rényi entropy and Rényi entropy rate
Information Sciences
Information Theory with Application
Streaming algorithms for estimating entropy
IEEE Information Theory Workshop
Information Theory for Continuous Systems
Cited by (25)
Feature recognition of complex systems using cumulative residual Tsallis signal entropy and grey wolf optimized support vector machine
2024, Expert Systems with ApplicationsRényi entropy and complexity measure for skew-gaussian distributions and related families
2015, Physica A: Statistical Mechanics and its ApplicationsGeometrical analysis of physically allowed quantum cloning transformations for quantum cryptography
2014, Information SciencesCitation Excerpt :Quantum cryptography [1–4,8–27,43–76] is an emerging technology that provides unconditional secure communication by the fundamental laws of quantum mechanics [81–102].
New property of a generalized Hölder's inequality and its applications
2014, Information SciencesRényi divergence measures for commonly used univariate continuous distributions
2013, Information SciencesAlgorithmic superactivation of asymptotic quantum capacity of zero-capacity quantum channels
2013, Information SciencesCitation Excerpt :With the help of efficient computational geometric methods, the superactivation of zero-capacity quantum channels can be analyzed very efficiently. We would like to analyze the properties of the quantum channel using classical computer architectures and algorithms [4,10,17,42,56] since, currently, we have no quantum computers. To this day, the most efficient classical algorithms for this purpose are computational geometric methods.