Nonsymmetric entropy and maximum nonsymmetric entropy principle
Introduction
Entropy (Boltzmann’s entropy) is the most important concept in statistical physics. Shannon’s entropy, which measures the uncertain degree of information, is also the most basic concept in information theory. The concept of entropy has been studied and applied extensively and deeply (for example, see [1], [2], [3], [4], [5], [6], [7], [8]). For example, entropy has some important applications in E-infinity theory of El Naschie and super string theory [4], [5], [6], [7]. At the same time, the concept of entropy has been generalized to various forms. For example, in 1988, Tsallis [8] proposed a generalized entropy to deal with nonextensive statistics. Tsallis’s entropy has been applied to other fields (see, for example, [9], [10]). In addition, there are Renyi entropy and r-order entropy and so on. These generalized entropies have some applications in some special cases.
On the other hand, in 1957, Jaynes [11] proposed the maximum-entropy principle for setting up probability distribution on basis of partial knowledge. But there are some systems for which Boltzmann’s entropy (or Shannon’s entropy) and other entropies cannot be utilized to obtain the distributions by using corresponding maximum entropy principle. For example, we consider a system in which a coin has two side with different probabilities of coming down. We denote the corresponding probabilities by p1 and p2. If we assume p1 > p2, then we will find that the maximum of Shannon’s entropy cannot be reached for the Shannon’s entropy . In fact, using p2 = 1 − p1 and , we know that S cannot take its maximum since . Thus for the Shannon’s entropy (or Boltzmann’s entropy), we cannot obtain the needed distribution by using maximum entropy principle. Therefore, it is necessary to introduce other entropy to deal with this kind of systems. Recently, I have proposed a new entropy named nonsymmetric entropy so that Zipf’s law is derived naturally by maximizing the nonsymmetric entropy [12]. My key idea is to regard every event has a kind of auxiliary information so that the total information of the event is defined by , where βj is an auxiliary parameter of the jth event, and is an auxiliary information. We define the nonsymmetric entropy by the average total information of every event. Maximizing the nonsymmetric entropy leads to the Zipf’s law when βj = jα. Of course we can take other values of βj to obtain other distribution laws, even to obtain every distribution law. But I hope that the concept of the nonsymmetric entropy has a potential value for the study of complexity. In fact, for the time being, I do not know how to develop this concept and its further implications.
In the present paper, I discuss further the nonsymmetric entropy and obtain the maximum nonsymmetric entropy principle. As application, we derive the power law from the maximum nonsymmetric entropy principle.
This paper is organized as follows: In Section 2, we give all definitions and theorems. Some discusses and applications are given. In Section 3, we give some applications. The last section is the conclusion.
Section snippets
Maximum nonsymmetric entropy principle
In Ref. [12], I have defined the concept of nonsymmetric entropy in a linguistic model. I will give a general definition in the following in a statistical model. We first give some hypotheses in a statistical model, in which the set of states is X = {x1, … , xm}, and pi is the probability of xi. Definition 1 The auxiliary information of a sate xj is defined bywhere βj is the auxiliary information parameter. Respectively, we define the total information of the state xj by Definition 2 We define a
Application
If , we havewhich is the maximal nonsymmetric entropy distribution, where .
If we take β(x) = xα, α > 1, and assume the range of the random variable X is (k, +∞), then the maximum nonsymmetric entropy distribution iswhich is just the power law distribution in the continuous case. If there are some constraints, we will get other distributions similar to those in Theorem 3, Theorem 4.
Power law or Zipf’s law, which is an important non-exponential
Conclusion
The above results suggest that the nonsymmetric entropy is maybe a useful concept that will play a suitable role in complexity science. Of course the meaning of the nonsymmetric entropy needs a further reasonable explanation. I hope the concept of nonsymmetric entropy can stimulate more investigations in the future.
References (22)
- et al.
Complexity principle of extremality in evolution of living organisms by information-theoretic entropy
Chaos, Solitons & Fractals
(2002) - et al.
Intermittency and scale-free networks: a dynamical model for human language complexity
Chaos, Solitons & Fractals
(2004) Entropy principle of extremality as a driving force in the discrete dynamics of complex and living systems
Chaos, Solitons & Fractals
(2000)Dimensions and Cantor spectra
Chaos, Solitons & Fractals
(1994)On the topological ground state of E-infinity spacetime and the super string connection
Chaos, Solitons & Fractals
(2007)Superstrings, entropy and the elementary particles content of the standard model
Chaos, Solitons & Fractals
(2006)Internal Cantor distance and entropy of multidimensional Peano–Hilbert spaces
Chaos, Solitons & Fractals
(1993)Entropic nonextensivity: a possible measure of complexity
Chaos, Solitons & Fractals
(2002)- et al.
Tsallis statistics and turbulence
Chaos, Solitons & Fractals
(2002) - et al.
Statistical mechanical foundations for systems with nonexponential distributions
Chaos, Solitons & Fractals
(2002)
Possible generalization of Boltzmann–Gibbs statistics
J Stat Phys
Cited by (5)
Lerch distribution based on maximum nonsymmetric entropy principle: Application to Conway's game of life cellular automaton
2021, Chaos, Solitons and FractalsCitation Excerpt :GoL considered in this study was set at discrete time of steps, generating discrete Zipf distributions of course. However, the maximum nonsymmetric entropy principle also could be employed to generate continuous distributions [20]. In fact, nonsymmetric entropy was also considered by [27,38] to define the negentropy as the departure from gaussianity of the asymmetric gaussian distribution, and recently by [28] to define the Fisher–Shannon information complex plane using skewness as control parameter.
Rényi entropy and complexity measure for skew-gaussian distributions and related families
2015, Physica A: Statistical Mechanics and its Applications