Rényi Entropy and Rényi Divergence in Product MV-Algebras

This article deals with new concepts in a product MV-algebra, namely, with the concepts of Rényi entropy and Rényi divergence. We define the Rényi entropy of order q of a partition in a product MV-algebra and its conditional version and we study their properties. It is shown that the proposed concepts are consistent, in the case of the limit of q going to 1, with the Shannon entropy of partitions in a product MV-algebra defined and studied by Petrovičová (Soft Comput. 2000, 4, 41–44). Moreover, we introduce and study the notion of Rényi divergence in a product MV-algebra. It is proven that the Kullback–Leibler divergence of states on a given product MV-algebra introduced by Markechová and Riečan in (Entropy 2017, 19, 267) can be obtained as the limit of their Rényi divergence. In addition, the relationship between the Rényi entropy and the Rényi divergence as well as the relationship between the Rényi divergence and Kullback–Leibler divergence in a product MV-algebra are examined.


Introduction
The Shannon entropy [1] and Kullback-Leibler divergence [2] are the most significant and most widely used quantities in information theory [3]. Due to their successful use, many attempts have been done to generalize them. It is known that their important generalizations are the Rényi entropy and Rényi divergence [4], respectively. These quantities have many significant applications; for example, in statistics, in ecology, and also in quantum information.
Shannon's entropy is defined in the context of a probabilistic model in the following way: if we consider a probability space (Ω, S, P) and a measurable partition E = {E 1 , E 2 , . . . , E n } of (Ω, S, P), then the Shannon entropy of E is defined as the number H(E )= −∑ n i=1 P(E i )· log P(E i ) (with the usual convention that P(E i )· log P(E i ) = 0, for P(E i ) = 0). If E = {E 1 , E 2 , . . . , E n } and F = {F 1 , F 2 , . . . , F m } are two measurable partitions of (Ω, S, P), then the conditional Shannon entropy of E assuming a realization of F is defined as the number H(E /F ) = −∑ n i=1 ∑ m j=1 P(E i ∩ F j ) · log P(E i ∩F j ) P(F j ) (with the usual convention that 0 · log 0 x = 0 if x ≥ 0). If E = {E 1 , E 2 , . . . , E n } is a measurable partition of (Ω, S, P) with probabilities p i = P(E i ), i = 1, 2, . . . , n, then its Rényi entropy of order q, where q ∈ (0, 1) ∪ (1, ∞), is defined as the number H q (E ) = 1 1−q log ∑ n i=1 p q i . It can be shown that lim q→1 H q (E ) = ∑ n i=1 p i · log 1 p i , thus the Shannon entropy is a limiting case of the Rényi entropy for q → 1 . It is known that there is no universally accepted definition of conditional Rényi entropy. The paper [5] describes three definitions of conditional Rényi entropy that can be found in the literature. In [6], it is also possible to find a brief overview of various approaches to defining the conditional Rényi entropy, and in addition, a new definition of conditional Rényi entropy was proposed. In [7], the authors introduced a general type of conditional Rényi entropy which contains some of previously defined conditional Rényi entropies as special cases. The proposed concepts have successfully been used in information theory [8], time series analysis [9], and cryptographic applications [10]. However, no one of the proposed generalizations satisfies all basic properties of Shannon conditional entropy. The selection of the definition therefore depends on the purpose of application. The present article is devoted to the study of Rényi entropy and Rényi divergence in product MV-algebras. An MV-algebra [11] is the most useful instrument for describing multivalued processes, especially after its Mundici's characterization as an interval in a lattice ordered group (cf. [12,13]). At present, this algebraic structure is being studied by many researchers and it is natural that there are results also regarding entropy in this structure; we refer, for instance, to [14,15]. Also, a measure theory (cf. [16]) and a probability theory (cf. [17]) were studied on MV-algebras. Of course, in some problems of probability it is necessary to introduce a product on an MV-algebra, an operation outside the corresponding group addition. The operation of a product on an MV-algebra was suggested independently in [18] from the viewpoint of mathematical logic and in [19] from the viewpoint of probability. Also, the approach from the viewpoint of a general algebra suggested in [20] seems interesting. We note that the notion of a product MV-algebra generalizes some classes of fuzzy sets; a full tribe of fuzzy sets (see e.g., [21]) presents an example of a product MV-algebra.
A suitable entropy theory of Shannon and Kolmogorov-Sinai type for the case of a product MV-algebra has been provided by Petrovičová in [22,23]. We remark that in our article [24], based on the results of Petrovičová, we proposed the notions of Kullback-Leibler divergence and mutual information of partitions in a product MV-algebra. In the present article, we continue studying entropy and divergence in a product MV-algebra, by defining and studying the Rényi entropy and the Rényi divergence.
The rest of the paper is structured as follows. In the following section, preliminaries and related works are given. Our main results are discussed in Sections 3-5. In Section 3, we define the Rényi entropy of a partition in a product MV-algebra and examine its properties. It is shown that for q → 1 the Rényi entropy of order q converges to the Shannon entropy of a partition in a product MV-algebra introduced in [22]. In Section 4, we introduce the concept of conditional Rényi entropy of partitions in a product MV-algebra and study its properties. It is shown that the proposed definition of the conditional Rényi entropy is consistent, in the case of the limit of q going to 1, with the conditional Shannon entropy of partitions studied in [22], and that it satisfies the property of monotonicity and a weak chain rule. In the final part of this section, we define the Rényi information about a partition X in a partition Y as an example for the further usage of the proposed concept of the conditional Rényi entropy. Section 5 is devoted to the study of Rényi divergence in a product MV-algebra. It is shown that the Kullback-Leibler divergence in a product MV-algebra introduced by the authors in [24] can be obtained as the limit of the Rényi divergence. We illustrate the results with numerical examples. The last section contains a brief summary.

Preliminaries and Related Works
We start by reminding the definitions of basic terms and some of the known results that will be used in the article. We mention some works related to the subject of this article, of course, without claiming completeness.
Several different (but equivalent) axiom systems have been used to define the term of MV-algebra (cf., e.g., [19,25,26]). In this paper, we use the definition of MV-algebra given by Riečan in [27], which is based on the Mundici representation theorem. Based on Mundici's theorem [12] (see also [13]), MV-algebras can be considered as intervals of an abelian lattice-ordered group (shortly l-group). We remind that by an l-group (cf. [28]) we understand a triplet (G, +, ≤), where (G, +) is an abelian group, (G, ≤) is a partially ordered set being a lattice and x ≤ y ⇒ x + z ≤ y + z. Definition 1 ([27]). An MV-algebra is an algebraic structure A = (A, ⊕, * , 0, u) satisfying the following conditions: (i) there exists an l-group (G, +, ≤) such that A = [0, u] = {x ∈ G; 0 ≤ x ≤ u}, where 0 is the neutral element of (G, +) and u is a strong unit of G (i.e., u ∈ G such that u > 0 and to every x ∈ G there exists a positive integer n with the property x ≤ nu; (ii) ⊕ and * are binary operations on A satisfying the following identities: We note that MV-algebras provide a generalization of Boolean algebras in the sense that every Boolean algebra is an MV-algebra satisfying the condition x ⊕ x = x. For this reason, in order to generalize the concept of probability on Boolean algebras, Mundici introduced in [29] the notion of a state on an MV-algebra in the following way. Let A = (A, ⊕, * , 0, u) be an MV-algebra. A mapping s : A → [0, 1] is a state on A whenever s(0) = 0 and, for every x, y ∈ A, the following condition is satisfied: if x * y = 0, then s(x ⊕ y) = s(x) + s(y). Since the definitions of product MV-algebra and partition in a product MV-algebra are based on the Mundici representation theorem (i.e., the MV-algebra operation ⊕ is substituted by the group operation + in the abelian l-group corresponding to the considered MV-algebra), in this contribution, we shall use the following (equivalent) definition of a state which is also based on the Mundici representation theorem. This means that the sum in the following definition of a state, and subsequently in what follows, denotes the sum in the abelian l-group that corresponds to the given MV-algebra. (ii) if x, y ∈ A such that x + y ≤ u, then s(x + y) = s(x) + s(y).

Definition 3 ([19]).
A product MV-algebra is an algebraic structure (A, ⊕, * , ·, 0, u), where (A, ⊕, * , 0, u) is an MV-algebra and · is an associative and abelian binary operation on A with the following properties: (i) for every x ∈ A, u · x = x; (ii) if x, y, z ∈ A such that x + y ≤ u, then z · x + z · y ≤ u, and z · (x + y) = z · x + z · y.
For brevity, we will write (A, ·) instead of (A, ⊕, * , ·, 0, u). A relevant probability theory for the product MV-algebras was developed by Riečan in [30], see also [31,32]; the entropy theory of Shannon and Kolmogorov-Sinai type for the product MV-algebras was proposed in [22,23]. We present the main idea and some results of these theories that will be used in the following text.
Proof. The proof can be found in [33].
The following example shows that the model studied in this article generalizes the classical case. Example 1. Let us consider a probability space (Ω, S, P) and put A = {I E ; E ∈ S}, where I E : Ω → {0, 1} is the indicator function of the set E ∈ S. The class A is closed with respect to the product of indicator functions and it represents a special case of product MV-algebras. The mapping s : A → [0, 1] defined, for every I E ∈ A, by s(I E ) = P(E), is a state on the considered product MV-algebra (A, ·). A measurable partition {E 1 , E 2 , . . . , E n } of (Ω, S, P) can be viewed as a partition in the product MV-algebra (A, ·), if we consider the n-tuple I E 1 , I E 2 , . . . , I E n instead of {E 1 , E 2 , . . . , E n }. Example 2. Let (Ω, S, P) be a probability space, and A be the class of all S−measurable functions f : Ω → [0, 1], so called full tribe of fuzzy sets (cf., e.g., [21,34]). The class A is closed with respect to the natural product of fuzzy sets and it represents a significant case of product MV-algebras. The map s : A → [0, 1] defined, for every f ∈ A, by s( f ) = Ω f dP, is a state on the product MV-algebra (A, ·). The notion of a partition in the product MV-algebra (A, ·) coincides with the notion of a fuzzy partition (cf. [34]).

Definition 4.
Let s be a state on a product MV-algebra (A, ·). We say that partitions X, Y in (A, ·) are statistically independent with respect to s, if s(x · y) = s(x) · s(y), for every x ∈ X, and y ∈ Y.
The following definition of entropy of Shannon type was introduced in [22]. Definition 5. Let X = (x 1 , x 2 , . . . , x n ) be a partition in a product MV-algebra (A, ·) and let s : A → [0, 1] be a state. Then the entropy of X with respect to s is defined by Shannon's formula If X = (x 1 , x 2 , . . . , x n ), and Y = (y 1 , y 2 , . . . , y m ) are two partitions in (A, ·), then the conditional entropy of X given y j ∈ Y is defined by The conditional entropy of X given Y is defined by Here, the usual convention that 0 log 0 The base of the logarithm can be any positive real number, but as a rule one takes logarithms to the base 2. The entropy is then expressed in bits. The entropy and the conditional entropy of partitions in a product MV-algebra satisfy all properties that correspond to properties of Shannon's entropy of measurable partitions in the classical case; for more details, see [22].
In [24], the concepts of mutual information and Kullback-Leibler divergence in a product MV-algebra were introduced in the following way. Definition 6. Let X, Y be partitions in a product MV-algebra (A, ·), and let s : A → [0, 1] be a state. We define the mutual information between X and Y by the formula Definition 7. Let s, t be states defined on a given product MV-algebra (A, ·), and X = (x 1 , x 2 , . . . , x n ) be a partition in (A, ·). Then we define the Kullback-Leibler divergence D X (s t) by the formula The logarithm in this formula is taken to the base 2 and information is measured in units of bits. We use the convention that x log x 0 = ∞ if x > 0, and 0 log 0 In the proofs, we shall use the Jensen inequality which states that for a real concave function ϕ, real numbers a 1 , a 2 , . . . , a n in its domain and nonnegative real numbers c 1 , c 2 , . . . , c n such that and the inequality is reversed if ϕ is a real convex function. The equality holds if and only if a 1 = a 2 = . . . = a n or ϕ is linear. Further, we recall the following notions.
Definition 9. Let f : D → be a real function defined on a non-empty set D. Define the q-norm, for 1 ≤ q < ∞, or q-quasinorm, for 0 < q < 1, of f as

The Rényi Entropy of a Partition in a Product MV-Algebra
In this section, we define the Rényi entropy of a partition in a product MV-algebra (A, ·), and examine its properties. In the following, we assume that s : A → [0, 1] is a state. Definition 10. Let X = (x 1 , x 2 , . . . , x n ) be a partition in a product MV-algebra (A, ·). Then we define the Rényi entropy of order q, where q ∈ (0, 1) ∪ (1, ∞), of the partition X with respect to s as the number: Remark 1. In accordance with the classical theory, the log is to the base 2 and the Rényi entropy is expressed in bits. For simplicity, we write s( Let X = (x 1 , x 2 , . . . , x n ) be a partition in (A, ·). If we consider the function s X : X → , defined, for every x i ∈ X, by s X (x i ) = s(x i ), then we have and the Formula (7) can be expressed in the following equivalent form: Example 3. Let X = (x 1 , x 2 , . . . , x n ) be any partition in a product MV-algebra (A, ·). Let s : A → [0, 1] be a uniform state over X, i.e., s(x i ) = 1 n , for i = 1, 2, . . . , n. Then Example 4. Let us consider any product MV-algebra (A, ·), and the partition E = (u) in (A, ·) that represents an experiment resulting in a certain event. It is easy to see that H s q (E) = 0.

Remark 2.
It is possible to verify that the Rényi entropy H s q (X) is always nonnegative. Namely, for q ∈ (0, 1), On the other hand, for q ∈ (1, ∞), and i = 1, 2, . . . , n, it holds At q = 1 the value of the quantity H s q (X) is undefined since it generates the form 0 0 . In the following theorem, it is shown that for q → 1 the Rényi entropy H s q (X) converges to the Shannon entropy of a partition X in (A, ·) defined by the formula (1). Theorem 1. Let X = (x 1 , x 2 , . . . , x n ) be any partition in a product MV-algebra (A, ·). Then Proof. Put f (q) = log ∑ n i=1 s(x i ) q , and g(q) = 1 − q, for every q ∈ (0, ∞). The functions f , g are differentiable and for every q ∈ (0, 1) , under the assumption that the right hand side exists. It holds d dq g(q) = −1, and Note that the calculation of the derivative of function f is easily done by using the identity b α = e α ln b . We get which is the Shannon entropy of X defined by the Formula (1).
In the following theorem it is proved that the function H s q (X) is monotonically decreasing in q.

Theorem 2.
Let X be any partition in a given product MV-algebra (A, ·), and q 1 , Proof. Suppose that X = x 1 , x 2 , . . . , x n and q 1 , q 2 ∈ (1, ∞) such that q 1 ≥ q 2 . Then the claim is equivalent to the inequality The above inequality follows by applying the Jensen inequality to the function ϕ, defined, for every The case where q 1 , q 2 ∈ (0, 1) is obtained in an analogous way. Finally, the case where q 1 ∈ (1, ∞), and q 2 ∈ (0, 1) is obtained by transitivity.

Example 5.
Consider any product MV-algebra (A, ·), and a state s : , which is consistent with the property proven in the previous theorem. (i) Consider the case when q ∈ (1, ∞). Then s(x i ) q = ∑ j∈J(i) s(y j ) q ≥ ∑ j∈J(i) s(y j ) q , for i = 1, 2, . . . , n, and consequently In this case we have 1 1 (ii) Consider the case when q ∈ (0, 1). Then s(x i ) q = ∑ j∈J(i) s(y j ) q ≤ ∑ j∈J(i) s(y j ) q , for i = 1, 2, . . . , n, and consequently In this case we have 1 1−q > 0, hence As an immediate consequence of the previous theorem and Proposition 1, we obtain the following result.

Corollary 1.
For every partition X, Y in a product MV-algebra (A, ·), it holds  1], are partitions in (A, ·) with the state values 1 2 , 1 2 and 1 3 , 2 3 of the corresponding elements, respectively. The join of partitions X and Y is the system X ∨ Y = ( f 1 · g 1 , f 1 · g 2 , f 2 · g 1 , f 2 · g 2 ) with the state values = 0.848 bit. It can be seen that the inequality (9) applies.

Theorem 4.
If partitions X, Y in a product MV-algebra (A, ·) are statistically independent with respect to s, then H s q (X ∨ Y) = H s q (X) + H s q (Y).

The Conditional Rényi Entropy in a Product MV-Algebra
In this section, we introduce the concept of conditional Rényi entropy H s q (X/Y) of partitions in a product MV-algebra (A, ·), analogously to [6]. It is shown that the proposed definition is consistent, in the case of the limit of q going to 1, with the conditional Shannon entropy defined by Equation (3). Subsequently, by using the proposed notion of conditional Rényi entropy, we define the Rényi information about a partition X in a partition Y.
Let X = (x 1 , x 2 , . . . , x n ), and Y = (y 1 , y 2 , . . . , y m ) be two partitions in a product MV-algebra (A, ·), and y j ∈ Y be fixed. If we consider the function s X/y j : X → , defined, for every x i ∈ X, by s X/y j (x i ) = s(x i /y j ), then we have Definition 11. Let X = (x 1 , x 2 , . . . , x n ), and Y = (y 1 , y 2 , . . . , y m ) be partitions in (A, ·). We define the conditional Rényi entropy of order q, where q ∈ (0, 1) ∪ (1, ∞), of X given Y by the formula Remark 3. In the same way as in the unconditional case, it can be verified that the conditional Rényi entropy H s q (X/Y) is always nonnegative. Let X = (x 1 , x 2 , . . . , x n ) be any partition in (A, ·), and E = (u). Since s X/u (x i ) = s(x i /u) = s(x i ) = s X (x i ), for i = 1, 2, . . . , n, it holds s X/u q = s X q , and consequently Proposition 2. Let X = (x 1 , x 2 , . . . , x n ) be a partition in a product MV-algebra (A, ·) and let s : A → [0, 1] be a state. Then: (i) ∑ n i=1 s(x i · y) = s(y), for any element y ∈ A; (ii) ∑ n i=1 s(x i /y) = 1, for any element y ∈ A such that s(y) > 0.
The functions f and g are differentiable and evidently, lim q→1 g(q) = 0. Also, it can easily be verified that lim q→1 f (q) = 0. Indeed, if we put δ = j; s(y j ) > 0 , then using Proposition 2, we get s(y j ) = log 1 = 0.
Using L'Hôpital's rule, it follows that lim , under the assumption that the right-hand side exists. Let us calculate the derivatives of the functions f and g. We have d dq g(q) = 1 q 2 , and d dq f (q) = h (q) h(q) ln 2 , where h(q) is the continuous function defined, for every q ∈ (0, ∞), by the formula h(q) = m ∑ j=1 s(y j ) s X/y j q with the continuous derivative h for which it holds Analogously as in the proof of Theorem 1, we used the identity b α = e α ln b to calculate the derivative of function h.
We get which is the conditional Shannon entropy of X given Y defined by Equation (3).
Theorem 6 (monotonicity). Let X and Y be partitions in a product MV-algebra (A, ·). Then Proof. Let X = (x 1 , x 2 , . . . , x n ), Y = (y 1 , y 2 , . . . , y m ). Then by Proposition 2, it holds s(x i ) = ∑ m j=1 s(x i · y j ), for i = 1, 2, . . . , n. Suppose that q ∈ (1, ∞). Then, using the triangle inequality of the q-norm, we get It follows that log ∑ n i=1 s(x i ) q ≤ log ∑ m j=1 s(y j ) s X/y j q q , and consequently H s q (X) = 1 1−q log ∑ n i=1 s(x i ) q ≥ 1 1−q log ∑ m j=1 s(y j ) s X/y j q q = q 1−q log ∑ m j=1 s(y j ) s X/y j q = H s q (X/Y).
For the case where q ∈ (0, 1), we put r = 1 q . By writing the Rényi entropy in terms of the 1 q -norm and using the triangle inequality for the 1 q -norm, we get s(y j ) s X/y j q = H s q (X/Y).

Theorem 7.
If partitions X, Y in a product MV-algebra (A, ·) are statistically independent with respect to s, then H s q (X/Y) = H s q (X).
In the following theorem, a weak chain rule for the Rényi entropy of partitions in a product MV-algebra (A, ·) is given. where β = max 1 s(y j ) ; y j ∈ supp(s), j = 1, 2, . . . , m .

It follows
Consider now the case where q ∈ (1, ∞). Then the function ϕ is convex, and therefore we have ∑ m j=1 s(y j ) s X/y j q q ≤ ∑ m j=1 s(y j ) s X/y j q Since 1 − q < 0, for q ∈ (1, ∞), we get
Proof. The claim is a direct consequence of Theorems 6 and 9.
Definition 12. Let X, Y be partitions in (A, ·). We define the Rényi information of order q, where q ∈ (0, 1) ∪ (1, ∞), about X in Y by the formula Theorem 10. Let X, Y be partitions in (A, ·). Then lim q→1 I s q (X, Y) = I s (X, Y), where I s (X, Y) is the mutual information of partitions X, Y defined by Equation (4).
Proof. The claim is obtained as a direct consequence of Theorems 1 and 5.
Theorem 11. For arbitrary partitions X, Y in (A, ·), it holds I s q (X, Y) ≥ 0. Moreover, if X, Y are statistically independent with respect to s, then I s q (X, Y) = 0.
Proof. The claim is a direct consequence of Theorems 6 and 7.

The Rényi Divergence in a Product MV-Algebra
In this section, we introduce the concept of the Rényi divergence in a product MV-algebra (A, ·). We will prove basic properties of this quantity, and for illustration, we provide some numerical examples.
Definition 13. Let s, t be states on a given product MV-algebra (A, ·), and X = (x 1 , x 2 , . . . , x n ) be a partition in (A, ·) such that t(x i ) > 0, for i = 1, 2, . . . , n. Then we define the Rényi divergence D X q (s t) of order q, where q ∈ (0, 1) ∪ (1, ∞), with respect to X as the number (13) is taken to the base 2 and information is measured in bits. It is easy to see that D X q (s s) = 0. Namely,

Remark 5. The logarithm in the Formula
The following theorem states that for q → 1 the Rényi divergence D X q (s t) converges to the Kullback-Leibler divergence D X (s t) defined by the Formula (5).
Theorem 12. Let s, t be states on a given product MV-algebra (A, ·), and X = (x 1 , x 2 , . . . , x n ) be a partition in (A, ·) such that t(x i ) > 0, for i = 1, 2, . . . , n. Then where f , g are continuous functions defined, for every q ∈ (0, ∞), in the following way: By continuity of the functions f , g, we have lim and lim q→1 g(q) = g(1) = 0. Using L'Hôpital's rule, we get that lim , under the assumption that the right hand side exists. Since g (q) = 1, and f (q) = h (q) h(q) ln 2 , where The following theorem states that the Rényi entropy H s q (X) can be expressed in terms of the Rényi divergence D X q (s t) of a state s from a state t that is uniform over X.
Theorem 13. Let s be a state on a product MV-algebra (A, ·), and X = (x 1 , x 2 , . . . , x n ) be a partition in (A, ·). If a state t : A → [0, 1] is uniform over X, i.e., t(x i ) = 1 n , for i = 1, 2, . . . , n, then Proof. Let us calculate: On the other hand, as shown by Example 3, it holds H t q (X) = log n. Proof. The inequality follows by applying the Jensen inequality to the function ϕ defined by s(x i ) , for i = 1, 2, . . . , n. Let us consider the case of q ∈ (1, ∞). Then 1 − q < 0, therefore the function ϕ is convex. By the Jensen inequality we obtain and consequently Let q ∈ (0, 1). Then the function ϕ is concave, and therefore we get The equality in (14) holds if and only if s(x i ) is constant, for i = 1, 2, . . . , n, i.e., if and only if t(x i ) = k · s(x i ), for i = 1, 2, . . . , n. By summing over all i = 1, 2, . . . , n, we get ∑ n i=1 t(x i ) = k · ∑ n i=1 s(x i ), which implies that k = 1. Hence s(x i ) = t(x i ), for i = 1, 2, . . . , n. Therefore, we conclude that D X q (s t) = 0 if and only if s(x i ) = t(x i ), for i = 1, 2, . . . , n.
Corollary 3. Let s be a state on a product MV-algebra (A, ·), and X = (x 1 , x 2 , . . . , x n ) be a partition in (A, ·) such that s(x i ) > 0, for i = 1, 2, . . . , n. Then H s q (X) ≤ log n with the equality if and only if the state s is uniform over X.
Proof. Let t : A → [0, 1] be a state uniform over X, i.e., t(x i ) = 1 n , for i = 1, 2, . . . , n. Then according to Theorems 14 and 13, it holds 0 ≤ D X q (s t) = log n − H s q (X), which implies that H s q (X) ≤ log n. Since the equality D X q (s t) = 0 applies if and only if s(x i ) = t(x i ), for i = 1, 2, . . . , n, the equality H s q (X) = log n holds if and only if the state s is uniform over X.
For q = 1 2 , we get D X q (s 1 s 2 ) = 0.0122 bit. Evidently, in both cases mentioned above, This means that the triangle inequality for the Rényi divergence generally does not apply. The result means that it is not a metric in a true sense.
Theorem 15. Let s, t be states on a given product MV-algebra (A, ·), and X = (x 1 , x 2 , . . . , x n ) be a partition in (A, ·) such that s(x i ) > 0, and t(x i ) > 0, for i = 1, 2, . . . , n. Then: where Proof. We prove the claims by applying the Jensen inequality to the concave function ϕ defined, for every x ∈ (0, ∞), by ϕ(x) = log x. If we put c i = s(x i ), and a i = s(x i ) , for i = 1, 2, . . . , n, in the inequality (6), we get t(x i ) .
(i) Suppose that 0 < q < 1. Then 1 q−1 < 0, and therefore we obtain (ii) Suppose that q > 1. Then 1 q−1 > 0, and we get To illustrate the result of previous theorem, let us consider the following example which is a continuation to Example 6. = 0.07736 bit. As can be seen, for q = 1 3 , we have D X q (s 1 s 2 ) < D X (s 1 s 2 ), D X q (s 2 s 1 ) < D X (s 2 s 1 ), and for q = 2, we have D X q (s 1 s 2 ) > D X (s 1 s 2 ), D X q (s 2 s 1 ) > D X (s 2 s 1 ). The obtained results correspond to the claim of Theorem 15. Based on previous results, we can also see that the Rényi divergence D X q (s t), as well as the Kullback-Leibler divergence D X (s t), is not symmetrical.

Conclusions
The aim of this paper was to generalize the results concerning the Shannon entropy and Kullback-Leibler divergence in a product MV-algebra given in [22] and [24] to the case of Rényi entropy and Rényi divergence. The results are contained in Sections 3-5. In Section 3, we have introduced the concept of Rényi entropy H s q (X) of a partition X in a product MV-algebra (A, ·), and we examined properties of this entropy measure. In Section 4, we have defined the conditional Rényi entropy of partitions in the studied algebraic structure. It was shown that the proposed concepts are consistent, in the case of the limit of q → 1, with the Shannon entropy of partitions defined and studied in [22]. Moreover, it was shown that the Rényi entropy H s q (X) as well as the conditional Rényi entropy H s q (X/Y) are monotonically decreasing functions of parameter q. In the final part of Section 4, we have defined the Rényi information about a partition X in a partition Y as an example for the further usage of the proposed concept of the conditional Rényi entropy. Section 5 was devoted to the study of Rényi divergence in (A, ·). We have proved that the Kullback-Leibler divergence of states defined on a product MV-algebra can be derived from their Rényi divergence as the limiting case for q going to 1. Theorem 14 allows interpreting the Rényi divergence as a distance measure between two states (over the same partition) defined on a given product MV-algebra. In addition, we investigated the relationship between the Rényi entropy and the Rényi divergence (Theorem 13), as well as the relationship between the Rényi divergence and Kullback-Leibler divergence (Theorem 15), in a product MV-algebra.
In the proofs we used L'Hôpital's rule, the triangle inequality of qnorm, and the Jensen inequality. To illustrate the results, we have provided several numerical examples.
As has been shown in Example 1, the model studied in this article generalizes the classical case; that is, the Rényi entropy and the Rényi divergence defined in this paper are a generalization of the classical concepts of Rényi entropy and Rényi divergence. On the other hand, MV-algebras enable to study more general situations. We note that MV-algebras can be for example interpreted by means of Ulam some games (see e.g., [35][36][37]). The obtained results could therefore be useful for the researches on this subject.
In Example 2, we have mentioned that the full tribe of fuzzy sets represents a special case of product MV-algebras; therefore, the results of the article can be immediately applied to this important class of fuzzy sets. We recall that by a fuzzy subset of a non-empty set Ω (cf. [38]), we understand any map f : Ω → [0, 1]. The value f (ω) is interpreted as the degree of belongingness of ω ∈ Ω to the considered fuzzy set f . In [39], Atanassov has generalized the fuzzy sets by introducing the idea of an intuitionistic fuzzy set, a set having with each member a degree of belongingness as well as a degree of non-belongingness. From the application point of view, it is interesting that to a given class F of intuitionistic fuzzy sets can be constructed an MV-algebra A such that F can be embedded to A. Also, an operation of product on F can be introduced by such a way that the corresponding MV-algebra is a product MV-algebra. Therefore, all the results of this article can also be applied to the intuitionistic fuzzy case.