Next Article in Journal
Three Mathematical Models for COVID-19 Prediction
Previous Article in Journal
Modeling the Impact of Overcoming the Green Walls Implementation Barriers on Sustainable Building Projects: A Novel Mathematical Partial Least Squares—SEM Method
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

A New Insight into Entropy Based on the Fuzzy Operators, Applied to Useful Information Extraction from Noisy Time-Frequency Distributions

by
József Dombi
1,
Ana Vranković Lacković
2,* and
Jonatan Lerga
2,3
1
Department of Computer Algorithms and Artificial Intelligence, University of Szeged, 6720 Szeged, Hungary
2
Faculty of Engineering, University of Rijeka, 51000 Rijeka, Croatia
3
Center for Artificial Intelligence and Cybersecurity, University of Rijeka, 51000 Rijeka, Croatia
*
Author to whom correspondence should be addressed.
Mathematics 2023, 11(3), 505; https://doi.org/10.3390/math11030505
Submission received: 17 December 2022 / Revised: 13 January 2023 / Accepted: 15 January 2023 / Published: 17 January 2023
(This article belongs to the Section Mathematics and Computer Science)

Abstract

:
In this paper, we study the connections between generalized mean operators and entropies, where the mean value operators are related to the strictly monotone logical operators of fuzzy theory. Here, we propose a new entropy measure based on the family of generalized Dombi operators. Namely, this measure is obtained by using the Dombi operator as a generator function in the general solution of the bisymmetric functional equation. We show how the proposed entropy can be used in a fuzzy system where the performance is consistent in choosing the best alternative in the Multiple Attribute Decision-Making Problem. This newly defined entropy was also applied to the problem of extracting useful information from time-frequency representations of noisy, nonstationary, and multicomponent signals. The denoising results were compared to Shannon and Rényi entropies. The proposed entropy measure is shown to significantly outperform the competing ones in terms of denoising classification accuracy and the F1-score due to its sensitivity to small changes in the probability distribution.

1. Introduction

The concept of entropy is frequently used today as a quantitative measure of the uncertainty associated with a random variable. Entropy was introduced as a measure of information in Shannon’s 1948 paper entitled “A Mathematical Theory of Communication” [1]. However, Shannon’s entropy is not suitable for quantifying uncertainty arising from vagueness, ambiguity, or missing information that occurs in real-world scenarios.
Since the concept of probability was not sufficient to model uncertainty in ambiguous systems, Zadeh proposed a new theory, called fuzzy set theory, as a generalization of classical set theory [2]. From then on, fuzziness became a new tool to measure uncertainty. A fuzzy entropy function is based on a degree of membership that is different from probability, and therefore it has a different form from a probabilistic entropy like Shannon’s.
De Luca and Termini introduced a fuzzy entropy function as a weighted Shannon entropy and introduced axioms that a fuzzy entropy measure must satisfy [3]. After De Luca and Termini, numerous authors have proposed generalizations of fuzzy entropy. Pal and Pal proposed an exponential fuzzy entropy based on exponential functions [4]. Kapur in [5] proposed a fuzzy entropy of type ( α , β ) based on the paper by Sharma and Tenaja [6] in which they proposed a generalization of the functional equation [7] that led to a new entropy measure. Fan and Xie [8] proposed a method for generating fuzzy entropy by a distance measure based on the exponential operation. Based on the work of Sharma and Mittal [9], where the authors characterized non-additive entropies of discrete probability distributions, Hooda proposed new measures of fuzzy entropies [10]. Some additional parametric generalizations of [3] were studied by Fan and Ma [11]. Verma and Sharma introduced the generalized exponential fuzzy entropy of order α [12], which is a generalization of both exponential entropy [4] and logarithmic entropy [3]. Joshi and Kumar introduced two parametric exponential fuzzy entropies and tested the application of the proposed measure in multiple attribute decision-making problems [13]. Tian and Yang proposed an exponential entropy measure for the intuitionistic fuzzy set [14]. Intuitionistic sets are generalizations of the fuzzy sets proposed by Atanassov [15] as an extension of the fuzzy set. He added a non-membership function where the sum of the membership function and the non-membership function is not greater than one.
Entropies of intuitionistic fuzzy sets were also proposed in several papers [16,17,18,19]. Other descriptors of similar fuzzy measures and applications can be found in many other papers, like [20,21,22,23].
This paper proposes a new entropy of type ( α , β ) based on the generalized Dombi operator [24]. We apply this novel measure to fuzzy sets and compare it with other fuzzy entropies of type ( α , β ) .
We also apply the proposed measure to the problem of extracting useful content from a noisy signal in the time-frequency domain. Namely, classical probabilistic entropies are already widely used in signal analysis, and there are several useful content extraction methods from time-frequency distributions based on Shannon and Rényi entropies [25,26,27], but this is the first time this type of entropy from the family of fuzzy entropies has been implemented. Here, we improve the local entropy method presented in [28] and compare the obtained results with those obtained utilizing other entropy measures.
The main contributions of this work may be summarized as follows:
  • A new entropy measure based on the family of generalized Dombi operators is introduced and studied.
  • Comparison of the new measure to some existing fuzzy entropies of type ( α , β ) through a multi-criteria decision-making model is provided. The performance of the entropy measure with changes in parameters is found to be consistent in terms of choosing the best alternative.
  • Comparison of the differences between new entropy measures and some classical entropies is studied.
  • The efficiency of the proposed entropy measure is illustrated in examples of extracting useful information content from noisy signal time-frequency distributions.
The rest of the paper is structured as follows. In Section 2, we provide the theoretical background and present concept of the ( α , β ) entropy measures. In Section 3, we generalize the operator-dependent entropy measures using the composite functions. Examples, including the extraction of useful information from noisy time-frequency distributions, and a comparison of various entropy measures of different operators are elaborated on in Section 4. Lastly, in Section 5, we drew some conclusions and make some suggestions for future work.

2. Preliminaries

2.1. Shannon Entropy

Let us start with the basic and well-known Shannon entropy, which has been applied in various fields. It is fundamental to information theory, although Ludwig Boltzman first introduced the same idea in statistical mechanics. Shannon in [1] introduced the concept of information from a discrete source without memory as a function that quantifies the uncertainty of a random variable. The average of this information is known as the Shannon entropy measure.
Consider a source that produces messages by broadcasting a sequence of symbols from the source alphabet. Treating the values in the sequence as successive observations of the variable Z of a random experiment, we will assume that the probability associated with the value z i is p i , and the sum of all p i is 1.
The information associated with the outcome Z = z i is denoted by:
h p i = l o g p i .
One can also derive this measure axiomatically from a set of natural assumptions mentioned below. The expected information of the ensemble { z 1 , p 1 , z 2 , p 2 , . . . , z n , p n } is
H S p 1 , p 2 , . . . , p n = c i = 1 n p i l o g b p i ,
where b and c are positive constants, and b 1 . Each choice of the values b and c determines the unit in which the uncertainty is measured. The most common choice is to define the unit of measurement by the requirement that the magnitude of the uncertainty is 1 if Z = { z 1 , z 2 , . . . , z n } and p ( z 1 ) = p ( z 2 ) = . . . = p ( z n ) = 1 n . This requirement can be expressed by the equation
c l o g b 1 n = 1 .
For b = 2 and c = 1 , the resulting unit of measurement is bit. If b = e , then c = 1 l n ( n ) , and the resulting unit of measure is nat. By definition, for p i = 0 , p i l o g b p i = 0 .
Let us consider a simple case of n = 2 :
F ( p ) = H s p , 1 p = p l o g 2 p + 1 p l o g 2 1 p .
For p = 1 or p = 0 , F ( 1 ) = F ( 0 ) = 0 .
The Shannon entropy H s ( p ) has several important properties, such as nonnegativity, expansibility, symmetry, recursivity, additivity, and monotonicity. Below, we give a mathematical description of some of the properties that characterize the Shannon entropy [29].
  • If p = 1 , then H s p = 0 .
  • The entropy is a symmetric function of its arguments, i.e., the entropy H s p 1 , p 2 , , p n does not depend on the order of p i :
    H s p 1 , p 2 , , p n = H s p σ 1 , p σ 2 , p σ n ,
    where σ 1 , . . . , σ n is a permutation of 1 , 2 , . . . , n .
  • Maximum entropy is reached when all probabilities are equal.
    H s p 1 , p 2 , , p n H s 1 n , 1 n , , 1 n = 1 .
  • H s is a concave function of all its arguments.
    Let us also mention that Shannon entropy can be characterized by a functional equation. If
    f x + 1 x f y 1 x = f y + 1 y f x 1 y
    is satisfied, then f ( x ) is the Shannon entropy.
Remark 1.
It is worth noting that the Shannon entropy can be uniquely determined from the above four properties (1, 2, 3, and 4), which are treated as axioms.

2.2. Rényi’s Measure of Uncertainty

Alfréd Rényi, a Hungarian mathematician, was one of the first to define new measures of uncertainty. He sought the most general definition of uncertainty measures that would preserve additivity for independent events and be compatible with the axioms of probability. His information measure, known as the Rényi entropy [30], was defined as
H R ( α ) p 1 , p 2 , . . . , p n = 1 1 α l o g 2 i = 1 n p i α .
Let us here summarize the properties of Rényi’s entropy measure where p = ( p 1 , p 2 , . . . , p n ):
  • H R ( α ) p is a symmetric function of its variables.
  • H R ( α ) p is a continuous function of p .
  • H R ( α ) 1 n , 1 n , , 1 n = 1 .
  • For H R ( α ) p , the additivity is valid.
  • H R ( α ) p H ( α ) p for α α .
  • When α 1 , the Rényi entropy becomes the Shannon entropy. That is
    lim α 1 H R ( α ) p = H S p .
The above conditions 1 to 4 characterize the Rényi entropy, i.e., they are necessary and sufficient conditions for Equation (8). Note, however, that the Rényi entropy does not satisfy recursivity.
In the next section, we establish an interesting connection between the generalized arithmetic mean and the generalized geometric mean and show how the Shannon entropy is related to the Rényi entropy in this context.

2.3. Shannon and Rényi Entropy with Respect to Mean Operators

Here, we will use arithmetic and geometric means to describe the relationship between the Shannon and Rényi entropy. Sharma and Mittal were among the first ones to explore the notion of averaging and nonadditivity in this context [9].
The generalized arithmetic mean, where w i 0 , i = 1 n w i = 1 , is
A ( α ) x , w = i = 1 n w i x i α 1 α .
Additionally, the generalized or power geometric mean is defined as
G x , w = i = 1 n x i w i .
There is an interesting connection between the generalized arithmetic mean and the generalized geometric mean that was investigated in [31]:
lim α 0 A ( α ) x , w = G x , w .
Assuming that the value x i and the weight w i are equal, for x i = w i = p i we can write the two mean operators in the following way:
G p , p = i = 1 n p i p i ,
A ( α ) p , p = i = 1 n p i p i α 1 α = i = 1 n p i α + 1 1 α .
To find the correct connection with the entropies, we will carry out a parameter transformation where the parameter β is defined as α + 1 . Here, α has a positive value, which means that β is always greater than 1.
We can now make a connection between mean operators and entropy measures using the transformation function. The Shannon and Rényi entropies are the monotonic transformations of G p , p and A α p , p when F x = k l n x is a monotone transformation function.
Let us apply the transformation F ( x ) in the Shannon entropy
H S p = F G p , p = k l n i = 1 n p i p i = k i = 1 n p i l n p i .
If we use the arithmetic mean, where k = l n 1 n , we obtain the Rényi entropy
H R α p = F A α p , p = k 1 β 1 l n i = 1 n p i β .
Namely, with the transformation of the geometric power mean, we obtain the Shannon entropy, while with the transformation of the generalized arithmetic mean, we obtain the Rényi entropy.
It can be shown that the limit of the Rényi entropy is equal to the Shannon entropy. Let us now use mean operators to prove this assertion.
Theorem 1.
The limit of the Rényi entropy for β = 1 is the Shannon entropy [30].
Proof. 
Starting from Equation (4), we know that if α 0 , β 1 , we have
lim β 1 k 1 β 1 l n i = 1 n p i β = k i = 1 n p i l n p i .
When the parameter β is equal to 1, the Rényi entropy is equal to the Shannon entropy. From Equation (15), we know that the same holds for the arithmetic and geometric mean, i.e., when α 0 , the arithmetic mean is equal to the geometric mean. □
This relationship was also examined in the article by Valverde-Albacete and Pelaez-Moreno [32].

2.4. Fuzzy Entropy Function

Next, we will elaborate on some of the basic concepts of fuzzy entropy and give an overview of some fuzzy entropies, later used for comparison with the proposed measure.
Let us now define a fuzzy set A as
A = { x i | μ A ( x i ) : i = 1 , 2 , . . . , n } ,
where μ A ( x ) is a membership function:
μ A ( x ) = 0 , x does not belong to A , 1 , x belongs to A , < 0 , 1 > , x is partially member of A .
The fuzzy entropy of a fuzzy set is the measure of the fuzziness caused by the ambiguity of a fuzzy set. It is a key concept in the context of measuring the fuzziness of a fuzzy set, where the measure can be called a fuzzy entropy measure if it at least satisfies the following axioms [3]:
  • H ( A ) attains a minimum if and only if A is a crisp set, i.e., μ A ( x ) = 0 or 1 x .
  • H ( A ) attains a maximum if and only if A is the fuzziest set, i.e., μ A ( x ) = 0.5 x .
  • H ( A ) H ( A ) where A is a sharpened version of A.
  • H ( A ) = H ( ¬ A ) where ¬ A is a complement set of A.
Since the measure proposed by Deluca and Termini satisfied all the above axioms, it was accepted as a valid measure of fuzzy entropy [3]. Over the years, several new fuzzy entropy measures have been introduced. Here, we will focus on some of the type ( α , β ) .
One of the first entropies of type ( α , β ) based on the generalization of the functional equation [33] was introduced by Sharma and Tenaja in [6] as
H α β ( A ) = 1 2 1 α 2 1 β i = 1 n μ A ( x i ) α i = 1 n μ A ( x i ) β ,
where α β .
With this generalized entropy, the authors used the subtraction of two averages. Fan and Mal proposed a fuzzy entropy in [11] based on the generalized exponential entropy of a probability distribution described in [34]. It is
H α β ( A ) = i = 1 n μ α ( x i ) α ( 1 μ A ( x i ) ) α μ α ( x i ) β ( 1 μ A ( x i ) ) β 1 β α ,
where α β .
Hooda in [10] proposed the following measure of fuzzy entropy based on the article by Sharma and Mittal [9]:
H α β ( A ) = 1 1 β i = 1 n μ A α ( x i ) + ( 1 μ A ( x i ) ) α β 1 α 1 1 ,
where α β , α , β > 0 , and α 1 .
One of the more recent propositions of exponential fuzzy entropy of order ( α , β ) is given by Joshi and Kumar [13] as
H α β ( A ) = 1 n ( e 1 0 . 5 α e 1 0 . 5 β ) i = 1 n μ A ( x i ) e ( 1 μ A ( x i ) ) α + ( 1 μ A ( x i ) ) e ( 1 ( 1 μ A ( x i ) ) α ) μ A ( x i ) e ( 1 μ A ( x i ) ) β + ( 1 μ A ( x i ) ) e ( 1 ( 1 μ A ( x i ) ) β )
where either α > 1 , and 0 < β < 1 , or 0 < α < 1 , and β > 1 .
This measure generalizes the Verma and Sharma entropy [12], Pal and Pal exponential entropy [4], and De Luca and Termini logarithmic entropy [3].
Atanassov introduced an intuitionistic fuzzy set as an extension of a fuzzy set by adding fuzzy sets to a non-membership function [15]. The definition of an intuitionistic fuzzy set A in X is
A = { x , μ A ( x ) , ν A ( x ) | x X } ,
where the function
μ A ( x ) : X [ 0 , 1 ] and ν A ( x ) : X [ 0 , 1 ] .
There, μ A ( x ) is a membership function, ν A ( x ) is the non-membership function, and 0 μ A ( x ) + ν A ( x ) 1 .
For each Atanassov’s intuitionistic fuzzy set A, there is the so-called hesitation degree π A ( x ) that is defined as
π A ( x ) = 1 μ A ( x ) ν A ( x ) .
When π A ( x ) = 0 , we obtain the ordinary fuzzy set, defined as
A = { x , μ A ( x ) , 1 μ A ( x ) | x X } .
Next, we shall summarize the general concept of entropy.

2.5. The General Concept of Entropy

From the mean operators, we can generate the entropy measures, while the mean can be generated by a generator function.
If f ( x ) is a logarithmic function, then F ( x ) is the product of x i . The crucial property of the entropy function is that it attains its maximum value at x i = 1 n , i.e., when all the probability values are equal. We will show that the generalized entropy always satisfies this requirement.
Based on the generalized mean [35]
F x = f 1 i = 1 n w i f x i , i = 1 n w i = 1 ,
if we define w i = x i , we obtain the entropy function
F x = f 1 i = 1 n x i f x i , x i = [ 0 , 1 ] ,
where f is the generator function of the entropy, and it is a strictly monotonic and concave or convex function on [ 0 , 1 ] . This generator function of the entropy is normally used in fuzzy theory as a generator function of the operator [33].
In the case of the entropy function, the normalization constant has the value of n.
Theorem 2.
The maximum value of F ( x ) is obtained when all x i = 1 n and i = 1 n x i = 1 .
1 = F 1 n n f 1 i = 1 n x i f x i i f x i = 1 n , i = 1 . . . n .
Note that 1 F ( x ) only when f x a + b x .
Proof. 
It is trivial to show that F 1 n = 1 .
Let us show that F x i takes the maximum value at x i = 1 n for all i.
F x = i = 1 n x i f x i has the maximum value at x i = 1 n if 1 x i = 0 .
Using the Lagrange multiplier, we obtain
H x + λ 1 i = 1 n x i = 0 ,
where
G x i = x i f x i + f x i = λ .
If f ( x ) = a + b x , then
G ( x i ) = x i b x i 2 + a + b x i = a .
In Equation (32), if x i = x j or G x i = G x j , λ is a constant. If x f x is a constant value, then we get f x = a + b x .
Because all x i have to be equal in order to get i = 1 n x i = 1 , we know that x i = 1 n , i = 1 , , n . □
Remark 2.
Suppose we have the Dombi operator f x = 1 x 1 α . If α = 1 , then f(x) is a generator function and we do not obtain an entropy function. However, for α 1 , we obtain the entropy function.

3. Entropy Measures Based on the Generalized Dombi Operator

With different composite equations, we can obtain different operators. Here, we examine the solutions of the associativity and bisymmetric equations when different functions are used.
Let us first consider the associativity equation solution and function f ( x ) = l n ( x ) . The solution is the fuzzy product operator
c x , y = f 1 ( f ( x ) + f ( y ) ) = e l n x l n y = e l n x y = x y .
If we define f ( x ) = 1 x x α , using the associativity equation solution, we obtain
c x , y = 1 1 + 1 x x α + 1 y y α 1 α .

3.1. Continuous Valued Logic and Measure of the Uncertainty

A strict monotonously increasing operator in fuzzy theory has the form
a w , x = f 1 i = 1 n w i f x i ,
where f is the generator function f = [ 0 , 1 ] [ 0 , ] , 1 w i 0 . If w i = 1 for all i, then the operator is associative, while for w i = 1 , the operator is bisymmetric.
By comparing Equations (28) and (36), we can say that every logical system generated by such f has its own entropy measure.
For f x = l n x , we obtain the Shannon entropy ( l n ( x ) is a generalization of the probabilistic operator C x 1 , . . . , x n = i = 1 n x i ). There are a number of operators available in fuzzy logic theory, such as min-max, Hamacher, Einstein, product, Frank, Lukasiewicz, Azcel-Asina, and Dombi. Here, we will focus on the Dombi operator family [24].

3.2. Dombi Operator Family

The generalized Dombi operator has two parameters, and in the case of conjunctive or disjunctive operators, only one parameter is needed.
With f x = 1 x x α , which is the Dombi operator, we obtain an entropy measure obtained from the associativity equation solution in Equation (35):
E p x = 1 1 + i = 1 n x i 1 x i x i α 1 α .
When the Dombi operator is replaced with the generalized Dombi operator f x = l n 1 + β 1 x i x i α , we obtain the following measure:
D x = 1 1 + 1 β i = 1 n x i l n 1 + β 1 x i x i α 1 1 α .
Transforming this measure using x i as weight, we obtain an entropy measure:
E D x = 1 1 + 1 β i = 1 n 1 + β 1 x i x i α x i 1 1 α .
Figure 1 shows the behavior of the entropy for different α and β values.
Let us now show that the proposed entropy function satisfies the axioms mentioned in Section 2.4.
Theorem 3.
The fuzziness measure defined above has the following properties P1, P2, P3, and P4:
Proof. 
P1: Let E d ( x ) = 0 , then
1 x i x i = 0 , x i .
This will only hold if x i = 1 or x i = 0 , i.e., x is a crisp set.
P2: In Theorem 2 we showed that any measure obtained from the generator function has a maximum for n = 2 when x i = 1 2 . If f ( x ) = 1 x 1 α and α 1 , in an intuitionistic fuzzy set, the fuzziest set is when μ A ( x ) = 0.5 and ν A ( x ) = 0.5 . In that case, the proposed measure is equal to the case where n = 2 and A = { x , 0.5 , 0.5 } x x .
P3: Since E D ( x ) is an increasing function in x i in [ 0 , 0.5 and a decreasing function in x i in [ 0.5 , 1
x i x i E D ( x ) E D ( x ) , in [ 0 , 0.5 ,
and
x i x i E D ( x ) E D ( x ) , in [ 0.5 , 1 .
Hence,
E D ( x ) E D ( x ) .
P4: It is evident from the definition that E D ( x ) = E D ( ¬ x ) . Hence E D ( x ) satisfies all the properties of fuzzy entropy. □
The generalized Dombi operator has several special cases. One special case is the Dombi operator for β = 0 . In this case, Equation (38) has the form
D x = i = 1 n 1 x i x i α .
When β = 1 and α = ± 1 , we obtain the product operator
D x = 1 i = 1 n ( 1 x i ) .
The Dombi operator class also includes the Einstein and the Hamacher operators. The Einstein operator has the form
d E ( x , y ) = x + y 1 + x y .
Namely, for β = 2 and α = ± 1 in Equation (38), the Einstein operator is just a particular case of the generalized Dombi operator, and the solution of the Einstein general law of velocities is
v = c 1 + 2 i = 1 n 1 + 2 v i c v i 1 1 .
The Hamacher operator
o β ( α ) ( x ) = 1 1 + 1 β i = 1 n 1 + β 1 x i x i α 1 1 α
is a particular case of the generalized Dombi operator for β < 0 , > and α = ± 1 .
Table 1 summarizes the special cases of the generalized Dombi operator. A more detailed description of the special cases can be found in [24].
If we use any type of continuous-valued operator, the continuous-valued operator system generated by that operator produces its own entropy measure.
Remark 3.
The classical Shannon entropy function for n = 2 has the form
E ( x ) = ( x log 2 ( x ) + ( 1 x ) log 2 ( 1 x ) )
the approximation of the proposed entropy function is
E D ( α ) x = 1 1 + 1 2 1 x x α + 1 x x α 1 α , α 1
If α = 1 ,
E D ( 1 ) x = 4 x ( 1 x ) .
We can see that
m a x | E ( x ) E D ( 1 ) x | 0.15 .
When α = 6 π 2 ,
m a x | E ( x ) E D ( α ) x | 0.025 .
In Figure 2, we can see the differences between the Shannon entropy and the proposed entropy measure.

4. Examples and Numerical Results

Let us now present several numerical examples, as well as analysis and discussion of the above entropy measures and their comparison with the commonly used entropy functions.

4.1. Application of the Proposed Entropy in the Multiple Attribute Decision-Making Problem

In many areas, including investment decision-making and project evaluation, the MADM approaches are extensively used. They include obtaining decision information, compiling that information in a specific manner, evaluating the options, and choosing the best alternative. Numerous MADM techniques can be found in [36,37,38,39]. As a first example, we recreated the experiment conducted in [13] with multiple attribute decision-making (which means making decisions in the presence of multiple, usually conflicting criteria). The model proposed in [13] is as follows:
  • Let Z = ( μ P ( z i , a j ) ) n × m = ( μ i j ) n × m be the fuzzy decision matrix.
  • The weight of the attribute a j , j = 1 , 2 , , m , is determined by
    α β υ j = 1 E α β ( a j ) m j = 1 m E α β ( a j ) ,
    where E α β is a fuzzy entropy measure.
  • The scores are calculated as
    S α β = j = 1 m μ P ( z i , a j ) × α β υ j , i = 1 , 2 , . . . , n .
An alternative with the highest score is considered the best.
Let us now consider the example of a buyer as in [13], with five alternatives being taken into consideration, and four attributes used to rank the apartments: price ( a 1 ), locality ( a 2 ), design ( a 3 ), and safety ( a 4 ). Let the characteristics of the alternatives z i ( i = 1 , , 5 ) be represented by a fuzzy decision matrix Z = ( μ P ( z i , a j ) )
Z = a 1 a 2 a 3 a 4 z 1 z 2 z 3 z 4 z 5 ( 0.7 0.5 0.6 0.6 0.7 0.5 0.7 0.4 0.6 0.5 0.5 0.6 0.8 0.6 0.3 0.6 0.6 0.4 0.7 0.5 )
For this example, we will take α = 2 and β = 19 .
Using the model and measure given in Equation (39), we obtain i = 1 5 E α β ( a j ) = 0.6036 . Other values are shown in Table 2.
By calculating Equation (55), we get scores as S ( z 1 ) = 0.5961 , S α β ( z 2 ) = 0.5688 , S α β ( z 3 ) = 0.5487 , S α β ( z 4 ) = 0.5707 , and S α β ( z 5 ) = 0.5458 . We have also calculated the scores for the other values of α and β and listed them in Table 3.
Based on results obtained, the ranks of the alternatives z i , ( i = 1 , , 5 ) are:
  • For α = 2 , β = 19 , z 1 z 4 z 2 z 3 z 5 .
  • For α = 1 , β = 1 , z 1 z 4 z 2 z 3 z 5 .
  • For α = 2 , β = 0.8 , z 1 z 2 z 4 z 3 z 5 .
  • For α = 50 , β = 0.8 , z 1 z 2 z 4 z 5 z 3 .
  • For α = 2 , β = 50 , z 1 z 4 z 2 z 3 z 5 .
In each case, z 1 is the best choice. We can see that varying the values of parameters in the proposed measure does not change the choice of the best alternative.
A comparison to other measures: Now let us calculate other measures and compare the results for the newly proposed measure.
  • When we apply the entropy proposed by Sharma and Tenaja, given in Equation (20), to the example (56), the values of the score functions take the values (for α = 0.2 , β = 0.8 )
    S α β ( z 1 ) = 0.6078 ; S α β ( z 2 ) = 0.5851 ; S α β ( z 3 ) = 0.5534 ; S α β ( z 4 ) = 0.5855 ; S α β ( z 5 ) = 0.5571 ,
    and this results in a sequence of alternatives as z 1 z 4 z 2 z 5 z 3 .
  • When we apply the entropy proposed by Fan and Mal, given in Equation (21), to the example (56), the values of the score functions take the values (for α = 0.5 , β = 18 )
    S α β ( z 1 ) = 0.5895 ; S α β ( z 2 ) = 0.5554 ; S α β ( z 3 ) = 0.5476 ; S α β ( z 4 ) = 0.5709 ; S α β ( z 5 ) = 0.5347 .
    and this results in a sequence of alternatives as z 1 z 4 z 2 z 3 z 5 .
  • When we apply the entropy proposed by Hooda, given in Equation (22), to the example (56), the values of the score functions take the values (for α = 0.3 , β = 0.5 )
    S α β ( z 1 ) = 0.5987 ; S α β ( z 2 ) = 0.5726 ; S α β ( z 3 ) = 0.5497 ; S α β ( z 4 ) = 0.5742 ; S α β ( z 5 ) = 0.5482 .
    and this results in a sequence of alternatives as z 1 z 4 z 2 z 3 z 5 .
  • When we apply the entropy proposed by Joshi and Kumar, given in Equation (23), to the example (56), values of score functions take the values (for α = 1.2 , β = 1 )
    S α β ( z 1 ) = 0.6400 ; S α β ( z 2 ) = 0.6533 ; S α β ( z 3 ) = 0.5575 ; S α β ( z 4 ) = 0.5819 ; S α β ( z 5 ) = 0.6133 .
    resulting in a sequence of alternatives as z 2 z 1 z 5 z 4 z 3 .
We see that the values of the parameters in the proposed measure do not change the choice of the best alternative, but change the order of the other alternatives. The β value does not seem to influence the choice of the alternatives in most cases; however, it seems that with greater α values priority of the alternatives changes according to the greatest differences in the attribute values. In both cases, when the α value was 50, alternative z 2 was given priority over the alternative z 4 based on the attribute a 3 . The same happens with the alternatives z 3 and z 5 , where the alternative z 5 was favored because of the attribute a 3 . We will take a closer look at the changes in entropy value based on the parameter changes in the next example.

4.2. A Comparison of the Proposed Measure with Classical Probability Entropies

In the next example, we applied the proposed entropy measures to two examples of data from the Household Finance and Consumption Survey, a joint project of the central banks and national statistical offices of the European Union (EU) [40]. The dataset provides detailed household-level statistics on various aspects of household balances and related economic and demographic variables. To test the entropy measures listed earlier, we chose two examples, namely the distribution of household size and educational attainment in four different EU member states and aggregate information obtained for the EU. The EU members selected were Belgium (BE), Germany (DE), Croatia (HR), and Hungary (HU).
Table 4 shows the percentage distribution of household sizes in the EU and four EU countries. For each country, a list of entropies was calculated for the given probability distribution. The entropy measure found by using the generalized Dombi operator is given in the table for different values of α and β , as well as the Shannon and Rényi entropies. This example clearly indicates the large difference in entropy values.
The above entropy measures were also calculated for the distribution of the educational level of the population in the EU and the four countries studied, as shown in Table 5. Figure 3 shows the values of the entropy measure with different values of α or β . The proposed entropy measure is more sensitive to the choice of α , as can be seen in Figure 3a. From Figure 3b, we see that the choice of β affects the measure most strongly when the α value is between −10 and 10.

4.3. Extracting Useful Information Content from Noisy Time-Frequency Distributions

Motivated by the differences in entropy measure values in the previous examples, we decided to implement the entropy measure based on the generalized Dombi operator in the state-of-the-art 2D Local Entropy Method (2DLEM) to extract useful content from noisy signal time-frequency distributions proposed in [28]. The flowchart of the method is shown in Figure 4.
The 2DLEM method is based on a local 2D entropy calculation, and the useful content is extracted from the entropy map. The entropy level in the time-frequency domain is significantly different when signal components (useful information content) are present compared to domain regions containing mainly noise. The original method used Rényi entropy for information extraction. This study modified the technique to utilize the proposed entropy measure based on the Dombi operator instead and compared the denoising results with those obtained using the 2DLEM with the Shannon and Rényi entropy.
The 2DLEM method consists of several steps. The method works with time-frequency distributions; more precisely, with the spectrogram (the squared magnitude of the short-time Fourier transform) of the noisy signal, as well as other quadratic time-frequency distributions from Cohen’s class [41,42]). The local 2D entropy is calculated by treating the time-frequency distribution as a probability density function. For each point in the distribution, the entropy is calculated for different window sizes. The entropy value for the optimal window size is determined using the relative intersection of confidence intervals (RICI) algorithm [43,44,45]. After the process is completed for each point in the distribution, we obtain the entropy map.
In the second step, the method again uses the RICI algorithm to extract useful content from the entropy map. The signal energy is calculated for different threshold values. Based on this calculation, the RICI algorithm extracts information from the entropy map by comparing the intersection of the confidence intervals of the signal energy for the particular threshold with the confidence intervals of the other thresholds.
By thresholding the entropy map, we obtain a classification mask containing 0 and 1, where 1 represents the useful content. In the calculation of the 2DLEM method with proposed entropy, we selected different α and β values. The results in extracting useful information from noisy signals are given for two synthetic, nonstationary test signals. The first tested signal consists of three signal atoms, while the second is a multicomponent frequency-modulated signal. The time series of the signals are shown in Figure 5, while in Figure 6 there are the time-frequency distributions of these signals.
The performance of the applied technique was analyzed for different signal-to-noise ratios (SNRs) ranging from −3 dB to 6 dB. The base model used for the evaluation was a noise-free signal spectrogram. Next, hard thresholding of 5% was performed to obtain a classification mask containing only ones and zeros, where ones are treated as useful content in the probability distribution. The ideal classification masks obtained this way for the two signals are shown in Figure 7a,b. The feasibility and performance of this method were studied in previous articles, but only the Rényi entropy was applied [28,46].
The efficiency of extracting useful information from the noisy time-frequency domain is measured using the accuracy and F1 score.
Accuracy is defined as
accuracy = TP + TN TP + TN + FP + FN ,
where TP is true-positive, TN is true-negative, FP is false-positive, and FN is false-negative. Ideal mask extraction is shown in Figure 7. Masks are obtained by thresholding the entropy maps (Figure 8, Figure 9 and Figure 10). The result of subtracting the ideal mask and the obtained classification masks (Figure 11, Figure 12 and Figure 13) is used to calculate the metric values. Elements, where the overlapping masks were equal, are TP and TN (overlapping of ones gives TP and overlapping of zeros gives TN). If the subtracting result is 1, the mask obtained did not recognize the useful content and is considered the FN. When the result of subtracting the ideal and obtained mask is −1, there was no useful content, but it was wrongly “recognized” as useful content and it was treated as the FP.
The F1 score takes into account both precision and recall of the classification. It is a harmonic mean of these two scores defined as
F 1 = 2 × precision × recall precision + recall ,
where
precision = TP TP + FP ,
and
recall = TP TP + FN .
Table 6 shows the classification results for the first analyzed signal consisting of signal atoms. The proposed entropy measure based on the generalized Dombi operator improved the classification accuracy compared to the Shannon and Rényi entropies for all tested SNRs. The α and the β values for the results reported in Table 6 and Table 7 are 2 and 0.8, respectively. The proposed entropy outperforms the Rényi entropy by 0.0134 to 0.024 in accuracy and by 0.007 to 0.0134 in F1 score. The largest improvement in both accuracy and F1 score was observed for the highest tested noise level of SNR = −3 dB.
The results for the second tested signal are shown in Table 7. As in the previous case, the proposed entropy measure based on the generalized Dombi operator outperforms other entropy measures, while for this test signal, the Shannon entropy outperforms the Rényi entropy. Namely, for the SNR of −3 dB, the proposed entropy improves the accuracy by up to 0.0118 compared to the Shannon entropy and by up to 0.0667 compared to the Rényi entropy. F1 is improved by up to 0.0081 and 0.0439 compared to the Shannon entropy and the Rényi entropy, respectively. Moreover, the difference is significant at higher SNR values. The greatest difference was for SNR = 6, and the accuracy was improved by 0.042 and the F1 by 0.0256.
Figure 8 and Figure 9 show the entropy map computed with the Shannon and the Rényi entropies, respectively, while Figure 10 shows the entropy map computed using the proposed entropy measure. These were then used as input for the next step of the 2DLEM method, which extracted useful content in form of a classification mask, as shown in Figure 11 for the Shannon measure, in Figure 12 for the Rényi measure, and in Figure 13 for the measure based on the generalized Dombi operator, respectively.
Inspecting Figure 11, Figure 12 and Figure 13 and comparing the quantitative results given in Table 6 and Table 7, we see that the proposed entropy measure based on the generalized Dombi operator significantly outperformed the Rényi entropy in extracting useful information content from the noisy signal in the time-frequency domain. By applying this measure, more of the useful content is preserved and less noise is included in the final result. Therefore, the novel technique produced a higher classification accuracy and F1 score for all SNRs. Results for different α and β values for the 2DLEM method in combination with the proposed entropy measure are reported in Table 8 and Table 9. There are slight differences when different values are used, but regardless of the values used, the proposed entropy measure still outperforms the Shannon and the Rényi entropy. The choice of values was motivated by Section 4.2, where we can see that the α parameter has a greater influence on the results, and the largest difference is found between −10 and 10 (as can be seen from Figure 2). This result clearly shows the potential of our entropy function based on the generalized Dombi operator to analyze the information content of nonstationary noisy data in the time-frequency domain, and its potential in similar areas where traditionally the Shannon or the Rényi entropy is applied.

5. Conclusions

In this paper, we established a connection between mean operators and entropies using transformation functions. We formulated a new method for defining an entropy measure for a logical system. It was shown that different operators can be obtained by using different compound equations. We demonstrated that each logical system generated by an operator has its own entropy measure. We concentrated on the family of Dombi operators and showed that a system generated by the Dombi operator has its entropy measure. The new entropy measure was defined using a bisymmetric equation solution with a generalized Dombi operator. The introduced exponential fuzzy entropy of order ( α , β ) was tested in the framework of fuzzy set theory and it was shown to be consistent in decision-making for multiple attributes. Next, the proposed entropy was implemented in the framework of classical probability. It was also tested in the method previously based on the Rényi entropy to extract useful information from noisy signal time-frequency representations. Our numerical results confirmed that the proposed entropy outperformed both the Shannon and the Rényi entropies because it was more sensitive to small changes in the probability distributions. Namely, the proposed measure improved classification metrics (accuracy and F1) for all tested nonstationary signals and SNRs.
As a future research direction, it would be worth looking at additional applications of the proposed measure in fuzzy theory, intuitionistic fuzzy theory, and soft fuzzy set theory. The proposed entropy measure could be extended to cover intuitionistic fuzzy and interval-valued intuitionistic fuzzy sets. It would also be interesting to study the measure’s behavior in nonlinear dynamical systems. Note that in [47], a comparison was performed between fuzzy entropies and entropies commonly used in time series analysis, such as the approximate entropy and the sample entropy, and the results tell us that systems with fuzzy structures have better performance scores. With this in mind, it would be interesting to see whether the proposed measure could be modified for applications with time series, and compared with the family of entropy measures used in such systems. This is intended for future work.

Author Contributions

Conceptualization, J.D. and A.V.L.; data curation, A.V.L. and J.L.; formal analysis, J.D. and A.V.L.; funding acquisition, J.L.; investigation, J.D.; methodology, J.D. and A.V.L.; project administration, J.L.; resources, A.V.L.; software, A.V.L.; supervision, J.D. and J.L.; validation, J.D., A.V.L. and J.L.; visualization, A.V.L.; writing—original draft, J.D. and A.V.L.; writing—review and editing, A.V.L. and J.L. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the Croatian Science Foundation under the project IP-2018-01-3739, IRI2 project “ABsistemDCiCloud” (KK.01.2.1.02.0179), and the University of Rijeka under the projects uniri-tehnic-18-17 and uniri-tehnic-18-15.

Data Availability Statement

The data presented in this study is available on request from the corresponding author.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Shannon, C.E. A Mathematical Theory of Communication. Bell Syst. Tech. J. 1948, 27, 379–423. [Google Scholar] [CrossRef] [Green Version]
  2. Zadeh, L. Probability measures of Fuzzy events. J. Math. Anal. Appl. 1968, 23, 421–427. [Google Scholar] [CrossRef] [Green Version]
  3. De Luca, A.; Termini, S. A definition of a nonprobabilistic entropy in the setting of fuzzy sets theory. Inf. Control 1972, 20, 301–312. [Google Scholar] [CrossRef] [Green Version]
  4. Pal, N.; Pal, S. Object-background segmentation using new definitions of entropy. IEE Proc. E-Comput. Digit. Tech. 1989, 136, 284–295. [Google Scholar] [CrossRef] [Green Version]
  5. Kapur, J.N. Measures of Fuzzy Information; Mathematical Science Trust Society: New Delhi, India, 1997. [Google Scholar]
  6. Sharma, B.; Taneja, I. Entropy of type (α,β) and other generalized measures in information theory. Metrika 1975, 22, 205–215. [Google Scholar] [CrossRef]
  7. Chaundy, T.W.; McLeod, J.B. On a Functional Equation. Math. Notes 1960, 43, 7–8. [Google Scholar] [CrossRef] [Green Version]
  8. Fan, J.; Xie, W. Distance measure and induced fuzzy entropy. Fuzzy Sets Syst. 1999, 104, 305–314. [Google Scholar] [CrossRef]
  9. Sharma, B.D.; Mittal, D.P. New non-additive measures of entropy for discrete probability distributions. J. Math. Sci. 1975, 10, 28–40. [Google Scholar]
  10. Hooda, D.S. On generalized measures of fuzzy entropy. Math. Slovaca 2004, 54, 315–325. [Google Scholar]
  11. Fan, J.; Ma, Y. Some new fuzzy entropy formulas. Fuzzy Sets Syst. 2002, 128, 277–284. [Google Scholar] [CrossRef]
  12. Verma, R.S.; Sharma, B. On Generalized Exponential Fuzzy Entropy. World Acad. Sci. Eng. Technol. 2011, 5, 1895–1898. [Google Scholar] [CrossRef]
  13. Joshi, R.; Kumar, S.A. New Exponential Fuzzy Entropy of Order-(α,β) and its Application in Multiple Attribute Decision-Making Problems. Commun. Math. Stat. 2017, 5, 213–229. [Google Scholar] [CrossRef]
  14. Tian, D.; Yang, Z. An exponential entropy on intuitionistic fuzzy sets. In Proceedings of the 2015 International Conference on Wavelet Analysis and Pattern Recognition (ICWAPR), Guangzhou, China, 12–15 July 2015; pp. 198–202. [Google Scholar] [CrossRef]
  15. Atanassov, K.T. Intuitionistic fuzzy sets. Fuzzy Sets Syst. 1986, 20, 87–96. [Google Scholar] [CrossRef]
  16. Xuan Thao, N. Some new entropies and divergence measures of intuitionistic fuzzy sets based on Archimedean t-conorm and application in supplier selection. Soft Comput. 2021, 25, 5791–5805. [Google Scholar] [CrossRef]
  17. Zhu, Y.; Li, D. A new definition and formula of entropy for intuitionistic fuzzy sets. J. Intell. Fuzzy Syst. 2016, 30, 3057–3066. [Google Scholar] [CrossRef]
  18. Verma, R.; Sharma, B. Exponential entropy on intuitionistic fuzzy sets. Kybernetika Praha 2013, 49, 114–127. [Google Scholar]
  19. Ye, J. Two effective measures of intuitionistic fuzzy entropy. Computing 2010, 87, 55–62. [Google Scholar] [CrossRef]
  20. Singh, S.; Vaishno, S.; Sharma, S. On Generalized Fuzzy Entropy and Fuzzy Divergence Measure with Applications. Int. J. Fuzzy Syst. Appl. 2019, 8, 67–69. [Google Scholar] [CrossRef]
  21. Aggarwal, M. Bridging the Gap Between Probabilistic and Fuzzy Entropy. IEEE Trans. Fuzzy Syst. 2020, 28, 2175–2184. [Google Scholar] [CrossRef]
  22. Zhang, Q.; Chen, Y.; Yang, J.; Wang, G. Fuzzy Entropy: A More Comprehensible Perspective for Interval Shadowed Sets of Fuzzy Sets. IEEE Trans. Fuzzy Syst. 2020, 28, 3008–3022. [Google Scholar] [CrossRef]
  23. Mishra, A.; Hooda, D.S.; Jain, D. On Exponential Fuzzy Measures of Information and Discrimination. Int. J. Comput. Appl. 2015, 119, 1–7. [Google Scholar] [CrossRef]
  24. Dombi, J. The Generalized Dombi operator family and the multiplicative utility function. In Soft Computing Based Modeling in Intelligent Systems; Springer: Berlin, Germany, 2009; pp. 115–131. [Google Scholar] [CrossRef]
  25. Hild, K.E., II; Erdogmus, D.; Principe, J.C. An analysis of entropy estimators for blind source separation. Signal Process. 2006, 86, 182–194. [Google Scholar] [CrossRef] [Green Version]
  26. Erdogmus, D.; Hild, K.E., II; Principe, J.C. Blind source separation using Renyi’s α-marginal entropies. Neurocomputing 2002, 49, 25–38. [Google Scholar] [CrossRef] [Green Version]
  27. Saulig, N.; Milanovic, Z.; Ioana, C. A Local Entropy-Based Algorithm for Information Content Extraction from Time-frequency Distributions of Noisy Signals. Digit. Signal Process. 2017, 70, 155–165. [Google Scholar] [CrossRef]
  28. Vrankovic, A.; Lerga, J.; Saulig, N. A novel approach to extracting useful information from noisy TFDs using 2D local entropy measures. EURASIP J. Adv. Signal Process. 2020, 2020, 1–19. [Google Scholar] [CrossRef]
  29. Aczel, J.; Daroczy, Z. On Measure of Information and Their Characterizations; Dover: Downers Grove, IL, USA, 1975. [Google Scholar]
  30. Rényi, A. On Measures of Entropy and Information. In Proceedings of the Fourth Berkeley Symposium on Mathematical Statistics and Probability, Volume 1: Contributions to the Theory of Statistics; University of California Press: Berkeley, CA, USA, 1961; pp. 547–561. [Google Scholar]
  31. Fink, A.M.; Jodeit, M. A generalization of the arithmetic-geometric means inequality. Proc. Am. Math. Soc. 1976, 61, 255–261. [Google Scholar] [CrossRef]
  32. Valverde-Albacete, F.J.; Peláez-Moreno, C. The Case for Shifting the Rényi Entropy. Entropy 2019, 21, 46. [Google Scholar] [CrossRef] [Green Version]
  33. Aczel, J. Lectures on Functional Equations and Their Applications; Academic Press: Cambridge, MA, USA, 1966. [Google Scholar]
  34. Koski, T.; Persson, L. Some properties of generalized exponential entropies with applications to data compression. Inf. Sci. 1992, 62, 103–132. [Google Scholar] [CrossRef]
  35. Landsberg, P. A generalized mean. J. Math. Anal. Appl. 1980, 76, 209–212. [Google Scholar] [CrossRef] [Green Version]
  36. Torun, H. Group decision making with intuitionistic fuzzy preference relations. Knowl.-Based Syst. 2014, 70, 33–43. [Google Scholar] [CrossRef]
  37. Wan, S.P.; Dong, J. Decision Making Theories and Methods Based on Interval-Valued Intuitionistic Fuzzy Sets; Springer Nature: Berlin, Germany, 2020. [Google Scholar] [CrossRef]
  38. Jiang, H.; Hu, B. A novel three-way group investment decision model under intuitionistic fuzzy multi-attribute group decision-making environment. Inf. Sci. 2021, 569, 557–581. [Google Scholar] [CrossRef]
  39. Liu, P.; Wang, Y.; Jia, F.; Fujita, H. A multiple attribute decision making three-way model for intuitionistic fuzzy numbers. Int. J. Approx. Reason. 2020, 119, 177–203. [Google Scholar] [CrossRef]
  40. Lamarche, P. Estimating Consumption in the HFCS: Experimental Results on the First Wave of the HFCS; Statistics Paper Series 22; European Central Bank: Frankfurt, Germany, 2017. [Google Scholar]
  41. Boashash, B. Time-Frequency Signal Analysis and Processing: A Comprehensive Reference; Elsevier Academic Press: Canberra, Australia, 2016. [Google Scholar]
  42. Cohen, L. Time-frequency distributions—A review. Proc. IEEE 1989, 77, 941–981. [Google Scholar] [CrossRef] [Green Version]
  43. Lerga, J.; Vrankic, M.; Sucic, V. A Signal Denoising Method Based on the Improved ICI Rule. IEEE Signal Process. Lett. 2008, 15, 601–604. [Google Scholar] [CrossRef]
  44. Lerga, J.; Sucic, V.; Sersic, D. Performance analysis of the LPA-RICI denoising method. In Proceedings of the ISPA 2009—Proceedings of the 6th International Symposium on Image and Signal Processing and Analysis, Salzburg, Austria, 16–18 September 2009; pp. 28–33. [Google Scholar] [CrossRef]
  45. Segon, G.; Lerga, J.; Sucic, V. Improved LPA-ICI-Based Estimators Embedded in a Signal Denoising Virtual Instrument. Signal Image Video Process. 2017, 11, 211–218. [Google Scholar] [CrossRef]
  46. Vranković, A.; Ipšić, I.; Lerga, J. Entropy-Based Extraction of Useful Content from Spectrograms of Noisy Speech Signals. In Proceedings of the 2021 International Symposium ELMAR, Zagreb, Croatia, 13–15 September 2021; pp. 83–86. [Google Scholar] [CrossRef]
  47. Cao, Z.; Lin, C. Inherent Fuzzy Entropy for the Improvement of EEG Complexity Evaluation. IEEE Trans. Fuzzy Syst. 2018, 26, 1032–1035. [Google Scholar] [CrossRef]
Figure 1. Proposed entropy measure for different α and β values for n = 2 . (a) Entropy measure values for α = 1 , and β = 1 , β = 0.8 , β = 0.6 , β = 0.4 . (b) Entropy measure values for β = 0.9 , and α = 1 , α = 2 , α = 3 , α = 4 .
Figure 1. Proposed entropy measure for different α and β values for n = 2 . (a) Entropy measure values for α = 1 , and β = 1 , β = 0.8 , β = 0.6 , β = 0.4 . (b) Entropy measure values for β = 0.9 , and α = 1 , α = 2 , α = 3 , α = 4 .
Mathematics 11 00505 g001
Figure 2. Comparison of the proposed entropy measure and the Shannon entropy. (a) Values of proposed entropy measure and the Shannon entropy for α = 1 and β = 1 . (b) Difference between values of proposed entropy measure and the Shannon entropy for α = 6 π 2 , and β = 1 .
Figure 2. Comparison of the proposed entropy measure and the Shannon entropy. (a) Values of proposed entropy measure and the Shannon entropy for α = 1 and β = 1 . (b) Difference between values of proposed entropy measure and the Shannon entropy for α = 6 π 2 , and β = 1 .
Mathematics 11 00505 g002
Figure 3. The proposed entropy based on the Dombi operator for different values α and β parametar. (a) Values of proposed entropy measure for α = 1 , α = 2 , α = 2 , and α = 50 with β ranging from 0.1 to 1. (b) Values of proposed entropy measure for β = 1 , β = 0.8 , β = 4 , β = 8 , and β = 50 with α ranging from −10 to 70.
Figure 3. The proposed entropy based on the Dombi operator for different values α and β parametar. (a) Values of proposed entropy measure for α = 1 , α = 2 , α = 2 , and α = 50 with β ranging from 0.1 to 1. (b) Values of proposed entropy measure for β = 1 , β = 0.8 , β = 4 , β = 8 , and β = 50 with α ranging from −10 to 70.
Mathematics 11 00505 g003
Figure 4. Flowchart of the 2DLEM method.
Figure 4. Flowchart of the 2DLEM method.
Mathematics 11 00505 g004
Figure 5. Time domain plots of the noisy signals. (a) The first signal. (b) The second signal.
Figure 5. Time domain plots of the noisy signals. (a) The first signal. (b) The second signal.
Mathematics 11 00505 g005
Figure 6. Time-frequency distributions of two signals for SNR = 0 dB. (a) Time-frequency distribution of the first signal. (b) Time-frequency distribution of the second signal.
Figure 6. Time-frequency distributions of two signals for SNR = 0 dB. (a) Time-frequency distribution of the first signal. (b) Time-frequency distribution of the second signal.
Mathematics 11 00505 g006
Figure 7. Ideal useful content extraction for two signals. (a) Useful information content for the first signal. (b) Useful information content for the second signal.
Figure 7. Ideal useful content extraction for two signals. (a) Useful information content for the first signal. (b) Useful information content for the second signal.
Mathematics 11 00505 g007
Figure 8. Entropy maps obtained by applying the 2DLEM method with the Shannon entropy for SNR = 0 dB. (a) Entropy map for the first signal. (b) Entropy map for the second signal.
Figure 8. Entropy maps obtained by applying the 2DLEM method with the Shannon entropy for SNR = 0 dB. (a) Entropy map for the first signal. (b) Entropy map for the second signal.
Mathematics 11 00505 g008
Figure 9. Entropy maps obtained by applying the 2DLEM method with the Rényi measure for SNR = 0 dB. (a) Entropy map for the first signal. (b) Entropy map for the second signal.
Figure 9. Entropy maps obtained by applying the 2DLEM method with the Rényi measure for SNR = 0 dB. (a) Entropy map for the first signal. (b) Entropy map for the second signal.
Mathematics 11 00505 g009
Figure 10. Entropy maps obtained by applying the 2DLEM method with the proposed entropy based on the generalized Dombi operator for SNR = 0 dB. (a) Entropy map for the first signal. (b) Entropy map for the second signal.
Figure 10. Entropy maps obtained by applying the 2DLEM method with the proposed entropy based on the generalized Dombi operator for SNR = 0 dB. (a) Entropy map for the first signal. (b) Entropy map for the second signal.
Mathematics 11 00505 g010
Figure 11. Extracted useful information content obtained by applying the 2DLEM method with the Shannon entropy for SNR = 0 dB. (a) Classification mask for the first signal. (b) Classification mask for the second signal.
Figure 11. Extracted useful information content obtained by applying the 2DLEM method with the Shannon entropy for SNR = 0 dB. (a) Classification mask for the first signal. (b) Classification mask for the second signal.
Mathematics 11 00505 g011
Figure 12. Extracted useful information content obtained by applying the 2DLEM method with the Rényi entropy for SNR = 0 dB. (a) Classification mask for the first signal. (b) Classification mask for the second signal.
Figure 12. Extracted useful information content obtained by applying the 2DLEM method with the Rényi entropy for SNR = 0 dB. (a) Classification mask for the first signal. (b) Classification mask for the second signal.
Mathematics 11 00505 g012
Figure 13. Extracted useful information content obtained by applying the 2DLEM method obtained with the proposed measure based on the generalized Dombi operator for SNR = 0 dB. (a) Classification mask for the first signal. (b) Classification mask for the second signal.
Figure 13. Extracted useful information content obtained by applying the 2DLEM method obtained with the proposed measure based on the generalized Dombi operator for SNR = 0 dB. (a) Classification mask for the first signal. (b) Classification mask for the second signal.
Mathematics 11 00505 g013
Table 1. Special cases of the generalized Dombi operator.
Table 1. Special cases of the generalized Dombi operator.
Type of OperatorValue of β Value of α
ConjunctionDisjunction
Dombi0 0 < α 0 < α
Product11−1
Einstein21−1
Hamacher β < 0 , > 1−1
Drastic 0 < α 0 < α
Min-max0
Table 2. Values of E α β ( Z ) and α β υ for α = 2 and β = 19 .
Table 2. Values of E α β ( Z ) and α β υ for α = 2 and β = 19 .
a 1 a 2 a 3 a 4
E α β 0.23230.10000.15680.1146
α β υ 0.22600.26500.24830.2607
Table 3. Values of S α β ( Z ) and α β υ for different α , and β .
Table 3. Values of S α β ( Z ) and α β υ for different α , and β .
α = 1 , β = 1 α = 2 , β = 0.8 α = 50 , β = 0.8 α = 2 , β = 50
S α β ( z 1 ) 0.59700.59400.59760.5985
S α β ( z 2 ) 0.57050.56750.57330.5724
S α β ( z 3 ) 0.54890.54730.54820.5496
S α β ( z 4 ) 0.57160.56600.56690.5737
S α β ( z 5 ) 0.54680.54500.55020.5482
Table 4. Entropies for the distribution of household sizes in EU and four EU countries.
Table 4. Entropies for the distribution of household sizes in EU and four EU countries.
Household SizeEUBEDEHRHU
134.634.740.624.629.5
231.631.534.226.932.0
315.411.812.418.418.4
412.914.39.018.112.7
5+5.57.73.812.07.4
α , β Entropy
Proposed measure1, 10.84950.85770.75690.96350.8908
2, 0.80.71510.71300.62930.84800.7595
50, 0.80.66390.66260.564511
2, 500.73650.72920.65570.87160.7820
50, 500.66420.66240.564511
Shannon entropy-0.89660.90650.82810.94360.9264
Rényi entropy-0.83000.83710.73430.97910.8742
Table 5. Entropies for the distribution of education level in the EU and four EU countries.
Table 5. Entropies for the distribution of education level in the EU and four EU countries.
Education LevelEUBEDEHRHU
Basic or no education30.422.810.124.623.7
Secondary40.731.056.759.351.1
Tertiary28.946.233.216.125.2
α , β Entropy
Proposed measure1,10.97840.92740.74150.74180.8777
2, 0.80.86210.76290.59240.54920.6808
2, 0.50.86200.76220.58840.54470.6790
4, 0.80.84740.73870.55710.51860.6511
4, 0.50.84740.73860.55500.51530.6505
Shannon entropy-0.98910.96200.83680.86370.9390
Rényi entropy-0.96630.89400.68930.67370.8263
Table 6. Classification results for the first signal.
Table 6. Classification results for the first signal.
SNRShannon EntropyRényi EntropyProposed Entropy
Accuracy
−3 dB0.9290.9540.978
0 dB0.9420.9640.9807
3 dB0.9350.960.9821
6 dB0.9490.9720.9854
F1 score
−3 dB0.9660.9750.9884
0 dB0.97140.9810.9897
3 dB0.9680.9790.9905
6 dB0.9750.9850.9923
Table 7. Classification results for the second signal.
Table 7. Classification results for the second signal.
SNRShannon EntropyRényi EntropyProposed Entropy
Accuracy
−3 dB0.88590.83100.8977
0 dB0.87850.86870.9346
3 dB0.89660.91350.9459
6 dB0.93310.91740.9751
F1 score
−3 dB0.93560.89980.9437
0 dB0.9270.92130.9616
3 dB0.93760.94660.9686
6 dB0.95870.94950.9843
Table 8. Classification results for the first signal with proposed entropy measure and different parameter values.
Table 8. Classification results for the first signal with proposed entropy measure and different parameter values.
SNR α = 10 , β = 2 α = 1 , β = 1 α = 10 , β = 0.5 α = 0.5 , β = 10
Accuracy
−3 dB0.93880.9370.95230.9346
0 dB0.96100.96230.96250.9686
3 dB0.97360.97680.97570.9755
6 dB0.97280.97280.97980.9757
F1 score
−3 dB0.96680.96620.97440.9646
0 dB0.97860.97940.97960.9827
3 dB0.98540.98720.98660.9866
6 dB0.98470.98470.98880.9864
Table 9. Classification results for the second signal with proposed entropy measure and different parameter value.
Table 9. Classification results for the second signal with proposed entropy measure and different parameter value.
SNR α = 10 , β = 2 α = 1 , β = 1 α = 10 , β = 0.5 α = 0.5 , β = 10
Accuracy
−3 dB0.89220.89650.88660.8880
0 dB0.91720.92330.92970.9325
3 dB0.95520.95650.96020.956
6 dB0.97180.96890.96820.9674
F1 score
−3 dB0.93810.93490.93550.9421
0 dB0.95060.95580.95920.9611
3 dB0.97230.97440.97670.9744
6 dB0.98330.98140.98100.9805
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Dombi, J.; Vranković Lacković, A.; Lerga, J. A New Insight into Entropy Based on the Fuzzy Operators, Applied to Useful Information Extraction from Noisy Time-Frequency Distributions. Mathematics 2023, 11, 505. https://doi.org/10.3390/math11030505

AMA Style

Dombi J, Vranković Lacković A, Lerga J. A New Insight into Entropy Based on the Fuzzy Operators, Applied to Useful Information Extraction from Noisy Time-Frequency Distributions. Mathematics. 2023; 11(3):505. https://doi.org/10.3390/math11030505

Chicago/Turabian Style

Dombi, József, Ana Vranković Lacković, and Jonatan Lerga. 2023. "A New Insight into Entropy Based on the Fuzzy Operators, Applied to Useful Information Extraction from Noisy Time-Frequency Distributions" Mathematics 11, no. 3: 505. https://doi.org/10.3390/math11030505

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop