Novel Algorithm for Identifying and Fusing Conflicting Data in Wireless Sensor Networks

There is continuously increasing interest in research on multi-sensor data fusion technology. Because Dempster’s rule of combination can be problematic when dealing with conflicting data, there are numerous issues that make data fusion a challenging task, including the exponential explosion, Zadeh Paradox, and one-vote veto. These issues lead to a great difference between the fusion results and real results. This paper applies the idea of analyzing distance-based evidence conflicts, introduces the concept of vector space, and proposes a new cosine theorem-based method of identifying and expressing conflicting data. In addition, this paper proposes a new data fusion algorithm based on the degree of mutual support between beliefs, which is based on the Jousselme distance-based combination rule proposed by Deng et al. Simulation results demonstrate that the presented algorithm achieves great improvements in both the accuracy of identifying conflicting data and that of fusing conflicting data.


Introduction
The Dempster-Shafer (DS) theory, which is also known as the evidential theory, can be considered to be a generalization of the Bayesian theory. The DS theory uses a collection to construct the general OPEN ACCESS framework of an uncertainty reasoning model rather than using the specific probability to describe the uncertainty. By establishing the corresponding relation between hypotheses and collections, the uncertainty of hypotheses can be transformed into the corresponding collections. Under the circumstances that a priori probabilities are hard to achieve, the evidential theory can distinguish between the uncertain and the unknown, and thus it has more flexibility.
In the field of multi-sensor data fusion, the DS theory is a popular method to express and fuse the uncertainty information, and is especially suitable for decision level fusion. However, the appearance of conflicting evidence results in a series of problems when conducting fusion using the DS theory. Zadeh discovered that simply using Dempster's rule of combination to deal with highly conflicting data will produce some counter-intuitive results. This is known as Zadeh's Paradox [1]. In addition, there is another problem called the one-vote veto; namely, when the description of one piece of evidence is completely inconsistent with other pieces of evidence for the same hypothesis (in other words, the belief in the hypothesis is 0), whatever the belief in the other pieces of evidence for the hypothesis, the belief in the fusion results computed by Dempster's rule of combination will always be 0. The one-vote veto issue brings about the following problem: because of the conclusion of one piece of evidence, the belief in other pieces of evidence is invalid for the final fusion result. Because this problem is caused by conflicting evidence, there is a real need to identify and express highly conflicting data before using the DS theory.
This paper applies a distance-based idea to deal with evidence conflict and introduces the concept of vector space. On this basis, we propose a cosine theorem-based method that can effectively identify and explicitly express the conflicting data. On the basis of the above, we propose a new multi-sensor data fusion algorithm based on the degree of mutual support between beliefs, which can solve the data conflict.
The rest of this paper is organized as follows: in Section 2, we introduce the background and related work. The cosine theorem-based method for identifying and expressing conflicting evidence is described in Section 3. Section 4 provides the data fusion algorithm based on the degree of mutual support between beliefs on the basis of Section 3, and simulation verification is performed in Section 5. In Section 6, we present the concluding remarks for this paper, as well as the outlook of our future work.

Related Work
The DS theory uses Θ to denote a set of mutually exclusive and independence hypotheses about some problem domain. Thus, Θ is composed of all the possible answers to a question and is called the sample space or frame of discernment, which is indicated as Θ = { , ,..., }. We can define P(Θ) as the set of all possible combinations of all the elements, which is denoted as P ( Then, function m is called the basic probability assignment (BPA), the quantity m(A) represents the measure of belief that is committed exactly to hypothesis A, and A is called the focal element if > 0. Note that , are two basic probability assignments of discernment frame Θ, for A , and the combination of , is given by the following formula, which is called Dempster's rule of combination: where 1−K is called the normalization constant, and such that: However, the conflict data brings a challenge for the application of the DS theory. Many researchers have been researched the conflict data of the evidence theory, there has been no clear definition. Here, we introduce a simple and intuitive definition of the conflict data. Definition 1. (The notion of conflict): in DS evidence theory, the conflict between two reliabilities can be qualitatively expressed as a source of evidence strongly supports a hypothesis, but another source of evidence strongly supports an alternative hypothesis, and the two assumptions are incompatible.
In fact, the degree of conflict between the evidences can also be said the degree of mutual support. If the degree of conflict is small, it means conclusions of the evidence can support each other; otherwise, it means they cannot support each other. There have been some improvements and schemes resolving conflict data problems when using Dempster's rule of combination. The existing research can be approximately classified into two aspects: one is the improvement of the analysis, judgment, and expression of conflict evidences, and the other is the amelioration of the fusion algorithm for conflicting data.

Research on Analysis, Judgment, and Expression of Conflicting Evidence
As the fusion of conflicting data can be problematic when using Dempster's rule of combination, it is very important to reasonably and effectively judge the degree of data conflict. There have been various studies on this topic.

Conflicting Coefficient-Based Expression Methods
The earlier analysis of conflicting data was based on the quantity K computed by Dempster's rule of combination. It can be observed from the formula that K is calculated by the sum of the products of every two BPAs of completely conflicting propositions B and C. This is called the conflicting coefficient, the value of which can partly reflect the proportion of uncertain information caused by conflicting evidence. However, it has been proven from practical experience that there is no direct relationship between the value of K and the conflicting relation of two pieces of evidence. Thus, it is unreasonable to measure the degree of conflict between pieces of evidence using quantity K.

Distance-Based Conflict Expression Methods
Jousselme et al. introduced the notion of vector space for evidential theory to propose a conflict expression method based on the distance between the pieces of evidence [2]. The authors proposed to regard every basic probability assignment as a multi-dimensional vector. Thus, the degree of conflict between different pieces of evidence can be measured by the distance between the corresponding vectors, with a greater the distance indicating a higher degree of conflict. In contrast, a shorter distance indicates a lower degree of conflict and a better degree of mutual support between pieces of evidence. Definition 2. (Jousselme distance): let Θ be the frame of discernment composed of n mutually exclusive and exhaustive hypotheses. 1 m , 2 m are two basic probability assignments of discernment frame Θ; and the distance between two pieces of evidence for 1 m and 2 m is found as follows: Here, 1 m and 2 m are vector representations of 1 m and 2 m , and the dimensionality is 2 N . Meanwhile, D is a 22 NN  matrix, the elements of which are calculated as follows: The introduction of matrix D has the advantage of taking the similarity between the basic probability assignments into consideration. Hence, it avoids the disadvantage of Euclidean distance. However, the complexity of the algorithm will be very high because of the great size of matrix D .

Pignistic Probability-Based Conflict Expression Methods
Some researchers have started from the results of the basic probability assignments and proposed Pignistic probability-based conflict expression methods.

Definition 3.
(Pignistic probability function) [3]: Suppose m is the basic probability assignment of sample space Θ, and A is a subset of Θ. Then, the Pignistic probability assignment function : BetP   is defined as follows: Here, A indicates the number of elements in A. The Pignistic probability of the subset A is: The transformation from basic probability assignment m to the Pignistic probability is called the Pignistic probability transformation. Pignistic probability can effectively reflect the belief assignment of a given hypothesis, because it takes into consideration the influence of the inclusion relation between hypotheses on the probability.

Gambling Credibility Distance-Based Conflict Expression Methods
On the basis of Pignistic probability-based conflict expression methods, Liu proposed using the concept of using the gambling credibility distance to express the degree of conflict [4]. The gambling credibility distance lives up to the original definition of the judgment of an evidence conflict by judging the degree of evidence conflict through determining the maximum difference in the degrees of support given by different pieces of evidence to the hypothesis. If the difference is relatively great, that suggests that the difference between pieces of evidence is larger, which indicates the existence of conflict. In contrast, if the maximum difference is very small, this means the conflict between the pieces of evidence is small. The computation complexity of this method is significantly less than the Jousselme distance-based method, but it has the disadvantage of not being sensitive enough to the degree of conflict.
Example 1. The following shows the basic probability assignments for pieces of evidence E1, E2 and E3 under discernment frame Θ: ( ) 0.1, ( ) 0.1, According to the concept of the gambling credibility distance, A, B and C of this example are singleton sets and have no intersection. Thus, the Pignistic probability is the same as the belief assignment result. We can see that the degrees of conflict for the two methods are the same. However, even though the maximum difference in Pignistic probability between E2 and E1 equals that between E3 and E1, it is observed that on the condition that the belief assignment result for A of E3 remains the same, the belief assignment result for B increases. Thus, E3 is approximately closer to E1, and the degree of conflict is smaller than that between E2 and E1.

Axiomatic Definition of Conflict between Belief Functions
Destercke and Burger [5] proposed to measure the conflict between two bodies of evidence from a different perspective. He started by examining consistency and conflict on sets and extract from this settings basic properties that measures of consistency and conflict should have. They then extend this basic scheme to belief functions in different ways. Two main ideas motivate this study and its results. The first is the idea that the properties of conflict measures between belief functions should be based on properties that appear natural when measuring conflict between sets. The second is the idea that a measure of conflict between sources should not depend a priori on a specific dependence assumption between the sources.
Conflict k for sets should be such that: In particular, they do not make any a priori assumption about sources dependence and only consider such assumptions as possible additional information.

A New Combination Rule to Keep the Initial Meaning of the Conflict
Lefè vre and Elouedi [6] proposed a combination rule (CWAC) to preserve the main role of a conflict as an alarm signal indicating that there is a kind of disagreement between sources. The combination rule provided an adaptive weighting between Dempster's rule and conjunctive rule, which enables keep the initial meaning of the conflict obtained during the combination and thus to restore its initial role of alarm. Thus, it permits the conflict to take back its initial sense by only mentioning that there is a problem.
2.1.7. Using Discounting Rate to Reduce the Weight of the Evidence Klein and Colot [7] used the evidential conflict analysis to mine singular sources. They regarded the degree of conflict as a function of parameters called discounting rates, thus to introduced a criterion to weights the contribution of each basic belief assignment (bba). The bba will be assigned a large discounting rate when a source is unreliable so that its weight in the combination process is reduced. The proposed criterion was more robust to parameters such as the proportion of conflicting bbas and the number of bbas with a better or equivalent efficiency.

Research on Fusion Algorithm for Conflicting Data
Nowadays, researchers at home and abroad mainly focus on two aspects of fusion algorithms: improving Dempster's combination rule and modifying the evidence source model.

Improvement in Dempster's Combination Rule
When the pieces of evidence are highly conflicting, Dempster's combination rule normalizes the conflicting data directly without a reasonable belief assignment, which leads to unreasonable results after data fusion. In view of this problem, some researchers have proposed new combination rules.
Yager considered the discernment frame to be exhaustive under a closed world assumption. Therefore, the conflicting data should be regarded as unknown and uncertain information and contained in m(Θ). This removes the regularization process of the original combination rule [8]. In contrast, Smets held the view that the conflict is also a kind of information, the cause of which is an incomplete discernment frame for an unknown environment. Thus, Smets assigned the conflicting data to the focal element for a combined empty set, and then proposed a transferable belief model [9]. Doubois and Prade suggested assigning the conflicting data to the union set of conflicting focal elements [10]. Lefevre et al. focused on the assignments of conflicting pieces of evidence, and proposed a generic form for conflicting evidence assignment after synthesizing several methods for conflict resolution [11]. Later studies were primarily based on the above-mentioned ideas.

Improvement of Evidence Source Model
Other researchers maintained that the unreasonable fusion results come from direct fusion without preprocessing the conflicting data.
Some representative methods are as follows: Shafer introduced a discount factor to modify the mass function when some uncertain information is given [12]. Murphy put forward the average evidence combination rule, after analyzing the existing methods [13]. However, the average combination rule simply considers the average for the pieces of evidence without taking the correlation between them into consideration. Thus, it is sensitive to the influence of data error. On the basis of Murphy's method, many improved methods have been proposed. Deng et al. [14] introduced the Jousselme distance, which was introduced in Section 2.1, into Murphy's method to obtain the weighted average of multi-source evidence, the weight of which depends on the mutual degree of support between pieces of evidence. Han et al. [15] took the uncertainty of evidence into consideration based on the foundation of Deng's work, but it is unreasonable to measure the credibility of the evidence by the belief assignment result. For example, 1 It is unreasonable to consider that 2 m is more credible, for the uncertainty of the reliability allocation results of 2 m is smaller than 1 m . Liu et al. [16] suggested using the gambling credibility distance and conflicting coefficient introduced above to modify the mass function.
In general, there are deficiencies in all of the existing methods for improving the analysis, judgment, and expression of conflicting data, as well as the fusion algorithm for such data. In the following section, we will introduce our solution to deal with conflicting data.

Open World Assumption and Closed World Assumption
In brief, Closed World Assumption means that all situations are known in the current environment and the Open World Assumption means that there are unknown situation in the current environment. In evidence theory, there is a special condition: ( ) 0 m   . And this assumption belongs to the Closed World Assumption, for all reliabilities are allocated in the known identification framework. But in the Transferable Belief Model (TBM), which is proposed by Smets and Kennes [17], the value of () m  can be larger than zero. This means the reliabilities are distributed beyond the identification framework. Thus, the evidence theory application has been generalized to the Open World Assumption. In this article, we will discuss basic reliability allocation policy in Open and Closed World Assumption respectively.

Cosine Theorem-Based Method for Identifying and Expressing Conflicting Data
This paper references Jousselme's idea of using distance to analyze conflicting pieces of evidence. We introduce the concept of vector space and combine it with the angle concept from the cosine theorem to analyze and identify the conflicting evidence.
Considering the computing complexity of a data conflict expression method, this paper suggests using an n-dimensional vector space, where n is the number of elements from discernment frame Θ, which is more efficient than the 2 n -dimensional vector space proposed by Jousselme. In addition, for the open world assumption, an n + 1-dimensional vector space for analyzing the conflicting evidence is needed because there may be a belief assigned to the empty set.
For a discernment frame Θ that contains n elements, there are 2 n possible hypotheses in the problem domain. Therefore, we should perform the transition from a 2 n -dimensional belief assignment vector to an n-dimensional vector, where the converted n-dimensional vector is obtained from the Pignistic probability function introduced in Definition 3.
Assume that Θ = {Θ 1 , Θ 2 ,…, Θ n }, namely n mutually exclusive and exhaustive elements; m is the basic probability assignment of 2 n hypotheses; and the n-dimensional Pignistic probability vector of m is calculated by Formula (7): Equation (9) accomplishes the transition from the 2 n -dimensional belief assignment vector to the n-dimensional vector. Under the open world assumption, to distinguish from the conflicting coefficient K of Dempster's combination rule, we denote the n + 1 dimensional of PignisticVectorm, which expresses the Pignistic probability of hypotheses out of the discernment frame by () According to the definition of evidence theory, the basic probability assignment m will be greater than or equal to 0 and less than or equal to 1. Therefore, the Pignistic probability of n elements from Θ is still equal to 0 and less than or equal to 1 when the transition is completed to the n-dimensional vector of PignisticVectorm.
Above all, the belief assignment result for every piece of evidence will be mapped to an n-dimensional vector starting from the origin and ending with the vector calculated by Equation (7).
According to Example 1, in a case where the PignisticVector is the same as the original basic probability assignment, the corresponding vector space is shown in Figure 1. We have completed the transition to the n-dimensional vector of the belief assignment result. According to the definition of the basic probability assignment, the sum of the assignment results equals 1. Therefore, the sum of n Pignistic probabilities of PignisticVectors from the same evidence will also be 1, which ensures that different PignisticVectors map to different space vectors (there is a certain angle between vectors). This fits the intuitive understanding that only identical assignment results infer that there is no conflict. Therefore, using the concept of angle to judge the degree of conflict between pieces of evidence is reasonable in terms of intuitive sense. The following shows a more detailed analysis. As the space vector coordinates are known, it is easy to get the angle between the vectors using the cosine theorem. For discernment frame Θ = {Θ 1 , Θ 2 ,…,Θ n }, 1 m , 2 m are two basic probability assignments of Θ. We can get Because the Pignistic probability should be limited to interval [0, 1], the angle between PignisticVectors will be [0°, 90°], and the corresponding cosine values should be [0, 1]. When the angle is 0°, the cosine value is 1, which infers that the two pieces of evidence are fully supported. When the angle is 90°, the cosine value is 0, which means the degree of conflict is the largest. As the cosine function is monotonically decreasing in the domain [0, ] 2  , the cosine value decreases with an increase in the angle between vectors.
To express the degree of conflict between pieces of evidence, this article uses the following definition: ConfDegree denotes the degree of conflict between evidence m 1 and m 2 , and the conflicting value ConfDegree monotonically increases with an increase of the degree of conflict. When the belief assignment results for two pieces of evidence are the same, we can obtain 0 ConfDegree  , whereas, when the pieces of evidence conflict completely 1 ConfDegree  . Specifically, the n-dimensional pignistic probability space vector converted by Formula (9) may impact the real reliability allocation results.   (9), even though the initial reliability allocation results of evidence E1 and E2 are different.
The above result seems like violating the analysis principle of the conflict data. But, due to 2 ( ) 1 m  , it means E2 does not know the concrete reliability allocation results. According to the uncertainty principle of retaining unknown information in maximum entropy theory, the result of 22 ( ) 0.5, Inevitably, the existence of mixed set also can lead to situation that the PignisticVectors are the same. Under these conditions, we could calculate the cosine of the initial reliability allocate results and without the mixed set. Then, comparing the value of cosine with the cos ( c ) PignisticVe tor . If these two values are pretty inconsistent, we consider that the procedure of reducing the dimension is not necessary. The cosine could be calculated just by the initial reliability allocation results.

Evidence Fusion Algorithm Based on Degree of Support between Pieces of Evidence
In Section 3, the method for computing the degree of support between pieces of evidence was defined. To make further efforts, an evidence fusion algorithm is proposed based on the degree of support, which is calculated using the cosine of the pieces of evidence. The constructed correlation matrix considers the relativity among pieces of evidence. As a result, the proposed algorithm has a stronger ability to eliminate conflict between the pieces of evidence.
According to the obtained masses, the presented algorithm calculates the angle between the pieces of evidence, which is the metric for the degree of support between them. Then, we calculate the mean value of the weighted credibility by using Deng's method. Finally, we apply Murphy's method to fuse the data and obtain the fused masses. On basis of the results, we can consider whether it is necessary to add new evidence or perfect the frame for the discernment.
Using Expression (10), the angle between two pieces of evidence can be obtained. Assume that there are k data sources. After calculating the degrees of support, the correlation matrix can be constructed as follows: In the correlation matrix S, the elements on the diagonal are the degrees of support, which are all equal to 1. The sum of all the elements in row i is the degree of support for evidence i. The support vector for k pieces of evidence is shown in the following expression: It's important to note that the proposed algorithm adds up all the elements in a row, including the elements on the diagonal (all equal to 1). This is different from Deng's method, which does not include the diagonal elements. Intuitively, the result calculated by our method can better reflect the real support degree of the mutual evidence.
For example: suppose that there are three evidences 1 2 3 , and m m m , and the correlation matrix S: (1) Calculate the correlation coefficients and construct correlation matrix S. After the Pignistic transfer process, the number of credibility assignments will decrease from 2 n -dimension to the n-dimension. With the use of the cosine theorem, the degrees of support for the pieces of evidence will be obtained, after which correlation matrix S is constructed. (2) Calculate the support vector and credibility assignments for the pieces of evidence. By using Expressions (12)- (14), the support vectors and credibility values for the pieces of evidence will be obtained. In a situation with a high degree of conflict for one piece of evidence, it is unreasonable to only use other pieces of evidence to calculate its degree of support. Therefore, it is included in Expression (12) to avoid the occurrence of very low support. (3) Calculate the weighted mean of the credibility assignments by using the obtained degrees of support and credibility values. The transferred credibility assignments can be calculated using Expression (15). (4) Fuse the obtained credibility by using the Dempster's combination rule and Murphy's method.
In Murphy's method, the evidence's mass is just reformed by averaging. Here, we reform the mass using a weighted mean, which not only benefits from the advances of Murphy's method in dealing with conflicting pieces of evidence with a high calculation speed, but also takes into account the correlation between pieces of evidence.
(5) Under the Open World and Close World Assumption, make the decision based on the data fusion results.
As we all know, the high conflict of the data is equivalent to the great difference of the reliability allocation on the same assumption. Because we adopt the evidence weighted average fusion method in this paper, there is maybe a potential problem. If the difference of the reliability allocation, which is obtained after the fusion operation of the high conflict data is less than the preset difference threshold th, it is difficult to make the decision. In particular, the value of th is set based on the actual situation. In order to make the decision, under the Close World Assumption, we need increase the number of evidences. And under the Open World Assumption, we need to add new assumptions in the frame of discernment.
The flowchart of the process is given in Figure 2.

Results for Degree of Conflict Based on Cosine Theorem
The proposed method defined a new way to measure the conflict between pieces of evidence, and also the degree of support. It is essential to the method, so it is necessary to validate whether it is reasonable to show the conflict level.
Given the discernment frame { , } AB  , there are two pieces of evidence with the credibility assignments shown below: where x denotes the credibility of hypothesis A, and the credibility of hypothesis B is 1−x. With an increase in x, the degree of support between the two pieces of evidence is as shown in Figure 3. The x-axis is the results of the reliability allocation. The y-axis is the similarity of the evidences E1 and E2. As shown in Figure 3, when x increases from 0 to 0.5, the degree of support increases from 0.7071 to 1.0. In contrast, in the interval [0. 5,1], the degree of support decreases with an increase in x, and the cosine curve line is symmetrical with x = 0.5. The change is reasonable to show the relativity between pieces of evidence. When the conflict level is large, to simulate this situation, we assume that the credibility values of E1 and E2 are as follows: The obtained degree of support is shown in Figure 4.  Next, we discuss a comparison simulation of the three methods for measuring conflicts: BPA d based on the Jousselme Distance, difBetP based on Pignistic probability, and ConfDegree based on the proposed method. In [2], the situation is given as 1 2 20 { , , , }  , and the credibility assignments for pieces of evidence E1 and E2 are as follows:  Table 1: Table 1. Change process of set A in each step.
Step With the increase in the elements in A, the conflict between two pieces of evidence is shown in Figure 5. These results prove that this is also an effective and reasonable way to measure the conflict between pieces of evidence. Compared to the two existing methods, the proposed method is similar in the change trend of Jousselme's method, and the two methods are more reasonable for showing the conflicts. Moreover, the amount of calculation needed for the proposed method is lower than that for Jousselme's method. The method based on the Pignistic probability has a very low computational complexity. However, it does not show the increasing trend after Step 8, which shows that it is not sensitive to the change in the conflict between pieces of evidence.

Fusion Results Based on Degree of Support between Pieces of Evidence
To verify the rationality and validity of the proposed fusion method based on the degree of support, we use the example in [14] to test the algorithm and analyze the results.
Assume that there are three objects in a target recognition system. Thus, the discernment frame is { , , } A B C  . There are five different kinds of sensors to observe the object. They are a CCD sensor (S1), sound sensor (S2), infrared sensor (S3), radar (S4), and ESM sensor (S5). Now, there is an object that belongs to A, and the pieces of evidence obtained from the five kinds of sensors are as follows: In this situation, the perception of the sound sensor (S2) is abnormal, which may cause wrong fusion results. Table 2 lists the fusions results when using different combination rules. ,, m m m  Table 2, the fusion results with the Dempster combination rule are abnormal when S2's evidence conflicts with the other sensors, which causes the wrong decision results. This shows that the Dempster combination rule will produce counter-intuitive results, and the abnormal evidence accounts for a veto to the fusion results. When more pieces of evidence are included, the combination rules of Yager, Murphy, Deng, and the proposed method all produce accurate and reasonable results, which solves the "veto problem" effectively. Yager's method has a lower convergence speed than the other two methods. Murphy's method just reforms the pieces of evidence using simple averaging, and it is sensitive to a change in a single piece of evidence. Deng's method and the proposed method both take the approach of weighted averaging. Their change trends and convergence speeds are similar. However, the proposed method is easier to use and requires a lower amount of calculation.
To validate the rationality of the proposed method, we perform experiments in different situations. Group 1 includes two pieces of evidence with a high degree of conflict. Group 2 includes two pieces of evidence with low conflict, and Group 3 includes three pieces of evidences with high degrees of conflict. The discernment frame is { , , } A B C  . In Table 3, the fusion results for Groups 1 and 3 show that the proposed algorithm is able to effectively avoid the Zadeh paradox. With different pieces of evidence, the fusion results will get a more reasonable credibility assignment. The proposed method's results are also reasonable in a low-conflict situation. The experimental results prove that the proposed method is a rational and effective way to detect and solve the conflict between pieces of evidence in data fusion by using the DS evidence theory. Table 3. Fusion results for proposed method in different situations.

Conclusions
In this paper, the main problems in data fusion were first briefly introduced. Considering the deep influence of conflicts between pieces of evidence on the fusion results, a new method based on the cosine theorem was developed to measure the conflicts between pieces of evidence. The proposed method is easy to implement and is able to show the degree of support between pieces of evidence. Moreover, this paper presented a new improved evidence fusion algorithm based on the degree of support. Experimental results proved that the proposed method is able to produce reasonable fusion results. It is an effective way to deal with the Zadeh paradox, veto problem, and fair credibility assignment problem.
At the same time, there are still problems that need to be solved. In this paper, the index explosion in the DS evidence theory was not taken into consideration. On the other hand, in the weighted mean step using the degree of support, we just considered the degree of support between the different pieces of evidence and ignored the influence of other factors. In a future study, the improvements may include the following: (1) applying the degree of support to develop a flexible and effective new way to deal with the index explosion problem and (2) improving the weighted mean step and considering additional influence factors.