On the Uncertainty Properties of the Conditional Distribution of the Past Life Time

For a given system observed at time t, the past entropy serves as an uncertainty measure about the past life-time of the distribution. We consider a coherent system in which there are n components that have all failed at time t. To assess the predictability of the life-time of such a system, we use the signature vector to determine the entropy of its past life-time. We explore various analytical results, including expressions, bounds, and order properties, for this measure. Our results provide valuable insight into the predictability of the coherent system’s life-time, which may be useful in a number of practical applications.


Introduction
The process of quantifying and managing uncertainty over the random life-time of a system is a major task for engineers. As uncertainty increases, the reliability of a system will also decrease, so systems that have a longer life-time while benefiting from a lower level of uncertainty are preferable (see, e.g., Ebrahimi and Pellery [1]). The concept of uncertainty has far-reaching applications, as highlighted in Shannon's seminal work on information theory [2]. Information theory has provided valuable tools for evaluating and managing uncertainty in engineering systems. Let X be the lifespan of a system or other living organism with an absolutely continuous cumulative distribution function (cdf) F(x) and a probability density function (pdf) f (x). Shannon's differential entropy is a well-known measure and is given as follows: where "log" stands for the natural logarithm. If X represents the life-time a new system has, then H(X) calculates the uncertainty for the life-time of the system. In certain scenarios, operators may only partially know the age that a system currently has. For example, an operator may know that a system was in service at specified time t, and he/she is quantifying the uncertainty for the remaining life-time of the system after age t, commonly referred to as the remaining life-time or residual life-time after t. According to Ebrahimi [3], the residual entropy of X is considered to be the entropy of X t = [X − t|X > t]. Formally, for all t > 0, the residual life-time entropy for X is measured as If we already know that an object has survived to time t, then H(X t ) quantifies the uncertainty contained in the distribution of remaining life-times. Di Crescenzo and Longobardi [4] have proposed a notion of past entropy over the interval (0, t) using an analogy with the definition of entropy over time given in Equation (3). The introduction of the past where τ(x) = f (x)/F(x) is known as the reversed hazard rate function. Various aspects and statistical perspectives on past entropy have been treated in the literature, as can be seen in Di Crescenzo and Longobardi [4], Nair and Sunoj [5], Loperfido [6], and Shangari and Chen [7], as well as in the references used in these papers. In this case, Gupta et al. [8] obtained some results concerning the residual entropy and past entropy for order statistics, as well as presented several relevant stochastic ordering properties. In this context, they provided some characterization properties; see also [9]. Recently, Toomaj et al. [10] applied the residual entropy to a coherent system and obtained several related properties. Kayid and Alshehri [11] have recently studied the uncertainty in coherent structures using Tsallis entropy. In addition, Mesfioui et al. [12] also studied the phenomenon of uncertainty in the life-time of a coherent system using the Rényi entropy. In this research, we consider a coherent structure where all of the components have failed at time t. The system signature approach is utilized to compute the differential entropy of the past life-time.
The contents of this paper are organized as follows: In Section 2, we present a formula for the Shannon differential entropy of a coherent system when all components are inactive at time t. The method of system signature is applicable when the random life-times of the components are independent and identically distributed (i.i.d.). In Section 3, some valuable bounds are pointed out and outlined. In Section 4, the Jensen-Shannon disparity of the coherent framework is considered. Some concluding remarks are outlined in Section 5.

The Past Life-Time Uncertainty in Coherent Systems
Here, in order to define the past-life entropy for coherent structures, we apply the signature vector of the underlying structure. We assume that all of the components in the system have become inactive at time t. The coherent system is defined as a system that satisfies the requirements of having no unnecessary components and has a monotonic structure function. The vector p = (p 1 , . . . , p n ), in which the ith component is given by p i = P(T = X i:n ), i = 1, 2, . . . , n, is known as the signature vector (see [13]). We contemplate a coherent structure with components that have i.i.d. random life-times X 1 , . . . , X n and a specified signature p = (p 1 , . . . , p n ). If T t = [t − T|X n:n ≤ t] stands for the past life-time of the system, provided that at time t the components have all become inactive, then from the results of Khaledi and Kochar [14], the survival function of T t can be obtained as where P(t − X i:n > x|X n: denotes the survival function of the past life-time of an i-out-of-n system as long as all of the components have failed at time t. It follows from (4) that where such that Γ(·) is the full gamma function and T i t = [t − X i:n |X n:n ≤ t], i = 1, 2, · · · , n, is the time elapsed since the failure of the component with life-time X i:n in the system, assuming that the system failed at or before time t. Remark that T i t denotes the i-th order statistic among n i.i.d. components with cdf give a statement about the entropy of T t . To this aim, let us set F t (x) = F(x) F(t) , 0 < x < t. The probability integral transformation given by V = F t (T t ) plays a vital role in our study and it is obvious that U i:n = F t (T i t ) follows the beta distribution with parameters i and n − i + 1 with pdf for all i = 1, · · · , n. In the forthcoming theorem, we shall give a formula for the past entropy of T t using (6).

Theorem 1.
The past entropy of T t can be expressed as follows: V is the life-time of the coherent system which has pdf g V Proof. By (1) and (6), and by substituting z = t − x, we have The last equality is obtained by changing the variable of u = F t (z), and the proof is then completed.
It is important to keep in mind that Equation (8) expresses the entropy of T t as the sum of two terms, where the first term does not depend on the distribution of past life-times, while the second term depends on the distribution of the past life-times of the component.
If T t = [t − T|X n:n ≤ t] stands for the past life-time of the coherent system under the condition that at time t, all components of the system have failed, then H(T t ) calculates the expected amount of uncertainty induced by the conditional density of t − T, as long as X n:n ≤ t, on the predictability of the past life-time of the system. Especially if we consider an i-out-of-n system with the system signature p = (0, . . . , 0, 1 i , 0, . . . , 0), i = 1, 2, · · · , n, then Equation (9) to for all t > 0. The next theorem is a direct consequence of Theorem 1 that uses the property that the reversed hazard rate of X is decreasing. Recall that the random life-time X belongs to the class of the decreasing reversed hazard rate (DRHR) if τ(x) is a decreasing function of x > 0. (9) can be rewritten as It is plain to verify that F −1 t (u) = F −1 (uF(t)), for all 0 < u < 1, and hence, Using (12), the proof is then completed.
The next example deals with a situation where Theorems 1 and 2 are applied.
Example 1. Consider a coherent system with the signature p = (0, 2/3, 1/3). It follows that H(V) = −0.05757. Given the distributions of the components' life-times, the Relation (9) can be used to determine the precise value of H(T t ). Let us assume the following life-time distributions for this purpose.
(a) Let X be uniformly distributed in [0, 1]. It holds that for all i = 1, 2, 3, 4. From (8), we immediately obtain It is seen that the entropy of T t is an increasing function of time t. We note that the uniform distribution has the DRHR property, and therefore, H(T t ) is an increasing function of time t, as we expected based on Theorem 1.
(b) Let us assume that X follows the cdf One can see that for all i = 1, 2, 3, 4. Upon recalling (9), we obtain for all t > 0. For several choices of k, we have shown the exact value of H(T t ) with respect to time t in Figure 1. It is obvious that H(T t ) is an increasing function of time t for all k > 0 since X is DRHR, as can follow from Theorem 1. The duality of a system is a useful concept for technical reliability, which makes it possible to reduce the computational complexity for determining the signatures of all coherent systems of a given size by about half. Kochar et al. [15] have proposed a duality relation that exists between the signature of a system and that of its dual. If p = (p 1 , · · · , p n ) denotes the signature a coherent system with life-time T has, then the signature of its dual system with life-time T D is given by p D = (p n , · · · , p 1 ). In the following theorem, we apply the duality property to simplify the calculation of the past entropy for coherent systems. First, we need the following the lemma that is well-known as the Müntz-Szász theorem, and one can find it in [16].  Proof. It is worth noting that Theorem 2.2 of Toomaj and Doostparast [17] asserts the equality of entropies between V and V D , i.e., H(V) = H(V D ). To prove sufficiency, let us assume that It is worth noting that g i (1 − u) = g n−i+1 (u) for all i = 1, . . . , n and 0 < u < 1. Consequently, utilizing (9), we obtain that: and this completes the proof by recalling Equation (8). For necessity, H(T t ) = H(T t D ) holds for all p and all n. Let p = (1, 0, . . . , 0). So, it follows from (9) that the assumption where the last equality is obtained by noting that g 1 (u) = g n (1 − u), 0 < u < 1. Putting v = 1 − u in the right side of the above equation leads to Thus, we obtain Lemma 1, and this concludes the proof.
An immediate consequence of the above theorem is given for the i-out-of-n systems.
) satisfies for all 0 < u < 1 and t.

Bounds for the Past Entropy
Hereafter, we provide several useful bounds for H(T t ) by using the concept of the system signature. For the first bound, we use the notion of Kullback-Leibler (KL) discrimination information. We recall that the KL discrimination information between two random variables X and Y with pdfs f and g, respectively, is given by where H(X, Y) = −E(log g(X)) is known as the inaccuracy between f and g. life-times X 1 , · · · , X n having the common pdf f in which, at time t, all components of the system have failed. Then, we have where Proof. For the lower bound, since the differential entropy is a concave function of the density function, we can find a lower bound for the entropy of T t given by the following representation: Moreover, the upper bound can be obtained by noting that the Kullback-Leibler (KL) discrimination information is a non-negative measure. Thus, we have The upper bound (16) can be rewritten as where K(U i:n : U j:n ) = log Γ(j)Γ(n − j + 1) denotes the Kullback-Leibler divergence of beta distributions (see [18] for details). The second equality in (17) is obtained by noting that the KL function is invariant under one-to-one transformations. If we assume that j * = arg min 1≤j≤n n ∑ i=1 p i K(U i:n : U j:n ), then and the proof is then completed.
We remark that by recalling Equation (11), the lower bound can be rewritten as It is worth pointing out that using Equation (11), expression (9) can be rewritten as where H L (V) = n ∑ i=1 p i H(U i:n ). It is worth noting that the difference between the past entropy and the lower bound of T t , i.e., H(T t ) − H L (T t ) is distribution free and depends only on the system signature. For further information about the bounds and to obtain the optimal index j * , we refer the reader to [10,19]. Numerous authors have investigated the characteristics of coherent systems with various distribution components including Murthy and Jiang [20], Jiang et al. [21], Castet and Saleh [22], and Qiu et al. [23], as well as the references therein. To compare the bounds derived in Theorems 4 and 5, we present an example of a coherent system with power distribution components.
Example 2. Consider a coherent system having signature p = (0, 2/3, 1/3). It is easy to see that H(V) = −0.0874. Moreover, we can obtain j = 2 (see e.g., [24]). The exact value of H(T t ) can be computed using Relation (9) when the component life-time distributions are given. Let us denote the life-time of each component by X. We assume that X is a power distribution random variable, with the pdf given by for all i = 1, 2, 3. From (8), we immediately obtain Alternatively, from Equation (19), the lower bound is given as: The upper bound can be obtained by recalling Equation (18) as follows: The entropy of T t is a monotonically increasing function of time t. We note that the power distribution possesses the DRHR property, thus, as expected due to Theorem 1, H(T t ) is also an increasing function of time t. Figure 2 displays the exact value of H(T t ) together with the lower and upper bounds computed as described above for various values of k. As predicted by Theorem 1, it is evident that H(T t ) monotonically increases with respect to time t for all k > 0, since X is DRHR. Another useful lower bound can be obtained in the next theorem.
Theorem 5. By assuming that the conditions in Theorem 4 hold, one obtains for all t > 0. Proof. Due to Lemma 4.1 of Toomaj et al. [24], it holds that H(V) ≤ 0. Upon recalling Equation (20), the proof is then completed.
The following theorem compares the past entropies of two coherent systems that have distinct structures and the same component life-times. Theorem 6. Let T 1,t = [t − T 1 |X n:n ≤ t] and T 2,t = [t − T 2 |X n:n ≤ t] represent the past lifetimes in two coherent systems with signatures p 1 and p 2 , respectively, so that p 1 ≤ st p 2 . Let the system's components be i.i.d. with the common cdf F. Then, for t > 0, Proof. (i) First, it should be noted that the following equation can be used to rewrite Equation (9): Assumption p 1 ≤ st s 2 implies V 1 ≤ st V 2 . So, we obtain in which the inequality in (26) is derived in spirit of the implication that V 1 ≤ st V 2 implies E[π(V 1 )] ≥ E[π(V 2 )] for all decreasing functions of π. Therefore, Relation (25) gives where the last inequality is obtained from the assumption and hence the theorem. Part (ii) is analogously proven.
The following example supplies a situation to apply to Theorem 2.
Example 3. We take into account two coherent systems with four components shown in Figure 3 with past life-times (right panel). It is easily identified that p 1 = ( 1 2 , 1 2 , 0, 0) and p 2 = ( 1 4 , 1 4 , 1 2 , 0), respectively. Further, we can plainly see that H(V 1 ) = −0.2970 and that H(V 2 ) = −0.0575, hence, H(V 1 ) ≤ H(V 2 ). Moreover, we have p 1 ≤ st p 2 . Suppose that the component life-times are i.i.d. with the standard exponential distribution with the cdf F(t) = 1 − e −t , t > 0. It is easily seen that for all 0 < u < 1. Obviously, f t (F −1 t (u)) is a decreasing function of u for all t > 0. Hence, due to Theorem 2, it holds that H(T 1,t ) ≤ H(T 2,t ) for all t > 0. In the next theorem, we use the concept of duality to reduce the calculation of the past entropy of coherent systems. We recall that ≤ st stands for the stochastic order (see Shaked and Shanthikumar [25]).
n ∑ j=1 p i p j K(U i:n : U j:n ).
(30) Therefore, the second upper bound for H(T t ) can be obtained by substituting (30) in place of H(p) in (29).

Concluding Remarks
In recent years and decades, researchers in the field of information theory have become increasingly interested in developing measures that can be used to evaluate the degree of uncertainty in random variables. The phenomenon of uncertainty associated with the life-time of engineering systems is related to other aspects of the systems. For example, imagine a situation in which an inspection at time t by an operator makes it clear that a number of components that were functioning in a system have become inactive. The problem here is that an event has occurred in the past, but there is still uncertainty about the exact time at which the system or the components within it failed. The ability to assess predictability over the life-time of a system can be a valuable criterion in this regard. Differential Shannon entropy has proven to be an attractive measure for quantifying uncertainty in such situations. Assuming that each system component has failed at time t, we have established in this work an equation for the entropy of the life-time of a system. We have also investigated various properties of this proposed measure, including the determination of boundaries and partial orders between the past life-times of two coherent systems based on their entropy uncertainties using the concept of system signature. To demonstrate the effectiveness of our approach, we give several examples of its application. Our results highlight the potential of this measure for assessing the predictability of system life-times and its usefulness for engineering applications.

Data Availability Statement:
No new data were created or analyzed in this study. Data sharing is not applicable to this article.