Hybrid approach for the assessment of PSA models by means of binary decision diagrams

https://doi.org/10.1016/j.ress.2010.04.016Get rights and content

Abstract

Binary decision diagrams are a well-known alternative to the minimal cutsets approach to assess the reliability Boolean models. They have been applied successfully to improve the fault trees models assessment. However, its application to solve large models, and in particular the event trees coming from the PSA studies of the nuclear industry, remains to date out of reach of an exact evaluation. For many real PSA models it may be not possible to compute the BDD within reasonable amount of time and memory without considering the truncation or simplification of the model.

This paper presents a new approach to estimate the exact probabilistic quantification results (probability/frequency) based on combining the calculation of the MCS and the truncation limits, with the BDD approach, in order to have a better control on the reduction of the model and to properly account for the success branches. The added value of this methodology is that it is possible to ensure a real confidence interval of the exact value and therefore an explicit knowledge of the error bound. Moreover, it can be used to measure the acceptability of the results obtained with traditional techniques. The new method was applied to a real life PSA study and the results obtained confirm the applicability of the methodology and open a new viewpoint for further developments.

Introduction

Probabilistic safety assessment is a well-established technique for integrating various reliability models and to numerically quantify the frequency of damage in nuclear facilities. Its use is now widespread in nuclear regulation as it complements traditional deterministic analysis, providing a comprehensive and structured approach in identifying undesired accident scenarios, computing its likelihood in terms of occurrence frequency, and assessing the consequences and mitigation strategies. In terms of the mathematical tools used, PSA studies rely on the fault tree/event tree (FT/ET) methodology to obtain the response model of the facility.

The majority of computational tools used to assess the FT/ET models have implemented what is called the “classical” approach, namely the kinetic tree theory [1]. This approach, broadly used and accepted, is based on the computation of minimal cutsets (MCSs for short) by means of the Boolean reduction and on the use of probabilistic (frequency) cutoffs, owing to the complexity of the models. Truncation cutoffs on probability (or frequency) and also on the order of the MCS have to be applied to avoid MCS explosion. To avoid computational complexity, success (i.e. negated) logic is avoided in the FT/ET evaluation.

Bryant’s binary decision diagrams (BDD) [2], [3] are a well-known alternative to the minimal cutsets approach to assess the Boolean models. BDDs have the remarkable property of having complexity that is not related to the number of cutsets of the encoded Boolean formula. Conversely to the classical methodology, the BDD approach involves no approximation in the quantification of the model and is able to handle correctly the negative logic (success branches) at low additional complexity cost. However, BDDs are also subject to combinatorial explosion as the final size of the BDD is very sensitive to the variable ordering needed to convert the model into it.

After more than two decades of application, the BDD methodology has been applied successfully to improve the fault tree assessment and its introduction in the field has permitted renewing its algorithmic framework. In the last years, several works as well have undertaken its application to event tree assessment [4], [5], [6]. However, attempts to apply it to very large models, such as the ones coming from the PSA studies of the nuclear industry, which includes several thousand of basic events and logic gates, remain to date out of reach of a full automatic treatment. Although some attempts have been successful [4], for such large models it might not be possible to compute the BDD within reasonable amount of time and computer memory without considering truncation or simplification of the model. Consequently, it is necessary to explore new approaches to the problem. A potential solution is to develop a hybrid approach that combines the calculation of the MCS with the BDD approach, which allows obtaining a better and more controllable bound approximation of the model. The motivation and the basis of this new approach are the principal contribution of the work presented in this paper.

The remainder of this paper is organized as follows: Section 2 is devoted to introduce some basic terminology, to describe the particularities of the PSA models and to introduce the case study. Section 3 reviews the existing approaches for the FT/ET assessment, namely the classical and the BDD approaches. Section 4 specifically focused on the problem of model simplification that is performed with the classical approach. Section 5 presents the mathematical foundation of the hybrid approach. Finally, the experimental results and the conclusions are provided in 6 Experimental results, 7 Discussion and conclusions, respectively.

Section snippets

Description of the PSA models

This section is devoted to introduce the basic terminology and concepts needed to describe the Boolean models used in the PSA studies and to present the case study.

Current approaches for the PSA model assessment

In this section, we review two different existing approaches to assess the Boolean models in the context of PSA studies, namely the classical approach based on the computation of MCS and the most recent based on the binary decision diagrams.

Model pruning in the MCS approach

Probabilistic analyses are performed with codes that produce results, which have not been contrasted with alternative methodologies to validate them. The basis of the MCS approach is to reduce the model to a manageable size by discarding the less significant failure modes and to produce only the most relevant failure paths. This approximation is justified by the fact that these cutsets capture, in general, the most relevant parts of the model in terms of the contribution to the top event

Hybrid approach for ET assessment based on syntactical reduction

As a result of what has been concluded in the previous section, the key issue is that the reduction process should be more understandable and better controlled than it is when the model is manipulated through the one based on the MCS expansion and truncation. Moreover, it is required to be compatible with the treatment of negative logical models. Several works have proposed hybrid methods to apply the reduction in the progress of the BDD construction using as well truncation limits, leading to

Experimental results

This section presents the numerical results obtained by applying this new hybrid approach to quantify linked fault tree models coming from the PSA studies. This analysis considers four from the six accident sequences of the case study previously presented in Section 2.3. The other two sequences are not considered for being excessively trivial. The equations of these sequences are as follows:S5=F0¯E1¯F1¯F3F4F5¯F6E3S8=F0¯E1¯F1¯F3F4F5E2E3S13=F0¯E1¯F1F2¯F3F4F5¯F6E3S16=F0¯E1¯F1F2¯F3F4F5E2E3

The three

Discussion and conclusions

Binary decision diagrams have proved to be of great interest from a practical and a theoretical point of view. In the reliability engineering framework, it can be considered as a mature technology. However, large models in general, and specifically some of the ones coming from the PSA studies of the nuclear industry, are still out of reach of an exact evaluation. The models are too large and complex to be fully mastered with any of the current existing approaches and tools. It can be argued

Acknowledgement

This research has been supported by the Spanish Safety Council (CSN).

References (32)

  • R.E. Bryant

    Symbolic manipulation with ordered binary decision diagrams

    ACM Computing Surveys

    (1992)
  • Nusbaumer O.P.M. Analytical solutions of linked fault tree probabilistic assessment using binary decision diagrams with...
  • J.D. Andrews et al.

    Event-tree analysis using binary decision diagrams

    IEEE Transactions on Reliability

    (2000)
  • S. Beeson et al.

    Calculating the failure intensity of a non-coherent fault tree using the BDD technique

    Quality and Reliability Engineering International

    (2004)
  • A. Rauzy

    Mathematical foundations of minimal cutsets

    IEEE Transactions on Reliability

    (2001)
  • Nuclear Regulatory Commission (NRC). A Guide to the performance of probabilistic risk assessments for nuclear power...
  • Cited by (15)

    • A new method for analysis of Multi-State systems based on Multi-valued decision diagram under epistemic uncertainty

      2023, Reliability Engineering and System Safety
      Citation Excerpt :

      One of effective representations of the large dimension systems is MDD [57,62,63]. This representation is used in many studies for the analysis of availability [57,62,20], calculation of importance measures [63,4], definition of minimal cuts/paths sets [22,30]. But all of these studies suppose that the system behaviour has no uncertainty.

    • Human factors and nuclear safety since 1970 – A critical review of the past, present and future

      2021, Safety Science
      Citation Excerpt :

      Next, PRA was adapted to be used dynamically to account for changing system states in NPPs (Hsueh and Mosleh, 1996). After 2000, PRA has adapted further for novel applications, e.g. integration of PRA with binary decision diagrams to provide more accurate confidence intervals (Ibanez-Llano et al., 2010; Zubair and Zhijian, 2013). The most recent research has included the use of PRA for whole nuclear sites containing multiple reactor units at different points in their life cycles (Kim et al., 2018a,b; Modarres et al., 2017).

    • Current Status and Applications of Integrated Safety Assessment and Simulation Code System for ISA

      2017, Nuclear Engineering and Technology
      Citation Excerpt :

      In addition, research projects were launched to verify the efficiency of the computations by comparison with alternate ET/FT approaches. Among these approaches, binary decision diagrams (BDD) constitute a popular alternative; one of the projects [15] used them for this purpose. Furthermore, as regulators, CSN staff members need independent views of licensee models, which calls for in-house made models.

    • Dynamic labelling of BDD and ZBDD for efficient non-coherent fault tree analysis

      2015, Reliability Engineering and System Safety
      Citation Excerpt :

      Hence, on very complex fault trees, also the truncation methods may fail due to insufficient memory. Recently, new methods have been developed to analyse fault trees of any complexity, allowing determining upper and lower bounds [13,14] or exact values [15] of the top event probability; these methods, however, are not dealt with in this paper. Fig. 1 summarises the available BDD/ZBDD analysis procedures.

    • Analysis of large fault trees based on functional decomposition

      2011, Reliability Engineering and System Safety
      Citation Excerpt :

      Hence, new methods are needed. Recently a hybrid method was proposed by Ibánes-Llano et al. [18]. This method is based on the modification of the fault tree in such a way to obtain two simpler fault trees, whose analysis allows determining upper and lower bounds of the top-event unavailability.

    View all citing articles on Scopus
    View full text