Coupling Design and Validation Analysis of an Integrated Framework of Uncertainty Quantiﬁcation

: The uncertainty quantiﬁcation is an indispensable part for the validation of the nuclear safety best-estimate codes. However, the uncertainty quantiﬁcation usually requires the combination of statistical analysis software and nuclear reactor professional codes, and it consumes huge computing resources. In this paper, a design method of coupling interface between DAKOTA Version 6.16 statistical software and nuclear reactor professional simulation codes is proposed, and the integrated computing workﬂow including interface pre-processing, code batching operations, and interface post-processing can be realized. On this basis, an integrated framework of uncertainty quantiﬁcation is developed, which is characterized by visualization, convenience, and efﬁcient computing. Meanwhile, a typical example of small-break LOCA analysis of the LOBI test facility was used to validate the reliability of the developed integrated framework of uncertainty quantiﬁcation. This research work can provide valuable guidance for developing an autonomous uncertainty analysis platform in China.


Introduction
Since the 1988 amendment of the 10 CFR 50.46 rule, Best Estimate Plus Uncertainty (BEPU) methods have been approved by the Nuclear Regulatory Commission (NRC) as an alternative way for nuclear safety analysis [1]. The BEPU method has been developed internationally and become a popular nuclear safety analysis method in recent years [2][3][4]. Compared to traditional conservative analysis methods, the BEPU method uses the bestestimate nuclear simulation codes (such as RELAP5, TRACE, and MELCOR) to carry out more realistic and accurate calculations, which can bring more economic benefits on the premise of ensuring nuclear safety. According to the requirements of nuclear safety regulations, the uncertainty quantification is an indispensable step for the verification and validation (V&V) of the best-estimate nuclear codes [5]. From a statistical perspective, the uncertainty of code calculations is usually required to be quantified to satisfy the "95/95 criterion", i.e., the figure of merits with 95% probability are below the prescribed limits with 95% confidence [6].
Uncertainty analysis can reasonably quantify the difference between the computing results and the real situation. Currently, various uncertainty analysis methods have been developed internationally, including the original response surface method applied in the Code Scaling, Applicability, and Uncertainty (CSAU) methodology [7]; the Wilks nonparametric statistical method proposed in the GRS methodology [8]; and the Bayesian optimization method used in Inversed Uncertainty Quantification (IUQ) methods [9], among others. All the uncertainty analysis methods require multiple statistical sampling calculations. Nowadays, some statistical software with uncertainty calculation functions, such as DAKOTA, SPSS, and OPENTURNS, have been developed internationally. Among them, DAKOTA is a widely used statistical analysis software with plentiful functions such as parameter study, experimental design, optimization analysis, and uncertainty quantification [10,11]. The uncertainty quantification function of DAKOTA software is widely used in the field of nuclear engineering, especially the statistical sampling and sensitivity analysis, and can be applied for the validation of various nuclear reactor professional simulation codes in safety analyses.
For the uncertainty quantitative analysis of best-estimate nuclear simulation codes, it usually requires a combination of statistical analysis software and nuclear reactor professional simulation code, and it also consumes a large number of computational resources [12]. During the procedure of conducting uncertainty quantification, it is necessary to transfer and replace the sampling data from the statistical analysis software to the nuclear reactor professional simulation code, as well as to perform a batching calculation of the nuclear simulation code, and furthermore to extract the calculated results and return them to the statistical software for further data processing and analysis. If these steps are carried out manually, it will be very complex and time consuming. The NRC developed the SNAP platform, which can provide a workflow that couples DAKOTA with some nuclear reactor professional simulation codes, such as RELAP5, TRACE and MELCOR, to achieve integrated uncertainty calculations [13][14][15]. SNAP is a huge comprehensive platform without open-source permission, which makes it inconvenient to conduct targeted uncertainty quantitative analysis. Therefore, it is very necessary to develop an autonomous uncertainty analysis platform in China.
In this paper, we proposed an interface design method coupling the DAKOTA statistical software with nuclear power professional codes, which can realize integrated computing workflow including interface pre-processing, code batching operations, and interface postprocessing. On this basis, a visual integrated framework of uncertainty quantification was established on the basis of the Qt graphical interface. Moreover, a typical example of a LOBI test facility was used to validate the reliability of the developed integrated framework of uncertainty quantification. The establishment of an integrated framework will open up a key technical path to develop an autonomous uncertainty analysis platform.

Design of the Coupling Interface between Statistical Software and Nuclear Simulation Codes
The design mechanism of the coupling interface between DAKOTA statistical software and the nuclear reactor professional simulation codes such as RELAP5 is shown in Figure 1. Firstly, the selected uncertainty parameters are sampled by DAKOTA statistical software according to their probability distribution functions, and the results of different sample combinations of uncertainty parameters are used as different calculation conditions for the input files of the nuclear reactor professional simulation code. Then, according to the required number of code calculations, the nuclear reactor professional simulation code with different input conditions is run automatically in batch. Finally, the output results of the professional simulation code are extracted and imported into DAKOTA for statistical processing and analysis, and further uncertainty quantification and sensitivity analysis can be carried out. The coupling interface design process includes interface pre-processing, code batching operations, and interface post-processing.

Interface Pre-Processing
The uncertainty parameters are first sampled by DAKOTA Version 6.16 statistical software according to their probability distributions and ranges. Before the sampling data are transferred to the nuclear reactor professional simulation code (e.g., RELAP5), the DAKOTA parameter sampling file needs to be pre-processed, i.e., converted to a file format readable by the professional simulation code. The parameters, which need to be replaced by sampling results, are specified and marked in the input files of the nuclear reactor professional simulation code. Then, the sample values of the parameters obtained by DAKOTA are written to replace the corresponding parameter values that calibrated in the original input file of the professional simulation code, and thus multiple sets of new simulation code input files are obtained after the replacement of the parameter samples.
Energies 2023, 16, x FOR PEER REVIEW 3 Figure 1. The design mechanism of the coupling interface between DAKOTA statistical soft and nuclear reactor professional codes.

Interface Pre-Processing
The uncertainty parameters are first sampled by DAKOTA Version 6.16 statis software according to their probability distributions and ranges. Before the samp data are transferred to the nuclear reactor professional simulation code (e.g., RELA the DAKOTA parameter sampling file needs to be pre-processed, i.e., converted to a format readable by the professional simulation code. The parameters, which need t replaced by sampling results, are specified and marked in the input files of the nuc reactor professional simulation code. Then, the sample values of the parameters obta by DAKOTA are written to replace the corresponding parameter values that calibrate the original input file of the professional simulation code, and thus multiple sets of simulation code input files are obtained after the replacement of the parameter samp

Code Batching Operations
After obtaining the required multiple code input files, each input file is transfe to the executable file of the reactor simulation code to realize automated batch runn and thus multiple simulation results are obtained after code calculations. It is notewo that the input file name and output file name need to be appropriately modified in command line to meet the distinction between different input card files and the re files before the calculations. In addition, an extension function for coupling different clear reactor professional simulation codes is provided in the form of a user-defi command line. For code batching operations, each code run is performed on the bas the input (.in) file in each folder and generates an output (.out) result file and restart ( file in different corresponding folder directories.

Interface Post-Processing
After the output results are obtained through the code batching operations, the

Code Batching Operations
After obtaining the required multiple code input files, each input file is transferred to the executable file of the reactor simulation code to realize automated batch running, and thus multiple simulation results are obtained after code calculations. It is noteworthy that the input file name and output file name need to be appropriately modified in the command line to meet the distinction between different input card files and the result files before the calculations. In addition, an extension function for coupling different nuclear reactor professional simulation codes is provided in the form of a user-defined command line. For code batching operations, each code run is performed on the basis of the input (.in) file in each folder and generates an output (.out) result file and restart (.rst) file in different corresponding folder directories.

Interface Post-Processing
After the output results are obtained through the code batching operations, the data post-processing is required to satisfy the needs of subsequent data reading and statistical analysis. In order to meet the needs of safety analysis and code validation, the postprocessing of the result data usually includes obtaining the maximum/minimum values of the parameters of interest, calculating the variance and standard deviation, and computing the sensitivity measures. Among them, it is an important part to perform sensitivity analysis by calculating the sensitivity measures between the input uncertain parameters and output response parameters. Sensitivity analysis measures the magnitude of the impact of each input uncertain parameter on the output response parameter, and thus the important input parameters that have a significant impact are identified. Some commonly used sensitivity measures include the Spearman rank correlation coefficient and the Pearson correlation coefficient [16].

Establishment of the Integrated Framework of Uncertainty Quantification
On the basis of the design of coupling interface between statistical software and nuclear reactor professional simulation codes, an integrated framework of uncertainty quantification was further developed on the basis of the Qt Graphical User Interface (GUI). This framework has the characteristics of visualization, convenience, and integrated computation, which is helpful for carrying out efficient uncertainty quantification analysis.
Qt is a cross-platform framework for C++ GUI applications. It is a fully object-oriented interface that can provide the developers with almost all the functions required to build art-grade graphics [17,18]. The Qt GUI provides a user interface of a computer operating environment that is displayed graphically, which is applied for user-friendly operations through windows, menus, and keystrokes.
The workflow of integrated computing for uncertainty quantification analysis includes the following steps: (a) Enter the input card of nuclear reactor professional simulation code (such as RELAP5) and select the parameters that need to be replaced by different samples. (b) Import the DAKOTA input file and conduct parameter sampling according to the specified sampling method to obtain the sampling results. (c) Replace the selected parameters in the input card of nuclear reactor simulation code with the obtained sampling results. (d) Conduct automatic batching operations of the nuclear energy simulation code to obtain multiple sets of calculation results. (e) Perform data post-processing and sensitivity analysis based on multiple sets of output calculations results.
Through autonomous programming and successful compilation and operation under the Qt GUI framework, all the steps of integrated computing workflow for uncertainty quantification can be visualized in the user interface of the integrated framework, enabling users to achieve an integrated computing process of uncertainty quantification through autonomous operation through the visualized interface. During the execution process, the integration with DAKOTA is mainly realized by selecting ps1 or sh files in the interface block of the DAKOTA to execute the scripts, achieving an integrated process of uncertainty quantification.
There are two important developed scripts: (1) A script used for interface pre-processing and code batching operations. On the basis of input files (such as user-specified code input cards, DAKOTA input files, and the selected input uncertain parameters), a series of operations are carried out for sampling replacement and code batching calculations. (2) A script used for interface post-processing, which is mainly for obtaining the maximum/minimum values of figure of merits and calculation of sensitivity coefficients. Import a file containing response information specified by the user and obtain the output result file after a series of calculations and processing. It is worth noting that these scripts were written and embedded into the integrated framework, enabling automated uncertainty calculation workflows without the need for users to provide script writing.
Thus, the Qt-based integrated framework is established for uncertainty quantification, which can visually implement all steps of uncertainty analysis, including the distribution setting of input uncertainty parameters, sampling replacement, code batching operations, and data extraction. The developed visual integrated framework is mainly composed of view layer and model layer.
The view layer is a visualization interface that is decomposed into different windows including mainwindow, childwindow1 for specifying input uncertainty parameters, and childwindow2 for specifying output response parameters. The mainwindow interface is shown in Figure 2. The setting up the processes in categories allows the framework to be applied in many different situations, and the flexibility of this framework minimizes the negative impact of traditional process-oriented processes and ensures consistency across different sections. For example, when uncertainty analysis is not required, benchmark code calculation cases based on the initial code input files can be calculated independently, while when uncertainty analysis is required, batching operations for uncertainty quantification calculations can be carried out. work minimizes the negative impact of traditional process-oriented processes and ensures consistency across different sections. For example, when uncertainty analysis is not required, benchmark code calculation cases based on the initial code input files can be calculated independently, while when uncertainty analysis is required, batching operations for uncertainty quantification calculations can be carried out. The model layer can realize some complex functions that are difficult to implement by visualization interface through code programming, including the sampling replacement of input uncertainty parameters, code batching runs, and data extraction, among others. Through the external interface provided by DAKOTA, the underlying code in the model layer can be executed by calling the command line in the view layer, so as to achieve the required multiple functions.
The visual interface diagram of the developed integrated framework of uncertainty quantification is shown in Figure 3. The developed integrated framework is able to couple statistical software with different professional simulation codes, providing a flexible and practical coupling computing platform. This uncertainty framework is an open and ready-to-use tool. According to specific application requirements, users only need to perform simple and convenient operations on the visual interface to complete an efficient uncertainty calculation workflow. The establishment of this integrated framework opens up a key technical path to develop an autonomous uncertainty analysis platform. The model layer can realize some complex functions that are difficult to implement by visualization interface through code programming, including the sampling replacement of input uncertainty parameters, code batching runs, and data extraction, among others. Through the external interface provided by DAKOTA, the underlying code in the model layer can be executed by calling the command line in the view layer, so as to achieve the required multiple functions.
The visual interface diagram of the developed integrated framework of uncertainty quantification is shown in Figure 3. The developed integrated framework is able to couple statistical software with different professional simulation codes, providing a flexible and practical coupling computing platform. This uncertainty framework is an open and readyto-use tool. According to specific application requirements, users only need to perform simple and convenient operations on the visual interface to complete an efficient uncertainty calculation workflow. The establishment of this integrated framework opens up a key technical path to develop an autonomous uncertainty analysis platform.

Validation Analysis by a Typical LOBI Example
The reliability of the integrated framework of uncertainty quantification was validated and analyzed by using the LOBI test facility under the BL-44 accident condition as a typical example. Firstly, the RELAP5 best-estimate thermal-hydraulic code was used to simulate the LOBI test facility and analyze the BL-44 accident process. Then, 59 code runs of uncertainty calculations were carried out on the basis of the Wilks nonparametric statistical method [19]. Finally, the calculation results of output response parameters were collected and statistically processed, obtaining the uncertainty bands of figure of merits such as peak cladding temperature (PCT) and the sensitivity ranking of related input uncertainty parameters. The results of uncertainty quantification and sensitivity analysis obtained by the developed integrated framework were compared with the corresponding results obtained by the SNAP platform.

Simulation of the LOBI Test Facility
The LOBI facility is an integral thermal-hydraulic test facility built by the nuclear research institute of European atomic energy community, which is used for the experimental validation of prototypical large-scale pressurized water reactor [20,21].
The BL-44 experiment was a 6% small break loss of coolant accident of cold leg

Validation Analysis by a Typical LOBI Example
The reliability of the integrated framework of uncertainty quantification was validated and analyzed by using the LOBI test facility under the BL-44 accident condition as a typical example. Firstly, the RELAP5 best-estimate thermal-hydraulic code was used to simulate the LOBI test facility and analyze the BL-44 accident process. Then, 59 code runs of uncertainty calculations were carried out on the basis of the Wilks nonparametric statistical method [19]. Finally, the calculation results of output response parameters were collected and statistically processed, obtaining the uncertainty bands of figure of merits such as peak cladding temperature (PCT) and the sensitivity ranking of related input uncertainty parameters. The results of uncertainty quantification and sensitivity analysis obtained by the developed integrated framework were compared with the corresponding results obtained by the SNAP platform.

Simulation of the LOBI Test Facility
The LOBI facility is an integral thermal-hydraulic test facility built by the nuclear research institute of European atomic energy community, which is used for the experimental validation of prototypical large-scale pressurized water reactor [20,21].
The BL-44 experiment was a 6% small break loss of coolant accident of cold leg conducted on LOBI test facility. The total experimental process lasted for about 3000 s. The 6% break of cold leg opened at 500 s, and subsequently the primary system pressure began to drop rapidly. The reactor was shut down, the main pump coasted according to the set normal logic, and the high-pressure safety injection failed. When the primary system pressure dropped to 3.91 MPa, the safety injection tank started to inject. When the peak temperature of the fuel cladding reached 772 K, the low-pressure safety injection was initiated. After the low-pressure injection coolant completely submerged the core, the experiment was terminated [22].
The best-estimate thermal-hydraulic code, RELAP5/MOD 3.4, was used to simulate the BL-44 accident process of the LOBI test facility. The RELAP5 modeling was established to simulate the primary loop system and safety injection system of the LOBI test facility. Figure 4 shows the RELAP5 nodalization diagram for LOBI BL-44 modeling. The modeling of main components were divided into the following groups: the 100 series representing the reactor pressure vessel model and upper head simulator; the 200 series representing the complete loop pipeline and main coolant pump; the 300 series representing broken loop pipelines, main pumps, and broken components; the 400 series representing the pressurizer and surge lines; the 500 series representing the complete loop secondary side (SG I); the 600 series representing the secondary side of the break loop (SG II); the 700 series representing the steam header and steam pipeline; and the 800 series representing the injection and discharge pipelines of the primary loop. The best-estimate thermal-hydraulic code, RELAP5/MOD 3.4, was used to simulate the BL-44 accident process of the LOBI test facility. The RELAP5 modeling was established to simulate the primary loop system and safety injection system of the LOBI test facility. Figure 4 shows the RELAP5 nodalization diagram for LOBI BL-44 modeling. The modeling of main components were divided into the following groups: the 100 series representing the reactor pressure vessel model and upper head simulator; the 200 series representing the complete loop pipeline and main coolant pump; the 300 series representing broken loop pipelines, main pumps, and broken components; the 400 series representing the pressurizer and surge lines; the 500 series representing the complete loop secondary side (SG I); the 600 series representing the secondary side of the break loop (SG II); the 700 series representing the steam header and steam pipeline; and the 800 series representing the injection and discharge pipelines of the primary loop. In order to verify the rationality of the RELAP5 modeling, the comparison of calcu lation results and experimental data is provided as in Figures 5 and 6. The calculation results showed good consistency with the experimental data, indicating the rationality o the base case model. In order to verify the rationality of the RELAP5 modeling, the comparison of calculation results and experimental data is provided as in Figures 5 and 6. The calculation results showed good consistency with the experimental data, indicating the rationality of the base case model.

Uncertainty Quantification Analysis
Input uncertainty parameters are required to be identified firstly, and th eters are sampled according to their statistical information. In this work, th certainty parameters were selected by the combination of the phenomena id and ranking table (PIRT) and the relevant research literature, as well as engi perience. Thus, eight main input uncertainty parameters were selected to p certainty quantification, as listed in Table 1. Furthermore, the statistical info these input uncertainty parameters including probability distributions and r determined through the comprehensive information including the design do the LOBI facility, some related data in the literature, and engineering experie

Uncertainty Quantification Analysis
Input uncertainty parameters are required to be identified firstly, and th eters are sampled according to their statistical information. In this work, the certainty parameters were selected by the combination of the phenomena id and ranking table (PIRT) and the relevant research literature, as well as engi perience. Thus, eight main input uncertainty parameters were selected to p certainty quantification, as listed in Table 1. Furthermore, the statistical info these input uncertainty parameters including probability distributions and ra determined through the comprehensive information including the design do the LOBI facility, some related data in the literature, and engineering experien also shown in Table 1.

Uncertainty Quantification Analysis
Input uncertainty parameters are required to be identified firstly, and these parameters are sampled according to their statistical information. In this work, the input uncertainty parameters were selected by the combination of the phenomena identification and ranking table (PIRT) and the relevant research literature, as well as engineering experience. Thus, eight main input uncertainty parameters were selected to perform uncertainty quantification, as listed in Table 1. Furthermore, the statistical information of these input uncertainty parameters including probability distributions and ranges were determined through the comprehensive information including the design documents of the LOBI facility, some related data in the literature, and engineering experience [23][24][25], also shown in Table 1. Wilks' method was applied in this work to perform uncertainty analysis. Due to its simplicity and efficiency, Wilks' method is widely used for uncertainty analysis in nuclear engineering [26]. The required computing numbers can be determined by Wilks' formula so as to meet the 95/95 criteria. Here, 59 sampling calculations were performed according to Wilks' formula of one-sided intervals with a 95% confidence level. The developed integrated framework of uncertainty quantification was applied to carry out 59 code runs of the LOBI BL-44 simulation, and the obtained uncertainty bands of cladding temperature were as shown in Figure 7a. Meanwhile, the uncertainty bands of cladding temperature calculated by the SNAP uncertainty analysis platform under the same conditions are shown in Figure 7b. Through the comparison, it can be seen that the uncertainty analysis results using the integrated framework developed in this study was highly consistent with the corresponding calculation results by the SNAP platform. The uncertainty interval of PCT obtained using the established integrated framework was [712.070, 752.702] K, while the uncertainty interval of PCT obtained using the SNAP platform was [714.925, 747.347] K. Only small differences were observed, which resulted from the difference in the initial samples with different random sampling seeds by using the two different calculation tools. By comparison, it demonstrated the reliability of the integrated framework to carry out uncertainty quantification calculations. Wilks method was applied in this work to perform uncertainty analysis. simplicity and efficiency, Wilks method is widely used for uncertainty anal clear engineering [26]. The required computing numbers can be determined formula so as to meet the 95/95 criteria. Here, 59 sampling calculations were according to Wilks formula of one-sided intervals with a 95% confidence lev veloped integrated framework of uncertainty quantification was applied to ca code runs of the LOBI BL-44 simulation, and the obtained uncertainty bands o temperature were as shown in Figure 7a. Meanwhile, the uncertainty bands o temperature calculated by the SNAP uncertainty analysis platform under the ditions are shown in Figure 7b. Through the comparison, it can be seen that tainty analysis results using the integrated framework developed in this highly consistent with the corresponding calculation results by the SNAP pla uncertainty interval of PCT obtained using the established integrated frame [712.070, 752.702] K, while the uncertainty interval of PCT obtained using platform was [714.925, 747.347] K. Only small differences were observed, whic from the difference in the initial samples with different random sampling seed the two different calculation tools. By comparison, it demonstrated the reliab integrated framework to carry out uncertainty quantification calculations. (a)

Sensitivity Analysis
Sensitivity analysis is a statistical method to quantify the influence of in tainty parameters on the output response parameters. On the basis of the da results of uncertainty quantification, sensitivity analysis can be further carried Spearman rank correlation coefficient and Pearson correlation coefficient we sensitivity measures in order to investigate the impact of input uncertainty on the output response parameters such as PCT.
The Spearman rank correlation coefficient provides an effective measu correlations between the input parameters and output parameters. It is worth it does not directly use the actual data of parameters for calculation, but rathe rank to represent the relative size of parameters. Thus, the Spearman rank coefficient still retains its effectiveness when the input parameters and output have a great difference in magnitude. Its range is [−1, 1], and a larger absolu dicates a stronger correlation. The Spearman rank correlation coefficient ( S) is by the following expression: where RXi is the rank of one input Xi among all the values of the variable X; RY of one output Yi among all the values of the variable Y; and n is the total samples.

Sensitivity Analysis
Sensitivity analysis is a statistical method to quantify the influence of input uncertainty parameters on the output response parameters. On the basis of the data from the results of uncertainty quantification, sensitivity analysis can be further carried out. Here, Spearman rank correlation coefficient and Pearson correlation coefficient were used as sensitivity measures in order to investigate the impact of input uncertainty parameters on the output response parameters such as PCT.
The Spearman rank correlation coefficient provides an effective measure for the correlations between the input parameters and output parameters. It is worth noting that it does not directly use the actual data of parameters for calculation, but rather uses their rank to represent the relative size of parameters. Thus, the Spearman rank correlation coefficient still retains its effectiveness when the input parameters and output parameters have a great difference in magnitude. Its range is [−1, 1], and a larger absolute value indicates a stronger correlation. The Spearman rank correlation coefficient (ρ S ) is calculated by the following expression: where R Xi is the rank of one input X i among all the values of the variable X; R Yi is the rank of one output Y i among all the values of the variable Y; and n is the total number of samples. The Pearson correlation coefficient is also a widely used sensitivity measure. The Pearson correlation coefficient is also known as the Pearson product-moment correlation coefficient, which is a linear correlation coefficient. The Pearson correlation coefficient is usually used to reflect the degree of linear correlation between two variables X and Y, and its value is also between −1 and 1. The Pearson correlation coefficient (ρ P ) is calculated by the following expression: where X i is one value of input variable X; Y i is one value of output variable Y; and − X and − Y are the average values of the input and output variables, respectively.
Here, the Spearman rank correlation coefficient and Pearson correlation coefficient between the eight selected input uncertainty parameters and PCT were calculated by the established integrated framework, and the results were compared with those by the SNAP platform, as shown in Figures 8 and 9. Here, the Spearman rank correlation coefficient and Pearson correlation coefficient between the eight selected input uncertainty parameters and PCT were calculated by the established integrated framework, and the results were compared with those by the SNAP platform, as shown in Figures 8 and 9.  Through the calculation of sensitivity coefficients, the impact of input uncertainty parameters on PCT can be measured. From the results of the Spearman rank correlation coefficients, as shown in Figure 8, the ACC initial pressure and initial water inventory were identified as the important contributors for the uncertainty of PCT. From the results of the Pearson correlation coefficients as shown in Figure 9, the ACC initial pressure and initial water inventory and also the initial temperature were considered as important influencing factors.
Besides the magnitude of sensitivity coefficients reflecting the degree of importance of input uncertainty parameters on PCT, the sign of sensitivity coefficients can reveal the trend of PCT varying with input uncertainty parameters. For example, the ACC initial pressure has an obvious positive feedback effect on the PCT, that is, higher ACC initial pressure will lead to a higher PCT value. Meanwhile, a larger ACC initial water inventory will lead to a lower PCT value due to the corresponding negative Spearman rank correlation coefficient. Through the calculation of sensitivity coefficients, the impact of input uncertainty parameters on PCT can be measured. From the results of the Spearman rank correlation coefficients, as shown in Figure 8, the ACC initial pressure and initial water inventory were identified as the important contributors for the uncertainty of PCT. From the results of the Pearson correlation coefficients as shown in Figure 9, the ACC initial pressure and initial water inventory and also the initial temperature were considered as important influencing factors.
Besides the magnitude of sensitivity coefficients reflecting the degree of importance of input uncertainty parameters on PCT, the sign of sensitivity coefficients can reveal the trend of PCT varying with input uncertainty parameters. For example, the ACC initial pressure has an obvious positive feedback effect on the PCT, that is, higher ACC initial pressure will lead to a higher PCT value. Meanwhile, a larger ACC initial water inventory will lead to a lower PCT value due to the corresponding negative Spearman rank correlation coefficient.

Conclusions
In this work, we developed a design method of coupling interface between DAKOTA statistical software and the nuclear power professional code, and further established an integrated framework of uncertainty quantification that was based on the Qt graphical interface. The small break LOCA analysis of the LOBI test facility was selected as a typical example to carry out uncertainty quantification and sensitivity analysis by using the established integrated framework. The main conclusions are summarized as follows: (1) The established integrated framework of uncertainty quantification was able to couple DAKOTA statistical software and nuclear power professional codes, realizing integrated calculations including interface pre-processing, code batching operations, and interface post-processing. Meanwhile, the integrated framework of uncertainty quantification had the characteristics of visualization, convenience, and efficient computing. The establishment of this integrated framework opened up a key technical path to develop an autonomous uncertainty analysis platform.
(2) For a typical example of small-break LOCA analysis of the LOBI test facility, the results of 95/95 uncertainty bands of PCT obtained through the integrated framework were highly consistent with the corresponding calculation results of the SNAP platform, thus validating the reliability of the established integrated framework to carry out uncertainty quantification calculations.
(3) The developed integration framework can not only perform uncertainty quantification but also can conduct sensitivity analysis. Through the measure of Spearman rank correlation coefficients and the Pearson correlation coefficient, some important input uncertain contributors for the output response parameters can be distinguished.