State estimation of a shop floor using improved resampling rules for particle filtering

https://doi.org/10.1016/j.ijpe.2011.07.003Get rights and content

Abstract

Operational inefficiencies in supply chains cost industries millions of dollars every year. Much of these inefficiencies arise due to the lack of a coherent planning and control mechanism, which requires accurate yet timely state estimation of these large-scale dynamic systems given their massive datasets. While Bayesian inferencing procedures based on particle filtering paradigm may meet these requirements in state estimation, they may end up in a situation called degeneracy, where a single particle abruptly possesses significant amount of normalized weights. Resampling rules for importance sampling prevent the sampling procedure from generating degenerated weights for particles. In this work, we propose two new resampling rules concerning minimized variance (VRR) and minimized bias (BRR). The proposed rules are derived theoretically and their performances are benchmarked against that of the minimized variance and half-width based resampling rules existing in the literature using a simulation of a semiconductor die manufacturing shop floor in terms of their resampling qualities (mean and variance of root mean square errors) and computational efficiencies, where we identify the circumstances that the proposed resampling rules become particularly useful.

Highlights

► Lack of coherent control mechanisms cause operational inefficiencies in shop floors. ► In control mechanisms, accurate state estimation is crucial given massive datasets. ► Particle filtering with resampling may meet these requirements in state estimation. ► Resampling rules prevent the generation of degenerated weights for particles. ► We theoretically derive two new resampling rules concerning minimized variance and bias, while showing their resampling qualities and computational efficiencies.

Introduction

Today, supply chain organizations face the challenge of demanding performances against their competitors more than ever before due to price sensitive customers, over the seas providers, costlier machines with higher technologies, and their specific maintenance requirements. This challenge has pushed the organizations, that once focused primarily on the strategic concerns of supply chains involving coordination policies, distribution networks, and market profit differentiation, to pay attention to operational decision making in addition to strategic decision making to obtain a sustainable competitive advantage in their markets. Under this competitive environment, the percentage of products that were marked down in the US has risen from less than 10% in 1980 to more than 30% in 2000 with increased quality specifications (Lee, 2004, Harvard Business Review, 2006). In spite of improved efficiency motived by this competition, operational inefficiencies involved with the production stage inside the shop floors such as preventive maintenance scheduling, part routing scheduling and inventory control still exist, and cost companies millions of dollars every year. For instance, in a semiconductor supply chain operating at a $300 billion market as of 2009 (KPMG Report, 2009), a small amount of inefficiency originating at a shop floor may cost billions of dollars to the members of this chain as well as the members of its side industries.

Much of the aforementioned operational inefficiencies in the supply chain and shop floors arise due to the lack of a coherent planning and control mechanism across as well as within each strategic, tactical, and operational unit, as necessitated by globally competitive markets (Dreyer et al., 2009, Rupp and Ristic, 2000, Mula et al., 2006). In the decision making process of coherent planning and control, effective monitoring and hence determining the current shop floor status and capabilities while not disrupting its ongoing operations become critically important. However, shop floors generate massive datasets at each measurement point due to their large-scale, dynamic and complex nature, where both the collection and the analysis become quite challenging and computationally intensive. Even though this situation holds true for the strategic and tactical levels, it becomes even more obvious at the operational level where the number of parameters as well as the update frequency for each parameter grow significantly. In order to enable timely monitoring, simulation-based analysis, and control of these shop floors at the operational level in an economical and effective way, Celik and Son (in this press) proposed a data-driven adaptive simulation scheme incorporating Bayesian inferencing by means of particle filters. Particle filtering (also known as sequential Monte Carlo or sequential importance sampling with resampling) defines a class of simulation-based estimation techniques, which have been used to solve various types of sequential Bayesian inference problems that are encountered in wide range of areas such as econometrics (Casarin and Sartore, 2008, Flury and Shephard, 2008), signal and image processing (Brasnett et al., 2007, Xu and Li, 2007), robotics (Schulz and Burgard, 2001), and recently supply chain management (Celik and Son, in press).

This study develops on the previous research presented in Celik and Son (in press), where the particle filtering algorithm intends to estimate the actual status of the considered shop floor in terms of “mean time between failures” for each machine, which is further used in preventive maintenance and part routing scheduling problems during its control. The idea behind the particle filtering resides in effective sampling from a sequence of probability distributions, and the algorithm is structurally composed of two major steps including importance sampling and resampling. Resampling, by definition is drawing repeated samples from subsets of available data based on a given criteria. Resampling plays a critical role in the performance of the particle filter as it may resolve the potential issues of weight degeneration where after a few iterations, all but one particle will have negligible weight (Ristic et al., 2004; Liu and Chen, 1998) and waste of computational resources by replicating particles in proportion to their weights. Development of efficient resampling rules is extremely critical for practical implementations. Efficient resampling methods are the key for (1) increased speed and accuracy, (2) faster convergence, (3) on-time data acquisition, and (4) reduced memory-intensity (in terms of computational load) of particle filters regardless of the statistics of the particles. The entire particle filtering method benefits from a well-defined resampling method in terms of reduced complexity and/or improved quality of the resampling step. In this work, we enhance the efficiency of the generic particle filtering algorithms by presenting improvements in their resampling techniques from two different standpoints, namely variance-based resampling (VRR) and bias-based resampling efficiency rules (BRR). The proposed rules are first derived theoretically. Then, their performances are benchmarked against the performances of the resampling rule developed by Kong et al. (1994) and half-width based resampling rule (revisited in this work) in terms of their resampling qualities and computational efficiencies using a simulation for state estimation of a semiconductor die facility. The rules developed in this study are appropriate for all types of particle filters that use resampling because resampling step itself is not dependent on any particular application or the state-space model. The particular analysis which is presented in this paper is related to the generic particle filter (sample importance resampling), which is implemented for the state estimation of a shop floor. Large scale organizations employing the proposed resampling rules may benefit the most from in regard to different problems they face (such as the considered state estimation problem in shop floors which may help them reduce their preventive maintenance cost) by acknowledging the system status accurately with less computational cycles. While the validity of the proposed rules is demonstrated using a discrete-event simulation model (a shop floor simulation) of the system embedding a Matlab model for the particle filtering, the analysis can be easily extended to any application of particle filtering that performs resampling (i.e., the auxiliary sample importance resampling filter).

The rest of the paper is organized as follows. Section 2 provides a background and literature survey of Bayesian inference and modeling of particle filters (sequential Monte Carlo algorithms). Section 3, then, describes the variance-based and bias-based resampling efficiency rules proposed in this paper for the improvement of particle filtering algorithms. Section 4 describes the designed experiments and results obtained to demonstrate the efficacy of the proposed approaches. Finally, Section 5 summarizes the conclusions derived from this study and provides insights for the planned future work on this research.

Section snippets

Overview of the recursive Bayesian filtering

Bayesian filtering is a nonlinear filtering technique for estimating and tracking the state of a nonlinear stochastic system from non-Gaussian noisy observation data (Haug, 2005, Lee, 2008). Uncertainties about input parameters and performance of systems can be conveniently represented in Bayesian filtering by employing the probability distribution functions (Surekha and Ghali, 2001). Using these probability distribution functions, it allows for a basis where optimal estimates can be derived

Generic particle filter algorithm and importance sampling

In order to provide a consistent notation throughout the article, we briefly describe the nomenclature used in the sequence of operations of generic particle filter which is shown in Fig. 2. Here, xk=[xk1,xk2,xk3,,xki,,xkNs] is a random vector where each vector element xki and Ns represents an individual sample drawn at sample i of iteration k, and the sample size, respectively. However, throughout this section, by hiding the index k, we use only the random vector x=[x1,x2,x3,,xi] to

Experiments and results

In this section, we first present an overview on semiconductor industry, manufacturing processes, and semiconductor die manufacturing fab, which have been selected as a specific case study to illustrate and demonstrate the proposed work. Next, we provide the experimental results obtained for the proposed resampling rules using a synthetic function as well as this case study.

Conclusions and future work

In this study, improvements are proposed for the resampling rules within particle filtering algorithms, where theoretical derivations (the variance and bias-based relative efficiency rules) rely on the rules existing in the literature. Then, these improved resampling rules are benchmarked against that of the revisited half-width based resampling rule as well as widely known Kong rule for state estimation of shop floor using a simulation in terms of their resampling qualities and computational

Acknowledgments

This work was supported in part by the National Science of Foundation under NSF #0540212. Authors also would like to acknowledge Drs. Ferenc Szidarovszky and Guzin Bayraksan for their valuable comments during the course of this study.

References (62)

  • Bayraksan, G., 2005. Monte Carlo Sampling-based Methods in Stochastic Programming, Dissertation, Submitted to the...
  • C. Berzuini et al.

    Dynamic conditional independent models and Markov chain Monte Carlo methods

    Journal of American Statistical Association

    (1997)
  • Burke, D., Ghosh A., Heidrich, W., 2005. Bidirectional importance sampling for direct illumination. In: Bala, K.,...
  • F. Cadini et al.

    A Monte Carlo-based technique for estimating the operation modes of hybrid dynamic systems

    Reliability: Theory & Applications

    (2010)
  • J. Carpenter et al.

    An improved particle filter for non-linear problems

    Proceedings of IEE- Radar, Sonar, and Navigation

    (2004)
  • Casarin, R., Sartore, D., 2008. Matrix-state particle filter for Wishart stochastic volatility processes. Discussion...
  • Celik, N., Son, Y. Sequential Monte Carlo-based fidelity selection in dynamic-data-driven adaptive multi-scale...
  • Celik, N., Son, Y., 2010. State estimation of a supply chain using improved resampling rules for particle filtering....
  • Chen, Y., 2001. Sequential Importance Sampling with Resampling: Theory and Applications, Dissertation, Submitted to the...
  • J. Cheng et al.

    AIS-BN: an adaptive importance sampling algorithm for evidential reasoning in large Bayesian networks

    Journal of Artificial Intelligence Research

    (2000)
  • R. Douc et al.

    Limit theorems for weighted samples with applications to sequential Monte Carlo methods

    Annals of Statistics

    (2008)
  • Douc, R., Cappe, O., 2005. Comparison of resampling schemes for particle filtering. In: Proceedings of the Fourth...
  • Doucet, A., 1998. On Sequential Simulation-based Methods for Bayesian Filtering. Technical Report, Department of...
  • A. Doucet et al.

    An introduction to sequential Monte Carlo methods

  • A. Doucet et al.

    On sequential Monte Carlo sampling methods for Bayesian filtering

    Statistics and Computing

    (2000)
  • M. Dowd

    A sequential Monte Carlo approach for marine ecological prediction

    Environmetrics

    (2006)
  • H.C. Dreyer et al.

    Global supply chain control systems: a conceptual framework for the global control centre

    Production Planning and Control

    (2009)
  • Fearnhead, P., 1998. Sequential Monte Carlo methods in filter theory, Dissertation Submitted to the Faculty of the...
  • P. Fearnhead et al.

    On-line inference for hidden Markov models via particle filters

    Journal of the Royal Statistical Society: Series B (Statistical Methodology)

    (2003)
  • Flury, T., Shephard, N., 2008. Bayesian Inference Based only on Simulated Likelihood: Particle Filter Analysis of...
  • Cited by (0)

    View full text