ENBIS-12 in Ljubljana

9 – 13 September 2012 Abstract submission: 15 January – 10 May 2012

My abstracts


The following abstracts have been accepted for this event:

  • Monitoring Multivariate Process Variability : A Unified View From Generalized Variance To Likelihood Ratio Control Charts

    Authors: Emanuel Pimentel Barbosa (Universidade Estadual de Campinas), Mario Antonio Gneri (Universidade Estadual de Campinas), Ariane Meneguetti (Universidade Estadual de Campinas)
    Primary area of focus / application: Process
    Keywords: Control Charts, Generalized Variance, Likelihood Ratio Statistic, Multivariate Processes
    Submitted at 12-Apr-2012 22:02 by Emanuel Pimentel Barbosa
    10-Sep-2012 17:00 Monitoring Multivariate Process Variability : A Unified View From Generalized Variance To Likelihood Ratio Control Charts
    The sample generalized variance |S| is the more used statistic for monitoring and control of multivariate processes variability, but it has two serious drawbacks.

    The first one is that its usual implementation as a 3-sigma Shewhart -type control chart based on normal approximation inflates substantialy the risk of false alarm, and a solution was presented (ENBIS 11) based on Cornish-Fisher correction and Meijer G-function.

    The second one, is the fact that the |S| doesn´t detect all sorts of changes in the process variance-covariance matrix Sigma, and a solution is proposed here based on the inclusion of an auxiliary statistic based on the trace(S), jointly with the |S|.

    Since |S| and trace(S) are the two components of the likelihood ratio for testing hypothesis about Sigma, we propose to join these two statistics in just one, resulting a modified likelihood ratio control chart, which gives an unified wiew of the problem.

    In order to ilustrate the proposed methods, a couple of numerical examples are provided.
  • Robust Control Chart to Monitor the Information System of Semiconductor Production Plant

    Authors: Michel Lutz (Ecole des Mines de Saint-Etienne), Espéran Padonou (Ecole des Mines de Saint-Etienne), Olivier Roustant (Ecole des Mines de Saint-Etienne)
    Primary area of focus / application: Process
    Keywords: monitoring, robust statistics, time series, control chart, information system, semiconductor industry
    Submitted at 13-Apr-2012 07:20 by Michel Lutz
    Accepted (view paper)
    12-Sep-2012 09:45 Robust Control Chart to Monitor the Information System of Semiconductor Production Plant
    This presentation introduces some researches carried in a semiconductor production plant. Its activity is supported by a highly automated information system. A database, recording the activity of this system (volume of activity, performance…), is maintained. Hundreds of variables are tracked on a daily basis.

    The objective is to place these variables under statistical control (automated monitoring). As time series are considered, implying autocorrelation phenomena, a two-steps monitoring procedure is used (Montgomery, 2005): 1) time series modelling, using the Holt-Winters exponential smoothing algorithm; 2) model residuals monitoring, through an individual Shewart control chart.

    However, poor controls results have been encountered with basic approaches. That is why robust methods have been implemented: the Qn robust standard-deviation estimator (Rousseeuw & Croux, 1993) is used, as well as a robust Holt-Winters smoothing algorithm (Croux & al., 2011). These methods are already up and running and allow a fully automated monitoring of several hundred metrics with fairly good results.

    Further works are on-going to improve results and applications:

    1) Simulation: overall performances of the different methods are evaluated and compared through statistical simulation;

    2) Multivariate testing: for now, the multivariate dimension of the controls is not considered. We observed this could lead to unsatisfying outcomes (contradictory monitoring results for correlated variables, false alarms). Consequently, several simultaneous testing strategies are experienced, to enhance the current method.

    Methods and results will be illustrated and compared on the basis of simulated series and real case studies from the considered semiconductor production plant.
  • Bayesian Approach to Assess the Probability of Detection of Defects in Nondestructive Evaluation

    Authors: Severine Demeyer (CEA LIST), Frederic Jenson (CEA LIST), Nicolas Dominguez (CEA LIST)
    Primary area of focus / application: Design and analysis of experiments
    Keywords: propagation of uncertainty, design of experiments, Bayesian approach, Metamodelling, nondestructive testing, probability of detection
    Submitted at 13-Apr-2012 08:49 by Séverine Demeyer
    High costs of POD campaigns combined with the increasing number of configurations needed to establish reliable POD curves have led to the computation of numerical curves. Numerical approaches aim at delivering POD curves based on the modelling of the inspection process implemented in a numerical code. In this paper, the resulting approximation of POD curves is however balanced with the promising insights offered by the use of statistical simulation through numerical codes. Actually, as a numerical code provides an output for each set of values attributed to the input variable, uncertainty about input variables can be easily propagated through the numerical code and then can be taken into account in the resulting POD curve to get a more reliable POD curve. Conversely, the collection of output values combined with observed data can be used to update the knowledge on the uncertain variables used as inputs of the code. Combinig the two reverse approaches in a Bayesian framework provides a new framework to compute POD curves based on the management of uncertainty. In this framework, quantitative tools are set up to study the contribution of uncertainty sources to the probability of detecting cracks. Such a Bayesian framework is fully described and implemented on a case study involving modelling of an Eddy current inspection process of a Titanium flat area with uncertain variables.
  • Reliability Analysis with Warranty Data

    Authors: Nikolaus Haselgruber (CIS consulting in industrial statistics)
    Primary area of focus / application: Reliability
    Keywords: Reliability, Warranty Analysis, Failure Rate, Incomplete Data
    Submitted at 13-Apr-2012 10:07 by Nikolaus Haselgruber
    Accepted (view paper)
    11-Sep-2012 11:50 Reliability Analysis with Warranty Data
    Product reliability is a key characteristic of any complex technical product, in particular if for the user the invest costs are considerable and any unforeseen downtime jeopardizes profit or causes customer annoyance.
    Since in such cases the repair actions are executed by a network of contracting partners of the OEM typically a central warranty database collects all the claims in a systematic way. It serves as a basis for the finanical assessment and refunding of the claim costs and provides important material to determine relevant reliabililty figures, such as failure rate, mean time between failures, lifetime distribution quantiles etc. The main challenge here is the fact that not all the data required for an unbiased estimation of reliability parameters are available in such a warranty data base.
    This presentation introduces a new algorithm considering step by step the requirements for the reliability analysis based on warranty data with remedial measures where information is missing. Practical examples show successful applications in several industrial branches as well as the bias potential in case of ignoring relevant aspects such as incomplete, censored or truncated data.
  • Statistical Modelling of a Thermal Spraying Process with Additive Day-effects

    Authors: André Rehage (Technical University Dortmund), Sonja Kuhnt (Technical University Dortmund)
    Primary area of focus / application: Modelling
    Keywords: Thermal Spraying Process, Generalized Linear Models, Linked Generalized Linear Models, Day-effect
    Submitted at 13-Apr-2012 11:09 by André Rehage
    10-Sep-2012 16:45 Statistical Modelling of a Thermal Spraying Process with Additive Day-effects
    Thermal spraying processes are considered which apply a particle coating on a surface, e.g. for wear protection or durable medical instruments (Tillmann et al., 2010). The process often lacks reproducibility due to uncontrollable day-effects. As measurements of coating properties are time-consuming, adjustments of the process are not easily done. We aim at solving this problem by prediction models involving in-flight properties of the particles. Particles in flight are easier to measure than the coating properties, while at the same time it can be assumed that they carry relevant information on day-effects.

    Generalized linear models are used to model the relationship between process parameters and particle properties as well as between particle properties and coating properties. Uncontrollable or latent variables are assumed to primarily have an additive effect on the in-flight particle properties. With only a few initializing experiments at the day of interest we estimate such an additive constant (Rehage et al., 2012). Using the resulting estimate we link the generalized linear models to get a more appropriate description of the thermal spraying process for the individual day.

    W. Tillmann, E. Vogli, B. Hussong, S. Kuhnt and N. Rudak: Relations between in-flight particle characteristics and coating properties by HVOF spraying, DVS-Berichte 264, ISBN: 978-3-87155-590-9, 2010.
    A. Rehage, N. Rudak, B. Hussong, S. Kuhnt and W. Tillmann: Prediction of in-flight particle properties in thermal spraying with additive day-effects, SFB 823 Discussion Paper 06/12, 2012.
  • Designing Choice Experiments by Optimizing the Complexity Level to Individual Abilities

    Authors: Vishva Danthurebandara (Catholic University of Leuven), Jie Yu (Catholic University of Leuven), Martina Vandebroek (Catholic University of Leuven)
    Primary area of focus / application: Design and analysis of experiments
    Keywords: Choice experiments, Individual Sequential Optimal designs, Complexity measures, Simulation study
    Submitted at 13-Apr-2012 11:33 by Martina Vandebroek
    Accepted (view paper)
    10-Sep-2012 17:10 Designing Choice Experiments by Optimizing the Complexity Level to Individual Abilities
    It has been proven in psychology and behavioral decision theory that the complexity of the choice sets affects the consistency of the responses in choice experiments. A few studies in the discrete choice literature model this dependency explicitly when analyzing experimental results (for example, Swait and Adamowicz, 2001; DeShazo and Fermo, 2002; Sàndor and Franses, 2009). They fit a heteroscedastic choice model in which the error variance is modeled as a function of some complexity measures.
    We propose an individual sequential design approach for setting up choice experiments that take the complexity of the choice sets explicitly into account. In a simulation study we compare this approach with some more standard designs that are used for estimating the preference heterogeneity. We show that the sequential nature or our approach is very beneficial both for estimating the individual level and the population level parameters. We show that this is due to the fact that the procedure selects the individual choice sets such that they have a constant medium level of complexity which yields very informative choices.