ENBIS-11 in Coimbra

4 – 8 September 2011 Abstract submission: 1 January – 25 June 2011

My abstracts

 

The following abstracts have been accepted for this event:

  • On Assessing Performance of Sequential Procedures

    Authors: Ron S. Kenett(1) Moshe Pollak(2)
    Affiliation: (1) KPA Ltd., Raanana, Israel and University of Torino, Turin, Italy (2) The Hebrew University, Jerusalem, Israel
    Primary area of focus / application: Process
    Keywords: Average Run Length , Probability of False Alarm , Conditional Expected Delay , Statistical Process Control.
    Submitted at 15-May-2011 17:48 by Ron Kenett
    Accepted (view paper)
    7-Sep-2011 12:00 On Assessing Performance of Sequential Procedures
    Literature on statistical process control has focused mostly on the Average Run Length (ARL) to an alarm as a performance criterion of sequential schemes, with ARL0, the ARL to false alarm, being the in-control operating characteristic and the average run length from the occurrence of a change to its detection being the out-of-control operating characteristic. However, these indices do not tell the whole story, and at times they are not defined well by a single number.

    When considering false alarms, a quantity of interest is Pin control(T=n|Tn), the probability that a stopping time T will raise a false alarm at time n, conditional on its not having raised a false alarm previously. This may depend on n, so the worst-case scenario is indexed by supn<∞ Pin control(T=n|Tn), and one may want to keep this quantity low.

    Often, the post-change ARL is characterized by ARL1, the ARL to detection assuming that the change is in effect at the very start. However, CED(τ ) = E τ{(T − τ )+| T ≥ τ}, the conditional expected delay of detection conditional on a false alarm not having been raised before the (unknown) time of change , may depend on , and there is no guarantee that CED as a function of  is represented well by ARL1. If one has no anticipation of the time of change, sup<∞ Pin control(T=|T) would be an appropriate index (and often it is equal to ARL1). If one feels that there is a good chance that a change will be in effect right at the start, one may be interested in a fast initial response scheme, where CED(0) is made to be low, at the expense of a higher CED at later , so that ARL1 does not tell the whole story) . If one feels that a change is likely to take place in a distant future, then lim∞CED(τ ) may be of interest.

    We will review the role of various operating characteristics in assessing performance of sequential procedures in comparison to ARL0 and ARL1.
  • A Semi-infinite Programming Based Algorithm to Handle Minimax D-optimal Designs for Logistic Models

    Authors: Belmiro P.M. Duarte, Weng Kee Wong
    Affiliation: Department of Chemical and Biological Engineering, ISEC; Department of Biostatistics, UCLA School of Public Health
    Primary area of focus / application: Design and analysis of experiments
    Keywords: Optimal designs , Minimax D-optimal , General Equivalence Theorem , Semi-infinite programming , Logistic model
    Submitted at 16-May-2011 15:21 by Belmiro Duarte
    Accepted (view paper)
    6-Sep-2011 11:25 A Semi-infinite Programming Based Algorithm to Handle Minimax D-optimal Designs for Logistic Models
    Minimax optimal designs for regression models are gaining popularity and are increasingly used in different types of experiments in industry and biological sciences. There are iterative algorithms for generating many types of optimal designs but there is none for finding a minimax optimal design. Part of the reason is that the optimality criterion is not differentiable and so minimax designs are technically more difficult to find and study.

    In this work, we use a semi-infinite programming based algorithm and tools from the general algebraic modeling system (GAMS) to find minimax optimal designs for a variety of models. We show that our non-iterative procedure is able to find the optimal designs efficiently and they coincide with the small number of theoretical optimal minimax designs reported in the literature. The implication is that our method seems to work and applicable to more complicated statistical models and problems. Several illustrative applications using logistic models for a binary response are provided.
  • Kernel estimator for the bias correction of a monitoring system

    Authors: Giuseppe Curcurù, Ph. D. in Logistics; g.curcuru@unipa.it Alberto Lombardo, Full Professor in Statistics; alberto.lombardo@unipa.it
    Affiliation: Dipartimento di Ingegneria Industriale – Università di Palermo
    Primary area of focus / application: Metrology & measurement systems analysis
    Keywords: Kernel estimator , monitoring system , degradation , transfer function
    Submitted at 19-May-2011 13:20 by Alberto Lombardo
    Accepted (view paper)
    5-Sep-2011 16:30 Kernel estimator for the bias correction of a monitoring system
    Monitoring systems employed in manufacturing and service industries are based on the use of reliable dynamic ‘sensors’, able to translate by means of a transfer function an unobservable interesting quantity into an observable one. Despite the performances of such sensors are reliable, their response can diverge more and more from the correct one over time. This could be caused by an intrinsic physical degradation of the monitoring system or by particular operating environmental conditions. In this paper a method is proposed to detect this degradation process by using the distribution function of the observed measures. Such methodology is based on the Kernel estimator that is found to work very well in the central region of the distribution, badly in the tails of the distribution, where only a few number of observations are generally available.
    The proposed approach can be useful to monitor the performances of a real monitoring system in the attempt of extending its natural life-time and/or avoiding an underestimate of the degradation process under study. Results are presented by some simulated examples.
  • Calculating Run Length Quantiles with the R Library spc -- Go Beyond the Famous and Simple ARL

    Authors: Sven Knoth
    Affiliation: Department of Mathematics and Statistics, Helmut Schmidt University Hamburg, Germany
    Primary area of focus / application: Process
    Keywords: spc , control charts , run length quantiles , statistics software , numerical methods
    Submitted at 20-May-2011 15:50 by Sven Knoth
    Accepted
    7-Sep-2011 12:20 Calculating Run Length Quantiles with the R Library spc -- Go Beyond the Famous and Simple ARL
    While presenting a new version of the R library spc during ENBIS 10 in Antwerp it turned out the need of providing functions to calculate conveniently quantiles of control chart run length (= number of observations until the chart signals). There are some papers with results on RL (run length) quantiles (especially, of course, the median). Here, the related -- known ones and new -- algorithms are boxed in simple functions within a new version of the R library spc. Thus, potential users of, e. g., xewma.arl() which provides the ARL (the average ...) for several types of mean EWMA charts, may try also xewma.qrl() which offers now (in the upcoming version 0.4.1 of the R library spc) the corresponding quantile functionality.
  • Random Events in Safety Critical Maintenance: A Myth?

    Authors: Chris McCollin Tony Sackfield
    Affiliation: Nottingham Trent University
    Primary area of focus / application: Reliability
    Keywords: Maintenance modelling , NHPP with Weibull intensity , Exponential distribution , Extended models
    Submitted at 24-May-2011 12:21 by Chris McCollin
    Accepted (view paper)
    7-Sep-2011 12:40 Random Events in Safety Critical Maintenance: A Myth?
    We initially consider an extension to the mean value function of the Duane Model or more commonly called the Non-homogeneous Poisson Process with Weibull intensity. The intractability of this model leads us to consider a simpler version which provides some insight into the use of the exponential distribution in maintenance modelling. We transform the model to provide graphical estimates of parameters from data and provide analysis and interpretation for a well-known reliability data set. We then consider the time to first event of this process and determine the mean and variance of special cases of the associated probability density function. Finally we show a graphical method to determine the parameters of a special case with an application to a reliability data set and provide a possible physical interpretation.
  • GWSDAT (GroundWater Spatiotemporal Data Analysis Tool)

    Authors: Wayne R. Jones (Wayne.W.Jones@shell.com)
    Affiliation: Shell Statistics & Chemometrics
    Primary area of focus / application: Business
    Keywords: "Environmental monitoring", , "spatiotemporal modelling" , "application development" , "Sampling" , "Software"
    Submitted at 24-May-2011 17:27 by Wayne Jones
    Accepted
    6-Sep-2011 12:50 GWSDAT (GroundWater Spatiotemporal Data Analysis Tool)
    The GroundWater Spatiotemporal Data Analysis Tool (GWSDAT) is a user-friendly geostatistical R (www.r-project.org) software application developed by Shell for the analysis and visualisation of trends in environmental (groundwater) monitoring data. GWSDAT has added value to the business by improving the risk-based decision making process.
    In this presentation we shall discuss the underlying statistical methods and demonstrate GWSDAT functionality which includes:

    • User friendly GUI (Graphical User Interface).

    • Time series, spatial and Spatiotemporal trend detection.

    • Automatic report generation.