ENBIS-12 in Ljubljana

9 – 13 September 2012 Abstract submission: 15 January – 10 May 2012

My abstracts


The following abstracts have been accepted for this event:

  • Interpretation of Shewhart Control Charts: New Opportunities

    Authors: Yuri Adler (VEI), Olga Maksimova (VEI), Vladimir Shper (VEI)
    Primary area of focus / application: Education & Thinking
    Keywords: Shewhart control chart, special cause, interpretation rules, statistical thinking
    Submitted at 15-Apr-2012 21:08 by Vladimir Shper
    Accepted (view paper)
    10-Sep-2012 12:30 Interpretation of Shewhart Control Charts: New Opportunities
    At the ENBIS10 in Antwerp we presented a new approach to the analysis of simple Shewhart Control Charts (SCC). The main idea of our approach is that a special cause of variation can change not only the values of process parameters but a distribution law itself. This increases essentially our opportunities to analyse real processes from the viewpoint of statistical thinking.
    This work shows that within the framework of such approach the interpretation of signals on Shewhart charts can be different from commonly used. For example, the lack of points beyond 3σ level can mean not the problems of rational subgroupping as it is usually being explained in literature but can be caused by a change of the distribution function (DF). Or, another example: a point beyong the control limits may signal not about the change of average or standard deviation but about the DF's change again. As a result the technicalities in using SCC decrease and the gap between our models and reality diminish as well. This means that SCC may loose some characteristics of exact mathematical models but can gain some new operational opportunities.
  • A Cointegration Approach to Monitoring Nonstationary Multivariate Processes

    Authors: Bart De Ketelaere (Katholieke Universiteit Leuven)
    Primary area of focus / application: Process
    Keywords: statistical process control, nonstationarity, multivariate, cointegration
    Submitted at 15-Apr-2012 21:46 by Bart De Ketelaere
    10-Sep-2012 12:10 A Cointegration Approach to Monitoring Nonstationary Multivariate Processes
    Statistical Process Control and related Control Charts are a well-developed field in cases where stationarity of the process can be assumed. However, for nonstationary cases a unified framework is not at hand. In this contribution, we use the concept of cointegration as a way to handle nonstationarity of more than one process variable. We will briefly review the theory of cointegration, and use the concept to two different case studies. Strengths and weaknesses will be discussed.
  • When the Surgeon Does Not Cut Straight

    Authors: Bernard Francq (Université Catholique de Louvain)
    Primary area of focus / application: Consulting
    Keywords: accuracy resecting bone tumor, mixed model, parametric bootstrap, delta method
    Submitted at 15-Apr-2012 22:57 by Bernard Francq
    Accepted (view paper)
    10-Sep-2012 17:20 When the Surgeon Does Not Cut Straight
    Resecting bone tumors within the pelvis requires good cutting accuracy to achieve satisfactory safe margins and improve reconstruction stability. Hand-controlled bone cutting can result in serious errors, especially due to the complex three-dimensional geometry, the limited visibility and the restricted working space of the pelvic bone. This experimental study in the ‘hôpital universitaire Saint-Luc’ (Belgium) investigates the bone-cutting accuracy and performance during navigated and non-navigated bone tumor resection within the pelvis.

    Through this presentation, I will explain and focus on the statistical details by answering questions such as: How to measure the "quality" of a resecting bone tumors? How to analyze the results using a mixed model with fixed, random and nested effects? And the most important issue for surgeons: what is the probability (+ CI around the estimated probability) to get an error greater than X millimeter in a resecting bone tumor for each technology (navigated or non-navigated) in comparison to the ‘target’ cut? This question is very simple to understand but less to resolve. We will then see the usefulness of the variance-covariance matrix of the variances of random effects, the delta method and the parametric bootstrap in a mixed model.
  • How to Accept the Equivalence of Two Measurement Methods?

    Authors: Bernard Francq (Université Catholique de Louvain), Bernadette Govaerts (Université Catholique de Louvain)
    Primary area of focus / application: Metrology & measurement systems analysis
    Keywords: measurement methods comparison, (correlated)-errors-in-variables-regressions, Bland and Altman's plot, tolerance intervals
    Submitted at 15-Apr-2012 23:00 by Bernard Francq
    10-Sep-2012 11:00 How to Accept the Equivalence of Two Measurement Methods?
    The needs of the industries to quickly assess the quality of products and the performance of the manufacturing methods leads to the development and improvement of alternative analytical methods sometimes faster, easier to handle, less expensive or even more accurate than the reference method. These so-called alternative methods should ideally lead to results comparable or equivalent to those obtained by a standard method known as a reference.

    To compare two measurement methods, a certain characteristic of a sample can be measured by the two methods in the experimental domain of interest. Firstly, to statistically test the equivalence of measurement methods, the measures given by the reference method and the alternative one can be modelled by an errors-in-variables regression (a straight line). The estimated parameters are very useful to test the equivalence. Indeed, an intercept significantly different from zero indicates a systematic analytical bias between the two methods and a slope significantly different from one indicates a proportional bias. A joint confidence interval can also be used to test the equivalence. Secondly, the differences between paired measures can be plotted against their averages and analyzed to assess the degree of agreement between the two measurement methods by computing the limits of agreement. This is the very well known and widely used Bland and Altman’s approach.

    We review and compare the two methodologies, the Bland and Altman’s approach and the errors-in-variables regression, with simulations and real data. Then, we’ll propose some improvements like the tolerance interval on the Bland and Altman’s approach and how to accept the equivalence by specifying a practical difference threshold on the results given by two measurement methods in the approach of errors-in-variables regression. We’ll conclude by explaining whether it’s more suitable to regress in a X-Y plot or to regress the differences with their averages like in a Bland and Altman plot to assess the equivalence of measurement methods.
  • Classification and Regression Trees in the Process Industry

    Authors: Stefanie Feiler (AICOS Technologies AG), Philippe Solot (AICOS Technologies AG)
    Primary area of focus / application: Mining
    Keywords: data mining, process industry, CART (Classification and Regression Trees), consulting experience, software usage
    Submitted at 15-Apr-2012 23:54 by Stefanie Feiler
    10-Sep-2012 15:15 Classification and Regression Trees in the Process Industry
    The CART (Classification and Regression Trees) methodology is a classical tool in data mining. The resulting decision trees are easily understandable, even by persons without statistical background. Although the method is widely used in marketing, finance, and biological / medical settings, the applications in manufacturing / industrial contexts are still sparse.

    We here want to share our experience in this field. For instance, we have used Classification and Regression Trees for comparing the performance of different reactors used in parallel in a chemical plant, or for improving a pharmaceutical production. The related technique of Random Forests has been valuable for assessing parameter importance. Our examples show that the tool is indeed well suited also for analysing process industry data. As software tools, we either work with R, or use the commercial CART/SPM of Salford Systems (the original developers of the method). We therefore shortly present both tools, discuss their differences, and address some important practical aspects (e.g. dealing with missing values / high level categorical parameters).
  • How to Market Statistics

    Authors: Winfried Theis (Shell Global Solutions BV)
    Primary area of focus / application: Consulting
    Keywords: marketing, communication, consultancy, promoting statistics
    Submitted at 16-Apr-2012 10:52 by Winfried Theis
    Accepted (view paper)
    10-Sep-2012 17:20 How to Market Statistics
    In my career as a statistician I experienced at every stage that it is crucial to promote statistics continuously and to invest time and effort into marketing our profession. That started already at the TU Dortmund with a campaign to attract more students to the study. It continued through all my later assignments. When I joined Shell a bit more than a year ago, I discovered, that our group was not too well known amongst the colleagues from other groups I met in the first few weeks. So we started a campaign to market our group better and push for more and better use of Statistics and Chemometrics. The learning for the group was that it requires constant engagement in outreach to sustain and further the proper use of statistics. In this presentation I want to present what we have done, but also want to discuss with the audience, what their experiences are, and what we could do together to easier promote our work.