ENBIS-16 in Sheffield

11 – 15 September 2016; Sheffield Abstract submission: 20 March – 4 July 2016

My abstracts


The following abstracts have been accepted for this event:

  • Going Beyond Six Sigma - Global Data Insight & Analytics at the Ford Motor Company

    Authors: Dr. Thomas Hochkirchen (Ford Motor Company)
    Primary area of focus / application: Business
    Secondary area of focus / application: Six Sigma
    Keywords: Business decision support, Ford motor company, Whiz kids, Deming, Six Sigma
    Submitted at 7-Jun-2016 10:13 by Thomas Hochkirchen
    14-Sep-2016 10:50 Going Beyond Six Sigma - Global Data Insight & Analytics at the Ford Motor Company
    In this talk, developments within Ford Motor Company, deemed exciting for the Statistics profession, will be described – the creation of a new global skill team devoted to the systematic use of data and analytical methods to support better business decisions.

    Top Management of an industrial corporation with a 113 year legacy have identified the value created by systematic use of data and analytics and sponsored creation of an own skill team devoted to this - so the use of data is lined up with other team skills such as "Manufacturing", "Product Development", "Finance" etc.

    We will describe these developments in more detail and show how the ground for them was paved within the corporate mindset – from the “Whiz Kids” in the 1940s via Deming and, of course, application of Six Sigma at large scale.
  • Thinking about Data

    Authors: Roland Caulcutt (Caulcutt Associates)
    Primary area of focus / application: Education & Thinking
    Secondary area of focus / application: Consulting
    Keywords: Data, Thinking, Brain, Decisions
    Submitted at 12-Jun-2016 21:15 by Roland Caulcutt
    Accepted (view paper)
    14-Sep-2016 10:10 Thinking about Data
    Statisticians think about data. Their clients also think about data.
    At one extreme we find professional statisticians, with many years of training and experience, who can advise clients what conclusions can reasonably be drawn from their data. At the other extreme we find clients who have such fear of statistics that they condense their data down to two numbers for ease of decision making.
    Clearly there must be a spectrum of analytical competence between these two extremes. Recent research shows us that people at the two extremes use different thinking processes.
    This interactive session will demonstrate these thinking processes and show you where you lie in this spectrum. You may be surprised.
  • Evolutionary Operation for Contemporary Processes

    Authors: Bart De Ketelaere (Catholic University of Leuven), Mohammad Rohail Taimour (Catholic University of Leuven), Peter Goos (Catholic University of Leuven)
    Primary area of focus / application: Design and analysis of experiments
    Secondary area of focus / application: Process
    Keywords: Design of Experiments, Sequential Improvement, EVOP, Power, Simulation study
    Submitted at 13-Jun-2016 11:16 by Bart De Ketelaere
    14-Sep-2016 09:20 Evolutionary Operation for Contemporary Processes
    Evolutionary Operation (EVOP) was first mentioned by George Box in 1957 as a simple method that can be used to improve full scale processes while they are running. The rules associated with it allowed EVOP to be calculated by hand on a sheet of paper so that process operators could implement the adapted settings of the process and evaluate the adapted process. This idea of sequentially perturbing the process and to learn from it is both simple and effective, but it never became a standard operation in practice. The main reason is probably that EVOP was, until recently, never studied in more detail for processes that are common in current practice: they are governed by a multitude of parameters and sampling intervals are very short requiring fast and automatic computation.
    Rutten and co-workers (2014) understood this gap and performed research into EVOP schemes that are suitable for higher dimensional processes and automatic execution. In essence, they proposed several rules for implementing EVOP for low and high dimensions, and applied them in simulation studies as well as in real-life cases. One of the main open issues that they discussed is the choice of base design for EVOP when the dimensionality prohibits the standard full factorial to be used. In such cases, smaller base designs are required, but it is unclear what those base designs should look like to obtain an optimal EVOP scheme. Optimal here means that the process improvement is achieved using a minimal amount of experimental effort. Preliminary simulations performed by Rutten (2014) pointed in the direction of favoring low power designs. The aim of this contribution is to refine the simulations to determine the optimal properties for the base designs in EVOP, for low as well as for high dimensions.
    The approach taken starts from a simple numerical simulation study where a linear process needs to be optimized. We initially use an analytical approach to this simulation yielding more accurate results than those obtained by Rutten (2014). The main conclusions from these studies will be discussed and can be summarized as follows: (1) base designs with a low power outperform larger designs having a higher power; (2) base designs for higher dimensional processes ask for a higher power when compared to lower dimensions; (3) replication of the base design when no process improvement direction is found is advantageous for base designs with very low power, but not for base designs with higher power.
  • Quality Monitoring, Control and Improvement in Additive Manufacturing: Challenges and Opportunities

    Authors: Bianca Maria Colosimo (Politecnico di Milano)
    Primary area of focus / application: Quality
    Secondary area of focus / application: Process
    Keywords: Quality, Additive manufacturing, Statistical process monitoring, PCA, Cluster analysis
    Submitted at 14-Jun-2016 17:27 by Bianca Maria Colosimo
    14-Sep-2016 09:20 Quality Monitoring, Control and Improvement in Additive Manufacturing: Challenges and Opportunities
    Additive manufacturing (AM) has been attracting a growing interest in different industrial sectors (aerospace, biomedical implants, machinery and tooling, creative industries) as it is one of the key technologies for producing complex, customized shapes. Despite of the relevant improvements made by AM technology in the recent years, process capability is still a major issue for its industrial breakthrough. As a matter of fact, different kinds of defects may originate during the process and the quality of the part depends on hundreds of controllable parameters, but it is also affected by many nuisance factors. This talk illustrates challenges and opportunities of applying statistical process monitoring, control and optimization to AM. In particular, a new approach for in-line statistical process monitoring by combining Principal Component Analysis (PCA) and cluster analysis is described and applied to a real case study of metal Additive manufacturing via Selective Laser melting.
  • How to Solve the Probabilistic Problem of Reliability by VMEA in an Aircraft Engine Component

    Authors: Sören Knuts (GKN Aerospace Sweden)
    Primary area of focus / application: Reliability
    Secondary area of focus / application: Design and analysis of experiments
    Keywords: Reliability, VMEA, Variation, Probabilistic, Safety factors, Aerospace
    Submitted at 15-Jun-2016 12:09 by Sören Knuts
    13-Sep-2016 16:00 How to Solve the Probabilistic Problem of Reliability by VMEA in an Aircraft Engine Component
    The understanding how variation is affecting reliability is an intriguing but necessary task in today’s optimization of new aircraft engines. The Monte-Carlo method is a commonly requested approach to include variation aspects when studying reliability. However, Monte-Carlo limits the analytical understanding of variation, and where to attack a design optimization problem to increase the robustness and reliability of the product. In this article an example is given how VMEA, Variation Mode and Effects Analysis, has been used in order to estimate reliability of an aircraft engine component. VMEA or probabilistic VMEA includes not only variation but also uncertainties, and can be regarded as an alternative formulation of safety factors.
    Note: GKN Aerospace is a Tier 1 supplier in the Aerospace business with a large variety of components that is available on more than 90% of all new engines.
  • Using Sensitivity Analysis for the Calibration of a Numerical Model for Concrete

    Authors: Alexios E. Tamparopoulos (University of Wolverhampton), Marco Marcon (Christian Doppler Laboratory, Institute of Structural Engineering, University of Natural Resources and Life Sciences, Vienna), Roman Wendner (Christian Doppler Laboratory, Institute of Structural Engineering, University of Natural Resources and Life Sciences, Vienna)
    Primary area of focus / application: Design and analysis of experiments
    Secondary area of focus / application: Metrology & measurement systems analysis
    Keywords: Computer simulation, Design of Experiments, Model calibration, Dimensionality reduction
    Submitted at 18-Jun-2016 17:35 by Alexios Tamparopoulos
    We explore the calibration process for a computationally intensive model for concrete. The input parameters are 10 meso-scale properties that are not directly measurable, while the output is the response of the simulated concrete structural element. The challenge is to find the input parameter set that can match the observed response of two particular standard tests, namely the three point bending and the cube compression test. Since a single model run can take up to several hours, one needs to create a strategy for efficiently tuning the model, in order to attain the matching. To achieve that, we construct a set of independent normalised measures for comparing stress-strain profiles, by using the Fréchet distance for measuring curve dissimilarity. The sensitivity of the output is measured by means of partial rank correlation coefficients. Latin hypercube sampling is used for obtaining a small design space, while principal component analysis is used for further reducing the dimensionality of the comparison measure set. We show that the 10-dimensional input can be reduced to a three-dimensional one, which can lead to a dramatic decrease in computational time for the desired simulation.