ENBIS-7 in Dortmund

24 – 26 September 2007

My abstracts

 

The following abstracts have been accepted for this event:

  • Implementation of a quality plan for the monitoring of blood treatment process for the Belgian Red Cross

    Authors: A. Guillet, B. Govaerts, A. Benoit
    Primary area of focus / application:
    Submitted at 8-Sep-2007 19:00 by
    Accepted
    The Belgian Red Cross has to develop quality control procedures to monitor its blood treatment processes in order to be in conformity with the Belgian legislation. This project implies the adaptation of statistical quality control techniques to the particular problematic of blood treatment. Indeed, the non normal distributions and censored data due to the measurement stuff of most of the products made them adapt the classical quality tools.
    First, we defined a sampling design for each measurement. Then, we made a brief descriptive analysis of the new data to check the goodness of the plan. After several months, we created some graphs to verify if the specification limit are respected and we computed some statistics on the collected data in order to compare the results between the three sites, to determine if the processes are under control and to evaluate the client-provider risk. Some of the graphs are automatically updated everyday whereas the statistics and the other graphs are monthly created as a report.
    To realise it, we had to implement the necessary tools in a statistical software in such a way that every technician can use it. Thus, we chose to make them encode the data directly in worksheets of the statistical software formatted for the different products. Moreover, they followed a small course adapted to their use of the software in order that they can understand how to use it, particularly the use of the macros, and how to read the outputs and detect that there is a problem in the quality of the products.
  • Mixing Krigings for Global Optimization

    Authors: D. Ginsbourger, C. Helbert, L. Carraro (Ecole des Mines de Saint-Etienne)
    Primary area of focus / application:
    Submitted at 8-Sep-2007 20:49 by David Ginsbourger
    Accepted
    Over the last 5-10 years, numerical simulations of stochastic and deterministic systems have become more accurate, but also
    more time consuming. This makes it impossible to study simulators exhaustively, especially when the number of parameters grows
    large. Consequently, Computer Experiments is a field of study in expansion; application areas include crash-test studies,
    reservoir forecasting, nuclear criticity, etc. We focus here on surrogate-based global optimization techniques for this kind of complex models.

    Our starting point is the E.G.O. algorithm, which is based on a kriging metamodel. In a first part, we recall in detail how kriging
    allows building sequential exploration strategies dedicated to global optimization. We point out some problems of kernel selection that are often skipped in the litterature of kriging-based optimization.

    In a second part, we introduce an extension of the EGO algorithm based on a mixture of kriging models (MKGO). We emphasize on how
    a mixture of kernel can lower the risk of misspecifying the kernel structure and its hyperparameters, especially when the estimation
    sample is small. The proposed approach is illustrated with classical deterministic functions (Branin-Hoo, Goldstein-Price), and
    compared with existing results.

    We also present a study carried out on gaussian processes, and observe the relations between the quality of covariance estimation
    and the performances obtained in kriging-based optimization. We finally give a bayesian interpretation of MKGO, and discuss the
    in-and-outs of choosing the prior distribution of the covariance hyperparameters.

    This work was conducted within the frame of the DICE (Deep Inside Computer Experiments) Consortium between ARMINES, Renault, EDF,
    IRSN, ONERA and TOTAL S.A.
  • Metamodels from Computer Experiments

    Authors: J.J.M. Rijpkema
    Primary area of focus / application:
    Submitted at 9-Sep-2007 14:40 by
    Accepted
    In engineering optimization a direct coupling between analysis models and optimization routines may be very inefficient, as during optimization a large number of iterative calls to possibly time-consuming analysis models may be necessary. In those situations it is preferred to uncouple analysis and optimization through the use of so called metamodels or surrogate models: fast to evaluate approximations for objective and constraint functions.

    For the construction of metamodels there are a number of approaches available, such as Response Surface Methods (RSM), Kriging, Smoothing Splines and Neural Networks. They all estimate the response for a specific design on the basis of information from the full analysis of a limited number of training designs. However, they differ with respect to their underlying conceptual ideas, the calculation effort needed for training and the applicability to specific situations, such as large-scale optimization problems or analysis models based on numerical simulations.

    In this presentation I will focus on two approaches for metamodelling, namely RSM and Kriging. I will review and compare key-concepts and present efficient experimental design strategies to train the models. Furthermore, I will discuss ways to enhance the model building process by taking information on design sensitivities into account.
    This may lead to a reduction of the actual number of full model analyses that is necessary for model training and estimation. It may be very effective, especially in those situations where design sensitivities are easily available such as is the case for analysis models based on the Finite Element Method. To illustrate the use of RSM and Kriging results from a numerical model study will be presented. They throw some light on strengths and weaknesses of both approaches in practical applications.

    Specifics: Related to the field of approximation models, computer simulation and engineering optimization. The preferred form of presentation is an oral presentation of about 20 minutes (including discussion).
  • Variability of Electromagnetic Emissions

    Authors: U. Kappel and J. Kunert
    Primary area of focus / application:
    Submitted at 9-Sep-2007 15:51 by
    Accepted
    Modern cars' equipment consists of an increasing number of electric and electronic devices. Therefore, the risk of electromagnetic interference increases, too, and thus the importance of assuring electromagnetic compatibility (EMC) grows. Additionally, different cars, even of the same model, are more and more equipped individually. This adds an increasing complexity to EMC management.

    The presentation discusses the results of a small study on the relative importance of several factors that might influence the electromagnetic emission of a car's subsystem consisting of a video-, an audio-, and a cell phone component. A fourth factor of interest was the design of the wiring harness connecting these components. All four factors were considered at 4 levels each. For the three components, the levels were chosen in such a way that we would expect increasingly less problems: no grounding of the component's case at all, poor grounding, good grounding and as the fourth level absence of the component. For the harness, we chose the four levels by selecting varying distances between the single wires and circuits. The experiment was done as a fractional factorial design with 16 runs.

    The response was the average excess of the measured radiated emissions over the emissions limit, where the average was taken over all frequencies of interest.

    Between any two of the 16 runs, the wiring was completely redone, even if the next run had the same level for this factor. This gives a measure of the variability caused by rebuilding the wiring, even if we try to rebuild the system in a completely identical way. To check the size of the pure measurement error, each run was measured 5 times, without any changes between the measurements.

    The results seem to indicate that the pure measurement error was relatively small, while the variability caused by rebuilding the wiring was very important. Compared to this "rebuilding error" the effects of the four levels of video, of audio and of the planned variation of the wiring itself were negligible. However, we could show that no grounding or poor grounding of the cell phone component increased the average electromagnetic emissions significantly compared to the other two levels.

    These findings could be reproduced in a confirmation experiment.
  • ACCOUNTING FOR SYSTEMATIC EFFECTS IN METROLOGY AND TESTING, NAMELY IN COMPARISONS

    Authors: Franco Pavese (National Institute for Research in Metrology, Torino, Italy)
    Primary area of focus / application:
    Submitted at 9-Sep-2007 17:07 by
    Accepted
    Replication of measurements and the combination of observations are standard and essential practices in metrology. A metrological –or testing– process of evaluating the uncertainty of the measurement results consists, in each laboratory, of basically three steps, using different methods to fulfil distinct purposes [1]:
    (a) When done on the same standard, to obtain the statistical features of the observations allowing to assess the repeatability of the value of the standard;
    (b) When done on the same standard, to obtain a measure of the effect on the total uncertainty of the variability of the influence parameters affecting the standard, including dependence on time, i.e. to assess the reproducibility of the value of the standard;
    (c) When done on several standards of the laboratory, to check if they have the same value or to establish the differences between their values, and to evaluate the associated uncertainty; i.e., to evaluate the accuracy of the values of the laboratory standards. This exercise can be called intra-laboratory comparison.

    When the exercise is performed for purpose (c) by comparing one (or more) standards provided by different laboratories, it is called inter-laboratory comparison. Past experience suggests that one should assume, as an a priori knowledge, that the comparisons are performed to detect bias.

    Bias is originating from the influence quantities, whose variability can show a non-zero mean and is also the source of the generally higher uncertainty obtained in ‘reproducibility conditions’ with respect to ‘repeatability conditions’.

    The paper is shortly recalling first the basic terms used in several written standards and international documents that not always are fully consistent each other and also show some evolutions of the concepts in the past decade, with the consequent possible confusion arising from the fact that not everybody is talking of the same things when they are assumed to. Then, is comparing several data models and is discussing their merits in taking (or not) into account the systematic effects, which are the prevailing reason of systematic errors in most metrology and testing measurements.


    [1] Pavese F and Filipe E 2006 Some metrological considerations about replicated measurements on standards Metrologia 43 419–425
  • How engineers learn statistics from motor cycle tires !

    Authors: J.J.M. Rijpkema (Eindhoven University of Technology, The Netherlands)
    Primary area of focus / application:
    Submitted at 10-Sep-2007 01:03 by
    Accepted
    Design Based Learning focuses on an integrated approach to problem solving and engineering design, where students are stimulated to apply concepts and insights gained from mono-disciplinary courses in a multidisciplinary way. Its aim is to expand students' engineering competencies, while working on authentic problems using up-to-date CAE-tools. Design Based Learning offers opportunities to enhance the students' awareness of situations where Engineering Statistics can be relevant, as for example the design of experiments or the modeling and analysis of data may be part of the solution strategy.

    In this presentation I will explain the ideas behind Design Based Learning and its implications for our teaching of Engineering Statistics. I will illustrate this with details from a design based project we run for first year’s students in Mechanical Engineering at the Eindhoven University of Technology in the Netherlands. In this project tire-characteristics that are crucial for safe road handling of a motor cycle have to be determined experimentally from a full scale experimental setup. I will present experiences from students and lessons learned. Finally, I will discuss ways to enhance engineers’ statistical competencies through the use of design based projects in industry.

    Specifics: Related to the field of statistics education for engineers. The preferred form of presentation is an oral presentation of about 20 minutes (including discussion).