ENBIS-18 in Nancy

2 – 25 September 2018; Ecoles des Mines, Nancy (France) Abstract submission: 20 December 2017 – 4 June 2018

My abstracts

 

The following abstracts have been accepted for this event:

  • Designs for Characterization of Rare Events of Monotone Models

    Authors: Rodrigo Cabral Farias (I3S- CNRS/UNS), Elyes Ouerghi (I3S- CNRS/UNS), Luc Pronzato (I3S- CNRS/UNS), Maria Joao Rendas (I3S- CNRS/UNS)
    Primary area of focus / application: Design and analysis of experiments
    Keywords: Exceedance probability, Quantile estimation, Monotone functions, Experimental design
    Submitted at 27-Apr-2018 14:57 by Maria Joao Rendas
    Accepted
    4-Sep-2018 09:40 Designs for Characterization of Rare Events of Monotone Models
    Characterisation of rare events is a frequent goal in many distinct domains, including industrial design and environmental studies. Often, the rare event of interest is of the exceedance type, i.e. it corresponds to the fact that a function f(.) of a number of exogenous factors exceeds a given threshold η. Several authors have recently addressed the two related problems of estimation of a probability of exceedance and of identification of the configurations of exogenous factors that lead to the exceedance event of interest. These problems are particularly difficult when evaluation of the function for each input configuration is computationally costly. Efficient recent methods that resort to surrogate models based on a Gaussian Process (GP) assumption, e.g. [1,2], are specially tailored for this situation.

    In some circumstances, even if the function is not exactly known, qualitative knowledge about its dependency on the input factors exists. In this presentation we exploit knowledge that f(.) is monotone in its arguments when addressing the following two related problems: (i) to determine the probability α that f(.) exceeds a given level η when upper and lower bounds for α are known; (ii) to find the input configurations and threshold η that map to the α−quantile of f(.).

    Monotonicity assumptions have been explored before, in a similar context, in [3]. Our work differs in that we show that by exploiting knowledge of the probability distribution of the input factors of f(.) and about the possible range for α we can restrict the required (costly) evaluations of the function to a small subset of the entire input space. This is not only important from the point of view of reducing the (large) numerical complexity of state-of-the-art methods based on surrogate models, but it also (and this is probably equally important) increases overall robustness to the stationarity assumption that underlies these techniques. We discuss subsequent application of the adaptive method presented in [1] to the identified relevant subset, in particular concerning the choice of an initial design.
    The results are illustrated numerically on the estimation of the probability of flooding using an hydro-dynamic model, enabling an appreciation of how it impacts overall efficiency and accuracy.

    Bibliography
    [1] Julien Bect, Ling Li, Emmanuel Vazquez. Bayesian subset simulation. SIAM/ASA Journal on Uncertainty Quantification, ASA, American Statistical Association, 2017, 5 (1), pp.762-786.
    [2] C. Chevalier, J. Bect, D. Ginsbourger, E. Vazquez, V. Picheny,Y. Richet, Fast parallel kriging-based stepwise uncertainty reduction with application to the identification of an excursion set. Technometrics, 56, No.4, pp: 455—465, 2014.
    [3] T. Labopin-Richard, V. Picheny, "Sequential design of experiments for estimating quantiles of black-box functions", Statistica Sinica, 2017.
  • How SPM Can Help Practitioner when Her/His Process Is Unstable

    Authors: Vladimir Shper (Moscow Institute of Steel & Alloys), Yuri Adler (Moscow Institute of Steel & Alloys), Irina Zubkova (Moscow Institute of Steel & Alloys)
    Primary area of focus / application: Process
    Secondary area of focus / application: Consulting
    Keywords: SPM, Control, Charts, Unstable, processes
    Submitted at 27-Apr-2018 15:10 by Vladimir Shper
    Accepted
    4-Sep-2018 09:40 How SPM Can Help Practitioner when Her/His Process Is Unstable
    As a rule statisticians in their books, papers and presentations on Statistical Process Monitoring (SPM) discuss how to find out whether a process is stable and how after achieving stability to monitor its future behavior. There is a well-known statistical tool to this end – control chart, and further discussions are usually going along the problem which chart and how to use it in order to extract the maximum from data under study. Here we discuss a different situation: a process is obviously unstable – is it useful to construct any charts or plots or diagrams, and if yes – which ones are mostly helpful? Taking a big amount of data from an unstable process we deliver recommendations on the procedure to analyze unstable processes, on what charts and in which sequence to use, and how to interpret the data obtained. As the amount of unstable processes is much more than stable ones and we encounter such data very often while meeting with practitioners in many areas of activity, we are sure that this information may be useful for many practitioners.
  • Functional Regression Control Chart for Ship CO2 Emission Monitoring

    Authors: Fabio Centofanti (University of Naples Federico II), Antonio Lepore (University of Naples Federico II), Alessandra Menafoglio (MOX - Politecnico di Milano), Biagio Palumbo (University of Naples Federico II), Simone Vantini (MOX - Politecnico di Milano)
    Primary area of focus / application: Process
    Secondary area of focus / application: Modelling
    Keywords: Functional data analysis, Functional linear regression, Profile monitoring, Statistical Process Control, CO₂ emission monitoring
    Submitted at 27-Apr-2018 15:13 by Antonio Lepore
    Accepted
    5-Sep-2018 09:00 Functional Regression Control Chart for Ship CO2 Emission Monitoring
    In many industrial processes, the quality characteristic is apt to be modelled as a function defined on a compact domain, which is usually referred to as profile. Furthermore, because of the rapid development of data acquisition systems, it is not unusual that measurements of other functional and scalar covariates are available as well.
    In this context, profile monitoring methods can be formulated in order to use the additional information from these covariates as well as that in the quality characteristic.

    To this end, we propose a new functional control chart that combines information from all the measures attainable, with the aim of improving the efficacy of the monitoring.

    The rationale behind the proposed control chart is to monitor the residuals of a function-on-function linear regression of the quality characteristic on the covariates.

    This allows monitoring the quality characteristic adjusted for the effect of the covariates and, thus, taking into account their explained variance.

    A Monte Carlo simulation study is presented to quantify the performance of the proposed control chart in identifying quality characteristic mean shift, through a comparison with other functional control charts proposed in the literature.

    A real case study in the shipping industry is finally presented, with the purpose of monitoring CO₂ emissions from a Ro-Pax ship owned by the shipping company Grimaldi Group and in particular detecting the CO₂ emission reductions after that a specific energy efficiency initiative (EII) was performed.
  • Comparison of Methods for Summarizing and Simulating Non-Standard Distributions

    Authors: Jody Muelaner (The University of Bath), Kavya Jagan (The National Physical Laboratory)
    Primary area of focus / application: Metrology & measurement systems analysis
    Secondary area of focus / application: Quality
    Keywords: Monte Carlo, Pearson distribution, Johnson distribution, Non-standard distributions, Uncertainty of measurement
    Submitted at 27-Apr-2018 23:00 by Jody Muelaner
    Accepted
    5-Sep-2018 11:10 Comparison of Methods for Summarizing and Simulating Non-Standard Distributions
    When simulating uncertainties within non-linear models the output distributions are often non-Gaussian and not conforming to any standard distribution. Quantiles can be directly determined from the simulation results. However, a method of concisely describing the output distribution remains desirable so that, at a later date, quantiles for different probabilities may be determined or the complete sample may be regenerated as an input to some further simulation. A method which allows extrapolation to probability levels not feasible by numerical simulation would also be desirable. We compare a number of methods for summarizing non-standard distributions and for generating random numbers using these summary statistics. The methods compared include using: i) A Pearson distribution; ii) A Johnson distribution; iii) Distribution fitting based on Entropy; and iv) Fitting splines to quantiles. A number of non-linear models of measurements are used to simulate uncertainties with non-standard distributions and the accuracy of each method in recreating these distributions is then evaluated in terms of quantile determination.
  • Kriging-Based Robust Optimization

    Authors: Celine Helbert (Ecole Centrale de Lyon), Melina Ribaud (Ecole Centrale de Lyon), Christophette Blanchet (Ecole Centrale de Lyon), Frederic Gillot (Ecole Centrale de Lyon)
    Primary area of focus / application: Quality
    Secondary area of focus / application: Design and analysis of experiments
    Keywords: Robust optimization, Kriging, Expected improvment, Genetic algorithm, Robustness criterion
    Submitted at 30-Apr-2018 09:34 by Celine Helbert
    Accepted (view paper)
    3-Sep-2018 14:00 Kriging-Based Robust Optimization
    In this paper we expose our benchmarking work for kriging-assisted robust optimization for limited budgets problems. In industrial context production fluctuations can often impact the optimal design. Our robustness criterion is then naturally defined as the variance of the quantity of interest in the neighborhood of a design solution. First we introduce two kriging-based approximations of this robustness criterion and show that the second one based on the observation of first and second derivatives is much more efficient. Then we propose to address the robust optimization issue by solving a multiobjective optimization problem on the quantity of interest and the robustness criterion. We detail 10 different relevant strategies and compare them for the same budget of 54 evaluated points on the 2D Six-Hump Camel function as objective function. All strategies are based on a NSGA II algorithm and 9 starting points. Sets of 9 new points are then added after each iteration of the optimization algorithm to increase the kriging accuracy of the objective function and its robustness criterion. Results tend to show that a balance between computational effort and accuracy for added point is achieved when using Kriging Believer or Constant Liar strategy for the function as well as the robustness criterion.
  • Experimental Designs in Industrial Marketing

    Authors: Paulin Choisy (XLstat), Efthalia Anagnostou (XLstat)
    Primary area of focus / application: Other: Software
    Keywords: Conjoint analysis, Experimental design, Software, Optimal design
    Submitted at 30-Apr-2018 14:59 by Paulin Choisy
    Accepted (view paper)
    3-Sep-2018 12:00 Experimental Designs in Industrial Marketing
    Conjoint analysis is a decision making tool bringing the power of consumer value in the product design and development process. Its principle is to present a set of products to the individuals who will rank, rate, or choose some of them. Ideally, individuals should test all possible products. But in practice it rapidly becomes impossible; the cognitive capacities of a person being limited, and the number of combinations increasing very rapidly with the number of attributes. We therefore use the methods of experimental design to obtain an acceptable number of profiles to be judged while maintaining good statistical properties. In this presentation, we will talk about the mechanism XLSTAT software uses to generate optimal experimental designs following after a brief introduction to conjoint analysis. An example using the software will be also demonstrated.