ENBIS-18 in Nancy

2 – 25 September 2018; Ecoles des Mines, Nancy (France) Abstract submission: 20 December 2017 – 4 June 2018

My abstracts


The following abstracts have been accepted for this event:

  • Simultaneous Sparse Approximation for Variable Selection and Classification. Application to Near Infrared Spectra and Hyperspectral Images of Wood Wastes

    Authors: El-Hadi Djermoune (Université de Lorraine, CNRS), Thomas Aiguier (Université de Lorraine, CNRS), Cédric Carteret (Université de Lorraine, CNRS), David Brie (Université de Lorraine, CNRS)
    Primary area of focus / application: Other: Machine learning for automatic diagnosis and decision-making
    Keywords: Simulatenous sparse approximation, Variable selection, Classification, NIR spectroscopy, Hyperspectral Image
    Submitted at 23-May-2018 22:46 by David Brie
    3-Sep-2018 14:20 Simultaneous Sparse Approximation for Variable Selection and Classification. Application to Near Infrared Spectra and Hyperspectral Images of Wood Wastes
    We propose a simultaneous variable selection method for material sorting based on near infrared spectroscopy. The objective is to perform quick classification in industrial wood recycling processes based only on a few spectral bands. The spectra are first jointly modeled as linear combinations of explanatory variables drawn from a collection of wavelengths (dictionary). The aim is to select a common subset of variables shared by several spectra. The selection is then addressed using a simultaneous sparse approximation method in which the coefficients related to different spectra are encouraged to be piecewise constant, i.e. the coefficients associated to successive spectra should have comparable magnitudes. We propose also a nonnegative version of the optimization problem in which the magnitudes are also constrained to be positive. These problems are solved using the fast iterative shrinkage-thresholding algorithm. The proposed approaches are illustrated on a dataset of 294 infrared spectrometry measurements of wood wastes containing 1647 wavelengths. We show that the selected variables lead to better classification performances as compared to standard approaches. Finally, we show how these methods naturally extend to hyperspectral image classification for pushbroom imagers. Results obtained on real hyperspectral data of wood wastes confirms the effectiveness of the method.
  • Order-of-Addition Experiments

    Authors: Joseph Voelkel (Rochester Institute of Technology)
    Primary area of focus / application: Design and analysis of experiments
    Keywords: Experimental design, Orthogonal array, D-efficiency, Minimum chi-square
    Submitted at 28-May-2018 14:45 by Joseph Voelkel
    Accepted (view paper)
    5-Sep-2018 09:30 Order-of-Addition Experiments
    The order in which components are added in a chemical batch, film, food product, or in a study of protein transport may be a primary consideration in an experiment, especially in the earliest stages of studying a process.

    Little investigation has been done for the design of such order-of-addition (OofA) experiments until our recent work. Based on an idea from Van Nostrand (1995), we define and explain a reference standard of OofA experiments by extending the idea of orthogonal arrays. (We try to show this and other ideas as simply as possible.) For strength 2 designs, we find that OofA orthogonal arrays require that the number of runs N is divisible by 12 when the number m of components exceeds 3. (For m = 3, the full design requires only 6 runs.) We consider a chi-square criterion to measure the balance of an OofA array, and show that for strength 2 designs, OofA OA’s appear to be equivalent to D-optimal designs. In many situations, a number of non-isomorphic (distinct) designs exist; in such cases we use secondary measures to determine an optimal design. We have found OofA orthogonal arrays for m = 4 and 5 when N = 12, for m = 5, 6, and (near-optimal) 7 when N = 24.

    We also extend these optimal OofA designs to incorporate standard process variables so that, for example, temperature or mixing speeds may be included. Our methods can also take into account natural restrictions that the experimenter may have, such as requiring that one component is always added before another. Finally, we show and analyze examples of some recent experiments in m = 4, 5, and 6 components.
  • Spatio-Temporal PCA for Image Monitoring in Additive Manufacturing

    Authors: Bianca Maria Colosimo (Politecnico di Milano), Marco Grasso (Politecnico di Milano)
    Primary area of focus / application: Process
    Secondary area of focus / application: Quality
    Keywords: SPC, Additive manufacturing, PCA, Spatio-temporal, Image monitoring
    Submitted at 28-May-2018 17:16 by Bianca Maria Colosimo
    5-Sep-2018 09:30 Spatio-Temporal PCA for Image Monitoring in Additive Manufacturing
    The contribution presents an approach to combine spatial and temporal information with Principal Component Analysis (PCA) and clustering to quickly detect out-of-control states from video images in metal 3D printing, also known as Additive Manufacturing. The main idea is to analyse video images using a PCA where a weighted variance-covariance matrix is considered in order to include information on spatial correlation of pixels in the image. An alarm rule based on clustering is further considered to detect hot-spots, i.e., location where the cooling transient is out of control with respect to the usual pattern. Comparison with other existing methods and application to a real case study are shown.
  • Mixing Natural Language Processing and Image Segmentation for Pre-Diagnosis in Medicine. Application to Breast Cancer

    Authors: Stéphane Jankowski (Quantmetry), Pablo Valverde (Quantmetry), Antoine Simoulin (Quantmetry), Nicolas Bousquet (Quantmetry), Sébastien Molière (Hôpitaux Universitaires de Strasbourg), Carole Mathelin (Hôpitaux Universitaires de Strasbourg)
    Primary area of focus / application: Modelling
    Secondary area of focus / application: Modelling
    Keywords: Deep learning, Medical imaging, Natural language processing, Breast cancer
    Submitted at 28-May-2018 22:40 by Nicolas Bousquet
    3-Sep-2018 14:40 Mixing Natural Language Processing and Image Segmentation for Pre-Diagnosis in Medicine. Application to Breast Cancer
    We consider a case where a physician has to radically diminish the time required for a pre-operational diagnosis which can be established with the simultaneous help of an ontology based on electronic health records (EHR) and the localization of particular shapes using medical imaging. In the case of breast cancer, the search for tumors using three-dimensional magnetic resonance imaging (MRI) can be automatized using deep learning algorithms. However, its uses remains sometimes controversial, since leading possibly to confusions which can has major impacts on surgery choices. We propose an optimization of machine learning techniques applied to MRI by putting into coherence its results with an automated building of this ontology, based on natural language processing of EHR and the elicitation of a French dictionary of medical synonyms. The full methodology is presented in this talk, with details on the problems raised by name matching and the optimization process itself
  • Statistical Engineering: A Glimpse into the Future

    Authors: Geoff Vining (Virginia Tech Statistics Department)
    Primary area of focus / application: Other:
    Keywords: Six Sigma, Scientific method, Solving complex problems, Future of statistics
    Submitted at 29-May-2018 15:43 by Geoff Vining
    Accepted (view paper)
    3-Sep-2018 12:00 Statistical Engineering: A Glimpse into the Future
    There is a compelling need to create a new discipline, Statistical Engineering, to advance the theory and practice of solving large, complex, unstructured, problems. The issue is not the tools required; the basic and even advanced statistical/analytical methodologies are well developed. People in diverse disciplines are beginning to clearly understand the need for interdisciplinary teams and the fundamental issues regarding team dynamics, as well as other aspects from organizational psychology. For example, Lean Six Sigma clearly demonstrates the importance of proper project management in the context of problem solving. The “missing link” is how to put the tools to the most effective use, especially when multiple tools are needed, based on what we have learned from previous solutions to other large, unstructured, complex problems.

    The proper model for this new discipline is chemical engineering and its systems approach for creating chemical processes based on the concept of “unit operations,” such as distillation, chemical reactor design, and heat transfer. Chemical engineering uses a systems approach to put the proper unit operations together in novel ways to build new chemical process, as well as improve existing processes, efficiently and effectively. Obviously, chemical engineering does not replace chemistry, but is rather complementary to it, figuring out how to best utilize the science of chemistry to develop large-scale chemical processes.

    The statistical engineering analogs of unit operations are: data acquisition, data exploration, analysis/modeling, inference (back to the original problem), deployment of a tentative solution, and solution confirmation. The engineering challenge is how to put these tools together based on previous experience, and the unique nature of the problem at hand. Statistical engineering must combine the tools taught in university statistics curricula with the practical subject-matter knowledge based, and experience on previous successful solutions. The scientific method, properly understood, is an important key to success.

    The International Statistical Engineering Association (ISEA) is a new global professional society dedicated to advancing the theory and practice of statistical engineering. The ISEA membership model is based on ENBIS. We are always looking for other people who share our vision and passion for this new discipline.
  • Statistical Engineering from a Corporate Perspective

    Authors: William Brenneman (Procter & Gamble Company)
    Primary area of focus / application: Other: Statistical Engineering
    Keywords: Statistical engineering, Applied statistics, Industrial statistics, Process
    Submitted at 29-May-2018 15:58 by William Brenneman
    3-Sep-2018 11:30 Statistical Engineering from a Corporate Perspective
    The skills to solve large unstructured problems in industry are critical to the success of many projects faced by practicing statisticians. Often these skills are learned on the job informally through observing experienced statisticians or semi-formal through mentoring and coaching. Statistical engineering provides a theory, framework and process to codify the experience of practicing statisticians and provides a mechanism for improving the process over time. I will provide some real examples from industry of statistical engineering and challenge the academic community to enhance their statistics curriculum to include statistical engineering methods so their graduates can have greater impact as practicing statisticians.