ENBIS-18 in Nancy

2 – 25 September 2018; Ecoles des Mines, Nancy (France) Abstract submission: 20 December 2017 – 4 June 2018

My abstracts

 

The following abstracts have been accepted for this event:

  • Structured Low-Rank Matrix Completion for Forecasting in Time Series Analysis

    Authors: Konstantin Usevich (Université de Lorraine, CNRS, CRAN), Jonathan Gillard (Cardiff University)
    Primary area of focus / application: Mining
    Secondary area of focus / application: Modelling
    Keywords: Low-rank matrix completion, Forecasting, Nuclear norm, Hankel matrices
    Submitted at 4-Jun-2018 22:06 by Konstantin Usevich
    Accepted
    4-Sep-2018 12:20 Structured Low-Rank Matrix Completion for Forecasting in Time Series Analysis
    In this talk we consider the low-rank matrix completion problem with specific application to forecasting in time series analysis. Briefly, the low-rank matrix completion problem is the problem of imputing missing values of a matrix under a rank constraint. We consider a matrix completion problem for Hankel matrices and a convex relaxation based on the nuclear norm. Based on new theoretical results and a number of numerical and real examples, we investigate the cases when the proposed approach can work. Our results highlight the importance of choosing a proper weighting scheme for the known observations. This work was supported in part by ERC AdG-2013-320594 DECODA.
  • Industrial Quality Control Based on High Dimensional Data: From the Lab to the Workfloor

    Authors: Bart De Ketelaere (Catholic University of Leuven), Iwein Vranckx (TOMRA sorting NV), Nghia Nguyen (Catholic University of Leuven), Wouter Saeys (Catholic University of Leuven)
    Primary area of focus / application: Other: Multivariate statistics in industry - Alberto Ferrer
    Secondary area of focus / application: Quality
    Keywords: Industrial quality control, Multivariate, Principal component analysis, Partial least squares, Implementation
    Submitted at 4-Jun-2018 22:18 by Bart De Ketelaere
    Accepted
    3-Sep-2018 15:50 Industrial Quality Control Based on High Dimensional Data: From the Lab to the Workfloor
    Novel, fast and objective sensor technologies offer great potential for the industry to assess the quality of each product produced, and simultaneously monitor the underlying process. The generated data are often of a complex (multivariate) nature, requiring advanced methods to turn them into valuable information. Although many scientific reports are produced showing the added value of these sensors in combination with multivariate methods such as Principal Component Analysis, Partial Least Squares and derived techniques, we see that the implementation rate is still low. In this talk we will touch upon several reasons for this low adoption rate with a focus on quality control in the agrofood industry where biological variability and spatio-temporal behavior dominates. Several case studies will be used and remedies aimed at increasing the industrial adoption rate are presented.
  • Be Confident, Predictable or Tolerable in Method Comparison Studies. Correlated-Error-in-Variables Regressions in XY and MD Plots.

    Authors: Bernard Francq (GSK, CMC StatS), Marion Berger (Sanofi, Département de Biostatistiques et Programmation)
    Primary area of focus / application: Metrology & measurement systems analysis
    Secondary area of focus / application: Modelling
    Keywords: Tolerance interval, Prediction interval, Confidence interval, Errors-in-variables regression, Agreement, Equivalence, Bland-Altman plot
    Submitted at 4-Jun-2018 23:15 by Bernard Francq
    Accepted (view paper)
    4-Sep-2018 14:10 Be Confident, Predictable or Tolerable in Method Comparison Studies. Correlated-Error-in-Variables Regressions in XY and MD Plots.
    The need of laboratories to quickly assess the quality of samples leads to the development of new analytical measurement methods. These methods should, ideally, lead to ‘similar’ or ‘equivalent’ results notwithstanding the measurement errors. This can be assessed in a classical XY plot or MD plot (M=(X+Y)/2, D=Y-X) (Bland-Altman plot).

    The well-known Agreement Interval (AI) in the MD plot will be compared to Tolerance Intervals (TIs). It will be explained that TIs are better, easier to explain and to interpret.

    The measurement uncertainties in an MD plot are correlated. If this correlation is ignored, the biases soar considerably and the coverage probabilities collapse drastically. Therefore, the Bland-Altman approach leads to substantial misunderstandings and misleading conclusions.

    In a recent paper, Francq and Govaerts reconcile the XY plot and MD plot by providing Confidence Intervals (CIs) and Prediction Intervals (PIs) with respectively the BLS and the new and promising CBLS regression ((Correlated)-Bivariate Least Square).

    In this talk, we generalize these statistical intervals (CIs and PIs) by introducing the concept of a ‘generalized’ interval (GI). This provides flexibility to the practitioners as the equivalence can be assessed on averages (CI), on individual measures (PI) or on averages of a given number of measures (GI). Simulations, animated graphs and real data will be used to illustrate these techniques.

    B.G. Francq, B.B. Govaerts. How to regress and predict in a Bland-Altman plot? Review and contribution based on tolerance intervals and correlated-errors-in-variables models. Statistics in Medicine, 35:2328-2358, 2016.
  • Confidence and Prediction Intervals in Linear Mixed Models, Trueness and Accuracy (Trueness + Precision) in Assay Qualification

    Authors: Bernard Francq (GSK, CMC StatS), Dan Lin (GSK, CMC StatS), Walter Hoyer (GSK, CMC StatS)
    Primary area of focus / application: Design and analysis of experiments
    Secondary area of focus / application: Metrology & measurement systems analysis
    Keywords: Mixed models, Prediction interval, Truness, Accuracy, Assay qualification
    Submitted at 4-Jun-2018 23:22 by Bernard Francq
    Accepted (view paper)
    3-Sep-2018 15:50 Confidence and Prediction Intervals in Linear Mixed Models, Trueness and Accuracy (Trueness + Precision) in Assay Qualification
    During development of a vaccine, different analytical methods for determining the antigen concentration or the (relative) potency in the produced vaccine batches need to be developed. In assay development, a linear mixed model across all samples is applied to estimate the variance components to evaluate the precision of the method (repeatability and intermediate precision). Trueness is the systematic deviation (bias) between the reference value and the measured concentration. Accuracy is the closeness between the reference value and an individual test value, which takes into account the systematic error (trueness) and random error (precision). Trueness is therefore assessed by using Confidence Intervals (CIs) at each measured concentration while accuracy is evaluated by using Prediction Intervals (PIs).

    The advantages and convergence of these intervals will be discussed in the framework of linear mixed models with different sample sizes. The literature about PI in mixed models is scarce as often the methodology is developed for specific design (one random factor) by using explicit analytical formulae, which are not appropriate for unbalanced or more complex designs.

    We propose a PI formula that is generalizable under a wide variety of designs with a variance component structure (random, nested, crossed, balanced or unbalanced designs). The methodology is based on the Hessian matrix which leads to a straightforward generalized solution. Performance of our methodology will be evaluated by means of simulations and application to a case study in assay qualification.
  • Challenges and Opportunities while Introducing Industry 4.0 within Aerospace Component Manufacturing

    Authors: Sören Knuts (GKN Aerospace Sweden)
    Primary area of focus / application: Process
    Secondary area of focus / application: Quality
    Keywords: Process, SPC, PPAP, Quality, Aerospace, Industrie 4.0
    Submitted at 4-Jun-2018 23:23 by Sören Knuts
    Accepted (view paper)
    5-Sep-2018 11:30 Challenges and Opportunities while Introducing Industry 4.0 within Aerospace Component Manufacturing
    Statistical Process control is an implemented tool and requested tool in order to show that a manufacturing process is within statistical control. Statistical control demands that Key Process Variables and Product Key Characteristics on the manufacturing process is measured in order to show fulfilment, and to give a warning signal whenever a characteristic fall outside the limits. This is a requirement whenever production of a new product is started in a Production Part Approval Process (PPAP).

    In a recent study, it has been shown that the process measurements that are performed today are too few and more data is requested to fulfill the need of the new Industry standard 4.0, where implementation is on going. In the study that focuses on GKN Aerospace Sweden it is shown what the challenges are, and how this can be taken care of both in new and in current production.

    It addresses also the opportunities that comes with having more data like use of Machine Learning algorithms in order to detect Multi-variable relations and automatic problem solving and guidance.
  • Equivalence Approach for Multi-Factor Robustness Evaluation with Application in Vaccines Development

    Authors: Bernard Francq (GSK, CMC StatS), Dan Lin (GSK, CMC StatS), Waldemar Miller (GSK, CMC StatS, Universität Magdeburg), Réjane Rousseau (GSK, CMC StatS), Sylvie Scolas (GSK, CMC StatS), Walter Hoyer (GSK, CMC StatS)
    Primary area of focus / application: Design and analysis of experiments
    Secondary area of focus / application: Quality
    Keywords: TOST (two one-sided test), Robustness, Flatness, Design of Experiments, Equivalence, Design space
    Submitted at 4-Jun-2018 23:31 by Waldemar Miller
    Accepted
    4-Sep-2018 10:50 Equivalence Approach for Multi-Factor Robustness Evaluation with Application in Vaccines Development
    Current state-of-the-art vaccines development is based on the “Quality-by-Design” paradigm, where risk-based and data driven decisions are key. A prominent example is the classification of process parameters into “critical” and “non-critical” based on a series of Designs of Experiments (DoE) performed during vaccine development. This helps to understand the relationship between Critical Process Parameters (CPPs) and “Critical Quality Attributes” (CQAs) and then to establish the “Design space”. Design spaces are defined according to the ICH guidance Q8 as a subspace of process parameter combinations “that have been demonstrated to provide assurance of quality.”

    In that context, the robustness of a process is its property to stay within the specification limits (target ± Δ) after a change in experimental conditions. In analogy to the classical equivalence test (see e.g. Schuirmann’s Two One-Sided Test (TOST) procedure [1] or its extension by Wiens and Iglewicz [2]), a “DOE for flatness” extends the equivalence test to the multi-dimensional case (continuous or categorical factors, e.g. temperature or duration). We discuss adaptation of the significance level and tackling the multiplicity issue as the entire experimental domain is compared to a given reference level by contrasts of mean responses between every point and the reference. The design space is then the subset of the multi-dimensional space where the predicted means are equivalent to the reference level, i.e., confidence intervals of mean contrasts lie within ± Δ.

    Performance of our methodology will be evaluated by means of simulations and applications to case studies within CMC statistics and vaccines development.

    [1] Schuirmann D.J. A comparison of the two one-sided tests procedure and the power approach for assessing equivalence of average bioavailability. Journal of Pharmacokinetics and Biopharmaceutics, 15, 657–680, 1987
    [2] Wiens B.L., Iglewicz B. On testing equivalence of three populations. Journal of Biopharmaceutical Statistics, 9, 465–483, 1999