ENBIS-18 in Nancy

2 – 25 September 2018; Ecoles des Mines, Nancy (France) Abstract submission: 20 December 2017 – 4 June 2018

My abstracts

 

The following abstracts have been accepted for this event:

  • Exploratory Study on a Statistical Method to Analyse Time Resolved Data Obtained during Nanomaterial Exposure Measurements

    Authors: Frédéric Clerc (INRS - assurance maladie), Olivier Witschger (INRS - assurance maladie)
    Primary area of focus / application: Metrology & measurement systems analysis
    Secondary area of focus / application: Modelling
    Keywords: Bayesian network, Probabilistic modelling, Nano materials, Exposure assessment
    Submitted at 4-Jun-2018 09:18 by Frédéric Clerc
    Accepted (view paper)
    4-Sep-2018 12:20 Exploratory Study on a Statistical Method to Analyse Time Resolved Data Obtained during Nanomaterial Exposure Measurements
    Most of the measurement strategies that are suggested at the international level to assess workplace exposure to nanomaterials rely on devices measuring, in real time, airborne particles concentrations (according different metrics). Since none of the instruments to measure aerosols can distinguish a particle of interest from the background aerosol, the statistical analysis of time resolved data requires special attention. So far, very few approaches have been used for statistical analysis in the literature. This ranges from simple qualitative analysis of graphs to the implementation of more complex statistical models. To date, there is still no consensus on a particular approach and the current period is always looking for an appropriate and robust method. In this context, this exploratory study investigates a statistical method to analyse time resolved data based on a Bayesian probabilistic approach. To investigate and illustrate the use of this statistical method, particle number concentration data from a workplace study that investigated the potential for exposure via inhalation from cleanout operations by sandpapering of a reactor producing nanocomposite thin films have been used. In this workplace study, the background issue has been addressed through the near-field and far-field approaches and several size integrated and time resolved devices have been used. The analysis of the results presented here focuses only on data obtained with two handheld condensation particle counters. While one was measuring at the source of the released particles, the other one was measuring in parallel far-field. The Bayesian probabilistic approach allows a probabilistic modelling of data series, and the observed task is modelled in the form of probability distributions. The probability distributions issuing from time resolved data obtained at the source can be compared with the probability distributions issuing from the time resolved data obtained far-field, leading in a quantitative estimation of the airborne particles released at the source when the task is performed. Beyond obtained results, this exploratory study indicates that the analysis of the results requires specific experience in statistics.
  • Robustness of Agreement in Ordinal Classifications

    Authors: Amalia Vanacore (University of Naples Federico II), Maria Sole Pellegrino (University of Naples Federico II)
    Primary area of focus / application: Metrology & measurement systems analysis
    Secondary area of focus / application: Quality
    Keywords: Agreement, Robustness, Prevalence, Bias
    Submitted at 4-Jun-2018 10:34 by Amalia Vanacore
    Accepted
    4-Sep-2018 14:50 Robustness of Agreement in Ordinal Classifications
    The quality of subjective evaluations provided by field experts (e.g. physicians or risk assessors) as well as by trained operators (e.g. visual inspectors) is often evaluated in terms of inter/intra rater agreement via kappa-type coefficients. Despite their popularity, these indices have long been criticized for being affected in complex ways by the presence of bias between raters and by the distributions of data across the rating categories (“prevalence”).

    This paper presents the results of a Monte Carlo simulation study aimed at investigating the robustness of four kappa-type indices (viz. Gwet’s AC2 and the weighted variants of Scott’s Pi, Cohen’s Kappa and Brennan-Prediger coefficients) taking into consideration the case of two series of ratings provided by the same rater (intra-rater agreement) or by two raters (inter-rater agreement). The robustness of the reviewed indices to changes in the frequency distribution of ratings across categories and in the agreement distribution between the two series of ratings has been analyzed across several simulation scenarios built by varying the sample size (i.e. number of rated items), the dimension of the rating scale, the frequency and agreement distributions between the series of ratings.

    Simulation results suggest that the level of agreement is sensitive to the distribution of items across the rating categories and to the dimension of rating scale but it is not influenced by the sample size. Among the reviewed indices, the Brennan–Prediger coefficient and Gwet’s AC2 are less sensitive to variation in the distribution of items across the categories for a fixed agreement distribution.
  • Determining Baseline Profile by Diffusion Maps

    Authors: Francisco Moura Neto (Rio de Janeiro State University), Pedro Souza (Instituto de Matemática Pura e Aplicada), Maysa S. de Magalhães (Instituto Brasileiro de Geografia e Estatística)
    Primary area of focus / application: Process
    Secondary area of focus / application: Quality
    Keywords: Quality control, Statistical Process Control, Diffusion maps, Phase I, Nonlinear profile
    Submitted at 4-Jun-2018 15:54 by Francisco Moura Neto
    Accepted
    A major step in statistical process control is to establish the standard pattern of a product. Some quality characteristics are best represented by a profile, a functional relationship between a response variable and one or more explanatory variables, which usually implicates in high-dimension data handling. Here, we propose a method for Phase I analysis based on diffusion maps to investigate historical profile data sets to establish a standard profile. Diffusion maps are powerful data-driven techniques to represent data and reduce dimensionality in several different problems, while somehow preserving the local geometric structure of the data set, hence allowing it to be analysed more thoroughly. The proposed method, for Phase I analysis of nonlinear profiles, does not require explicit functional relation between dependent and independent variables, which enables it to be applied generally to profiles that come from discrete space (or time) measurements. For validation purposes and also to gain insight in the properties of the method, we present a new explicit analytic expression for the diffusion maps of an idealized noiseless data set. Furthermore, we also apply the method to artificially generated data sets based on a nonlinear profile, and to real data coming from a wood board production process, highlighting its capabilities in analysing sets of profiles for baseline profile estimation. Our results are compared with those in the literature, showing that the proposed method has good performance and, moreover, gives a better understanding of the similarities of a real profile data set, thus enhancing its geometric view.
  • Use of Mechanistic Models for in Silico Trials

    Authors: Mélanie Prague (Univ Bordeaux, Inria/Inserm, Vaccine Research institute), Chloé Pasin (Univ Bordeaux Inria/Inserm), Rodolphe Thiébaut (Univ Bordeaux, Inria/Inserm, Vaccine Research institute)
    Primary area of focus / application: Modelling
    Secondary area of focus / application: Design and analysis of experiments
    Keywords: Modeling, Mechanistic and dynamical models, Computer-based simulations, In silico trials, Non-linear mixed-effects
    Submitted at 4-Jun-2018 16:15 by Mélanie PRAGUE
    Accepted
    3-Sep-2018 14:40 Use of Mechanistic Models for in Silico Trials
    An in silico clinical trial is an individualized computer simulation used in the development of a medicinal product, device, or intervention. Expected benefits are to provide a mechanistic understanding, to optimize the strategies of delivery and to improve the clinical trials designs. We propose to present how in silico trials could be used to investigate simplified antiretroviral therapies (HAART) such as therapies based on two antiretroviral treatments (ANRS 163 ETRAL, ANRS 167 LAMIDOL), dose reduction (ANRS 165 DARULIGHT) or short cycle treatment interruption (ANRS 162 4D, BREATHER Study, ANRS 170 QUATUOR). We propose to model dynamics of viral load and CD4 with a dynamical model based on ordinary differential equations (ODE) with mixed effects models on parameters (NLME). We account for pharmacokinetics and pharmacodynamics properties of HAART by using in vitro indicators and in vivo pre-established properties from phase I trials. Model for in silico trials has been defined and parameters estimated using data from previous clinical trials and observational cohort (the Aquitaine cohort). By simulating a phase III in silico trial, we investigate the expected results of on-going studies. We discuss how to simulate an in-silico population which is heterogeneous enough to build reliable confidence bounds. Altogether, we show that such pipeline for in silico trials is a promising tool for investigating novel strategies of treatment and opens the perspective for personalized medicine for HIV.
  • Statistical Change Detection: Application to Global Navigation Satellite Systems

    Authors: Daniel Egea-Roca (Universitat Autònoma de Barcelona (UAB)), Gonzalo Seco-Granados (Universitat Autònoma de Barcelona (UAB)), José A. López-Salcedo (Universitat Autònoma de Barcelona (UAB))
    Primary area of focus / application: Other: Sequential Analysis
    Secondary area of focus / application: Other: Sequential Analysis
    Keywords: Statistical change detection, Quickest change detection, Transient change detection, Sequential detection, Stopping time, Signal quality monitoring, GNSS
    Submitted at 4-Jun-2018 17:36 by Daniel Egea-Roca
    Accepted (view paper)
    4-Sep-2018 10:30 Statistical Change Detection: Application to Global Navigation Satellite Systems
    The problem of detecting sudden statistical changes has many important applications such as quality control, signal segmentation, on-line monitoring of safety-critical infrastructures or detection of signals in radar and sonar. This kind of detection lies in the field of the so-called statistical change detection (SCD), including both quickest change detection (QCD) and transient change detection (TCD).

    QCD is particularly of interest for applications in which the statistical change will remain forever once it appears. This is the case for instance when analysing the quality of the product of an industrial process. In this case, when the process is miss-behaving, the quality of the product will be degraded as long as the problem is not solved. QCD has been extensively studied and applied into a wide range of applications since the 1950s. The goal of QCD is to detect the statistical change as soon as possible. The optimality criterion is to minimize the detection delay.

    In contrast, TCD deals with the detection of changes with a finite duration. So, the goal in this case is to minimize the probability of detecting the change within a given period of time. For instance, in safety-critical applications we would like to detect any issue of the system before the end-user gets hurt. This may be the case of autonomous navigation when the navigation system fails. If we do not detect the failure within a short period of time the safety of the end-user might be in danger. Unfortunately, the related literature for the fundamentals of TCD is scarce because it is a field still under development. Nevertheless, in the last years some contributions have appeared shedding some light on the optimal solution for the TCD problem. This has raised interest of many fields to consider the framework of TCD. This is the case for instance of navigation, drinking water quality or cyber attacks monitoring.

    Another sector that can benefit from the advances on SCD is the global navigation satellite systems (GNSSs). GNSSs are becoming an essential tool for many critical sectors in our modern society like transportation, communications or timing for power grid control or bank transactions. Indeed, as reported by the 2015 GSA market report, by 2020 GNSS will provide revenues of nearly 50 billions of Euro worldwide. As a matter of fact, the disruption or miss-behaviour of GNSS services can have a high economic impact and cause a major setback worldwide. For this reason, it is of paramount importance to promptly detect any possible anomaly or misleading behaviour that could be endangering the received GNSS signal.

    Based on the above considerations, the goal of our presentation is to show the audience an overview of the detection theory with special emphasis on SCD. To see the practical application of the theoretical concepts of SCD we show its application to the GNSS field. The idea is to show the audience how to apply SCD to a particular detection problem so that they can get an idea of how to apply SCD to other problems.
  • Using the Agricultural Land Quality Matrix to Survey and Improve the Agricultural Land in Romania

    Authors: Belloiu Radu-Florian (Geoagri Cadastru), Iorga Danut (Geoagri Cadastru), Scarlat Cezar (University “Politehnica” of Bucharest)
    Primary area of focus / application: Six Sigma
    Keywords: Precision agriculture, Six Sigma, Linear regression, Logistic regression, NDVI, NDWI, Quality matrix
    Submitted at 4-Jun-2018 17:37 by Radu-Florian Belloiu
    Accepted
    Purpose – Precision agriculture gains more attention from different national and private organizations. One important tool to create a predictable business in agriculture is crop satellite surveying during the year. Processing and interpreting satellite data could help farmers to take reparatory actions or to plan strategies to improve their agricultural land. Another aspect is multiyear evolution of crops on a specific area compared with other vicinities, in order to receive relevant information regarding the evolution of the land quality due to anthropic interventions. This paper focuses on finding ways to develop an Agricultural Land Quality Matrix that could help farmers and state organizations to control and plan strategies to maintain or improve the land quality and efficiently spend the available resources.

    Design/ Methodology - The current paper analyses a direct satellite survey on a region from Romania, Braila county, using multispectral images. The images are interpreted calculating NDVI (normalized difference vegetation index) and NDWI (normalized difference water index). Following the indexes mapping using QGIS (Open Source Geographic Information System), the authors adapted Six Sigma methodology in order to develop an Agricultural Land Quality Matrix. In the analyze step (DMAIC method) the main tools were linear regression and logistic regression.

    Findings – An Agricultural Land Quality Matrix was defined to survey the correlation between factors and results in order to improve and survey the evolution of the agricultural land. Also, based on research the authors took into consideration the opportunity to design an IT online platform in order to facilitate communication among stakeholders.

    Research implications - Continuous improvement methods and a structured approach of precision agriculture through the Agricultural Land Quality Matrix is a premier in Romania. Moreover, the results could pave the road to develop a national approach in order to allocate resources more efficient by governmental entities and also by farmers.