ENBIS-15 in Prague

6 – 10 September 2015; Prague, Czech Republic Abstract submission: 1 February – 3 July 2015

My abstracts


The following abstracts have been accepted for this event:

  • Stochastic Modelling and Prediction of Fatigue Crack Propagation Using Piecewise-Deterministic Markov Processes

    Authors: Anne Gégout-Petit (Université de Lorraine), Marie Touzet (Université de Bordeaux), Anis Ben Abdessalem (Université de Bordeaux), Monique Puiggali (Université de Bordeaux), Romain Azais (INRIA BIGS Nancy Grand Est)
    Primary area of focus / application: Reliability
    Secondary area of focus / application: Modelling
    Keywords: Fatigue crack propagation, Piecewise-deterministic Markov processes, Regime-switching models, Prediction
    Submitted at 12-Jun-2015 13:05 by Anne Gégout-Petit
    8-Sep-2015 12:40 Stochastic Modelling and Prediction of Fatigue Crack Propagation Using Piecewise-Deterministic Markov Processes
    Fatigue crack propagation is a stochastic phenomenon due to the inherent uncertainties originating from material properties, environmental conditions and loads. Stochastic processes offer an appropriate framework for modelling and predicting crack propagation. In this work, we propose to model and to predict the fatigue crack growth with Piecewise Deterministic Markov Processes (PDMP) associated with deterministic crack laws. First, we propose a regime-switching model with one jump to express the transition between Paris' regime and rapid crack propagation which occurs before failure. Parameters are adjusted from real data available in the literature. For this, we have to capture the change of regime of the observed failure trough an optimisation algorithm. The second purpose of this investigation is to predict the fatigue crack path and its variability based on the global model and knowledge of some informations retrieved at the beginning of crack propagation. We show that our method based on PDMP associated with an updating method provides a reliable prediction and can be an efficient tool for safety structures in a large variety of engineering applications. In addition, the proposed strategy requires only few information to be effective and is cheaper in terms of computing time.
  • Tuning Spatial Interpolation Under Non-Space-Filling Designs for Unmanned Surface Vehicle Applications

    Authors: You Li (Laboratoire d'Informatique, Signaux et Systèmes de Sophia-Antipolis (I3S)), Maria João Rendas (Laboratoire d'Informatique, Signaux et Systèmes de Sophia-Antipolis (I3S))
    Primary area of focus / application: Design and analysis of experiments
    Keywords: Spatial interpolation, Cross validation, Space-filling design, Line survey, Unmanned surface vehicle
    Submitted at 12-Jun-2015 15:07 by You LI
    7-Sep-2015 17:40 Tuning Spatial Interpolation Under Non-Space-Filling Designs for Unmanned Surface Vehicle Applications
    Spatial interpolation is the procedure of estimating the value of properties of a
    field at unsampled sites using the measured values at a set of positions. All
    interpolation algorithms require the definition of a set of user-defined parameters,
    whose value should reflect the available knowledge about the spatial coherency of
    the field. In the absence of prior knowledge, Cross-Validation (CV) is often used to
    select "best" parameters. Leave-One-Out Cross-Validation (LOOCV), the most common
    form of CV, removes one data point at a time, and estimates its value using the
    remaining samples. The “best” interpolation parameters is then chosen as the one
    that leads to the minimum value of the residual between the actual and estimated
    values, assuming that the CV residuals is a representative sample of the residuals
    when interpolating over the entire area of interest.

    Many factors including sample size, design geometry and data properties affect
    statistical distribution of the LOOCV residuals. When the sampling designs are
    "space-filling-like”, the CV residuals distribution is close to the actual
    distribution of the interpolation algorithm and CV ALLOWS to good interpolator
    tuning. However, this property is violated for liner designs, frequent when a
    spatial field is observed using a sensor carried in a mobile platform, like an
    oceanographic sensor or an airplane. In this case the sampled points are closely
    spaced along the one-dimensional vehicle trajectory, but most often with a much
    lower spatial sampling rate in the directions orthogonal to it. This induces a
    strong bias on the statistical distribution of the CV residuals, since each point
    always has at least one near-by sample.

    We proposed a randomised CV method that corrects the bias of the residuals
    distribution, enabling a better estimation of the expected interpolation error, even
    for strongly non-space-filling designs like line-transects. Results of numerical
    tests on simulated data and a diverse set of interpolators (inverse distance
    weighting, local weighted polynomial regression)show that leads to algorithm tuning
    offering better field reconstruction.

    [1], L. Pronzato et W. Muller, "Design of computer experiments: space filling and
    beyond" in Journal of Statistics and Computing, 2012.
  • Monetising Data in the Energy and Utility Sectors

    Authors: Warren Yabsley (Enzen Global and Newcastle University), Shirley Coleman (Industrial Statistics Research Unit (ISRU), Newcastle University)
    Primary area of focus / application: Mining
    Secondary area of focus / application: Business
    Keywords: Gas demand, Weather data, Impact, Value, Analytics
    Submitted at 12-Jun-2015 18:08 by Warren Yabsley
    Accepted (view paper)
    8-Sep-2015 10:10 Monetising Data in the Energy and Utility Sectors
    Data analytics has repeatedly been highlighted as providing a pivotal role in optimising operations and business strategy. Those that embrace this field are shown to gain advantages over organisations that have not adopted such strategies. Global expenditure on data analytics by the energy and utility sectors is forecast to expand from $700 million in 2012 to $3.8 billion in 2020.(1)

    Enzen Global and ISRU at Newcastle University are engaged in a 2 year Knowledge Transfer Partnership programme that is part financed by Innovate UK government funding. In the UK, Enzen focuses on Gas, Electricity and Water sectors and works closely with many companies in the industry to enhance their practices. This ensures various metrics are satisfied: sufficient and consistent supply to customers, upholding of industry regulations and targets, and further environmental friendliness. All of these must be fulfilled whilst enabling scope for increased profitability of the business. Enzen believes that proactively encouraging the use of and implementing data analysis is a vital extension to their current portfolio of value propositions and business outcome delivery.

    An enhanced toolbox of data analysis and statistical techniques such as regression, forecasting and decision trees have been employed to monetise that data in a number of areas. For the Gas sector this has included assessment of pipeline pressure change with time/demand, daily demand change with weather and/or between different regional distribution networks, and the effect of variation between local and regional weather on demand predictions. The analysis has been translated into business impact by interpreting the results in monetary terms. For example, it was shown that current weather measurement for a region is inadequate and unrepresentative of local areas thus potentially resulting in over £100,000 per day discrepancy between forecast and actual demand for the region. The importance of the work is with regard to increased value of assets and monetary savings that the data analysis can reveal and this emphasis will be demonstrated in the presentation.

    1. http://eandt.theiet.org/magazine/2014/01/data-on-demand.cfm

  • Monitoring of a Quality of Technical Fabrics

    Authors: Lenka Techniková (Technical University of Liberec), Maroš Tunák (Technical University of Liberec)
    Primary area of focus / application: Quality
    Secondary area of focus / application: Process
    Keywords: Fabrics, Sieves, Quality control, Image analysis, Statistical method
    Submitted at 12-Jun-2015 20:26 by Lenka Techniková
    Accepted (view paper)
    7-Sep-2015 17:20 Monitoring of a Quality of Technical Fabrics
    This contribution deals with an objective procedure for monitoring the quality of technical fabrics, and the detection of potential defects. These technical fabrics are used as special sieves for filtering of e.g. a friable body, fluid, air, etc. The sieves are woven in a plain weave with various fabric densities and made from polyester or polyamide monofilaments. In the present case, the potential defects are inhomogeneous pore sizes of the fabrics which influence the efficiency of the filtration. Generally, the size of the pores of a fabric is closely connected to the density of the fabric, area of the pores, and length of sides of the pores. Using of a subjective method for the measurement of the size pore and the density of a fabric is often impossible because the size of pores can be around a tenth of a millimeter. In this work, an objective method is proposed, based on image analysis techniques and statistical methods for quality control. The surface of technical fabric is captured by a camera and the pores of the fabric are recognized and analyzed with using of the basic image analysis tools (a global thresholding, etc). Data as the pore sizes are monitored and statically evaluated by control charts, boxplot, interval of confidence, etc. Due to the statistical tools we can define if the pore sizes are meeting the standard requirements.
  • Picking the Right Statistical Tools, a Case Study Using Office Lighting

    Authors: Jan-Willem Bikker (CQM)
    Primary area of focus / application: Modelling
    Secondary area of focus / application: Business
    Keywords: Office lighting, Multiplicity, Missing values, Mixed-effects ordered logistic regression, Mixed-effects model, Non-parametric tests, Dose-response curve
    Submitted at 13-Jun-2015 08:42 by Jan-Willem Bikker
    7-Sep-2015 17:20 Picking the Right Statistical Tools, a Case Study Using Office Lighting
    The Dutch consultancy firm CQM supports many R&D projects with modeling and statistics. This presentation is about statistical support for a Philips Research study that investigated how lighting conditions affect participants with respect to their alertness, mood, vitality, lighting appraisal, visual acuity and performance on a cognitive test battery. Forty-eight subjects participated in the experiment.

    The participants came from two age groups (30yr, 60yr) and each had 4 sessions in a simulated office; the 4 different lighting conditions were varied over the 4 sessions within participant (a within-subject study).

    Analysis and modeling faced a number of choices and trade-offs. The set of responses was quite large and each of them could be summarized in multiple ways. The general question “does lighting affect participants with respect to response Y” was translated into a number of hypothesis tests.

    Some aspects of handling this large set of model approaches are discussed.

    As an example: The lighting condition can be modeled as a factor on 4 levels, or as a continuous predictor using polynomials. The latter sometimes gives sharper conclusions on the pairwise comparison of two adjacent lighting conditions, but is also more difficult to explain.

    Another example is a ordered logistic regression model with random effects as opposed to the classical non-parametric methods for responses on 7-point scales.

    Also, the applied models assume no interaction between subject and lighting condition; a partial investigation was done to the validity of that assumption and the consequence for the type I error if it were fulfilled.

    The response most strongly affected by lighting was the visual acuity (i.e. how small are the smallest objects a participant can see); comfort and appraisal are also affected. The conclusion was that lowering energy consumption rates in office lighting can be disruptive for visual acuity and comfort, especially within an ageing workforce. Some visualizations and summaries from the models for these responses will be discussed.

    The entire analysis consists of several methods, most of which are well-known. The difficulty of the analysis lies in finding the right combination of tools, considering aspects such as plausible model assumptions, complexity, and customs in publishing.
  • Use of Screening Design for Multiple Quality Characteristics

    Authors: Jai-Hyun Byun (Gyeongsang National University)
    Primary area of focus / application: Design and analysis of experiments
    Keywords: Multiple quality characteristics, Design of Experiments, Screening design, Fractional factorial design, Simultaneous optimization
    Submitted at 13-Jun-2015 10:52 by Jai-Hyun Byun
    7-Sep-2015 10:40 Use of Screening Design for Multiple Quality Characteristics
    For product or process design and development, it is common to optimize multiple responses (characteristics) based on experimental data. To determine optimal factor conditions, we need to design the experiment, obtain proper model for each response, and optimize the multiple responses simultaneously. There are several techniques and many research results on optimizing multiple responses simultaneously, when the experimental data are available. However, the experimental design issue for optimizing multiple responses is not discussed yet. This paper proposes some idea on how to plan screening design when we consider multiple responses to be optimized. A case study is also presented to show the validity of the approach.