ENBIS-16 in Sheffield

11 – 15 September 2016; Sheffield Abstract submission: 20 March – 4 July 2016

My abstracts


The following abstracts have been accepted for this event:

  • Extreme Value Analysis for Multivariate Process Monitoring

    Authors: Stijn E. Luca (KU Leuven), David A. Clifton (University of Oxford), Bart Vanrumste (KU Leuven)
    Primary area of focus / application: Process
    Secondary area of focus / application: Mining
    Keywords: Multivariate process monitoring, Extremes, Patterns, Heath monitoring
    Submitted at 4-Jul-2016 10:20 by Stijn Luca
    Accepted (view paper)
    14-Sep-2016 09:00 Extreme Value Analysis for Multivariate Process Monitoring
    Process monitoring is a diverse field that is applicable to the monitoring of machines, industrial production processes and health monitoring. Fault detection is a crucial part of process monitoring, where one wishes to identify faults or abnormalities that indicate a deviation from the normal state of the system.
    We will treat a problem related to the detection of faults in multivariate processes. Most of the literature on multivariate process monitoring deals with point-wise approaches evaluating individual points of the process. This work treats the more general problem of modelling patterns of measurements from a process. In particular we consider the general problem of determining whether a given set of d-dimensional data points S={x_1,x_2,…,x_n} is coming from a statistical distribution X modelling the “normal” state of a process.
    To this end, it is shown that a models based on extreme value statistics can be used for multivariate process monitoring, that enables to fuse different types of information hidden in low-density regions of data space. This is particular beneficial when data from the faulty situation is sparse such that a model of the abnormal state is absent and other classification techniques are sub-optimal.
    The method is illustrated on a healthcare application where epileptic patients are monitored during their daily routine. Treating epileptic seizures as extreme or statistically rare allow to detect them in a set of acceleration data acquired from a wristband.
  • Field Failure Forecasting Using Warranty Claim Data

    Authors: Antonio Pievatolo (CNR-IMATI)
    Primary area of focus / application: Reliability
    Secondary area of focus / application: Reliability
    Keywords: Warranty claim data, Poisson-lognormal process, Particle learning, Claim forecasting
    Submitted at 4-Jul-2016 16:23 by Antonio Pievatolo
    Accepted (view paper)
    13-Sep-2016 14:50 Field Failure Forecasting Using Warranty Claim Data
    Bayesian methods provide a natural framework for the analysis of data that become sequentially available, either for descriptive, predictive or prescriptive purposes. We present a methodology for the prediction of future claims, based on a nonlinear dynamic model, with Poisson observable counts and dynamic failure rates. An error-propagation approximation makes it possible to assign a meaningful covariance structure to failure rates, resulting in a dynamic Poisson-Lognormal model. An adaptation of the auxiliary particle filter similar to the particle learning algorithm is used for parameter learning and forecasting. A series of examples with real datasets show that it is possible to obtain a small forecasting error for claims having very different patterns.
  • Reliability of Student Evaluations of Teaching

    Authors: Amalia Vanacore (University of Naples Federico II, Department of Industrial Engineering), Maria Sole Pellegrino (University of Naples Federico II, Department of Industrial Engineering)
    Primary area of focus / application: Quality
    Keywords: Student Evaluation of Teaching (SET), Inter-rater agreement, Inter-rater reliability, Intra-rater reliability
    Submitted at 4-Jul-2016 22:33 by Maria Sole Pellegrino
    Accepted (view paper)
    13-Sep-2016 16:20 Reliability of Student Evaluations of Teaching
    Student Evaluations of Teaching (SETs) are the most common way to measure teaching quality in Higher Education (HE): they are assuming a strategic role in monitoring teaching quality, becoming helpful in taking major formative academic decisions aimed at improving and shaping teaching quality for future courses starting from the evidence shown by quality evaluations.
    In the context of HE, the choice of using the student as primary rater of teaching quality has dominated worldwide for the past 40 years. Nevertheless, a review of specialized literature evidences that researchers widely discuss whether SETs can be considered reliable measures for quality evaluation. Although some researchers believe that SET is the best option to provide quantifiable and comparable measures, others point out that student ratings could be influenced by significant impacting factors (such as own classroom experience and class size) losing reliability. The majority of studies concerning SETs reliability focus on the instruments and the procedures adopted to collect students' evaluations or on the inter-rater reliability rather than the intra-rater reliability.
    In this paper the main results of a reliability study on SETs provided for a university course are fully illustrated. The data sets have been collected during three supervised experiments carried out on successive classes. The class sizes were more than 20 students, homogeneous in curriculum and instruction.
    SETs collected in each experiment have been analysed by measuring the reliability of each student (i.e. single rater), the reliability of the whole class (i.e. multiple raters) and the agreement among students. Particularly, the reliability has been measured as the coherence of the evaluations provided in two occasions using the same rating instrument (i.e. same items and same rating scale) or using partially different rating instruments (i.e. same items but different rating scales); whereas the agreement has been measured for students’ ratings collected exactly in the same conditions (i.e. same occasion, same items and same rating scales).
  • Using Dynamic Bayesian Networks to Model Technical Risk Management Efficiency

    Authors: Anan Halabi (Department of Mathematics "G. Peano", University of Turin), Ron S. Kenett (Department of Mathematics "G. Peano", University of Turin), Laura Sacerdote (Department of Mathematics "G. Peano", University of Turin)
    Primary area of focus / application: Business
    Secondary area of focus / application: Modelling
    Keywords: Bayesian networks, Dynamic Bayesian network, Risk, Expert subjective assessment, What-if scenario
    Submitted at 4-Jul-2016 22:34 by Anan Halabi
    Accepted (view paper)
    13-Sep-2016 14:30 Using Dynamic Bayesian Networks to Model Technical Risk Management Efficiency
    The objective of this paper is to develop a mathematical model for achieving risk management efficiency in product development. The reduction of overall product risks before product release to production can be used as an indicator of risk management efficiency. Acceptable risks targets vary according to the factors such as: industry type, organization characteristics, regulations etc`. In general, risk management process contains the following phases: identification of risks, analysis, control and feedback. We propose a mathematical approach using dynamic Bayesian networks to evaluate how product risks progress over time during development. The properties of the derived model are evaluated using two validation methods: k-fold cross validation and leave one out. Mathematical imputation methods like multivariate normal imputation were used to deal with missing data. In addition, sensitivity analysis is performed to assess the uncertainty embedded by learned parameters of the dynamic Bayesian network.
    A further application of the model is to support decision makers on their decision whether to release a product to customers for Beta testing or to conduct additional accelerated activities in order to reduce the overall risk level before customer shipment. In addition, the model is used for prediction purposes and provides an estimate of the expected risk at time t+1 based on the level of risk at time t.
  • Improve Efficiency in Problem Solving to Enhance Robustness in Low Volume Aircraft Component Production

    Authors: Sören Knuts (GKN Aerospace Sweden)
    Primary area of focus / application: Process
    Secondary area of focus / application: Six Sigma
    Keywords: Robustness, Problem solving, PPS/A3, Root cause analysis, SPS system
    Submitted at 4-Jul-2016 23:37 by Sören Knuts
    14-Sep-2016 09:40 Improve Efficiency in Problem Solving to Enhance Robustness in Low Volume Aircraft Component Production
    Low volume production means that a low number of data is available when optimizing a production process. Hence, it is obviously challenging how products with low volume can meet robustness requirements in production. Within GKN Aerospace an initiative on management level has been taken to achieve a high robustness level in production in what is called “Road to Zero”. Since robustness cannot be taken for granted when production changes are introduced the efficiency of a problem solving process is of great importance and must be a natural part of how robustness can be handled in production.
    GKN Aerospace is as many other producing companies confronted with internal reoccurring problems in the production. Reoccurring problems are waste and consume production resources, energy, material, lead time and stresses the personnel.

    This study concludes on results from different studies that have been performed within GKN including interviews, survey and review of internal documents.

    Following improvement factors have been identified:
    1) The lack of understanding and description of the failure defect (within failure report).
    2) The lack of problem characterization in description that is asked by the used template of Practical Problem Solving (within PPS/A3).
    3) The root cause analysis has shortcomings in how the method is performed when breaking down and identifying the root cause. (within PPS/A3)
    4) The ability to handle measurement data related to robustness problems in production, i. e. what is measured and reported. (within SPS-system)
    5) A go/no-go approval of parts has to be changed to controlling robustness in production process on a sub-process level with appropriate process data. (within SPS-system)
    6) The need of good communication and establishing a unified language within problem solving team.
    7) Spare time for problem solving.
    8) Operators incorporated in problem solving team and trained to understand the purpose of problem solving.
    9) Tiger team to assist and to boost the ability to solve a problem.
    10) Management to act as stakeholder of good problem solving and ask for 1)-9).

    By improving the efficiency in the problem solving process GKN can reduce the number of reoccurring non-conformances, and therefore enhance the competitiveness and the margins in current business.
  • Phase I Distribution-Free Analysis of Multivariate Data: Current Approaches and New Proposals

    Authors: Giovanna Capizzi (Department of Statistical Sciences, University of Padua)
    Primary area of focus / application: Process
    Keywords: Change-points, Multivariate sign, Multivariate ranks, Nonparametric, Preliminary analysis, Retrospective analysis, Statistical process monitoring
    Submitted at 7-Jul-2016 20:03 by Giovanna Capizzi
    Accepted (view paper)
    13-Sep-2016 12:10 Phase I Distribution-Free Analysis of Multivariate Data: Current Approaches and New Proposals
    Phase I distribution-free methods have received an increasing attention in the recent statistical process monitoring literature. A general discussion and some contributions in the univariate framework can be found in Capizzi (2015), Jones-Farmer et al (2009), Jones-Farmer, Champ (2010), Graham et al (2010) and Capizzi, Masarotto (2013). Two distribution-free Phase I Shewhart control charts, based on the Mahalanobis depth ranks and the spatial signs, have been recently suggested for the analysis of subgrouped data from a multidimensional elliptical distribution. The need for using distribution-free methods in Phase I applications will be illustrated. After briefly reviewing the existing nonparametric Phase I procedures for multivariate observations, two alternative Shewhart schemes, based on multivariate signed and spatial ranks, respectively will be introduced and discussed. Following the proposed approach, the assumption of an elliptical underlying distribution, often difficult to be checked before establishing process stability, can be avoided, also when the sample size is not large. Further, a multivariate generalization of the Recursive Segmentation and Permutation procedure, suggested in Capizzi, Masarotto (2013), is introduced. This new Phase I control chart (i) can handle individual and sugbgrouped multivariate data; (ii) does not require that the in-control distribution belongs to the elliptical family and it can be used for the simultaneous monitoring of discrete count and continuous variables; (iii) is designed to effciently detect isolated or single-step shifts but also more complex out-of-control patterns commonly encountered in practical applications. The results of a simulation study will be presented to compare the different surveillance schemes in a variety of out-of-control scenarios.