ENBIS-16 in Sheffield

11 – 15 September 2016; Sheffield Abstract submission: 20 March – 4 July 2016

My abstracts


The following abstracts have been accepted for this event:

  • Experiences of Classroom Response Systems in an Introductory Course for Engineers

    Authors: Jesper Rydén (Uppsala University)
    Primary area of focus / application: Education & Thinking
    Keywords: Introductory course, Engineering education, Classroom response systems, E-learning
    Submitted at 25-May-2016 11:55 by Jesper Rydén
    Accepted (view paper)
    12-Sep-2016 10:40 Experiences of Classroom Response Systems in an Introductory Course for Engineers
    Classroom response systems (occasionally called clickers) could be useful when teaching large groups of students. There are several benefits using such devices, for instance maintaining students' attention during a lecture, or for the instructor to check the understanding of the students: of previously introduced concepts in the course or recently in the very lecture. Obviously the system needs to be tailored to the subject in question, for instance mathematical statistics. Examples will be given of problems and questions posed in an introductory course. Moreover, some participants of the course were located some 300 km away and typically a lecture was performed using video-conference equipment; implications of this way of teaching will be discussed.
  • Automatic Splat Detection in HVOF Spraying Processes

    Authors: Dominik Kirchhoff (TU Dortmund University), Sonja Kuhnt (Dortmund University of Applied Sciences and Arts)
    Primary area of focus / application: Mining
    Keywords: Splats, Image processing, Circular clustering, HVOF spraying, Coating
    Submitted at 25-May-2016 14:31 by Dominik Kirchhoff
    Accepted (view paper)
    13-Sep-2016 09:40 Automatic Splat Detection in HVOF Spraying Processes
    For the coating of a surface, thermal spraying processes are becoming increasingly popular. An interesting technique is high velocity oxygen fuel (HVOF) spraying, where a powder is fed into a jet, gets accelerated and heated up by means of a mixture of oxygen and fuel, and finally deposits as coating upon a substrate in form of small, about circular so-called splats.

    To gain more information about the influence of the parameter setting and in-flight particle properties, one can analyze scanning electron microscope (SEM) images of splats. Here, we propose an algorithm that automatically detects splats in such images. Before, this was not trivially possible, since existing software for the detection of objects often requires the objects to be very similar and fore- as well as background to be homogenous. Splats, however, vary with regard to their shape, size, and structure, and the images tend to be quite noisy.

    Our algorithm fits circles to the splats in an iterative manner, mainly using Rotational Difference Kernel Estimators (RDKE) to detect edges and circular clustering to find circles. We develop a measure to quantify the goodness of fit of a set of circles given manually assigned circles and conduct a small study in which we find that the algorithm works well, except for images with many overlapping splats. The algorithm drastically reduces the amount of time researchers spend on counting splats. As a next step, a shape classifier could be applied based on the found splat locations.
  • Bayesian Networks or Regression: Which Model is more Useful in my Case?

    Authors: Jan-Willem Bikker (CQM)
    Primary area of focus / application: Modelling
    Secondary area of focus / application: Business
    Keywords: Bayesian networks, Probabilistic networks, Case study, Machine learning, Modeling
    Submitted at 25-May-2016 14:54 by Jan-Willem Bikker
    12-Sep-2016 10:00 Bayesian Networks or Regression: Which Model is more Useful in my Case?
    The Dutch consultancy firm CQM (Consultants in Quantitative Methods, ~35 consultants) supports many customer’s projects in R&D, but also in planning and supply chain. Naturally, we also look at trends in machine learning and data science. For one relatively new type of project, we tried two approaches: a statistical approach using regression/ANOVA type models, and one using probabilistic networks or Bayesian networks, a machine learning technique popular in e.g. computer science. Both approaches can be applied to a project where a user’s overall rating of a product is linked to assessments of underlying aspects of the product. We use such models to identify the product’s most important aspects, realizing the limitations of an observational study. One difference between both modeling approaches is a quantification of sampling uncertainty that Bayesian networks lack, which we find important when reporting the results. Another is the type of effect that is estimated: regression focuses on the underlying truth whereas Bayesian Networks focus on prediction of future observations, which leads to different effect sizes. The latter can be adapted, and then we see that these two models (and a few others as well) then give very similar results. This reassuring result is possibly typical of classical statistical methods vs machine learning methods. Finally, we also encountered a variant of this case study in which 95% of the predictor values are missing values, posing some modeling challenges. The discussion includes possibilities of imputation, SEM with missing values, and an analysis with data in long format.
  • Two-level Fractional and Fractional Factorial Designs in Blocks of Size Two

    Authors: Janet Godolphin (University of Surrey, UK)
    Primary area of focus / application: Design and analysis of experiments
    Keywords: Aliasing, Confounding, Design, Factorial, Orthogonal
    Submitted at 25-May-2016 15:39 by Janet Godolphin
    14-Sep-2016 09:40 Two-level Fractional and Fractional Factorial Designs in Blocks of Size Two
    Two level factorial and fractional factorial designs are widely used in screening experiments to determine which factors have a significant effect on an industrial process. Practical constraints can make it necessary to arrange the runs in blocks of size two. For example, it might only be possible to conduct two runs in a fixed period of time, or a batch of raw material may only be sufficient for two runs.

    The construction of designs comprising replicates of two-level full factorials or fractional factorials arranged in blocks of size two, to estimate main effects and two factor interactions, is investigated. A construction procedure is developed based on a new concept of Selected Treatment Combinations. This provides a straightforward method of design generation in which the number of replicates of full or fractional factorials is minimised. The approach is illustrated by examples.
  • On the Use of Simulators for Teaching Statistical Methods

    Authors: Marco P. Seabra dos Reis (Department of Chemical Engineering, University of Coimbra), Ron S. Kenett (KPA Group)
    Primary area of focus / application: Education & Thinking
    Keywords: Statistical education, Process simulators, Pedagogical roadmap, The quality ladder, Statistics curriculum
    Submitted at 25-May-2016 16:48 by Marco P. Seabra dos Reis
    Accepted (view paper)
    13-Sep-2016 10:30 On the Use of Simulators for Teaching Statistical Methods
    Teaching and training activities benefit from the establishment of links between the concepts under analysis and the reality of people’s daily experiences and working backgrounds. These links foster students’/trainees’ engagement and provides an efficient root to the contextualized acquisition of skills and competences. The engagement is even higher when trainees are called to play an active role in the learning process. The benefits of active learning are well established and constitute a cornerstone of many successful training programs. This talk addresses a structured overview of simulation platforms for teaching statistical concepts. We propose a hierarchical classification scheme for process simulators based on their inherent complexity, and describe training situations where each class offers valuable solutions.
  • Markovian and Non-Markovian Sensitivity Enhancing Transformations for Process Monitoring

    Authors: Tiago J. Rato (University of Coimbra), Marco P. Seabra dos Reis (University of Coimbra)
    Primary area of focus / application: Process
    Keywords: Principal component analysis, Sensitivity enhancing transformation, Fault detection and diagnosis, Markovian and Non-Markovian modelling, Causal network
    Submitted at 26-May-2016 14:51 by Tiago Rato
    12-Sep-2016 10:40 Markovian and Non-Markovian Sensitivity Enhancing Transformations for Process Monitoring
    Industrial processes typically present a high degree of correlation among variables, as a result of the natural interactions, control policies and redundant sensors existing in the processing units. This redundant information is useful for tracking process variability and for detecting abnormalities by construction of adequate reference models, being principal component analysis (PCA) a popular methodology in this context. However, PCA only defines the region of acceptable values and does not take into account the process’ causal structure, which is fundamental for fault diagnosis. Other modelling alternatives based on partial correlations can extract the inner causal network underlying data and such information can be further used to model the correlation, namely via sensitivity enhancing transformation (SET). In the simplest form, the SET models each variable by regression on its causal parents, following a Markovian approach. Alternatively, Non-Markovian modelling is here considered by making use of all ancestors and thus potentially improve the description power and robustness of the transform. The transformed variables are then very suitable for detecting mean changes as they highlight deviations from expectation, while reducing the smearing effect problem of PCA. Furthermore, as they are uncorrelated, but still directly related with the original variables, fault diagnosis significantly benefits from such a transform. This is achieved through a procedure similar to the Mason, Tracy and Young (MTY) decomposition, but without the cumbersome computation of multiple contribution terms.
    Both SET modelling approaches were compared against PCA methodologies and proved to significantly improve monitoring performance in several cases study.
    The authors acknowledge financial support through Portuguese FCT project PTDC/QEQ-EPS/1323/2014 / Compete Project nb. 016658, co-financed by the Portuguese FCT and European Union’s FEDER through the program “COMPETE 2020”.