ENBIS-14 in Linz

21 – 25 September 2014; Johannes Kepler University, Linz, Austria Abstract submission: 23 January – 22 June 2014

My abstracts

 

The following abstracts have been accepted for this event:

  • Nonparametric Estimation and Forecasting of Time Series with Deterministic Trend and Season and Traffic Fatalities in Germany

    Authors: Harry Haupt (University of Passau), Joachim Schnurbus (University of Passau)
    Primary area of focus / application: Economics
    Secondary area of focus / application: Modelling
    Keywords: Mixed kernel forecasting, Smoothing, Deterministic trend and season, Break points
    Submitted at 28-Apr-2014 18:32 by Joachim Schnurbus
    Accepted
    22-Sep-2014 11:05 Nonparametric Estimation and Forecasting of Time Series with Deterministic Trend and Season and Traffic Fatalities in Germany
    A simple procedure for simultaneous smoothing in time series regressions with deterministic components, lagged endogeneous and exogeneous components is proposed. Using monthly data on traffic fatalities in Germany from January 1991 to December 2013 the forecasting performance is compared to several commonly used forecasting methods such as ARIMA and innovation state space models as well as naive forecasting approaches. Besides an analysis of the forecasting performance using repeated hold-out samples, we compare the proposed methods in terms of identifying and adapting to breaks in the trend component not captured by political (traffic regularizations) or economic or other (concerning e.g. safety) reasons.
  • An Inverse Problem to Simulate the Degradation Process of Semiconductor Devices

    Authors: Barbara Pedretscher (KAI Kompetenzzentrum für Automobil- und Industrieelektronik GmbH), Olivia Bluder (KAI Kompetenzzentrum für Automobil- und Industrieelektronik GmbH), Barbara Kaltenbacher (Alpen Adria Universität Klagenfurt), Kathrin Plankensteiner (KAI Kompetenzzentrum für Automobil- und Industrieelektronik GmbH)
    Primary area of focus / application: Reliability
    Secondary area of focus / application: Modelling
    Keywords: Semiconductor Reliability, Modeling Lifetime Data, State Space Model, Adjoint Method
    Submitted at 29-Apr-2014 09:53 by Barbara Pedretscher
    Accepted (view paper)
    24-Sep-2014 10:15 An Inverse Problem to Simulate the Degradation Process of Semiconductor Devices
    In automotive industry reliability analysis on semiconductor devices is highly important to discover device weaknesses and to guarantee safe operation. Due to limited resources it is not possible to test all devices. Consequently, reliable models for the device lifetime are necessary. Previous investigations show that commonly used acceleration models like Arrhenius relationship cannot capture the complex lifetime behavior but they provide insufficient forecasts. Hence, for modeling degradation advanced models such as stochastic processes have to be considered. In order to describe the trend of this degradation process a state space model is proposed.

    The data contains input parameters of accelerated stress tests (current, voltage, pulse width, repetition time), lifetime data measured in Cycles to Failure (CTF) and special electrical measurements (ATE-measurements), which are available at pre-defined stress cycles. It is assumed that the mentioned ATE-measurements are indicators for the progress of the degradation.

    A state space model combines state variables with input and output data via a state differential equation and an observation equation. In this context CTFs, modeled by first passage times, and ATE-measurements belong to the output.
    Since crack propagation and the implied increasing temperature are the main reasons for device failure, we model an interacting system of differential equations with two state variables, which map the time behavior of these failure causes. Usually, the crack propagation is modeled by the Paris law dependent on the stress range. Since wafer bow measurements on the metallization of interest indicate that the stress range is in the plastic regime, meaning that the stress does not change significantly with small changes in the temperature rise, the temperature dependent strain is used for modeling instead. For this purpose, Hooke’s formula, which provides a connection between the stress range, the strain and the modulus of elasticity, is used. Measurements show that even the modulus of elasticity can be modeled temperature dependently.
    The temperature rise in the device can be modeled by the input parameters of the lifetime test and a thermal coefficient reflecting the material properties. The input parameters do not change over testing time, but the temperature rise in the device under test increases; hence the thermal coefficient is directly proportional to the temperature rise. Generally spoken, the coefficient mirrors the thermal connection of the material layers in the device. Since it needs to tend towards a saturation level we use a logistic curve to model its behavior. Thus, the state variables are given by the degradation, modeled via the crack propagation, and the thermal coefficient.
    Due to a high number of unknown parameters compared to available measurements, the parameter estimation is challenging. Under the assumption of normal distributed measurement errors the log-likelihood function can be used as a cost function. For efficient minimization the gradient of the log-likelihood function with respect to the parameters is needed. To avoid expensive calculation of sensitivities an adjoint method, using Lagrange multipliers is applied.
  • Towards More Reliable Analysis of Nonregular Designs

    Authors: John Tyssedal (The Norwegian University of Science and Technology)
    Primary area of focus / application: Design and analysis of experiments
    Keywords: Factor based, Nonregular designs, Projection properties, Screening
    Submitted at 29-Apr-2014 11:51 by John Tyssedal
    Accepted
    23-Sep-2014 09:00 Towards More Reliable Analysis of Nonregular Designs
    Two level nonregular designs have been one of the main research topics within Design of Experiments the last 20 years. How to analyze them has been considered rather challenging and considered more a task for the experts than for the practitioners. A lot of methods have been proposed both factor based and effect based, frequentist and Bayesian. Among suggested methods are also penalized least squares methods like the LASSO and the SQUAD and the Dantzig selector. Most methods are demonstrated to work well on given examples, but their overall performance is still rather unclear. As a result there seems to exist a lot of confusion among practitioners on how to analyze data from such experiments and highly disputable advices about which methods to use and what assumptions that is needed for their analysis. In this talk we first explain what we mean by a reliable analysis and demonstrate by examples attempts on how to perform such an analysis.
  • Factor Screening in Non-Regular Two-Level Designs based on Projection Based Variable Selection

    Authors: Shahrukh Hussain (Department of Mathematics), John Sølve Tyssedal (Department of Mathematics)
    Primary area of focus / application: Design and analysis of experiments
    Secondary area of focus / application: Design and analysis of experiments
    Keywords: Screening, Non regular designs, PB design, Multiple determination
    Submitted at 29-Apr-2014 12:57 by Shahrukh Hussain
    Accepted
    23-Sep-2014 09:20 Factor Screening in Non-Regular Two-Level Designs Based on Projection Based Variable Selection
    In this paper we focus on the problem of factor screening in non-regular two-level designs through gradually reducing the number of possible sets of active factor. Three methods are tested out on a panel of models consisting of three or four active factors with data generated using a 12 run PB design. All three methods do variable selection on projection models, but differ in the way terms are entered into a model. The dependence of the procedures on the amount of noise, number of experimental factors and type model is also investigated. In turns out that variable selection based on the change in the coefficient of multiple determination had the overall best performance. A real example is included to show how we suggest factor screening can be done in practise.
  • Statistical Literacy of Business Students

    Authors: Christine Duller (Johannes Kepler University Linz)
    Primary area of focus / application: Education & Thinking
    Keywords: Dtatistical thinking, Teaching statistics, Education, Empirical study
    Submitted at 29-Apr-2014 15:10 by Christine Duller
    Accepted
    23-Sep-2014 16:00 Statistical Literacy of Business Students
    This empirical study shows the difficulties of students in interpreting "statistical" diagrams or in very basic mathematics and statistics respectively. The data were collected at the University Linz, Austria, in March 2001, 2003, 2005, 2007 and 2013. The samples consist of participants of basic courses in statistics, which is obligatory for all students in economics, business and social sciences. The questionnaire was made up of eleven questions dealing with mathematics or statistics, most of them were multiple choice questions with six different possibilities of response. The topics were interpretation and calculation without calculator of percentages and fractions, interpretation of graphics and (rough) estimation of square roots and percentages.

    The results were not very glorious, even majority was not always right, as some examples show. The samples show that most of the students have serious difficulties in interpreting "statistical" diagrams and also problems in very basic mathematics.
  • Redefining Maintenance Events: A Study of Load-Haul Dump Machines

    Authors: Chris McCollin (Nottingham Trent University)
    Primary area of focus / application: Reliability
    Keywords: Maintenance, Differential equations, Non-homogeneous poisson processes, Load-haul dump machines
    Submitted at 29-Apr-2014 15:38 by Chris McCollin
    Accepted
    22-Sep-2014 12:15 Redefining Maintenance Events: A Study of Load-haul Dump Machines
    The paper addresses a new definition of maintenance event which accounts for the continuous nature of aging (as well as wear) and maintenance rules for periodic servicing.
    Differential equations are then developed to explain the operations and maintenance aspects of systems. These are compared with well-known differential equation structures. A comparison is made with existing non-homogeneous Poisson process models. The physical structure of systems event data is shown graphically and the model obtained is applied to a well-known data set. Comments on the meaning of the structure, the interrelationship of the components within the system and future work is discussed.