ENBIS Spring Meeting 2017

28 – 30 May 2017; Monastery of Schlägl in Upper Austria Abstract submission: 11 November 2016 – 5 March 2017

My abstracts


The following abstracts have been accepted for this event:

  • Evaluating the Environment of a Solar Inverter with Power Electronic Control Data

    Authors: Bernhard Lanz (Fronius International GmbH)
    Primary area of focus / application: Reliability
    Keywords: Power electronics, Solar inverter, Reliability, Condition monitoring
    Submitted at 23-Feb-2017 12:31 by Bernhard Lanz
    Accepted (view paper)
    30-May-2017 12:35 Evaluating the Environment of a Solar Inverter with Power Electronic Control Data
    Fronius solar power inverters are designed to sustain harsh environmental conditions in different climate zones all over the world. Since the systems are supposed to be maintenance free, sufficient robustness margin, mainly achieved through oversizing, is provided. Future power electronic systems enable a significant decrease in weight and size, therefore oversizing shall be avoided. Functions like condition monitoring and regeneration are essential to fulfill system reliability targets with reduced size.
    The power module integrated sensors are used to control the power conversion. It is assumed that the sensor data from different systems can be used to get information about the environment of inverters. Statistical methods are applied to utilize the information provided by sensor data in order to explore the environmental conditions of an inverter, to trigger regeneration or to adjust condition monitoring functions.
  • An Approach for Modelling System Reliability Based on Physics of Failure

    Authors: Franz Moser (Statistische Anwendungen - Joanneum Research), Ulrike Kleb (Statistische Anwendungen - Joanneum Research), Birgit Kornberger (Statistische Anwendungen - Joanneum Research), Michaela Dvorzak (Statistische Anwendungen - Joanneum Research)
    Primary area of focus / application: Reliability
    Keywords: System reliability, Physics of failure, Dynamic stress, Survival signature
    Submitted at 23-Feb-2017 17:54 by Franz Moser
    29-May-2017 16:35 An Approach for Modelling System Reliability Based on Physics of Failure
    Especially in safety-critical applications, as automotive, aviation or medical applications, it’s crucial to have an accurate reliability assessment of the embedded microelectronic devices. Moreover the lifetime performance of a microelectronic system under environmental stress (e.g. voltage, current, temperature), the system will be exposed to, has to be well-characterized.

    In semiconductor industry it is common practice to perform lifetime tests of single defects under constant stress conditions. The tests are done isolated for each relevant failure mechanism (such as hot carrier degradation, gate oxide breakdown or electromigration) to comply with technical standards. Furthermore the possibilities for implementing realistic test settings, like time-varying stress conditions, are limited. From the results of these tests, it is neither possible to derive the expected lifetime at system level nor dynamic stress can be taken into account.

    The main task of our comprehensive approach is to improve reliability predictions at system level, considering physics of failure and dynamic stress conditions. It comprises the following elements:

    (1) Definition of a system structure for a given reliability block diagram: First, the system components (vertices) and their interconnections (edges) have to be determined. As a device fails with the occurrence of just one type of failure, the failure mechanisms can be regarded as serial sub-system of the overall system. Thus, this step results in a serial-parallel system, each component representing one failure mechanism, depicted as a graph.

    (2) Calculating the Survival Signature: The Survival Signature is a concept that has been introduced for reliability assessment of systems. The procedure provides probabilities for the functioning of the system, given that a certain number of its components work. These probabilities characterize a given system structure and are not related to the survival functions of the components.

    (3) Physics of Failure: This includes statistical modelling based on lifetime data from laboratory experiments (acceleration tests) under different stress conditions for every single failure mechanism. Aside from stress parameters, some design parameters such as transistor length are part of the proven failure distribution models.

    (4) Dynamic stress: Only in rare cases electric variables in a microelectronic device behave constantly. To overcome this fact, a known approach was extended that enables to deal with dynamic voltages and currents. Based on stress profiles (waveforms), respectively clock cycles, that can be provided by circuit simulations, a time transformation factor, containing the total dynamic stress information (ESWF - Equivalent Stress Weighted Factor), is calculated. Applying this procedure, conventional (constant) stress models can still be used for lifetime prediction.

    (5) The System Survival Probability at any time can be estimated with the failure distribution models, appropriately corrected by the ESWFs, and the probabilities arising from the survival signature. A complete system survival function will be generated by calculation of the above procedure for a sequence of defined time steps.

    The approach got implemented in R shiny for demonstration purposes. An example of a simple inverter, serving as a test system, will be shown.
  • Statistical Analysis of Vehicle/Track Interaction Data for Improved Predictive Maintenance

    Authors: Gerald Trummer (Virtual Vehicle Research Center), Bernd Luber (Virtual Vehicle Research Center), Josef Fuchs (Virtual Vehicle Research Center), Luzia Burger-Ringer (Institute of Statistics, Graz University of Technology)
    Primary area of focus / application: Reliability
    Keywords: Predictive maintenance, Multiple regression, Vehicle/track interaction, Railways
    Submitted at 24-Feb-2017 08:50 by Gerald Trummer
    Accepted (view paper)
    30-May-2017 11:55 Statistical Analysis of Vehicle/Track Interaction Data for Improved Predictive Maintenance
    Railway vehicles and tracks must be properly maintained to meet safety and comfort requirements. The objective of this work is to demonstrate a methodology based on statistical methods to identify railway track sections in which dynamic forces deviate from predicted values in large sets of measurement data. The identified sections can then be analyzed in-depth to identify possible causes in order to improve the quality of the prediction.
    We focus on the standard deviation of the dynamic vertical and lateral forces on wheelsets which should be as low as possible from a maintenance point of view. These forces are linked to geometrical deviations and the track layout by multiple linear regression models obtained via stepwise procedures. Additional influences on the dynamic forces may then manifest themselves as deviations between measured and predicted values. The identification of such track sections is the focus of the described methodology.
    Our results indicate that the possible causes for large deviations between prediction and measurement may be attributed to the lack of explanatory variables in the model, dynamic excitation of vehicles, or changes of the track superstructure properties. Some of these events occur infrequently in the data (e.g. bridges, switches) hence they were not identified as significant variables although they may occasionally lead to high dynamic vehicle forces.
  • New Time Reduction Method for Burn-in of Semiconductor Devices

    Authors: Daniel Kurz (Department of Statistics, Alpen-Adria University of Klagenfurt), Horst Lewitschnig (Infineon Technologies Austria AG), Jürgen Pilz (Department of Statistics, Alpen-Adria University of Klagenfurt)
    Primary area of focus / application: Reliability
    Secondary area of focus / application: Quality
    Keywords: Binomial distribution, Burn-in time, Sampling plan, Weibull distribution, Zero defects
    Submitted at 24-Feb-2017 11:20 by Daniel Kurz
    30-May-2017 11:55 New Time Reduction Method for Burn-in of Semiconductor Devices
    In semiconductor manufacturing, a burn-in (BI) study is applied to demonstrate a sufficiently low failure probability in the early life of the produced devices. In order to reduce the BI time during a BI study, a randomly selected number of devices is put into the BI and investigated for early failures at predefined readout times.

    In this talk, we propose a new concept for assessing the BI time on the basis of the results from the BI study and a reference distribution for the lifetime of early failures. This involves to derive an upper bound for the total failure proportion from dependent binomial samples from different readout intervals. Furthermore, we provide suggestions on how to determine an appropriate reference curve for early failures. To avoid a too fast reduction of the BI time, we consider a delaying factor, which is based on the predictive probability of increasing the BI time.

    Finally, the proposed concept is applied to determine sampling plans for reducing the BI time under different reduction strategies. Moreover, we extend the presented method to derive the BI times of follower products with different chip sizes. In this way, we are led to flexible BI times (of follower products), which essentially contributes to a reduction of the efforts of BI.
  • Sparse Representation Based Approach for Change Detection

    Authors: Ofer Levi (The Open University)
    Primary area of focus / application: Modelling
    Secondary area of focus / application: Mining
    Keywords: Sparse representation, Change detection, Model selection, Least Squares, Fourier analysis, Wavelets
    Submitted at 26-Feb-2017 10:17 by Ofer Levi
    29-May-2017 16:15 Sparse Representation Based Approach for Change Detection
    Dynamic signals and time series often consist of mixtures of patterns and phenomena occurring at different times. Traditional signal processing and analysis methods are optimized to handle signals that include a single class of patterns, such as Fourier Representation for pure harmonics or Wavelets Representation for piece-wise smooth functions. In simplified and unreal scenarios, simple operations such as thresholding or filtering in the appropriate space can be very effective for extraction of meaningful patterns from a noisy signal. However, using a single representation method usually yields mediocre results when applied to real-life signals. Matching Pursuit and Basis Pursuit use the idea of merging several different representation methods to create a so-called over-complete dictionary. These methods can decompose a signal into relatively few meaningful components by searching for the sparsest possible representation, given the right choice of dictionary. In this talk we will show how sparse representation methodologies can be used for detection of a change point in dynamic signals and time series of certain types. A case study will be presented in the topic of "detection of local variations in real measured heart bit signals" as well possible applications to process control and maintenance.
  • Comparing Estimator Accuracy for the Weibull Distribution

    Authors: Bryan Dodson (SKF Group Six Sigma), Rene Klerx (SKF Group Six Sigma), Marco Capozzi (SKF Group Six Sigma)
    Primary area of focus / application: Reliability
    Secondary area of focus / application: Six Sigma
    Keywords: Weibull distribution, Reliability, Maximum Likelihood, Median rank regression
    Submitted at 26-Feb-2017 17:29 by Marco Capozzi
    30-May-2017 12:15 Comparing Estimator Accuracy for the Weibull Distribution
    It is generally accepted that there is a bias when estimating the shape parameter of the Weibull distribution, however, there is no general agreement on which estimation method is most accurate. In fact, there is little agreement regarding accuracy for not only the shape parameter, but also the scale parameter and desired percentiles. We answer some of the questions by simulating a range of censoring cases with a variety of shape parameters. The estimation methods considered are maximum likelihood, regression against exact median rank values on the y-axis, and regression against exact median rank values on the x-axis. One hundred simulation trials are provided for each case.