ENBIS-14 in Linz

21 – 25 September 2014; Johannes Kepler University, Linz, Austria Abstract submission: 23 January – 22 June 2014

My abstracts


The following abstracts have been accepted for this event:

  • Constrained On-Line Optimization Using Evolutionary Operation: A Case Study about Energy-Optimal Robot Control

    Authors: Koen Rutten (KU Leuven), Josse De Baerdemaekera (KU Leuven), Julian Stoev (Flanders Mechatronics & Technology Centre), Maarten Witters (Flanders Mechatronics & Technology Centre), Bart De Ketelaere (KU Leuven)
    Primary area of focus / application: Design and analysis of experiments
    Secondary area of focus / application: Process
    Keywords: EVOP, On-line optimization, Desirability, Steepest ascent, Case study
    Submitted at 30-Apr-2014 15:15 by Koen Rutten
    22-Sep-2014 14:30 Constrained On-Line Optimization Using Evolutionary Operation: A Case Study About Energy-Optimal Robot Control
    Optimization of full-scale processes during regular production is a challenge that is often encountered in practice, requiring specialized approaches that only introduce small perturbations so that production does not need to be interrupted. Based on a case study, we discuss the potential of Evolutionary Operation (EVOP) derived methods. The case study relates to a badminton robot which has to perform point-to-point motions during a fixed time interval, based on two operation modes: time-optimal motion, which ensures maximum precision but highest energy consumption, and energy-optimal motion, which decreases the energy consumption, but as a trade-off also lowers the precision. The current standard mode of operation is the energy-optimal mode which is constructed from off-line optimization on simulations. An on-line Evolutionary Operation Steepest Ascent (EVOPSA) optimization to further reduce the energy consumption by fine-tuning the implemented energy-optimal mode was implemented. The constrained nature of the problem, where energy needs to be minimized subject to a time constraint, was transformed to an unconstrained single-objective optimization using Derringer desirability functions. Two important contributions were made: (1) the on-line optimization of the energy-optimal motion lowered the energy consumption by 4.7% while keeping the precision constant, (2) the more stringent time-constraints implemented in desirability functions lead to an operation mode with maximum precision and 51.7% less energy consumption than the current time-optimal motion.
  • An Adapted Screening Concept to Detect Risk Devices

    Authors: Anja Zernig (KAI - Kompetenzzentrum für Automobil- und Industrieelektronik GmbH), Olivia Bluder (KAI - Kompetenzzentrum für Automobil- und Industrieelektronik GmbH), Jürgen Pilz (Alpen-Adria Universität Klagenfurt), Andre Kästner (Infineon Technologies Austria AG)
    Primary area of focus / application: Reliability
    Secondary area of focus / application: Quality
    Keywords: Reliability, Screening, Mavericks, ICA, Negentropy
    Submitted at 30-Apr-2014 17:20 by Anja Zernig
    Accepted (view paper)
    22-Sep-2014 10:45 An Adapted Screening Concept to Detect Risk Devices
    A reliable performance of semiconductor devices is of paramount importance as they are used in safety relevant applications like airbags. The production chain of semiconductor devices consists of hundreds of single steps interacting with each other. A permanent monitoring of system and device parameters keeps the process in secure specification limits. The finished devices are verified for their functionality whereby damaged devices are scrapped immediately. Nevertheless, among the remaining devices there are still some with a high risk to fail early, since the lifetime of semiconductor devices follows the well-known bathtub curve. As a consequence, the most delicate period for a device is its early lifetime, where failures are very likely to occur. In fact, these risk devices, called Mavericks, are good devices, at least in the sense of electrical functionality but unreliable in the long-term behavior. To identify Mavericks, statistical screening methods are applied to detect conspicuities in the measurement data, although they are still inside the specification limits. Summarized, with statistical screening methods Mavericks are detected and preventatively scrapped to guarantee a reliable performance.
    For Maverick detection on technologies with smaller structures it became apparent that the inspection of the measurement data themselves is insufficient. Moreover, an investigation of up to thousand available measurements is counterproductive as it leads to a huge amount of wrongly rejected devices and choosing a valuable subset of measurements depends on available expert knowledge. Remedy provides a data transformation, separating meaningful information from noise in the measurements. We propose to use Independent Component Analysis. Its concept is based on a segregation of additive mixtures (measurements) in independent random variables (sources). As the mixing process is unknown, demixing the measurements is an optimization problem, resulting in an estimation of the independent sources.
    Henceforth, sources instead of measurement data are analyzed for conspicuities. A distinction between sources with major and minor contribution to the detection of Mavericks is conducted to further improve the screening method. The measure of Negentropy, which quantifies the deviation of a source from a Gaussian distribution, is applied to identify non-Gaussian values in the sources. Those values are accused to be the desired Mavericks. First evaluations provide promising results that this approach leads to a reliable detection of Mavericks with a minor amount of misclassifications.
  • Extreme Value Statistics and Robust Filtering for Hydrological Data

    Authors: Bernhard Spangl (BOKU - University of Natural Resources and Life Sciences, Vienna), Peter Ruckdeschel (Fraunhofer ITWM, Kaierslautern)
    Primary area of focus / application: Modelling
    Secondary area of focus / application: Metrology & measurement systems analysis
    Keywords: Extreme values, Robust filtering, EM algorithm, Hydrology, River discharge
    Submitted at 30-Apr-2014 22:16 by Bernhard Spangl
    23-Sep-2014 09:40 Extreme Value Statistics and Robust Filtering for Hydrological Data
    River discharge data are extreme events with strong seasonal and regional variations. Modern research pools data collected at different times and locations to improve risk prediction at a certain point and time. Besides extreme value statistics, this calls for filtering techniques to remove seasonalities, and regional effects. Among the extreme events of such data, it is not clear whether they are recurrent, so non-robust approaches attributing overly high influence to single extreme events could impair future risk predictions. This motivates the use of robust variants of filtering and extreme value procedures. We analyze data comprising daily average discharge time series of rivers from various sites in Germany collected over the last 35 years at least. In the presentation, we limit ourselves to considering seasonal effects. For filtering, we fit a linear, time-invariant, time-discrete state space model to the data by means of a robust EM algorithm. By filtering we extract a detrended and desaisonalized series, to which we robustly fit a Generalized Pareto distribution.
  • Signal Interpretation in Hotelling's T2C Control Chart for Compositional Data

    Authors: Marina Vives-Mestres (Universitat de Girona), Josep Daunis-i-Estadella (Universitat de Girona), Josep-Antoni Martín-Fernández (Universitat de Girona)
    Primary area of focus / application: Process
    Secondary area of focus / application: Quality
    Keywords: Compositional data, Log-ratio, Hotelling T2, Multivariate control chart, Signal interpretation
    Submitted at 1-May-2014 13:37 by Marina Vives-Mestres
    23-Sep-2014 14:40 Signal Interpretation in Hotelling's T2C Control Chart for Compositional Data
    One of the main goals of Statistical Process Control (SPC) is to detect special-cause variation. This is achieved by using a statistical control chart and interpreting the output: when a signal is detected, the causes of the anomaly must be identified in order to apply appropriate remedial measures.

    In industry it is common to measure process performance by a set of interrelated continuous variables. In those cases, a widely used charting statistic is the Hotelling’s T2 which takes into account both the univariate and the interrelationship effects between variables. However, multivariate process monitoring may mask the cause of the anomaly because of the dimensionality reduction from a p-dimensional vector to a unidimensional statistic. One efficient procedure for interpreting signals in a T2 control chart is by decomposing the T2 statistic into orthogonal components directly interpretable known as MYT decomposition [1].

    Vives-Mestres et al. [2] proposed the T2C control chart suitable for monitoring compositional data (CoDa). Compositions are vectors of positive elements describing quantitatively the parts of some whole, which carry exclusively relative information between the parts. Their sample space is the simplex and specific statistical methods are necessary because of its particular geometry. T2C control chart is based on the log-ratio methodology and is plenty consistent with the characteristics of CoDa. Broadly speaking, T2C control chart is equivalent to applying the typical methodology to the compositions expressed in coordinates with respect to an orthonormal basis.

    The difficulty arises again when the causes of the signal in the T2C control chart have to be identified. Vives-Mestres et al. [3] dealt with the case of three part compositions (p=3). In this work, we present a general method (p>3) for interpreting such signals based on the MYT decomposition.


    [1] Mason, R. L., N. D. Tracy, and J. C. Young (1995). Decomposition of T2 for multivariate control chart interpretation. Journal of Quality Technology, 27 (2), 99-108.

    [2] Vives-Mestres, M., J. Daunis-i-Estadella, and J. A. Martín-Fernández (2014). Individual T2 control chart for compositional data. Journal of quality technology, 46 (2), 127-139.

    [3] Vives-Mestres, M., Daunis-i-Estadella, J., & Martín-Fernández, J.-A. (2014). Out-of-Control Signals in Three-Part Compositional T2 Control Chart. Quality and Reliability Engineering International, 30(3), 337–346.
  • More Precise Treatment Comparisons in User Studies by Modeling Baseline Effects

    Authors: Jan-Willem Bikker (Consultants in Quantitative Methods (CQM))
    Primary area of focus / application: Modelling
    Secondary area of focus / application: Business
    Keywords: User study, Linear mixed model, Structural equation model, Heteroskedasticity, Baseline effect, Visual analogue scale
    Submitted at 1-May-2014 19:18 by Jan-Willem Bikker
    23-Sep-2014 15:20 More Precise Treatment Comparisons in User Studies by Modeling Baseline Effects
    The Dutch consultancy firm CQM supports many R&D projects, linking the project issues with modeling and statistics. In one project, a study was performed in which a certain treatment is applied to users, who give an assessment before (X) and after (Y) the treatment. This subjective assessment is made on a Visual Analogue Scale (VAS): the user draws a mark on a line segment between the two extreme assessments. In this study, three variants of the treatments are compared within subjects; each subject tested each treatment once. The research question is to compare the effects of the different treatments on the change of the measurement, X-Y. A standard analysis could be a two-way ANOVA with factors treatment and subject. However, there is the suggestion of a baseline effect: higher X values tend to go together with larger reductions X-Y; partly due to the natural correlation between X and X-Y, but probably also due to the underlying nature of the phenomenon. Therefore, the default model was extended to have the baseline measurement X as a predictor as well. The added model terms are the main effect of X (often called a covariate) and also the interaction of X with treatment, and it turns out that treatments can be compared effectively using this interaction. An additional benefit is that this model also opens up the possibility to compare treatment effects between different studies of this type; the distribution of X tends to vary between such studies. This model also is applicable to between-subject studies where it reverts to a model of heteroskedasticity, and power studies can be made. Lastly, an attempt is made to separate the natural correlation between measured values of X and X-Y from the true underlying effect using structural equation models.
  • Credit Portfolio Models from a Copula Perspective

    Authors: Matthias Fischer (Universität Erlangen-Nürnberg)
    Primary area of focus / application: Finance
    Keywords: CreditRisk+, CreditMetrics, CreditPortfolioView, Credit-Value-at-Risk
    Submitted at 2-May-2014 22:39 by Matthias Fischer
    22-Sep-2014 12:35 Credit Portfolio Models from a Copula Perspective
    For internal capital allocation and risk controlling, banks are allowed to use internal models to quantify the amount of economic capital required to cover unexpected losses in extreme situations. Since every model is based on certain assumptions, a regulatory requirement is to quantify
    the model risk accompanied by e.g. distributional assumptions. In this talk, we analyze the implicit or explicit copulas of popular credit portfolio models as well as the model risk accompanied by the choice of a certain dependency structure by means of a hypothetical Portfolio. Our analysis incorporates representatives
    from the elliptical, the generalized hyperbolic and the Archimedean copula classes together with two implicit copulas, implicitly dened by the portfolio