ENBIS-14 in Linz

21 – 25 September 2014; Johannes Kepler University, Linz, Austria Abstract submission: 23 January – 22 June 2014

My abstracts

 

The following abstracts have been accepted for this event:

  • A Panel-Based Stratified Cluster-Sampling Methodology for the Evaluation of Photovoltaic Field Data

    Authors: Evgenii Sovetkin (RWTH-Aachen)
    Primary area of focus / application: Design and analysis of experiments
    Keywords: Photovoltaic systems evaluation, Photovoltaic module defects, Stratified cluster-sampling, Linear model
    Submitted at 29-May-2014 21:52 by Evgenii Sovetkin
    Accepted (view paper)
    24-Sep-2014 10:55 A Panel-Based Stratified Cluster-Sampling Methodology for the Evaluation of Photovoltaic Field Data
    The evaluation and monitoring of photovoltaic (PV) systems is of increasing importance, in order to assess long-term quality and reliability. We propose a panel-based methodology, where the PV system are chosen from a larger population by means of a (nearly) orthogonal design and field data are collected over several years by a cost-efficient stratified cluster-sampling approach. The stratification allows to take into account additional factors, especially the manufacturer and known classifications of the modules with respect to known damages. Relying on a linear model which is estimated by weigthed least squares with nonparametrically estimated covariance matrix, we investigate the influence of several model parameters such as the time-varying proportions of defect modules and the within-cluster correlation on the power of significance testing. In addition, the effects on the estimation of the proportions of module defects is examined.
  • Efficiently and Reliably Populating the Entire Pareto Front for Multi-Criteria Optimal Design

    Authors: Timothy Robinson (University of Wyoming)
    Primary area of focus / application: Design and analysis of experiments
    Keywords: Response surface, Computer experiments, Multiple objectives, Optimization
    Submitted at 30-May-2014 01:15 by Timothy Robinson
    Accepted
    23-Sep-2014 16:40 Efficiently and Reliably Populating the Entire Pareto Front for Multi-Criteria Optimal Design
    When constructing an optimal experimental design, it is advantageous to find a design that is robust to diverse purposes of the decision maker. Simultaneous optimization over multiple design criteria begins with a careful selection of the experimenter’s priorities and then moves to an examination of the inherent trade-offs that exist among the associated criteria. The trade-offs can be collectively represented by a set of solutions, popularly known as a Pareto-optimal set whose image is called the Pareto Front. While the Pareto front offers exciting possibilities and flexibilities for the decision maker for examining design robustness to a variety of specified design criteria as well as an opportunity to visualize the implicit trade-offs among the criteria, existing algorithms for populating the Pareto front tend to be computationally expensive and/or non-reliable. In this talk, we propose two efficient hybrid algorithms for populating the Pareto front based on a hybrid of the traditional coordinate exchange algorithm and multi-objective evolution. The capability the two algorithms is illustrated using examples.
  • The Role of Statistical Thinking for Solving the Main Problems Facing Humanity

    Authors: Vladimir Shper (Moscow Institute of Steel & Alloys)
    Primary area of focus / application: Education & Thinking
    Secondary area of focus / application: Business
    Keywords: Statistical, Thinking, Future, Of mankind, Variability
    Submitted at 30-May-2014 06:00 by Vladimir Shper
    Accepted (view paper)
    23-Sep-2014 14:40 The Role of Statistical Thinking for Solving the Main Problems Facing Humanity
    The World is facing many complex and inconsistent challenges. One of the greatest is the continuously expanding gap between the world's rate of change and the rate of change of human beings thinking. We think that the main way to cure our thinking goes through the change of the prevailing educational system. This change should include many facets but one of the most important is that one which is closely connected with statistical thinking. We are sure that the mankind should one and all become statistically thinking because the World we all live in is not deterministic but probabilistic. This paper discusses both why we came to such a conclusion and what we suggest to do in order to achieve necessary changes.
  • Modelling and Controlling of a Thermal Spraying Process

    Authors: Nikolaus Rudak (TU Dortmund University), Sonja Kuhnt (Dortmund University of Applied Sciences and Arts)
    Primary area of focus / application: Design and analysis of experiments
    Secondary area of focus / application: Process
    Keywords: Thermal spraying process, Statistical modelling, Covariance modelling, Process optimization, Multicriterial optimization
    Submitted at 30-May-2014 10:06 by Nikolaus Rudak
    Accepted
    24-Sep-2014 10:55 Modelling and Controlling of a Thermal Spraying Process
    The thermal spraying technology can be employed to apply a coating on a surface, e.g. for wear- or corrosion-resistance. Spray material is fed in powder form into the spraying gun. Here, it is melted up and a gas stream accelerates the particles towards the substrate where the particles deposit as a coating. Fields of applications for thermal spray coatings include for example the automotive industry (e.g. brake disk coatings) or the gas industry (e.g. heat exchangers or pipes). Modelling and controlling of a thermal spraying process is an active field of research. Coating properties can only be measured by very time-consuming and expensive experiments as the layer has to be destroyed. A common approach is to predict coating properties from the process parameters. Unfortunately, on different days identical process parameters often result in different coating properties due to uncontrollable effects. It is assumed that these influences are already observable in in-flight particle properties. We therefore aim at a prediction of coating properties by taking particle properties and correlations among them into account (Rudak et al., 2012). We make use of models where the covariance matrix is allowed to depend on covariates. The main challenge in covariance modelling is the positive definiteness constraint. Existing approaches from the literature which make use of methods such as Cholesky decompositions and kernel estimation are compared by simulations studies and application to the thermal spraying process. We than integrate these models in the JOP method (Kuhnt and Rudak, 2013) and optimize the thermal spraying process by minimizing a risk function for a sequence of cost matrices.

    References:

    Kuhnt, S. and Rudak, N. (2013), "Simultaneous Optimization of Multiple Responses with the R-Package JOP", Journal of Statistical Software, 54 (9).

    Rudak, N., Kuhnt, S., Hussong, B. and Tillmann, W. (2012), "On different strategies for the prediction of coating properties in a hvof process", SFB 823 Discussion Paper 29/12, TU Dortmund University.
  • A QbD Example from the Pharma Industry – Application of a Novel Approach to Modelling Dissolution Data, from a Split-Plot Factorial Design to Account for Both Process Parameters & Quality Attributes

    Authors: Kevin Lief (GlaxoSmithKline)
    Primary area of focus / application: Quality
    Secondary area of focus / application: Modelling
    Keywords: Split-plot factorial design, Critical quality attributes, Dissolution, QbD, Product quality
    Submitted at 30-May-2014 13:08 by Kevin Lief
    Accepted (view paper)
    23-Sep-2014 10:55 A QbD Example from the Pharma Industry – Application of a Novel Approach to Modelling Dissolution Data, from a Split-Plot Factorial Design to Account for both Process Parameters & Quality Attributes
    The purpose of this paper is to employ Restricted Maximum Likelihood and Partial Least Squares methods (REML-PLS) for combining information from potentially critical quality attributes and design variables to predictive dissolution model, when the data has been generated from a split-plot structure.

    Many pharmaceutical companies utilise split-plot designs to aid the development of their production processes. At the start and through the process, many measurements on potentially critical quality attributes may be taken, without necessarily understanding which of those measurements contain important information on the final product quality.

    A REML-PLS method is applied to investigate if variations in properties of these attributes, in the presence of a split-plot design, impact the final product quality of a solid oral dose, as measured by dissolution.
  • Modified Optimality Criteria for Models with Correlated Observations

    Authors: Juan M. Rodriguez-Diaz (University of Salamanca (Spain)), Maria J. Rivas-Lopez (University of Salamanca (Spain)), Sandra Martin-Chaves (University of Salamanca (Spain))
    Primary area of focus / application: Design and analysis of experiments
    Keywords: Correlated observations, Optimality criteria, Restricted maximum likelihood estimator, Variance-covariance matrix
    Submitted at 30-May-2014 13:57 by Juan M. Rodriguez-Diaz
    Accepted
    22-Sep-2014 12:35 Modified Optimality Criteria for Models with Correlated Observations
    For the case of correlated observations, the variance of the estimators of model parameters is not the covariance matrix of their asymptotic distribution any more. Optimal designs try to minimize this variance for a particular optimality criteria, and thus should take into account the correction with respect to the traditional approach. Some analytical results are derived for the general model and the random-block model. The theory is applied to several examples showing that the difference with optimal designs obtained using the corrected variance and those computed using the variance-covariance matrix of the asymptotic distribution may be important in some cases.