ENBIS-14 in Linz

21 – 25 September 2014; Johannes Kepler University, Linz, Austria Abstract submission: 23 January – 22 June 2014

My abstracts


The following abstracts have been accepted for this event:

  • On the Union-Intersection Solution to the Testing for Equivalence and Non-Inferiority

    Authors: Fortunato Pesarin (Dept. of Statistics, University of Padua, Italy), Luigi Salmaso (Dept. of Management and Engineering, University of Padua, Italy), Eleonora Carrozzo (Dept. of Management and Engineering, University of Padua, Italy)
    Primary area of focus / application: Quality
    Keywords: Dependent data, Nonparametric combination, Permutation tests, Union intersection tests
    Submitted at 11-Apr-2014 20:33 by Fortunato Pesarin
    Accepted (view paper)
    23-Sep-2014 10:55 On the Union-Intersection Solution to the Testing for Equivalence and Non-Inferiority
    The notion of testing for equivalence and/or non-inferiority of two treatments is widely used in clinical trials, pharmaceutical experiments, bioequivalence and quality control. Traditionally, it is essentially approached within the intersection-union principle (IUP). According to this principle the null hypothesis is stated as the set of effects lying outside a suitably established interval and the alternative as the set of effects lying inside that interval. The non-inferiority problem is tackled by simply setting at the infinity one of the interval limits. The solutions provided in the literature are essentially based on likelihood techniques, which in turn are rather difficult to handle especially in multidimensional situations.
    The main goals of present work are: (i) to go beyond the limitations of likelihood based methods, by working in a nonparametric setting within the permutation frame; (ii) to provide an insight into those problems by observing that they can rationally be approached by two quite different principles: one based on the IUP, the other based on Roy's union-intersection principle (UIP); (iii) to provide a general multidimensional permutation solution.
    IUP and UIP principles essentially differ on the role assigned to the alternative on which the inferential focus is addressed to; thus they appear as not completely and significantly comparable. In fact the UIP, which fits closer to the traditional way of testing, considers as the null hypothesis that the new effect lies within the equivalence interval. To the best of our knowledge we think that the UIP way has not been discussed in the literature mostly because no explicit likelihood based solutions have been provided for it, except perhaps for very specific cases.
    By extending a result on two-sided testing within the permutation frame, where two arms of the alternative are separately, albeit simultaneously, tested and two partial test statistics suitably combined, we propose such a solution for the UIP approach. To extend the solution to the multidimansional case we use the notions developed in Pesarin and Salmaso (Permutation Tests for Complex Data; Wiley, 2010) of multi-aspect testing and that of nonparametric combination of several dependent permutation tests, where a given inferential problem is analyzed by at most a countable set of concurrent different partial tests, each specialized to put into evidence one aspect of interest for the analysis and their partial results combined nonparametrically with respect to their usually unknown dependence structure. To obtain practical solutions, an algorithm for the UIP way is presented. A simulation study for evaluating its main properties is also presented.
  • Representation of a 1-d Signal by a 0_1 Sequence and Similarity-Based Interpretation

    Authors: Petra Perner (Institute of Computer Vision and applied Computer Sciences)
    Primary area of focus / application: Mining
    Secondary area of focus / application: Metrology & measurement systems analysis
    Keywords: 1-d signal analysis, Signal coding, Similarity measure, Signal interpretation
    Submitted at 17-Apr-2014 09:43 by Petra Perner
    Accepted (view paper)
    23-Sep-2014 14:40 Representation of a 1-d Signal by a 0_1 Sequence and Similarity-Based Interpretation
    A 1-d Signal can be approximated by lines and represented by a 0_1 sequence. If the line goes upward, a zero codes it, and if the lines goes downward, a one codes it. This results in a 0_1 sequences representing the signal. This sequence prevents us from describing the signal by more complicated symbols such as peaks, valleys, and flat areas which is much more complicated to do and needs statistical assumption that are often not easy to judge.
    Once the signal is described by the 0_1 sequence this sequences can be taken for interpretion of the signal based on proper syntactical similarity measures. We describe the similarity measures and give results on the interpretation accuracy based on a 1-d application.
  • Process Optimization of a Super-Finishing Machine through Experimental Design and Mixed Response Surface Models

    Authors: Rossella Berni (Department of Statistics, Informatics, Applications "G.Parenti"-University of Florence), Matteo Burbui (General Electric Oil & Gas Company)
    Primary area of focus / application: Design and analysis of experiments
    Secondary area of focus / application: Quality
    Keywords: Mixed Response Surface Model, Robust design optimization, Random effects, Manufacturing technology
    Submitted at 17-Apr-2014 14:22 by Rossella Berni
    Accepted (view paper)
    24-Sep-2014 10:15 Process Optimization of a Super-Finishing Machine through Experimental Design and Mixed Response Surface Models
    In literature, experimental designs and Response Surface Methodology (RSM) (Myers et alt., 2004) have played a relevant role since the 80's and 90's in setting a robust design process; more specifically, methodological improvements relate to modeling, following the seminal contribution by Vining and Myers (1990), and the optimization process (for further details see Del Castillo, 2007).
    By considering the modeling features, the initial dual response surface approach has been extended in order to improve: i) the compliance between experimental design and the corresponding statistical model, such as the combined array structure, Myers et alt. (1992); ii) the inclusion in the statistical model of random effects in order to evaluate the relevance of sub-experimental factors or first order interactions between control and noise variables, (Khuri, 2006; Khuri and Mukhopadhyay, 2010).
    Furthermore, when considering the variability sources, the connection of RSM with the inclusion of random effects is a remarkable improvement in the study of these components, e.g. variance components, which can influence the main operative variables of the production process. In particular, the search for an optimal solution with the inclusion of random effects within the fitted mixed response surface allows us to evaluate the error components linked to noise and/or sub-experimental factors.
    This presentation deals with process optimization for a centrifugal compressor (Berni and Burbui, 2013). More precisely, the technological problem concerns the reduction of the surface roughness of centrifugal compressor impellers through a new technology implemented by GE Oil & Gas, called super-finishing. The new technology is studied through statistical methods in order to achieve a minimization of the final roughness according to the best set of levels for the abrasive component mixture and the time process. To this end, an experimental design is planned for three different materials, e.g. three types of steel, and mixed response surface models are applied. The application of mixed models allows us to estimate random effects, useful for better controlling the process variance in a robust design approach. Within this framework, a random effect is the initial roughness, measured for each impeller vane before starting the super-finishing process. Furthermore, random effects are also included in the final optimization step.

    -Berni, R., Burbui, M. (2013). Process optimization of a super-finishing machine through experimental design and mixed response surface models, forthcoming on Quality Engineering.
    -Del Castillo, E. (2007). Process optimization. New York, NY: Springer-Verlag.
    -Khuri, A.I. (2006). Mixed response surface models with heterogeneous within-block error variances. Technometrics, 48: 206-218.
    -Khuri, A.I., Mukhopadhyay, S. (2010). Response surface methodology. WIREs Computational Statistics, 2: 128-149.
    -Myers, R.H., Montgomery, D.C., Vining, G.G., Borror, C.M., Kowalski, S.M. (2004). Response Surface Methodology: a retrospective and literature survey. Journal of Quality Technology, 36: 53-77.
    -Myers, R.H., Khuri, A.I., Vining, G. (1992). Response surface alternatives to the Taguchi robust parameter design approach. The American Statistician, 46: 131-139.
    -Vining, G.G., Myers, R.H. (1990). Combining Taguchi and response surface philosophies: a dual response approach. Journal of Quality Technology, 22: 38-45.
  • Tolerance Design with Excel

    Authors: Jonathan Smyth-Renshaw (Jonathan Smyth-Renshaw & Associates Ltd)
    Primary area of focus / application: Modelling
    Secondary area of focus / application: Modelling
    Keywords: Tolerance stack up, Worst case, Statistical tolerance (RSS), Modelling, Excel
    Submitted at 24-Apr-2014 21:48 by Jonathan Smyth-Renshaw
    Accepted (view paper)
    23-Sep-2014 10:35 Tolerance Design with Excel
    During this session I would like to show how Excel can be used to undertake tolerance design calculation and then be used to model tolerance stack ups.
    The introduction will detail the two approaches - worst case and using a statistical approach (RSS). I will then focus on the use of the statistical approach and using Excel to create a simple tolerance model.
    I would also like to examine the use of both asymmetric and symmetric tolerance using both the normal distribution and a triangle distribution.
    In summary, Excel is a powerful simulation package which if programmed correctly can create powerful models for tolerance stack up.
  • A New Control Charting Procedure for Monitoring Poisson Observations

    Authors: Athanasios Rakitzis (University of Nantes), Philippe Castagliola (University of Nantes), Petros Maravelakis (University of Piraeus)
    Primary area of focus / application: Process
    Secondary area of focus / application: Modelling
    Keywords: Average run length, Control charts, Markov chain, Poisson distribution, Statistical process control
    Submitted at 25-Apr-2014 12:40 by Athanasios Rakitzis
    Accepted (view paper)
    22-Sep-2014 11:25 A New Control Charting Procedure for Monitoring Poisson Observations
    In this work, we propose a new control charting procedure with memory, which is suitable for the monitoring of Poisson observations. The proposed scheme uses integer-valued weights in the recent as well as in the past observations, while the plotted statistic is also a positive integer. For the statistical design of the proposed scheme, it is required the determination of a pair of positive, integer-valued, weights and a control limit (for the one-sided schemes, upper- or lower-sided) or a pair of control limits (for the two-sided scheme). For the exact evaluation of the run length properties as well as for the statistical design of the proposed chart, an appropriate Markov chain technique is used. The results of a numerical study reveal that the proposed scheme is very sensitive in the detection of small shifts in the mean of a Poisson process. Comparisons with other competitive schemes are also given.
  • Communication and Adoption Dynamics in New Product Life Cycle: The Case of Apple iPod

    Authors: Mariangela Guidolin (University of Padua)
    Primary area of focus / application: Business
    Secondary area of focus / application: Modelling
    Keywords: New product life cycle, Business forecasting, Time-dependent market potential, Bass model, Guseo-Guidolin model, Nonlinear Least Squares, Early adopters
    Submitted at 25-Apr-2014 18:23 by Mariangela Guidolin
    The ability to forecast new product growth is especially important for innovative firms that compete in the marketplace. Innovation diffusion models are used in business statistics and quantitative marketing to describe and forecast the growth dynamics of new products and technologies when launched into markets. These models help understand the mechanisms that generate and enhance the penetration of an innovation into a socio-economic system. The most famous diffusion model is that by Bass, BM, which forms the basis of several extensions of it. One of these, the Guseo-Guidolin model, GGM, assumes a time-dependent market potential, which is generated by a communication process among consumers. This model has turned out a useful improvement of the Bass model in different industrial sectors (e.g. pharmaceutical, energy). In this paper, we propose a new application of the GGM to the yearly sales data of the iPod, the portable media player designed and marketed by Apple inc., highlighting the forecasting upgrade obtained with respect to the simple Bass model. Moreover, we show that the GGM not only allows a separation between communication and adoption dynamics in new product life cycle, but also permits a temporal allocation of these two phases, based on easy to compute location indexes (mode, median, and mean). Interestingly, we see that in the case of the iPod the phase of adoption precedes that of communication, underlying the crucial role played by the group of early adopters in determining the success of the product.
    From a statistical point of view, the estimation of the GGM only requires sales data and is performed with standard Nonlinear Least Squares techniques through the Levemberg-Marquardt algorithm.