ENBIS-11 in Coimbra

4 – 8 September 2011 Abstract submission: 1 January – 25 June 2011

My abstracts

 

The following abstracts have been accepted for this event:

  • Efficiency of Formal Analysis Methods of Unreplicated Factorials

    Authors: Bjarne Bergquist and Erik Vanhatalo
    Affiliation: Luleå University of Technology, Luleå, Sweden
    Primary area of focus / application: Design and analysis of experiments
    Keywords: Design of Experiments , Unreplicated experiments , Screening designs , Formal Analysis , Simulation , Power calculation
    Submitted at 23-Jun-2011 12:29 by Bjarne Bergquist
    Accepted
    5-Sep-2011 12:50 Efficiency of Formal Analysis Methods of Unreplicated Factorials
    Unreplicated factorials are useful for the purpose of exploring the experimental space at a low cost, especially during screening stages of an experimental campaign. Despite their lack of independent replicates, the designs are analyzable due to three principles: effect sparsity, effect hierarchy, and effect heredity. All principles can be incorporated into the analysis through normal probability plotting. However, evaluation of probability plots rests on the skills and judgements of the analyst and two analysts may end up with results that differ widely. Formal procedures using one or several of the proposed principles have also been proposed to overcome the analyst bias and errors stemming from the manual deliberations of interpreting plots. The formal methods include pseudo-variance estimation methods such as the Lenth’s method, as well as Bayesian methods. In this paper, the efficiencies and power of some of the popular methods and some of the more advanced methods are compared through a simulation study.
  • Near-infrared chemical imaging & multivariate image analysis for quality improvement in pharmaceutical industry

    Authors: J.M.Prats-Montalbán*, J.I. Jérez-Rozo**, R.J. Romañach**, A. Ferrer*
    Affiliation: * Universidad Politécnica de Valencia (Spain) ** University of Puerto Rico
    Primary area of focus / application: Process
    Keywords: hyperspectral imaging , multivariate image analysis , multivariate curve resolution , principal component analysis , process monitoring , quality improvement
    Submitted at 23-Jun-2011 12:40 by Alberto J. Ferrer-Riquelme
    Accepted (view paper)
    6-Sep-2011 11:45 Near-infrared chemical imaging & multivariate image analysis for quality improvement in pharmaceutical industry
    Near-infrared chemical imaging (NIR-CI) combines chemical spectroscopy information with spatial information from the pharmaceutical formulations. NIR-CI creates an image that divides the sample into spatial pixels, each pixel enclosing a complete acquired NIR spectrum. The complete NIR hyperspectral image is defined as a data cube with a plane (image) at each NIR wavelength.
    The advantage of the NIR-CI technique is that it is a nondestructive (i.e., noninvasive) method, while the traditional quality control tests require invasive techniques. Nevertheless, hyperspectral images provide a massive quantity of data that are highly dependent on multivariate data processing and data mining strategies to extract accurate information.
    This talk reports a two-stage methodology based on the application of multivariate chemometrics models and multivariate image analysis (MIA) to analyze the chemical composition and the spatial relationships between the active pharmaceutical ingredient (API) and different excipients for process monitoring and quality improvement of a pharmaceutical formulation.
  • Application for Producing Product Quality Review Reports in a Pharmaceutical Company

    Authors: Heli Rita and Anni Liimatainen
    Affiliation: Orion Pharma
    Primary area of focus / application: Process
    Keywords: Pharmaceutical industry , Data warehouse , Reporting , Control charts , Capability indices
    Submitted at 23-Jun-2011 12:47 by Heli Rita
    Accepted
    6-Sep-2011 15:45 Application for Producing Product Quality Review Reports in a Pharmaceutical Company
    Product quality review (PQR) reports shall be annually compiled for each and every product which is manufactured in a pharmaceutical company. Their purpose is to show that the manufacturing process is stable and capable of constantly producing products which fulfill their quality-related specifications. The reports are based on large amount of data, which usually are stored in several locations in diverse formats. Traditionally, manual gathering of data and editing them for reports has been a huge task, especially in companies with a large product portfolio.

    We have installed a product data warehouse for collecting the data related to the manufactured batches in an integrated system. It collects data from an ERP system and from several manufacturing and measurement equipment. These data contain batch-related information e.g. on materials, process parameters, deviations and results of release and in-process tests. The data can be analyzed and reported for several purposes, e.g. for producing the PQR reports. The end-user can utilize the data by web-based applications, which are made available for easy usage via intranet. The applications have been built-up by using stored processes in SAS software.

    An application is presented for efficiently producing PQR reports. By using this application, the workload for reporting has changed from manual data gathering to the actual evaluation of products and processes. The content, layout and quality of the about 400 reports annually have become substantially more uniform. Control charts and process capability indices were implemented as a part of reporting in the application, which has induced request of training of SPC methods.
  • Are Multivariate Process Capability Indices Really Needed?

    Authors: Kerstin Vännman and Ingrid Tano
    Affiliation: Umeå University, Luleå University of Technology and University West, Sweden
    Primary area of focus / application: Process
    Keywords: Process capability index , Multivariate process capability index , Safety region plot , Multivariate normal distribution , Cpk , Confidence interval , Test
    Submitted at 23-Jun-2011 14:50 by Kerstin Vannman
    Accepted
    6-Sep-2011 15:25 Are Multivariate Process Capability Indices Really Needed?
    Process capability indices have been in use for more than 20 years to characterize process performances when studying univariate quality characteristics. However, the quality of a process is often determined by several correlated quality characteristics. In such situations the quality characteristic is a vector, and multivariate process capability indices (MPCI:s) have been elaborated to handle this case. To begin with the MPCI:s were derived as theoretical quantities to define the process capability and suitable point estimates were suggested. More recently statistical theory for some of these estimated MPCI:s have been derived and confidence intervals or tests have been suggested. This means that conclusions about the process capability can be drawn, based on a random sample, at a given confidence level or significance level. We review and compare some of these MPCI:s and their corresponding confidence intervals or tests, which are based on the assumption that the quality characteristic vector follows a multivariate normal distribution. Examples, using data from a Swedish industry, are presented. Some advantages and disadvantages of the studied MPCI:s are brought up. Furthermore, safety region plots for univariate process capability indices are suggested to be used for process capability analysis also when the quality characteristic is a vector. Using safety region plots the process capability can be assessed at a stated significance level and at the same time give information about location and spread of the individual quality variables. We discuss, using results from a simulation study, if the use of safety region plots implies that MPCI:s might be considered superfluous.
  • Estimation of annual redemption rates of savings products

    Authors: Luís Maranhão, Margarida Brito, Maria do Carmo M. Guedes, Nilza Pinto
    Affiliation: AXA, DM-FCUP, DM-FCUP, DM-FCUP
    Primary area of focus / application: Process
    Keywords: modeling , generalized linear models , saving products , redemption rates
    Submitted at 23-Jun-2011 17:03 by Maria do Carmo Guedes
    Accepted
    We consider here the problem of estimating the annual redemption rates of savings products, i.e. the percentage of the value of these products that at the end of each year is expected to have been lifted. The main objective is the derivation of an adequate model in order to preview at the beginning of each year, the expected rate of redemption.
    The estimation of fees used actual information about the type of product, the life time and the value of the contract, the client's age, the redemption value and the type of rescue (total or partial). Four years of data were used.
    Four different situations were found: there is no type of rescue; there is partial redemption; there is total redemption with penalty; from one year to another the contract value shall be zero due to maturity, whereas in this case there is no redemption.
    For the statistical modelling of the data, generalized linear models were used, from which the variables that influence the rate of redemption were identified.
  • Quantification of sheet thickness from optical measurements

    Authors: Oliver Melsheimer, Joachim Kunert, Gerd Sebastiani, Simone Wenzel, Adrian Wilk
    Affiliation: TU Dortmund
    Primary area of focus / application: Quality
    Keywords: Forming process , Estimation of wall thickness , Clustering , Interpolation
    Submitted at 24-Jun-2011 11:21 by Adrian Wilk
    Accepted (view paper)
    5-Sep-2011 16:25 Quantification of sheet thickness from optical measurements
    As part of a joint research project with mechanical engineers and statisticians, we try to optimize an incremental forming process to produce a funnel out of a blank metal sheet. One criterion for the quality of a transformed workpiece is the achievement of an even wall thickness. If the wall gets too thin in a given area, the risk of failure arises.
    Direct measurement of the wall-thickness is time-consuming and extensive effort because the whole three dimensional contour of the workpiece has to be considered. As an alternative, we used an optical all-around-scan to produce data of the workpiece surface in a triangularized way. Making use of statistical methods like cluster analysis and spatial interpolation, one can use this surface data to estimate the sheet thickness at any point of the workpiece with good precision.
    The poster presents a detailed description of our approach.