ENBIS8 in Athens
21 – 25 September 2008
Abstract submission: 14 March – 11 August 2008
My abstracts
The following abstracts have been accepted for this event:

The text deals with basic methods used in variability management.
We show that
• Effectiveness of variability management depends on results achieved when performing steps of the ShewhartDeming PDCA Cycle.
• There is a finite set of methods that support effective operation of PDCA Cycle and enable use of potential of knowledge of organisation .
These methods are:
• Variability Analysis, which is to give information on measures and structure of variability in the system;
• Problem Solving scheme, which is to give information on causes of variability and on possible ways to eliminate or reduce influence of main (root) causes
• Quality Function Deployment, which is to transform an idea of method of modifying variability causes into details of standards to be used in the system
• Failure Mode and Effect Analysis, which is to assess risk of failure when defining and implementing a standard in the system
• Value Analysis, which is to assess costs of implementation and use of standard
Common important features of them are following:
• They can be used to control effectiveness of establishing or modifying standards and therefore show that there may be no essential conflict between using standards and working creatively
• They provide system with some measures that may be used to assess results achieved in each step of the PDCA Cycle
• They process information on a set of causes of variability that are potentially familiar to people who are in the system
• Result of their application can be verified by observation of overall results of use of standards.
• Their effectiveness depends on effectiveness of use of the knowledge shared by people using them.
The QM Principles formulated in the ISO 9000 Standard are defining a management system environment ready for effective use of the above methods.

A TwoSided Multivariate c Control Chart
Authors:
Paolo Cozzucoli
Affiliation: Department of Economics and Statistics, University of Calabria  Via P. Bucci, Cubo 0C , 87036 Rende (CS) , Italy
Primary area of focus / application:
Submitted at 11May2008 08:49 by Paolo Cozzucoli
Accepted
(view paper)
22Sep2008 15:40 A TwoSided Multivariate c Control Chart
In modern industrial environment operators are frequently interested in evaluating the quality of complex products. Supposing that the quality of the product depends on several quality characteristics, then it is correct to use multivariate quality control methodology. If the quality characteristics are defined on nominal or ordinal scale then the corresponding process to monitor is known as multivariate attribute process. In this paper, we assume that the operator is interested in monitoring a Multivariate Poisson Process, that is the main purpose is to monitor simultaneously the number of non conformities belonging to each of the k ordered, distinct and not mutually exclusive defect categories; specifically, defects of different types may be classified according to the degree of effect that they have on the performance of the product. Usually the process is said to be capable if the total number of defect, identified on the selected inspection unit, is very small and remains low, or declines, over time. In this case, because we have chosen to classify the defects into k ordered and not mutually exclusive classes of defects, the overall number of non conformities identified on the inspection unit depends on the k categories, which are not necessary independent, and we are interested in evaluating over time the number of defects in each category as well as the overall (across the k categories) number of nonconforming items. Also, we may use this approach to evaluate the overall demerit in the system. To achieve this goal, in this paper we propose i) a normalized index that can be used to evaluate the overall quality of the process in terms of the weighted sum of non conformities identified in each class of defects and ii) a two sided Shewharttype multivariate control chart with asymptotic probabilistic limits useful to monitor the overall number of nonconformities for each inspection unit. In addition, we suggest a solution to the identification problem when an out control signal occurs. The same sample statistic is used to define the normalized index and the multivariate c control chart.

Consider a situation in process capability analysis where the smallest possible value of the studied quality characteristic is zero and an upper specification limit only exists. For this situation it is not uncommon that zero is also the best value to obtain. For instance, consider a surface polishing process where the surface should be as smooth as possible or a gearwheel production process, where the deviation from the surface of the gearwheel should be as small as possible. In such situations it is likely to find a skew distribution with a long tail towards large values rather than a normal distribution for the studied quality characteristic. Hence the most commonly used process capability index for an upper specification limit in industry today, Cpu, is not suitable to use when studying process capability.
For the situation described above it is often reasonable to assume that the underlying distribution is a Weibull distribution. Hence, we focus on process capability indices when the studied quality characteristic is distributed according to a highly skewed Weibull distribution and there is an upper specification with a prespecified target value 0.Under these conditions we consider a previously proposed class of capability indices, designed for skewed zerobound distributions, and propose an efficient estimator of the index in the studied class. Using this estimated index and its asymptotic distribution we suggest a decision rule to be used for deeming a process capable at a given significance level. We present results from a simulation study, performed to investigate the true significance level, when the sample size, n, is between 50 and 200. An example from a Swedish industry will also be presented to illustrate the proposed ideas.

Bayesian control charts constitute a more flexible and effective approach to statistical process monitoring compared to conventional control charts such as Shewhart and CUSUM. Although the specific mathematical analysis for the economically optimal design and operation of Bayesian charts has been developed for over a decade, the practical implementation of these charts is minimal due to their computational complexity and requirements. However, the continuing progress in computing technology and power is gradually allowing the design and implementation of Bayesian charts in practice. The objective of this presentation is to highlight these developments and provide convincing arguments in favor of more widespread adoption of the Bayesian approach to statistical process control. A specific application of economically designed control charts in an actual manufacturing environment will serve to support and illustrate the arguments.

A Permutation Test for Effects in Unreplicated Factorial Designs
Authors:
Anna Skountzou, Christos Koukouvinos and Kalliopi Mylona
Affiliation: Department of Mathematics, National Technical University of Athens, Zografou 15773, Athens, Greece
Primary area of focus / application:
Submitted at 13May2008 14:13 by Anna Skountzou
Accepted
22Sep2008 10:35 A Permutation Test for Effects in Unreplicated Factorial Designs
The analysis of unreplicated factorial designs concentrates much of interest, since these designs enable us to estimate the factorial effects using contrasts, while no degrees of freedom are left to estimate the error variance. In order to examine the significance of the effects in unreplicated twolevel factorial designs we present a permutation testing procedure. According to our method the permutation estimates of the factorial effects are obtained by applying the Ordinary Least Squares estimation to a set of inequivalent Hadamard matrices. A major reason that led us to use a permutation test is that it requires no a priori assumptions regarding the underlying distribution of the data nor does it impose any practical restriction on the number of potentially significant effects present. Moreover a comparison study, made with previous works and performed by simulated experiments, is given revealing the effectiveness of the proposed method. Finally, we present the application of our method to an example with real data.

Supersaturated designs (SSDs) is a large class of factorial designs which can be used in screening experiments where the number of factors is large, while the experimentation is expensive. An SSD can save considerable cost when the number of factors is large and a small number of runs is desired or available. A recent application of evolutionary algorithms to the construction of twolevel cyclic structured supersaturated designs and a statistical analysis method, for twolevel E(s^2)optimal supersaturated designs with a block orthogonal structure, are presented.