ENBIS: European Network for Business and Industrial Statistics
Forgotten your password?
Not yet a member? Please register
Overview of all Abstracts
The following PDF contains the Abstractbook as it will be handed out at the conference. It is only here for browsing and maybe later reference. All abstracts as PDF
The following abstracts have been accepted for this event:

Efficient Design in Conjoint Analysis and Alike
Authors: Rainer Schwabe
Primary area of focus / application:
Submitted at 7Sep2007 21:31 by
Accepted

Calibration of instruments using LogVariance models
Authors: Diego Zappa  Massimiliano Pesaturo
Primary area of focus / application:
Submitted at 7Sep2007 22:42 by
Accepted

Selecting explanatory variables with the modified version of Bayesian Information Criterion
Authors: Malgorzata Bogdan (Purdue University, West Lafayette, IN, USA)
Primary area of focus / application:
Submitted at 8Sep2007 10:23 by
Accepted

Design of Experiments for Mean and Variance
Authors: Marta Emmett, Peter Goos, Eleanor Stillman
Primary area of focus / application:
Submitted at 8Sep2007 11:28 by
Accepted
mean of a single response variable under homoscedasticity. However, in many
practical applications the variance structure is not known and the variance, as
well as the mean, needs to be estimated. Estimating the mean and variance
simultaneously is particularly relevant in quality control experiments. The
first person to bring attention to the importance of reducing variability in
such experiments was Taguchi in the 1980s. Taguchi methods seek to design a
product or a process whose performance meets a specified target on average and
exhibits little variability. This variability may be a consequence of
environmental factors, controllable and uncontrollable factors during the
manufacturing process and component deterioration.
More recently, Atkinson & Cook (1995) and Vining & Schaub (1996) developed
optimal design theory for estimation of mean and variance functions
simultaneously. Both papers assume that the variance function is estimated
using the residuals of the regression function for the mean. However,
researchers often prefer using sample variances for quantifying and modelling
variation. This has the advantage that the responses of the variance function
do not depend on the specification of the mean function. If sample variances
are utilized, the optimal design approaches of Atkinson & Cook (1995) and
Vining & Schaub (1996) are no longer ideal. Therefore, building on the work of
Goos, Tack and Vandebroek (2001), we propose a new optimal design criterion for
the simultaneous estimation of mean and variance functions, where it is assumed
that sample variances are used for estimating the latter function.
References:
Atkinson, A.C. and Cook, R.D. (1995). Doptimum designs for heteroscedastic
linear models. Journal of the American Statistical Association, 90, 204212.
Goos, Peter, Tack, L., Vandebroek, M. (2001). Optimal designs for variance
function estimation in using sample variances. Journal of Statistical Planning
and Inference. 92, 233252.
Vining, G.G. and Schaub, D. (1996). Experimental designs for estimating both
mean and variance functions. Journal of Quality Technology. 28, 135147.

Improvement of a manufacturing process by integrated physical and numerical experiments: a casestudy in the textile industry
Authors: Stefano Masala (1), Paola Pedone (2), Martina Sandigliano (1) and Daniele Romano (2)
Primary area of focus / application:
Submitted at 8Sep2007 11:49 by
Accepted
in the product development phase. It is easy to forecast that it will spread soon also in less knowledgeintensive sectors.
However, although Design of Experiments and Computer Experiments provide sound methodologies for running experiments in
physical and numerical settings respectively, the integration between the two kinds of investigation is still in its infancy.
Yet in that case the sequential experimentation approach, introduced by George Box for physical experiments some fifty years
ago, would have an even wider scope.
The work describes the results of a research project which is currently taking place at Technova Srl, a medium size textile
firm in Sardinia (Italy). The company produces flocked yarn, a component which, after weaving, becomes a fabric for a wide
range of technical applications. Typical end products are coverings for seats and other components in car interiors. The
yarn is formed by finely cut fibers (flock) applied to an adhesive coated carrier thread by the electrostatic force. The
research focuses on the improvement of the manufacturing process. To this end, we exploit all kind of information sources
available, from historical production data to physical experiments on pilot and production machines and experiments on
different process simulators. We show that the results obtained by this approach are well beyond the initial expectations of
the company in terms of enhanced product quality as well as process economy and flexibility.
Keywords: DoE, Computer experiments, Sequential experimentation, Flocking process, Quality improvement.
Affiliations:
(1) Technova Srl, Olbia, martina.sandigliano@novafloor.it
(2) University of Cagliari, Dept. of Mechanical Engineering, Cagliari, romano@dimeca.unica.it

An automotive experience in applying DoE to improve a process
Authors: Laura Ilzarbe, M. Tanco, M. Jesús Alvarez, E. Viles
Primary area of focus / application:
Submitted at 8Sep2007 13:38 by
Accepted
In this paper we present the application of the design of experiments in a car manufacturing company to improve their technical knowledge of the laser welding process and the positive impact that this research already had on the number of defects observed.

Implementation of a quality plan for the monitoring of blood treatment process for the Belgian Red Cross
Authors: A. Guillet, B. Govaerts, A. Benoit
Primary area of focus / application:
Submitted at 8Sep2007 19:00 by
Accepted
First, we defined a sampling design for each measurement. Then, we made a brief descriptive analysis of the new data to check the goodness of the plan. After several months, we created some graphs to verify if the specification limit are respected and we computed some statistics on the collected data in order to compare the results between the three sites, to determine if the processes are under control and to evaluate the clientprovider risk. Some of the graphs are automatically updated everyday whereas the statistics and the other graphs are monthly created as a report.
To realise it, we had to implement the necessary tools in a statistical software in such a way that every technician can use it. Thus, we chose to make them encode the data directly in worksheets of the statistical software formatted for the different products. Moreover, they followed a small course adapted to their use of the software in order that they can understand how to use it, particularly the use of the macros, and how to read the outputs and detect that there is a problem in the quality of the products.

Mixing Krigings for Global Optimization
Authors: D. Ginsbourger, C. Helbert, L. Carraro (Ecole des Mines de SaintEtienne)
Primary area of focus / application:
Submitted at 8Sep2007 20:49 by David Ginsbourger
Accepted
more time consuming. This makes it impossible to study simulators exhaustively, especially when the number of parameters grows
large. Consequently, Computer Experiments is a field of study in expansion; application areas include crashtest studies,
reservoir forecasting, nuclear criticity, etc. We focus here on surrogatebased global optimization techniques for this kind of complex models.
Our starting point is the E.G.O. algorithm, which is based on a kriging metamodel. In a first part, we recall in detail how kriging
allows building sequential exploration strategies dedicated to global optimization. We point out some problems of kernel selection that are often skipped in the litterature of krigingbased optimization.
In a second part, we introduce an extension of the EGO algorithm based on a mixture of kriging models (MKGO). We emphasize on how
a mixture of kernel can lower the risk of misspecifying the kernel structure and its hyperparameters, especially when the estimation
sample is small. The proposed approach is illustrated with classical deterministic functions (BraninHoo, GoldsteinPrice), and
compared with existing results.
We also present a study carried out on gaussian processes, and observe the relations between the quality of covariance estimation
and the performances obtained in krigingbased optimization. We finally give a bayesian interpretation of MKGO, and discuss the
inandouts of choosing the prior distribution of the covariance hyperparameters.
This work was conducted within the frame of the DICE (Deep Inside Computer Experiments) Consortium between ARMINES, Renault, EDF,
IRSN, ONERA and TOTAL S.A. 
Metamodels from Computer Experiments
Authors: J.J.M. Rijpkema
Primary area of focus / application:
Submitted at 9Sep2007 14:40 by
Accepted
For the construction of metamodels there are a number of approaches available, such as Response Surface Methods (RSM), Kriging, Smoothing Splines and Neural Networks. They all estimate the response for a specific design on the basis of information from the full analysis of a limited number of training designs. However, they differ with respect to their underlying conceptual ideas, the calculation effort needed for training and the applicability to specific situations, such as largescale optimization problems or analysis models based on numerical simulations.
In this presentation I will focus on two approaches for metamodelling, namely RSM and Kriging. I will review and compare keyconcepts and present efficient experimental design strategies to train the models. Furthermore, I will discuss ways to enhance the model building process by taking information on design sensitivities into account.
This may lead to a reduction of the actual number of full model analyses that is necessary for model training and estimation. It may be very effective, especially in those situations where design sensitivities are easily available such as is the case for analysis models based on the Finite Element Method. To illustrate the use of RSM and Kriging results from a numerical model study will be presented. They throw some light on strengths and weaknesses of both approaches in practical applications.
Specifics: Related to the field of approximation models, computer simulation and engineering optimization. The preferred form of presentation is an oral presentation of about 20 minutes (including discussion).

Variability of Electromagnetic Emissions
Authors: U. Kappel and J. Kunert
Primary area of focus / application:
Submitted at 9Sep2007 15:51 by
Accepted
The presentation discusses the results of a small study on the relative importance of several factors that might influence the electromagnetic emission of a car's subsystem consisting of a video, an audio, and a cell phone component. A fourth factor of interest was the design of the wiring harness connecting these components. All four factors were considered at 4 levels each. For the three components, the levels were chosen in such a way that we would expect increasingly less problems: no grounding of the component's case at all, poor grounding, good grounding and as the fourth level absence of the component. For the harness, we chose the four levels by selecting varying distances between the single wires and circuits. The experiment was done as a fractional factorial design with 16 runs.
The response was the average excess of the measured radiated emissions over the emissions limit, where the average was taken over all frequencies of interest.
Between any two of the 16 runs, the wiring was completely redone, even if the next run had the same level for this factor. This gives a measure of the variability caused by rebuilding the wiring, even if we try to rebuild the system in a completely identical way. To check the size of the pure measurement error, each run was measured 5 times, without any changes between the measurements.
The results seem to indicate that the pure measurement error was relatively small, while the variability caused by rebuilding the wiring was very important. Compared to this "rebuilding error" the effects of the four levels of video, of audio and of the planned variation of the wiring itself were negligible. However, we could show that no grounding or poor grounding of the cell phone component increased the average electromagnetic emissions significantly compared to the other two levels.
These findings could be reproduced in a confirmation experiment.