ENBIS-8 in Athens

21 – 25 September 2008 Abstract submission: 14 March – 11 August 2008

My abstracts


The following abstracts have been accepted for this event:

  • Design of fault diagnosis systems: comparing a method based on robust design with classical control engineering approaches

    Authors: Daniele Romano and Michel Kinnaert
    Affiliation: University of Cagliari, Cagliari, Italy; Université Libre de Bruxelles, Bruxelles, Belgium
    Primary area of focus / application:
    Submitted at 1-May-2008 00:17 by Daniele Romano
    Accepted (view paper)
    24-Sep-2008 09:00 Design of fault diagnosis systems: comparing a method based on robust design with classical control engineering approaches
    Fault Detection and Isolation (FDI) is an important topic in control engineering since the seventies. FDI systems provide a precocious diagnosis of failures or malfunctions which may occur on a dynamic system by elaborating signals coming from on-line monitoring. Typical applications are in the process industry, in transportation and telecommunications, where the supervised systems may be industrial plants and critical sub-systems of aircrafts, trains, space shuttles, and satellites. Prompt detection of the fault and early understanding of its cause (isolation) can prevent the product from loss of quality, the plant from unexpected shutdown and, more importantly, the environment and the humans from harmful effects in case of failures during hazardous operations.
    The basic diagnostic mechanism is to compute a set of fault indicators and then assess their current value against some reference values in order to make the diagnosis decision. The reference values are obtained by means of previously identified models of the system behaviour under both sound and faulty conditions (model-based FDI). As different sources of noise affect the system (random deviations from the nominal set-point, uncertainty in model parameters, variation of the fault level, measurement error of sensors) the design of faults indicators can be regarded as a kind of robust design problem with the objectives to minimise the rate of false alarms, missed detection and missed isolation. For solving such a problem the authors developed a method based on the Kullback divergence as a measure for obtaining the maximum decoupling among indicators of different faults.
    Using a fluid mixing process as a test case, we show that this approach is superior to two of the most known methods in the control engineering literature. One of them is a model-based method (Parity Space), the other is a data-based one (Fisher Discriminant Analysis).

    CHOW, E.Y., and WILLSKY, A.S., 1984, “Analytical Redundancy and the Design of Robust Failure Detection Systems”. IEEE Trans. on Automatic Control, AC-29(7), 603-614.
    ROMANO D., and KINNAERT M., (2006), “Robust Design of Fault Detection and Isolation Systems”. Quality Reliability Engineering International, 22(5), 527-538.
  • A statistical framework for teaching improvement in higher education

    Authors: Stefano Barone & Eva Lo Franco
    Affiliation: University of Palermo, Italy
    Primary area of focus / application:
    Submitted at 1-May-2008 15:02 by Stefano Barone
    Accepted (view paper)
    24-Sep-2008 09:00 A statistical framework for teaching improvement in higher education
    The never ending debate on quality invests every institution devoted to higher education. It is undeniable that with a massive and fast globalisation causing student flows in any direction, and under the constraints of limited economic resources devoted to higher education, the performances and quality of any academic system must be constantly monitored and improved.
    Thus, whoever researches and teaches the scientific foundations of quality, has at the same time the right and the duty to provide his/her opinion and to firstly assure the quality of the processes for which he/she is responsible.
    In the recent past the authors started focusing on Teaching as a relevant aspect for the transmission of knowledge to an audience who constantly experiences a rapidly changing technological environment. We recently proposed a methodology named TESF: Teaching Experiments and Student Feedback. It is aimed at designing, monitoring, and continuously improving (according to the Deming's cycle) the quality of a university course. The TESF methodology is based on the concurrent adoption of Design of Experiments and the SERVQUAL model. The experiments are essentially “teaching experiments” performed by the teacher according to a predefined plan. The teacher is therefore the designer, the experimenter and part of the experimental unit. The other part of the experimental unit is constituted by a predefined sample of students attending the course (students evaluators sample) whose feedback is carefully studied.
    We have shown, by a preliminary application of the TESF, that it is absolutely not necessary to be an experienced statistician to apply this methodology and even the description of the model is kept at the most general level. In fact the methodology, initially thought for the academic environment could be easily applied to any educational context. On the other side, it is evident that expert statisticians can be stimulated by this approach, so a scientific discussion can be opened and further substantial improvements can be gained.
    This presentation is aimed at giving an overview of the TESF, giving emphasis to the statistical aspects therein involved and the delicate experimental and measurement issues. An interesting upgrade concerning the data (student feedback) analysis will be a specific focus.
    Results from the application of the methodology in three consecutive editions of a Statistics course at the University of Palermo will be presented.
  • Design of experiments in simulation with application to services

    Authors: Shuki Dror Tamar Gadrich Maya Kaner
    Affiliation: ORT Braude College
    Primary area of focus / application:
    Submitted at 1-May-2008 19:41 by Shuki Dror
    23-Sep-2008 12:40 Design of experiments in simulation with application to services
    The role of experimental design is to improve performance by focusing on design details rather than on trial-and-error approaches. Design of experiments enables the analyst to define each system components and its levels thus obviating the need to decide arbitrarily regarding different components or factors (Sanshez, 2006).

    Several applications of design of experiments in simulation have been presented by different researchers. However most of the applications focus on production systems that incorporate a few process factors such as variability in the demand, processing time variability and work-in-process (WIP) (Yucesan and Groote, 2000)

    Service processes include a wide range of variegated factors such as activities, persons, tangibles and intangibles (Kaner and Karni, 2007). The factors and their interactions have to be incorporated into the design. We propose to integrate design of experiments in simulation for design of new and improvement of the existed service processes.

    Generally, designing simulation experiments comprises three steps (Kleijnen at el, 2005): 1) developing a basic understanding of the problem for simulation settings; 2) finding robust, not necessarily optimal decisions; 3) comparing robust decisions for choosing the best alternative. We apply these steps for service processes. We assert that there is a set of generic factors (i.e., customer arrival rate, numbers and skills of the servers/workers etc.); generic performance measures (i.e., customer waiting time, number of customer complaints etc.) and generic possible interactions between the factors (i.e., a steady customer waiting time could depend on the number of workers; this could be changed if the customer arrival rate rises above a specific threshold). Our methodology enables the designer to define the simulation settings for a specific new process using generic collection of the service factors, their interactions and service performance measures. Business process improvement practices (Mansar and Reijers, 2007) such as activity resequencing, triage, parallelism, employee empowerment, flexible performer assignment are incorporated in our methodology and allow the designer to improve the existed service process. In addition our framework assists the designer to distinguish the statistically significant factors and interactions for finding and comparing robust decisions.

    In summary, this paper describes a methodology, based on design of experiments and simulation, which enables a designer of service processes to cope with various factors and their statistically significant interactions. We illustrate the application of the methodology for design of order handling process in service system.

    1. Kaner, M., Karni, R., 2007, Design of service systems using a knowledge-based approach. Knowledge and Process Management, 14(4): 260-274.
    2. Kleijnen, J.P.C., Sanchez, S.M., Lucas, T.M., Cioppa, T.M., 2005. A User’s Guide to the Brave New World of Designing Simulation Experiments. INFORMS Journal on Computing 17(3): 263-289.
    3. Mansar, L. M., Reijers, H.A., 2007. Best practices in business process redesign: use and impact. Business Process Management 13(2): 193-213.
    4. Yucesan, E., de Groote, X., 2000. Lead times, order release mechanisms, and customer service. European Journal of Operational Research 120:118-130.
    5. Sanchez, S.M., 2006. Work Smarter, Not Harder: Guidelines for Designing Simulation Experiments. In Proceedings of the 2006 Winter Simulation Conference, eds. Perrone, L.F., Wieland, F.P., Liu, J., Lawson, B.G., Nicol, D.M., Fujimoto, R.M., 47-57.
  • On control charting normal variance

    Authors: Sven Knoth
    Affiliation: Advanced Mask Technology Center Dresden, Germany
    Primary area of focus / application:
    Submitted at 4-May-2008 21:21 by Sven Knoth
    23-Sep-2008 16:10 On control charting normal variance
    There are several reasons to monitor the variance of a random characteristic. Besides making sure that the initially designed control limits of a mean chart should not be tuned because of changes in the variance between samples, variance stability is of it's own importance. For instance, it is important to monitor repeatability of a gage, uniformity of deviations from a golden profile, volatility or risk in finance etc. All these characteristics are typically modeled by variance (measures).In this talk, several control charts (Shewhart, EWMA, CUSUM, Shiryaev-Roberts) will be analyzed in terms of the zero-state and the steady-state ARL (average run length). Additionally, for EWMA charts the choice of the underlying statistics will be discussed.
  • Constructing Classification Trees via Data Mining and Design of Experiments Concepts

    Authors: Irad Ben-Gal, Eugene Kagan and Niv Shkolnik
    Affiliation: Dept. of Industrial Engineering, Tel-Aviv University
    Primary area of focus / application:
    Submitted at 4-May-2008 22:25 by Irad Ben-Gal
    23-Sep-2008 09:00 Constructing Classification Trees via Data Mining and Design of Experiments Concepts
    We present a generalized approach for the construction of classification trees. The proposed approach is based on both data mining and design-of-experiments concepts.
    In particular, we model the different partitions of the target variable and the attributes' set by nodes in a graph and use information theory measures to set the edges' weights. We show that by a proper selection of these measures, the proposed representation generalizes known construction algorithms of classification trees. Moreover, it enables to better balance the tradeoff between construction complexity and classification optimality and often results in shorter trees. Examples will be given.
  • Analysis of Computer Experiments with Multiple Noise Sources

    Authors: Christian Dehlendorff, Murat Kulahci and Klaus Kaae Andersen
    Affiliation: DTU Informatics, Technical University of Denmark
    Primary area of focus / application:
    Submitted at 5-May-2008 11:03 by Christian Dehlendorff
    Accepted (view paper)
    23-Sep-2008 16:30 Analysis of Computer Experiments with Multiple Noise Sources
    With the current advances in computing technology, computer and simulation experiments are being used more and more frequently to study complex systems for which physical experimentation is usually not feasible. Such experiments are often considered to be without experimental noise. For certain applications such as in health care, this is however not adequate as the variations in the operating conditions in these applications are an important concern for the decision makers. In this study a simulation model of an orthopaedic surgical unit is considered. The discrete event simulation (DES) model describes the individual patient’s flow through the unit and is developed in collaboration with the medical staff at Gentofte University Hospital in Copenhagen.
    The overall objective in this study is to minimize the patient waiting time from the time he/she enters the ward until the exit. Multiple outcomes are considered simultaneously; both quality requirements and outcomes related to the study objective. Often the quality requirements and the objectives are inversely related requiring some sort of trade-off solution.

    The simulation model consists of two sources of noise coming from variations in the environmental factors (uncontrollable factors in the physical system) and from changes in the seed controlling the random number generation process embedded in the simulation model. Methods for evaluating and splitting the variation into separate components in a meta-model, i.e. a fixed effect corresponding to the controllable settings, a random effect corresponding to the environmental variables and a residual term corresponding to the changes in seed, are presented. The resulting meta-model is used in minimizing the waiting time while satisfying the quality requirements since using the simulation model directly is computationally infeasible. The optimal settings are evaluated under different environmental settings to quantify the robustness of the solutions and to identify influential environmental factors.