ENBIS-12 in Ljubljana

9 – 13 September 2012 Abstract submission: 15 January – 10 May 2012

My abstracts

 

The following abstracts have been accepted for this event:

  • An Improvement to the Sequential Simplex Optimization Algorithm

    Authors: Bryan Dodson (SKF), Paolo A Re (SKF), Rene Klerx (SKF), Markus Weidenbacher (SKF)
    Primary area of focus / application: Design and analysis of experiments
    Keywords: evolutionary optimization, sequential simplex method
    Submitted at 20-Mar-2012 10:14 by Paolo Re
    Accepted (view paper)
    12-Sep-2012 09:50 An Improvement to the Sequential Simplex Optimization Algorithm
    The sequential simplex method has proven to be an efficient algorithm for Evolutionary Optimization (EVOP) experiments. This is particularly true when there are a large number of factors since the initial simplex only requires one more trail than the number of factors. When using the sequential simplex method in manufacturing it is common to have a small initial simplex to avoid disrupting the manufacturing process. When using the sequential simplex method in engineering and research it is common to have a large initial simplex that collapses to the optimum condition. This paper presents specific rules for the two cases of a small initial simplex and a large initial simplex that improve efficiency. Comparisons are made with the classic variable step size sequential simplex method.
  • Problem Definition Using 4W & 1H

    Authors: Jonathan Smyth-Renshaw (University of Liverpool), Iain Reid (University of Liverpool)
    Primary area of focus / application: Education & Thinking
    Keywords: 5W&1H, problem definition, Six Sigma, G8D, A3 thinking, brainstorming, 5 why
    Submitted at 2-Apr-2012 20:17 by Jonathan Smyth-Renshaw
    Accepted
    10-Sep-2012 11:00 Problem Definition Using 4W & 1H
    Purpose – The purpose of this paper is to explore the dynamics of Root Cause Analysis in the context of six sigma and the applicability of the “5W+1H’ (What, Why, When Where, Who, How)” technique which is used by many managers in understanding a problem in order to define the root cause.

    Design/methodology/approach – The research integrates principles of a traditional literature review with a reflective inquiry of a practitioner.

    Findings – The “5W+1H’” methodology is insufficient in identifying the root cause, due to the variations triggered by asking the question ‘why’. The paper demonstrates that some extraordinary RCA was achieved by redefining the approach of the 5W+1H’ methodology, as catastrophic failures were often the result of misinterpreting the ‘why’ question. Consequently, the paper identifies a new domain that can be added to traditional RCA and Six Sigma projects.

    Originality/value – The paper explores a different perspective to the problem definition in RCA which does not require a predetermined standard. It provides a specific example and suggestions to help practitioners avoid expensive contingency plans, while conducting investigations to RCA using the refined 4W+1H’approach. By questioning in the principles of RCA though a process reflective inquiry, benefits both practitioners and academics.
  • Physical versus Statistical Assumptions in Modelling: A Study of Maintenance

    Authors: Chris McCollin (Nottingham Trent University), Shirley Coleman (University of Newcastle upon Tyne)
    Primary area of focus / application: Reliability
    Keywords: identifying physical structure, Exploratory Data nalysis, air-conditioners, submarine diesel generators, bus powertrains, ambassador limousines
    Submitted at 5-Apr-2012 12:59 by Chris McCollin
    Accepted
    11-Sep-2012 11:30 Physical versus Statistical Assumptions in Modelling: A Study of Maintenance
    We are going to discuss physical assumptions versus statistical assumptions illustrated by some reliability data which if modelled ignoring physical assumptions (which are more important), gives incorrect results. The importance of physical assumptions can be extended to each stage of physics, systems and service to identify the modelling being done (or should be done), e.g. quantum mechanics - statistical physics, service quality statistics. There may be very little in the way of physical principles of service delivery, but more may come to light. Jim Morrison pursued a lifelong mission to include awareness, understanding and utilisation of physical knowledge when applying statistical methods in his work on statistical engineering Morrison (2009). Bokov (2007) noted the benefit of combining heuristic (i.e. physical) models with data dependent models to give a combined model which was much more accurate. Regarding service quality, Stella Cunliffe's obituary (Times, Jan 31st 2012; RSS President 1975-77) noted that her greatest contribution to the smooth working of Guinness when she was a statistician there for 20 years was the way she combined knowledge of human propensities with physical measurements of beer in barrels to produce a more sensitive quality control.

    In a different way, data mining also seeks to include physical assumptions by acknowledging that they exist but may be unknown. Knowing that there can be many reasonable models (as Box says, all are wrong but some are useful), methods such as random forest and bagging produce an improved model by combining multiple alternative models, using methods such as PCA on the coefficients of alternative models to produce a superior compromise. The approach here is to use simple statistical models and plots to identify structure which can be explained in the context of the physical system under study.

    Kansei Engineering seeks to combine awareness of emotional factors in the design process. It shows that there are more considerations beyond the physical assumptions and that you can get the physical attributes right but still miss the hidden spark which makes one person desire a product and another not be attracted to it at all.
    We note the progression from statistical assumptions, then physical assumptions, then emotional responses. There are parallels with the progression from Data to Information to Knowledge to Wisdom. The initial missing link at the beginning of both these is system description. One of the main reasons why Six Sigma is so successful is because it encases statistical problem solving in definition, control and evaluation. Implementation issues and the need to include on site and system knowledge are key to successful statistical modelling.
    SPC is a common thread through most of the recent quality/process improvement initiatives. SPC is more likely to be successful if it is combined with process knowledge. However, it is interesting that a main feature of SPC is that it is implemented to encourage people to “leave the process alone” and not use their process knowledge to try to improve the process. The resolution of this apparent contradiction, is the importance of understanding variation and using process knowledge to diminish special causes whilst using SPC self-restraint to avoid tweaking and adding to the natural variation. SPC requires a compromise whereby physical assumptions and workplace know-how are included but transient emotional tweaks are prohibited. An approach is utilised here with each of the data sets studied which utilises what SPC is very good at: identifying non-random structure, i.e. outlier detection (submarine diesel generators), event groupings (bus powertrains), stability in the mean (ambassador limousines) and trend and seasonality (plane air-conditioners).
  • Use of Statistics in Pharmaceutical Industry

    Authors: Alain Poncin (ProtAffin Biotechnologie)
    Primary area of focus / application: Process
    Keywords: Biopharmaceutical, Quality by Design, Modeling, Optimisation
    Submitted at 5-Apr-2012 14:16 by Alain Poncin
    Accepted (view paper)
    10-Sep-2012 10:40 Use of Statistics in Pharmaceutical Industry
    Since publication of “Pharmaceutical cGMP for the 21st Century – A Risk Based Approach” by the FDA in 2004, Statistics including multivariate analysis and Design of Experiments are expected to be used during the life cycle of a Pharmaceutical product as part of “Quality by Design”.
    The list of the Guidelines published since this date is impressive and confirms the requirements for more “science” to provide patients with affordable high quality drugs.
    The goal of multivariate analysis and DoE is to define a “Design Space” in which the manufacturer will be allowed to move.
    Skepticism of industry about increased Regulatory latitude and the need to change refrained the introduction of Quality by Design and except some champions mainly in Big Pharma, these new concepts are still poorly used.
    I’m using “QbD” for more than 25 years as a competitive advantage to improve quality and efficiency. The intensive use of statistics allowed me to develop robust production process for more than 70 different proteins now mainly in late clinical phase (Phase IV included). The modeling and optimisation of the refolding of a protein (14 potential significant factors) will be used to illustrate the integrated use of DoE, ANN and other statistical tools such as Monte Carlo simulation.
    Our work is not only to do statistical analysis but also to train and empower staff people to succeed in implementing Quality by Design.
  • Short-term Prediction of Dangerous High-water Levels at a Dutch Storm Surge Barrier

    Authors: Alessandro Di Bucchianico (Eindhoven University of Technology), Krijn Saman (Dutch Ministry of Transport, Public Works and Water Management), Jan-Rolf Hendriks (Dutch Ministry of Transport, Public Works and Water Management)
    Primary area of focus / application: Modelling
    Keywords: short-term prediction, water level, oscillation, time series, trend
    Submitted at 5-Apr-2012 15:24 by Alessandro Di Bucchianico
    Accepted (view paper)
    10-Sep-2012 10:40 Short-term Prediction of Dangerous High-water Levels at a Dutch Storm Surge Barrier
    After the 1953 flood in the South-West Netherlands which killed almost 2000 people, the Dutch government initiated the Delta Plan, a large-scale protection system consisting of dikes and dams. An important part of the Delta Plan is the Easter Scheldt Storm Surge Barrier, which is a bridge-like construction that can be closed when water levels become too high.
    When water levels are above 275 cm a Decision Team is physically present at the barrier to close the barrier at the right moment. In order to keep flooding risk at acceptable levels, the barrier automatically closes and takes over full control when there is a water level prediction of 300 cm exactly. In order to avoid environmental damage, the barrier is not allowed to close at predicted water levels below 300 cm. It is thus important for the Decision Team to avoid unnecessary closures of the barrier as well avoid that the barrier takes over full control. The Decision Team has access to several long-term water level predictions. As an addition to these existing predictions, we developed short-time prediction models. Our models are capable of handling the many oscillations in the water levels (due to wind, rain and geometrical constraints of the Easter Scheldt river) and use information of several locations.
  • Cause and Effect

    Authors: Roland Caulcutt (Caulcutt Associates)
    Primary area of focus / application: Education & Thinking
    Keywords: cause, effect, decisions, conscious, unconscious, brain
    Submitted at 5-Apr-2012 15:38 by Roland Caulcutt
    Accepted
    10-Sep-2012 10:40 Cause and Effect
    Einstein defined madness as “continuing to do what you have always done in the past, but expecting to get better results in the future”. It would be unwise to ignore these words if you wish to improve the performance of a business process. Improvement will not occur until after you change the process in some way.
    Of course, we don’t need Einstein to tell us that an inappropriate change could have a deleterious effect on business performance. So, it is desirable that the proposed changes should be based on a sound understanding of recent process performance. In particular, any predictions of future performance should be based on a cause-and-effect explanation of what has happened in the past.
    An avalanche of brain research has recently shed light on how people make judgements on causes and effects. This research suggests that a statistician is likely to use certain processes in the conscious brain whilst carefully assessing evidence before drawing conclusions, whereas a manager is likely to reach a conclusion much more quickly, using processes in the unconscious brain. The manager has neither the time nor the ability to behave like a statistician.
    A blackbelt, a greenbelt or any other improvement specialist has training in statistics and technical knowledge of the business process. How can we ensure that the training will help him/her to draw conclusions about cause and effect, using the conscious brain?