ENBIS-14 in Linz

21 – 25 September 2014; Johannes Kepler University, Linz, Austria Abstract submission: 23 January – 22 June 2014

My abstracts

 

The following abstracts have been accepted for this event:

  • Design of Experiments for Scenario and Risk Assessment

    Authors: Winfried Theis (Shell Global Solutions International B.V.)
    Primary area of focus / application: Business
    Secondary area of focus / application: Design and analysis of experiments
    Keywords: Design of Experiments, Simulation, Scenario, Risk assessment
    Submitted at 30-May-2014 20:11 by Winfried Theis
    Accepted
    23-Sep-2014 14:40 Design of Experiments for Scenario and Risk Assessment
    In Shell we have to take many decisions under uncertainty, as investment decisions have to be assessed for very long timeframes. A typical large investment decision like building extra equipment into a refinery require millions of $ investments and therefore have to be taken for a long time often decennia of service. But even more short-term decisions are influenced by market fluctuations like buying a certain crude for a refinery needs to consider the potential products and their prices by the time the crude actually has arrived at the refinery. For all these decisions Shell uses optimization and models of the assets. These models do not contain any random components and until recently they were only checked for the worst/best case scenarios. Therefore, in recent years people suggested to introduce Monte Carlo methods into the process. First tests showed that it proved tricky to correctly build in randomness and that users felt more confused by the outputs and unsure if they do not miss out on crucial information.
    We now created a tool that introduces statistical Design of Experiments into the process to drive the exploration of the complex models and to investigate the impact of the uncertainties on the stability of the optimal solutions. Essentially the DuU (Decisions under Uncertainty) tool runs in a staged process: in the first phase the most influential variables are determined by exploring the space by slightly change screening designs or space-filling designs. In the second phase the design space is reduced to the most important variables found and then space filling designs are used to create a number of scenarios. When the question is about the described investment decisions over time several sorts of random processes can be applied. When there is no time component simple random distributions are used.
    We will present the tool and show the insights based on one practical examples.
  • SPC for Condition Monitoring Using Irregularly Sampled Data

    Authors: Bjarne Bergquist (Luleå University of Technology), Peter Söderholm (The Swedish Transport Administration)
    Primary area of focus / application: Reliability
    Keywords: Statistical process control (SPC), Irregular sampling, Interpolation, Linear assets, Maintenance
    Submitted at 30-May-2014 21:37 by Bjarne Bergquist
    Accepted
    22-Sep-2014 16:20 SPC for Condition Monitoring Using Irregularly Sampled Data
    Control charts have the potential to signal if processes or products are out of control. Hence, control charts can be used to monitor the condition of items and to signal the presence of faulty states and failure events that require maintenance actions. In the presentation, a case is discussed where control charts are used for condition assessment of railway tracks. The control charts are aimed to detect faulty states of railway track and give early warnings when the deterioration rate is high. The data used is sampled by measurement wagons that pass the track sections a given number of times each year, but the measurement wagon’s passage is irregular, rendering sampling at varying intervals. Hence, one obstacle for using control charts in this application is the irregularity of obtained data. Different approaches of analyzing the irregularly sampled data, such as ignoring the time differences or using different interpolation approaches such as krieging, using splines, nearest neighbor interpolation are studied and discussed.
  • Process Capability Indices for Bounded Distributions: The Dry-Etching Semiconductor Case-Study

    Authors: Riccardo Borgoni (Bicocca University of Milan), Laura Deldossi (Cattolica University of Milan), Diego Zappa (Cattolica University of Milan)
    Primary area of focus / application: Quality
    Keywords: Capability indices, Rational subgroups, Spatial monitoring grid, Asymmetric tolerances
    Submitted at 30-May-2014 23:25 by Diego Zappa
    Accepted
    Capability indices are well known tools to estimate the mean-deviance performance of a process with respect to both a target and specification limits. A flow of contributions is available in literature both for univariate and multivariate cases.
    Most of the references refers to processes with two sided specification limits but in many applications only a one sided specification limit is necessary.
    That is the case of the so called dry etching phase in semiconductor manufacturing processes.
    Lovelace and Swain (2009), Albing and Vännman (2009) have proposed appropriate generalization of the most common indices by assuming in the first case that data come from a truncated (from below) lognormal distribution, while in the second paper a parametrically more flexible index is proposed along with a test of significance. The latter proposal is attractive because of its flexibility but it suffers of some problems in the estimation of key parameters, while the former benefits of a somehow simple parametric assumption.
    We will mainly consider the first proposal for a couple of reasons. First, on the basis of a Montecarlo simulation the authors assure that the proposal is fairly robust to departure from log-normality. Second in their proposal it is admitted the case of truncation and the key role of rational subgroups.
    That is our case: since chipsets are built by splitting appropriately wafers into smaller pieces, process capability indices should reflect the precision of the production at the wafer stage. Differences among wafers are reflected by averaging precision of the production process, including specific factors that for the sake of capability computation are assumed not present into the single unit.
    The uniqueness of this case study is that when data are filtered above a pre-fixed threshold, all the wafers with at least one measurement below it, cannot be used as rational subgroups and should be excluded from the computation of the capability indices.
    That strategy may cause the rejection of many wafers depending on the value of the threshold chosen.
    To avoid that we propose to compute sample statistics for each subset of a 9 sample grid For each possible threshold and each combination, boxplots are displayed Capability is computed conditionally to the dimension of the sample grid by averaging the indices with the dimension of the rational subgroups used for the computations.
  • Run-to-Run Profile Monitoring on Sensor Temporal Data for Tool Condition Diagnosis

    Authors: Jakey Blue (École des Mines de Saint-Étienne), Jacques Pinaton (STMicroelectronics), Agnès Roussy (École des Mines de Saint-Étienne)
    Primary area of focus / application: Process
    Secondary area of focus / application: Reliability
    Keywords: Run-to-Run variation, Profile monitoring, SPC (Statistical Process Control), Tool condition hierarchy, Tool fault diagnosis
    Submitted at 31-May-2014 01:17 by Jakey Blue
    Accepted (view paper)
    22-Sep-2014 15:40 Run-to-Run Profile Monitoring on Sensor Temporal Data for Tool Condition Diagnosis
    In addition to the R&D advancement in semiconductor technologies, IC makers also have to invest huge capital in equipment procurement and implementation of IT systems to meet with the continuously growing demands. It is therefore very critical to optimize tool utilization and the dynamic control plans. Nowadays the common practice for tool condition monitoring is by implementing the Fault Detection and Classification (FDC) system, from which hundreds of control charts are generated and monitored to detect the shifts/drifts in the FDC data. Apparently, this is not an efficient way.

    In 2012 a hierarchical tool condition monitoring scheme is proposed for efficient detection and diagnosis of tool faults [1]. The overall tool condition in the hierarchy provides a general view of tool behavior. The hierarchy then enables an intuitive breakdown of the detected anomaly in the overall tool condition into the sensor groups. However, to make the hierarchy more applicable, the drilldown diagnosis should be extended to the sensor level [2]. Therefore a temporal run-to-run variation at sensor level is developed in this research.

    The idea of run-to-run variation is to capture the local changes (variances) between two consecutive wafers from time to time. It contains four critical steps to complete:
    1. process recipe classification;
    2. FDC temporal profiles synchronization;
    3. run-to-run variation calculation; and
    4. SPC monitoring on R2R variations.

    Through the case validation with real data collected from our industrial partner, the R2R variation indeed enables correlating tool faults with single sensor effectiveness. By combining the proposed R2R variation with the tool condition hierarchy, tool condition monitoring now becomes efficient and tool fault diagnosis can be systematically top-down.

    [1] J. Blue, D. Gleispach, A. Roussy, and P. Scheibelhofer, “Tool Condition Diagnosis With a Recipe-Independent Hierarchical Monitoring Scheme,” IEEE Transactions on Semiconductor Manufacturing, vol. 26, no. 1, pp 82-91, 2013.
    [2] J. Moyne, N. Ward, and P. Hawkins, “Leveraging Advanced Process Control (APC) Technology in Developing Predictive Maintenance (PdM) Systems,” in Proceedings of 24th Advanced Semiconductor Manufacturing Conference (ASMC), Saratoga Springs, USA, May, 2012, pp. 221-226.
  • Makin’ ‘em Do DoE…

    Authors: Stefanie Feiler (AICOS Technologies AG), Philippe Solot (AICOS Technologies AG)
    Primary area of focus / application: Education & Thinking
    Secondary area of focus / application: Design and analysis of experiments
    Keywords: Education, Consulting, Statistical Design of Experiments (DoE), Quality, Psychology
    Submitted at 31-May-2014 01:38 by Stefanie Feiler
    Accepted (view paper)
    23-Sep-2014 16:40 Makin’ ‘em do DoE…
    Although DoE (statistical design of experiments) is the appropriate methodology for doing experiments, there are still many experts of their field who do not believe so.
    So what can you do if you want to promote the use of DoE? What are the psychological barriers to overcome? Which strategies can be used for convincing somebody to try out DoE? How to sell education in DoE and statistics?
    The aim of the talk is to provide a thorough overview of different approaches for advertising and implementing DoE – but we are in particular looking forward to a lively discussion where you can share your own experience!

    Reference: The starting point of the talk is a discussion on this topic on LinkedIn, initiated by ENBIS member Phil Kay.
  • Customized Fetal Growth Modeling and Monitoring - A Statistical Process Control Approach

    Authors: Diamanta Benson-Karhi (The Open University of Israel), Haim Shore (Ben-Gurion University of the Negev), Maya Malamud (Ben-Gurion University of the Negev), Asher Bashiri (Soroka University Medical Center and Ben-Gurion University of the Negev)
    Primary area of focus / application: Process
    Secondary area of focus / application: Modelling
    Keywords: Customized fetal growth modeling and monitoring, Least absolute deviation, Nonlinear profiles, Response modeling methodology, Short runs, Statistical process control
    Submitted at 31-May-2014 09:49 by Diamanta Benson-Karhi
    Accepted
    23-Sep-2014 10:55 Customized Fetal Growth Modeling and Monitoring—A Statistical Process Control Approach
    A statistical process control (SPC)-based methodology is
    developed to detect early deviations of fetal biometry from expected normal growth. Using response modeling methodology (RMM), a fetal growth model is dynamically estimated and integrated in a regression-adjusted SPC control scheme, based on a new median control chart and a control chart for residuals variation. Hadlock’s reference centiles are also integrated in the monitoring scheme. Longitudinal data from normal pregnancies and
    those with adverse medical outcomes have been analyzed. Results show that nonsmooth growth trajectory, expressed in exceptionally large absolute deviations from predicted median values, is a good precursor to possible
    prenatal adverse outcomes.