"Six Sigma: What's Beyond the Hype?"
According to the press, Six Sigma has been the most successful business improvement initiative in history. However, it is also undoubtedly the most hyped statistics-based improvement initiative in history. So the question remains, what's beyond the hype? This talk will briefly review the history of Six Sigma, including a frank discussion of what it has and hasn't accomplished. It will present the "real" Six Sigma, without the hype and fanfare, discussing both its unique advantages and also its limitations. At its core, Six Sigma is neither a metric, nor a collection of statistical tools, but rather an overall methodology and infrastructure for improvement. The session will discuss current trends in Six Sigma, including its inclusion into curricula at several leading academic institutions. Lastly, it will suggest potential future directions of the initiative, i.e., what might be the "next big thing" after Six Sigma.
Roger Hoerl leads the Applied Statistics Lab at GE Global Research, which consists of fourteen statisticians who support new product and service development in each of the GE businesses. He is a Certified Master Black Belt, and previously served as Quality Leader for the GE Corporate Audit Staff. Dr. Hoerl has been named a Fellow of the American Statistical Association and the American Society for Quality, and has been elected to the International Statistical Institute. He has received the Brumbaugh and Hunter Awards from the American Society for Quality, and the Founders Award from the American Statistical Association. His introductory text Statistical Thinking: Improving Business Performance, co-authored with Ron Snee, was described as "probably the most practical basic statistics textbook that has ever been written within a business context" by the journal Technometrics. He is coauthor of Leading Six Sigma, a Step-by-Step Guide Based on Experience With GE and Other Six Sigma Companies, and most recently wrote (also with Ron Snee) Six Sigma Beyond the Factory Floor; Deployment Strategies for Financial Services, Healthcare, and the Rest of the Real Economy, published by Financial Time/Prentice Hall in 2005.
"Modern Process Chemometrics"
Process chemometrics is a subdiscipline of chemometrics, dealing with analyzing and modeling process data. The goal of modeling process data can be multifold: process optimization / improvement, process monitoring, process control or process understanding. The approach in process chemometrics is usually multivariate. This is due to the characteristics of most processes: they generate multivariate dynamic data. Very popular statistical techniques in process chemometrics are those based on dimensionality reduction, e.g. principal component analysis, partial least squares regression and multiway analysis. Such methods have been shown to be powerful in analyzing process data and providing process understanding. Recently, due to the improvement of analytical chemistry methods, very sophisticated tools have become available for probing processes. For instance, metabolomics approaches can be used for providing fundamental understanding of biotechnological processes. This opens the way rational process optimization. In this lecture, an overview will be given of several process chemometrics approaches based on the ideas of dimension reduction. This will be illustrated with examples from chemical, pharmaceutical en biotechnogical processes.
From 1993 to 2004, Age K. Smilde was a full-professor of Process Analysis and Chemometrics. During that period he worked closely together with chemical engineers and statisticians in the field of process chemometrics. From January 2004 onwards, he is a part-time professor of biosystems data analysis at the Swammerdam Institute for Life Sciences at the University of Amsterdam. He is also programme manager analytical information sciences at TNO Quality of Life in the Netherlands. He holds an MSc in econometrics / statistics and a PhD in analytical chemistry. His current research interest focuses on building methods for analyzing dynamic complex data from biological systems and processes to support systems biology research. Such methods should be able to incorporate nonlinear dynamics, multilevel structures (e.g. data from different biological compartments) and a priori biological information (grey models). He has published more than 130 peer-reviewed papers and has been the Editor-Europe of the Journal of Chemometrics during the period 1994-2002. He chaired the 1996 Gordon Research Conference on Statistics in Chemistry & Chemical Engineering. Together with Rasmus Bro and Paul Geladi, he co-authored the book 'Multi-way Analysis' (Wiley; 2004).
Dr. Godfrey will talk during the Juran Session (Session 7.3 on Wednesday morning)
Dr. A. Blanton Godfrey (Blan) is Dean and Joseph D. Moore Professor of Textile & Apparel Technology & Management, College of Textiles, North Carolina State University. The College is the leading institution of its type in the world and produces over half the doctorates in its field in the United States. Prior to joining NC State on July 1, 2000 Blan was Chairman and Chief Executive Officer of Juran Institute, Inc., the leading international management consulting, research, and training organization focused on quality management and business excellence, a position he held for thirteen years. Prior to joining Juran Institute, Blan was Head of the Quality Theory and Technology Department of AT&T Bell Laboratories (now Lucent Technologies, Bell Labs Innovations). The department focused on applied research in the areas of quality management and technology, reliability and productivity. Blan joined Bell Labs in 1973 after receiving an MS and PhD in Statistics from Florida State University and a BS in Physics from Virginia Tech.
For nineteen years Blan was also an Adjunct Professor in Columbia University's School of Engineering and Applied Science where he taught graduate courses in quality management and control. He has also been a guest lecturer in clinical quality management at Harvard University.
Blan is a Fellow of the American Statistical Association, the American Society for Quality, and the World Academy of Productivity Sciences, and the Royal Society for Art, Manufacturers, and Commerce. He is an elected member of Sigma Xi, the New York Academy of Science, The Fiber Society, and an academician of the International Academy for Quality. He is also listed in Who's Who in America. He has published over 200 articles and book chapters and co-authored or co-edited five books including Modern Methods for Quality Control and Improvement and Curing Health Care: New Strategies for Quality Improvement. The first edition of Modern Methods was named "Book of the Year" by the Institute of Industrial Engineering and the second edition was published last December. He is the co-editor (with Dr. Joseph M. Juran) of Juran's Quality Handbook, Fifth Edition, published in March 1999. The Spanish edition of the handbook was published in 2000, and the Chinese edition in 2003. The Japanese edition and paperback edition of Curing Health Care were published in 2002.
In 1992 the American Society for Quality Control presented Blan the Edwards Medal for his outstanding contributions to the science and practice of quality management. Blan received the Distinguished Graduate Award from Florida State University's College of Arts and Sciences in 1993. He was also named a "Grad Made Good" by the FSU student members of ODK, a campus leadership fraternity that each year selects three former students they admire for career accomplishments. Blan's research interests include statistical graphics, quality and productivity management, strategic deployment, mistake proofing, and applied statistics. In 2001 he became the founding editor of Six Sigma Forum Magazine, a new journal published by the American Society for Quality. In 2005 he was selected as the Deming Lecturer for the American Statistical Association's Annual Conference.
"Statistical Device Modeling and Advanced Process Control — A new concept to increase performance and robustness of electronic circuits"
The performance and yield of electronic circuits depends on the process variation of the individual semiconductor devices. To achieve robustness the designer verifies the circuit performance under varying process conditions by stochastic simulation using two different sets of input parameters: worst case vectors or Monte Carlo models. A problem with traditional worst case vectors for circuit simulation is that they are constructed from the specification of production control parameters to maximise a certain device performance. The combination of parameters neglecting the true correlation leads to pessimistic values when used to predict the variation of a circuit. As a consequence, chip area is increased and performance decreased to meet unrealistic worst case conditions. A new approach constructs statistical corner models from the multidimensional statistics of process control parameters reflecting the true correlation of parameters. The extreme values of the multivariate distribution are determined by non-parametric methods (location depth method) and correspond to production wafers (corner wafers). Simulation models are extracted for the corner wafers and implemented as part of a process design kit. Whereas statistical corner models enable a quick verification of circuit robustness, Monte Carlo simulation is needed to consider local variations (device mismatch) and to optimise the yield of a circuit by automated simulation based design centering. The accuracy depends on the quality of Monte Carlo input parameters which describe the random distribution of device parameters. Whereas traditional approaches use a uniform distribution based on the specification of production control parameters, more advanced methods derive the process variation of device model parameters from the actual distribution of production control parameters by backward error propagation including the proper correlation terms. To guarantuee the validity of statistical models with respect to the historical data set all parameters used for the generation of statistical corner models and Monte Carlo models have to be controlled using SPC. In a step-wise approach one-dimensional SPC has been implemented in order to control the hypercube spanning the range of relevant parameters and in a second step multivariate SPC is used to verify the correlations. The evaluation of production data shows that the combination of accurate statistical device models and advanced process control enables robust designs with smaller chip size and increased device performance.
Gerhard Rappitsch received his degree as graduate engineer in technical mathematics from the Technical University of Graz in 1991 and a Ph.D. degree in technical sciences from the same university in 1996. Since 1997 he has been with the research and development department of Austriamicrosystems AG working in the areas of analog circuit simulation, statistical SPICE modeling of semiconductor devices and robust design. In 2004 he was appointed principal engineer and became responsible for strategic projects in the area of statistical simulation and DFM (Design for Manufacturability). From 2005 he started building up a new DFM section as technical manager focusing on the development and implementation of new methods for increasing production yield of electronic circuits by stochastic simulation, layout verification and advanced process control. During his activities he implemented statistical corner models enabling chip size reduction and performance improvement by realistic simulation-based prediction of process variation. Gerhard Rappitsch has published several papers in his focus areas statistical modeling and robust design.
"The Business and Industrial Statistician: Past, Present and Future"
The role of the business and industrial statistician has changed radically during the half- century spanning my career. Technological advances have led to the "democratization of statistics." Statisticians can, and must, now play a broader and more proactive role than when we were absorbed principally with number-crunching. We need to be purveyors of statistical thinking and to be heavily involved in the sometimes neglected, but critical, task of helping ensure good data. I will elaborate on this journey, take a peek into the future, and comment on George Box's gigantic contributions along the way.
Gerald J. (Gerry) Hahn, retired as ManagerÃ¢â‚¬â€Applied Statistics, GE Research and Development in May 2001. At GE he led an 18-person group that provided leadership to GE businesses in applying statistical tools to meet their strategic goals, including Six Sigma implementation. Gerry joined GE in 1955 and assumed his managerial position in 1973. He received his Ph.D in Operations Research & Statistics from Rensselear Polytechnic Institute (RPI), and also holds degrees from the City College of New York, Columbia University, and Union College. He is the co-author of three books, has contributed to ten others, published over 100 papers in refereed engineering and statistical journals, has given over 150 invited presentations, and written regular columns for Chemtech and Quality Progress. He is a Fellow of the American Statistical Association (ASA) and the American Society for Quality (ASQ), and is an elected member of the International Statistics Institute. He is the recipient of 12 special ASA and ASQ awards (including ASQ's Distinguished Service Medal and Deming, Hunter, Shewhart, Wilcoxon and Youden Prizes), and is a GE R&D Coolidge Fellow. He has been the keynote speaker at numerous conferences. His professional contributions include founding chair of the ASA Section on Quality and Productivity, chair of the ASA Committee on Fellows, Associate Editor of Technometrics, Journal of Quality Technology, and Six Sigma Forum and chair of the Gordon Research Conference on Statistics in Chemistry and Chemical Engineering. He is currently an adjunct professor at RPI, is completing a new book on his experiences as an industrial statistician, and is serving as a consultant to various companies.