ENBIS16 in Sheffield
11 – 15 September 2016; Sheffield
Abstract submission: 20 March – 4 July 2016
My abstracts
The following abstracts have been accepted for this event:

AQL Tables and Lean Thinking
Authors:
Jonathan SmythRenshaw (Jonathan SmythRenshaw & Associates Ltd)
Primary area of focus / application:
Process
Secondary area of focus / application:
Quality
Keywords: Deming kp method, AQL tables, Lean, Reduced sample size
Submitted at 23Jun2016 14:20 by Jonathan SmythRenshaw
Accepted
(view paper)
14Sep2016 10:10 AQL Tables and Lean Thinking
The use of AQL tables for inspection decisions is well known and understood. The principle of Deming's kp method, in which 0% or 100% inspection are the two outcomes, based on cost and defective rates. The principles of Lean, as developed, by Toyota are used widely across business. The principles of Lean encourage small batch production to minimize waste. I believe this makes AQL tables redundant, and 100% inspection widespread in Lean processes. However, is inspection value added or not in a Lean process? Can the Deming kp method be adapted to provide a decision rule for inspection or no inspection based on the selling price of the final product/service? I will discuss these issues and propose a framework for inspection in Lean processes.

Prediction Intervals under Prior Information
Authors:
Lennart Kann (Robert Bosch GmbH, Automotive Electronics), Rainer Göb (University of Würzburg)
Primary area of focus / application:
Other: Sampling session
Secondary area of focus / application:
Quality
Keywords: Prediction interval, Proportion nonconforming, Prior information, Generalised hypergeometric distribution
Submitted at 29Jun2016 16:39 by Rainer Göb
Accepted
14Sep2016 10:30 Prediction intervals under prior information
Prediction intervals are widely used in industry, in particular prediction intervals for the number of nonconforming units. For the latter purpose, various intervals have been suggested in the literature. However, these approaches are restricted in usefulness and applicability. In particular, none of these approaches accounts for prior information on the underlying proportion nonconforming p. In industrial environments, this type of prior information is always available, usually restricting attention to very small values of the proportion nonconforming p. We present shortest length intervals for the number of nonconforming units under prior information expressed by a beta distribution of the proportion nonconforming p.

ISO PWI 28596: TwoStage Sampling Plans for Auditing
Authors:
Jens Bischoff (University of Würzburg), Rainer Göb (University of Würzburg)
Primary area of focus / application:
Other: Sampling session
Secondary area of focus / application:
Quality
Keywords: Audit sampling, Twostage sampling, Proportion nonconforming, Prior information, Confidence interval, Standardisation
Submitted at 29Jun2016 17:46 by Rainer Göb
Accepted
14Sep2016 10:50 ISO PWI 28596: Twostage Sampling Plans for Auditing
The recent international auditing standards ISA 500 and ISA 530 issued by IAASB (International Auditing and Assurance Standards Board) admit only statistical sampling schemes for proper audit sampling. The present ISO (International Standards Organization) sampling standards cannot directly be adopted for auditing purposes: 1) The ISO standards are merely decisionoriented and provide no mechanism of parameter estimation. Auditors usually need an estimate of the parameters of auditing classes. 2) The ISO standards provide no mechanism to exploit prior information in the design of sampling plans. In the auditing context, however, relatively precise prior information is usually available. We present the structure of a twostage sampling scheme for audit sampling for a proportion nonconforming which is currently discussed as the preliminary work item PWI 28596 at ISO.

When a multifactor experiment is carried out over a period of time, the responses may depend on a time trend. Unless the tests of the experiment are conducted in a proper order, the time trend has a negative impact on the precision of the estimates of the main effects, the interaction effects and the quadratic effects. A proper run order, called a trendrobust run order, minimizes the correlation between the effects' estimates and the time trend's linear, quadratic and cubic components. Finding a trendrobust run order is essentially a permutation problem. We develop a sequential approach based on integer programming to find a trendrobust run order for any given design. The sequential nature of our algorithm allows us to prioritize the trend robustness of the maineffect estimates. In the literature, most of the methods used are tailored to specific designs, and are not applicable to an arbitrary design. Additionally, little attention has been paid to trendrobust run orders of response surface designs, such as central composite designs, BoxBehnken designs and definitive screening designs. Our sequential algorithm succeeds in identifying trendrobust run orders for arbitrary factorial designs and response surface designs with two up to six factors.

Cryptographically Secure Multiparty Evaluation of System Reliability
Authors:
Louis Aslett (University of Oxford)
Primary area of focus / application:
Other: Applied Statistics Meets Practical Statistics session
Keywords: Reliability theory, Encrypted statistics, System reliability, Survival signature, Privacy
Submitted at 1Jul2016 13:41 by Louis Aslett
Accepted
13Sep2016 10:50 Cryptographically Secure Multiparty Evaluation of System Reliability
The precise design of a system may be considered a trade secret which should be protected, whilst at the same time component manufacturers are sometimes reluctant to release full test data (perhaps only providing mean time to failure data). In this situation it seems impractical to both produce an accurate reliability assessment and satisfy all parties' privacy requirements. However, we present recent developments in cryptography which, when combined with the recently developed survival signature in reliability theory, allows almost total privacy to be maintained in a cryptographically strong manner in precisely this setting. Thus, the system designer does not have to reveal their trade secret design and the component manufacturer can retain component test data inhouse.

This paper presents three improvements that were developed by the authors when they considered applying ASTM E2709 in a pharmaceutical company.
The standard itself follows an indirect strategy of using the sample mean and standard deviation of a sample of observations with a normal distribution to obtain a lower bound of the probability of passing a singlestage or multiplestage acceptance procedure. A direct strategy would use the sample mean and standard deviation to evaluate compliance with specifications. The methodology in the standard is to check whether the entire triangular confidence region for the population mean and standard deviation lies within the region of combinations of population mean and standard deviation for which the probability of passing the acceptance procedure is at least a minimum required level. For multiplecriterion and multiplestage acceptance procedures, the probability of passing the acceptance procedure is calculated as a lower bound.
The three improvements involved simplifying the calculations, extending the acceptance procedure to include an additional type of criterion, and using an alternative method to calculate the acceptance probabilities. Adopting the latter improvement reduces costs in many cases.
The question of whether the indirect strategy is at all useful is also raised.