ENBIS-16 in Sheffield

11 – 15 September 2016; Sheffield Abstract submission: 20 March – 4 July 2016

My abstracts


The following abstracts have been accepted for this event:

  • Democratic Big Data Analysis

    Authors: Shirley Coleman (ISRU, Newcastle University)
    Primary area of focus / application: Consulting
    Secondary area of focus / application: Business
    Keywords: Data, Analytics, Numeracy, Business, Visualisation, Insight, Monetisation
    Submitted at 29-Apr-2016 18:46 by Shirley Coleman
    Accepted (view paper)
    13-Sep-2016 15:40 Democratic Big Data Analysis
    Waking up to the benefits of analysing their internal company data is a new challenge for small to medium enterprises (SMEs). They can see the enormous potential in terms of offering new products based on visualisation and insight but are not sure how to get started. Levels of confidence in handling data are worryingly low in many SMEs due to diverse factors such as loose responsibility for collecting and checking data, poor numeracy skills, secrecy and privacy fears about giving too much away upsetting customers and losing competitive advantage. SMEs are so important to our EU economy but risk being left out of the great advances in big data analytics realised by large organisations. Government supported knowledge transfer partnerships between UK universities and businesses are helping to bridge the gap enabling companies to explore the new data analytics opportunities and obtain vital proof of concept to justify the resources needed to develop their own expertise. This presentation describes funded data analytics developments in the shipping industry and utilities sector highlighting the importance of data quality and business readiness. Ethical and legal issues are discussed in the context of monetising data from an expert system in the service sector. The relevance of such funded projects is demonstrated in terms of changed business models in the companies involved and a welcome increase in data analytics capabilities.
  • On Sequential Experimentation Strategy

    Authors: Jai-Hyun Byun (Gyeongsang National University), Byung-Cheol Shin (Gyeongsang National University)
    Primary area of focus / application: Design and analysis of experiments
    Keywords: Sequential experimentation, Central composite design, Non-central composite design, Asymmetric composite design, Design efficiency
    Submitted at 1-May-2016 11:12 by Jai-Hyun Byun
    Accepted (view paper)
    13-Sep-2016 10:30 On Sequential Experimentation Strategy
    Sequential experimentation is very efficient way of doing experiments for scientific discovery and process and product design. In this presentation, some sequential experimentation approaches and their uses are presented, based on the experimental data from factorial design. From the results of the initial factorial design, five follow-up approaches are suggested; steepest ascent (SA), central composite design (CCD), asymmetric composite design (ACD), overlapping non-central composite design (ONCD), and non-overlapping non-central composite designs (NNCD). Analysis methods for ACD, ONCD, and NNCD are presented using the initial factorial and follow-up design data. Comparison on the number of design points and design efficiencies are given for CCD, ACD, ONCD, and NNCD. A flow chart of employing the sequential design approaches is given to help the practitioners. The performances of these sequential approaches are evaluated based on a simulation study from a response surface test bed.
  • A Practitioner's Guide to Analyzing Reliability Experiments with Random Blocks and Subsampling

    Authors: Jennifer Kensler (Shell Global Solutions), Laura J. Freeman (Institute for Defense Analyses), G. Geoffrey Vining (Virginia Tech)
    Primary area of focus / application: Other: Invited: ASQ Journals
    Secondary area of focus / application: Reliability
    Keywords: Regression with lifetime data, Restrictions on randomization, Weibull distribution, Censored data
    Submitted at 2-May-2016 20:20 by Jennifer Kensler
    Accepted (view paper)
    13-Sep-2016 11:40 A Practitioner's Guide to Analyzing Reliability Experiments with Random Blocks and Subsampling
    Reliability experiments provide important information regarding the life of a product, including how various factors affect product life. Current analyses of reliability data usually assume a completely randomized design. However, reliability experiments frequently contain subsampling, which represents a restriction on randomization. A typical experiment involves applying treatments to test stands, with several items placed on each test stand. In addition, raw materials used in experiments are often produced in batches, leading to a design involving blocks. This presentation proposes a method using Weibull regression for analyzing reliability experiments with random blocks and subsampling. An illustration of the method is provided.
  • How Geometric Properties of Black-Box Computer Models can Help to Provide Conservative Reliability Assessments in Time-Consuming Situations

    Authors: Nicolas Bousquet (EDF R&D)
    Primary area of focus / application: Design and analysis of experiments
    Secondary area of focus / application: Reliability
    Keywords: Accelerated Monte Carlo, Computer model, Geometric constraints, Failure probability / quantile, Deterministic bounds, Limit state surface, Support Vector Machines
    Submitted at 6-May-2016 15:06 by Nicolas Bousquet
    Computing the probability of undesirable, unobserved events is a common task in structural reliability engineering. When dealing with major risks occuring with low probability, the lack of observed failure data often requires to use so-called computer models reproducing the phenomenon of interest. The simulation of their uncertain inputs, being modeled as random variables, allows to compute statistical estimators of a probability or a quantile. Usually these complex objects can be described as time-consuming black boxes. Therefore exploring the configurations of inputs leading to failure, in a non-intrusive way, requires to run the model over a design of numerical experiments. Classical (quasi) Monte Carlo designs cannot be practically used to explore configurations linked to low probabilities since they require too high computational budgets. Therefore, numerous sampling techniques have been proposed in the literature to diminish the computational cost while ensuring the precision of estimations. Most of them are based on choosing sequentially the elements of the design, by maximizing an expected gain in information at each step.

    This talk will provide a view of the most recent results when some geometric constraints can be detected or assumed on the limit (failure) state surface, as monotonicity or convexity. Deterministic (conservative) bounds on probabilities or quantiles can be produced and narrowed by successive runs, and they become almost sure bounds when the design is chosen stochastically, and a great attention is paid to the parallel production of consistent estimators. Furthermore, the consistent estimation of the failure state surface can be produced, at a known speed, by a combination of constrained Support Vector Machines (SVM).

    Beyond presenting such recent theoretical results, the talk will exhibit the good results obtained on several toy examples and real case-studies belonging to structural reliability engineering, and other possible applications in the more general field of applied decision theory.
  • Compressing an Ensemble with Statistical Models: An Algorithm for Global 3D Spatio-Temporal Temperature

    Authors: Stefano Castruccio (Newcastle University)
    Primary area of focus / application: Other: ASQ session
    Keywords: Space-time statistics, Climate model, Big data, Compression
    Submitted at 7-May-2016 14:29 by Stefano Castruccio
    13-Sep-2016 12:10 Compressing an Ensemble with Statistical Models: An Algorithm for Global 3D Spatio-Temporal Temperature
    One of the main challenges when working with modern climate model ensembles is the increasingly larger size of the data produced, and the consequent difficulty in storing large amounts of spatio-temporally resolved information. Many compression algorithms can be used to mitigate this problem, but since they are designed to compress generic scientific data sets, they do not account for the nature of climate model output and they compress only individual simulations. In this talk, I will propose a different, statistics-based approach that explicitly accounts for the space-time dependence of the data for annual global three dimensional temperature fields in an initial condition ensemble. The set of estimated parameters is small (compared to the data size) and can be regarded as a summary of the essential structure of the ensemble output; therefore, it can be used to instantaneously reproduce the temperature fields in an ensemble with a substantial saving in storage and time. The statistical model exploits the gridded geometry of the data and parallelization across processors. It is therefore computationally convenient and allows to fit a non-trivial model to a data set of one billion data points with a covariance matrix comprising of 10^18 entries.
  • Smart and Sustainable Cities: A Systemic Approach

    Authors: Alberto Pasanisi (EIFER), Andreas Koch (EIFER), Markus Peter (EIFER), Benoit Boutaud (EIFER)
    Primary area of focus / application: Other: smart cities
    Keywords: Smart city, Urban planning, Energy planning, Decision-aid
    Submitted at 19-May-2016 14:22 by Alberto Pasanisi
    14-Sep-2016 10:10 Smart and Sustainable Cities: A Systemic Approach
    Today, cities are facing a number of unprecedented challenges. In a worldwide context of growing urbanization of the population (2/3 of human beings are expected to live in urban areas by 2050), there is an increasing need for a sustainable planning that optimises the capital expenditure, is compliant with environmental and climate change related ambitious goals and meets the cross-disciplinary challenges of digitization, interleave of planning sectors, energy transition.
    Energy and water efficiency, appropriate waste management, local renewable energies, fast and reliable transportation means, limited air-pollution and noise, optimisation of synergies between industrial, tertiary and residential buildings, development of infrastructures are all pieces of a comprehensive strategy that shall not be considered separately but in a systemic approach.
    On the other hand, citizens want to be more and more involved in the ongoing transformations as "smart users" of their city, rather than simple inhabitants. The IT infrastructures allow to collect, exchange and analyse a huge quantity of data and open the way for new services and business opportunities.
    In this talk, by means of the analysis of a number of demonstrator projects, with no pretention of exhaustiveness, main challenges, issues and opportunity are highlighted, with a particular focus on energy-related questions.