September 10 - 8:00 AM to 9:30 AM - Room: 308
Presented by Dr. Mark Andrew, SmartUQ Technology Steward
Experienced practitioners who construct complex simulation models of critical systems know that replicating real-world performance is challenging due to uncertainties in found in simulation and physical tests. This course will discuss the types of uncertainties in the context of representing them with a design of experiments, constructing surrogate models and finally applying analytical methods to understand how sources of uncertainty impact replicating reality. This course will discuss the broad applications these probabilistic techniques have in analyzing numerous forms of engineering systems including Digital Thread/Digital Twins.
September 12 - 1:30 PM to 2:00 PM - Room: 304
Presented by Zack Graves, Application Engineer
This presentation illustrates both conceptual and practical applications of using Uncertainty Quantification (UQ) techniques to perform probabilistic analyses. The application of UQ techniques to the output from engineering analyses using model-based approaches is essential to providing critical decision-quality information at key decision points in a aerospace system’s life cycle. Approaches will be presented for the continued collection and application of UQ knowledge over each stage of a generalized life cycle framework covering system design, manufacture, and sustainment. The use of this approach allows engineers to quantify and reduce uncertainties systematically and provides decision makers with probabilistic assessments of performance, risk, and costs which are essential to critical decisions. As an illustration, a series of probabilistic analyses performed as part of the initial design of a turbine blade will be used to demonstrate the utility of UQ in identifying program risks and improving design quality. The application of UQ concepts to life cycle management will be addressed, highlighting the benefits to decision makers of having actionable engineering information throughout a system’s life cycle.
September 12 - 5:00 PM to 5:30 PM - Room: 308
Presented by Zack Graves, Application Engineer
Statistical calibration can reduce design cycle time and costs by ensuring that simulations are as close to reality as possible and by quantifying how close that really is. It optimizes tuning of model parameters to improve simulation accuracy, and estimates any remaining discrepancy which is useful for model diagnosis and validation. Because model discrepancy is assumed to exist in this framework, it enables robust calibration even for inaccurate models. The presentation will introduce the concepts and advantages of statistical calibration and model validation. Using an air foil CFD model case study, this presentation will walk through the step‐by‐step process of performing statistical calibration and quantification of the uncertainty of the final calibrated model. An analysis will be performed to compare results from a calibrated model and an uncalibrated model. The analysis will include optimization and sensitivity analysis. The results will illustrate the importance of calibrating a model before drawing design conclusions.