A Digital Twin is a digital model of a unit consisting of an individual physical asset or process. It is developed using data from multiple sources to build a combined system capable of predicting an individual unit’s operation. The data sources used typically include design simulations, system models, test data, manufacturing and maintenance records, and operational or IoT data.
Building an accurate digital twin starts during the product development design and testing phases and requires engineering process data and simulations. This is combined with testing data to form a product specific digital model, an authoritative truth source, that represents the most up to date predictions of performance and other properties of interest. To create a Digital Twin, the more general model is updated with all relevant manufacturing and operation data for an individual unit.
Using an authoritative truth source prior to deployment will make engineering design, prototyping, and testing for complex systems faster, cheaper, and more effective. Once a unit has been deployed and is providing physical data, a digital twin is initiated and used to help maintain the functional operations of the unit throughout its lifecycle. The digital twin allows new analytics for tasks like health monitoring and predictive maintenance, evaluation and optimization of ongoing operations, and predicting and monitoring a unit’s behavior with respect to a larger system such as an aircraft or power grid. The Digital Twin represents a paradigm shift in engineering analytics and promises to yield many benefits beyond conventional methods.
Many companies in industries such as aerospace, manufacturing, and medical devices are quickly adopting methodologies of the digital revolution to innovate and enhance their product design development and manufacturing processes. These methodologies have expanded metrology with responsive digital measurement devices, yielding a “data rich” environment and advancing simulation capabilities. It is now possible to analyze complex multi-physics systems using data recorded from real systems in ways that were out of reach a decade ago. The Digital Twin has emerged as a way of using these novel capabilities to bring together physical and simulated information to deliver greater value from existing resources.
In order to provide real value, digital twins must be able to do three things: Integrate data from different sources, operate fast enough to run analytics as needed (i.e. real time or faster than real time), and provide a degree of confidence in predictions made using the digital twin. Confidence is particularly important with respect to knowing how much to trust a digital twin when using it in the decision making or control process for expensive pieces of critical hardware.
For a digital twin to be a full representation of the physical unit, it must combine data from a variety of sources including high-fidelity simulations, systems models, historical records, test data, and machine telemetry. Each of these can have differing levels of uncertainty, fidelity, and coverage of operating conditions yielding a patchwork of datasets covering all aspects of the system’s behavior.
SmartUQ helps by providing tools that can combine multiple pieces of information with different levels of trustworthiness into a unified prediction model. General behavioral trends may be taken from the plentiful low trust data with either a high level of uncertainty or a low fidelity data while the predictions are grounded in the more expensive and higher trust data.
This means that a digital twin must make predictions in real time for high value activities like performance monitoring, fault checking, real time wear assessment, and virtual sensors. The speed requirements are even more dramatic if operational optimization or system uncertainty is being considered: the digital twin must operate much faster than real time in order to make predictions for a large number of potentially hazardous scenarios.
Both real time and faster than real time operation can present difficulties for digital twin architectures involving either physics-based simulation or predictions dependent on large data sets. This is particularly true when the twin is being operated on embedded or edge computing resources.
In order to meet these requirements, SmartUQ uses advanced emulators, or surrogate models, to quickly map the input-output relationships of slow running pieces of the digital twin with a minimum amount of data. This process can reduce the amount of data and effort required to construct the digital twin, significantly reducing the cost of pieces like long running simulations, while maintaining a high level of fidelity relative to the original data. Once the emulators are in place and the accuracy is verified, the digital twin is ready for use in the accelerated predictions necessary for control decisions in real hardware.
At the core of all digital twin applications there is a question of trust: how accurate are the predictions made by a digital twin and where does that accuracy breakdown? No model can be completely faithful to the underlying physical system, so it is a question of where the model discrepancy lies in the design or operating space, how much the model deviates from the physical system, and how a given analysis will be affected. A digital twin can be a valuable asset when used for tasks in areas like design and maintenance analysis even with significant discrepancies that would render it useless, or even dangerous, for high value applications like real time control or operation optimization. Thus, it is crucial to characterize both the level of trustworthiness of the digital twin and the requirements of the application.
Uncertainty quantification connects the elements of the digital twin with quantitative metrics gauging model error, prediction accuracy, and validity across the system’s operating space. This is done through a series of calibration, validation, uncertainty propagation, and sensitivity analysis operations on the individual models and data sets that comprise the twin.
With SmartUQ, aspects of a model can be tuned to match a higher fidelity model or test data using statistical or Bayesian calibration. The calibrated models can then be validated using probabilistic techniques to account for the variation, noise, and errors found in the physical data being used. In addition to improving and testing the model’s accuracy, probabilistic calibration and validation methods generate input and response probability distributions that are important to assessing the uncertainty of the fully assembled digital twin. Once the individual models are in place the entire digital twin contains the connections and component UQ information required for SmartUQ to perform UQ tasks such as uncertainty propagation and sensitivity analysis.
SmartUQ’s engineering analytics tools play a significant role in combining data, accelerating prediction times, and providing confidence in digital twin prediction results. Our expertise and tools can help take digital twins from a collection of models to a fast, accurate, validated truth source. Contact us to discuss your digital project and requirements.