Skip to main content
SHARE
Research Highlight

Time Managed Virtualization for Simulating Systems of Systems

A simulation testbed for building control systems.  CSED Computational Sciences and Engineering ORNL
A simulation testbed for building control systems: Every agent is a software system hosted by ORNL’s time managed virtualization simulation technology, which interacts with simulated building physics models and RESTFUL web API’s also hosted in the time managed virtualization technology. This approach permitted the operational control system to be tested using 100 homes for a full day of operation – without requiring any physical buildings.

The Science
Researchers at ORNL have created a unique simulation technology that allows software systems to participate in slower than real time simulation exercises, and to accomplish this without requiring recompilation of source code, relinking of object files, or modifications to operating systems software. The software participates in the simulation while running on its native operating system and interacting with the simulated world via its native interfaces.

The Impact
For the first time, it is possible to assess the performance of complex, software intensive systems without compromising on the fidelity of simulation models for physical assets or the need to create expensive simulation models of software systems. Because the software assets exist as they are in the real world, but within a simulated physical environment, it becomes possible to perform system studies that would otherwise be prohibited by high costs, risks of harm, or both; for example, cyberattacks against critical infrastructure.

Funding: Missile Defense Agency

Publication for this work:
Ozmen, O.; Nutaro, J.; Starke, M.; Munk, J.; Roberts, L.; Kou, X.; Im, P.; Dong, J.; Li, F.; Kuruganti, T.; Zandi, H. Power Grid Simulation Testbed for Transactive Energy Management Systems. Sustainability 2020, 12, 4402. https://doi.org/10.3390/su12114402

Summary

As the growth of data sizes continues to outpace computational resources, there is a pressing need for data reduction techniques that can significantly reduce the amount of data and quantify the error incurred in compression. Compressing scientific data presents many challenges for reduction techniques since it is often on non-uniform or unstructured meshes, is from a high-dimensional space, and has many Quantities of Interests (QoIs) that need to be preserved. To illustrate these challenges, we focus on data from a large-scale fusion code, XGC. XGC uses a Particle-In-Cell (PIC) technique which generates hundreds of PetaBytes (PBs) of data a day, from thousands of timesteps. XGC uses an unstructured mesh, and needs to compute many QoIs from the raw data, f.

One critical aspect of the reduction is that we need to ensure that QoIs derived from the data (density, temperature, flux surface averaged momentums, etc.) maintain a relative high accuracy. We show that by compressing XGC data on the high-dimensional, nonuniform grid on which the data is defined, and adaptively quantizing the decomposed coefficients based on the characteristics of the QoIs, the compression ratios at various error tolerances obtained using a multilevel compressor (MGARD) increases more than ten times. We then present how to mathematically guarantee that the accuracy of the QoIs computed from the reduced f is persevered during the compression. We show that the error in the XGC density can be kept under a user-specified tolerance over 1000 timesteps of simulation using the mathematical QoI error control theory of MGARD, whereas traditional error control on the data to be reduced does not guarantee the accuracy of the QoIs.