Research Horizons: Unraveling a Twister
University of Oklahoma researchers use supercomputing to detail the inner workings of tornadoes.
Without a host, a virus is a dormant package of proteins, genetic material and occasional lipids. Once inside a living cell, however, a virus can latch onto cell parts and spring into action—mutating, replicating and spreading into new cells.
Anyone from Nebraska to Nashville knows a tornado when they see one. And, hopefully, they know to duck for cover. Tornadoes are among nature's most powerful weather weapons.
Just last month, in a span of 24 hours beginning on April 27th, the southeastern U.S. saw a rare outbreak of tornadoes that resulted in a combined 344 deaths, according to estimates by the National Weather Service and the National Oceanic and Atmospheric Administration. Not since 1936 have more people been killed in a two-day period.
Despite their prevalence in a region of the central United States known as "tornado alley," there is still much we don't know about these much-feared funnels from the sky.
For starters, gathering any sort of data from actual tornadoes is risky business, with chasers physically following storms into the heart of harm's way. These chasers might witness a handful of tornados a year, and their mobile radar systems only measure certain variables, such as wind velocity and intensity of precipitation. To truly understand tornadoes, or to predict them, researchers need data that is currently unavailable, such as pressure readings and an understanding of the storms' three-dimensional wind structure. For that, they need far more tornadoes than the atmosphere produces.
"I don't need three, I need three hundred," says Amy McGovern, an assistant professor in the School of Computer Science at the University of Oklahoma, located in the heart of tornado alley. McGovern is also the principal investigator of a project that is using the University of Tennessee's Kraken supercomputer to better understand, and hopefully one day predict, tornadoes.
In order to do that, McGovern's team uses data from on-the-ground observations and monitoring systems to create a set of variables that describe conditions which may, or may not, create a tornado.
The research is funded by the National Science Foundation's Faculty Early Career Development Program, which offers the NSF's most prestigious awards in support of junior faculty who exemplify the role of teacher-scholars through outstanding research, excellent education and the integration of education and research within the context of the mission of their organizations.
The roots of the tornado project go back five years. Back then, says McGovern, the team used the observational data from a 20-year-old storm and tweaked a few environmental variables to create more than 250 simulated storms, at a 500-meter resolution. Measurements were taken for 40 variables every 500 meters.
The team quickly realized that a higher resolution was needed to achieve the accuracy they sought. Thanks to supercomputers such as Kraken, this enhanced resolution is now possible. Funded by the National Science Foundation and managed by the University of Tennessee's National Institute for Computational Sciences, Kraken is a Cray XT5 supercomputer housed at Oak Ridge National Laboratory.
McGovern's team is now generating 150 75-meter resolution possible tornado-precursor storms, with each simulation creating two to three storms and consuming 30 hours and 3,000 of Kraken's more than 112,000 cores. Of these, says McGovern, approximately 50 to 75 of the storms will produce tornadoes, supplying researchers with a sample sufficient to unravel the mysteries of one of Mother Nature's most common terrors.
These simulations delve into the most complex players in tornadic storms, such as rotating updrafts (upward moving currents of air that are tilted and rotating), downdrafts, vorticity (a measure of the instantaneous spin), tilt (how much horizontal vorticity has tilted toward the vertical) and the various relationships among these factors. "The important thing," says McGovern, "is understanding how these variables interact." If a storm does in fact generate a tornado, the team begins the process of "relational" data mining. Whereas in the past these variables have been studied individually, McGovern's relational approach studies the relationships among these variables—more than 40 to be exact—of which 20 are intensely examined. In other words, the team is not looking at individual factors, but how they change over space in time.
Data mining is necessary because each simulation generates approximately a terabyte of data, far too much information to investigate traditionally. For example, while an updraft is just one of the variables being studied, the team will investigate all the variables inside the updraft, such as the pressure gradient and tilt of the updraft itself. With simulations this complex at multiple space and time scales, the amount of data generated is insurmountable without the help of supercomputers to quickly locate important figures in a sea of numbers.
While the simulations are being performed on Kraken, the majority of the data mining is being performed on Nautilus, an SGI Altix UV 1000 system that serves as the centerpiece of UT 's new Remote Data Analysis and Visualization Center, which is also located at Oak Ridge National Laboratory. Nautilus's unique architecture provides an excellent platform for relational data mining. "Nautilus is fabulous," says McGovern, adding that the innovative system allowed her team to do three months of work in approximately 12 hours.
Overall, the team hopes their work will significantly reduce the false alarm rate for tornado warnings, currently about 75 percent, and increase warning lead time, currently around 12–14 minutes. "If we can change our understanding of how tornadoes form," says McGovern, "then, hopefully, that will lead to better prediction algorithms."
For example, if the team's simulations reveal that a certain set of storm conditions usually causes an F5 tornado (among the most powerful tornadoes), then perhaps observers on the ground could watch for those conditions in actual storms. Even if those conditions cause an F5 only half of the time, says McGovern, sounding a warning might save lives.
So far, the team has generated 30 of the planned 150 simulations. With Kraken's recent upgrade to 1.17 petaflops, the team should be able to forge ahead even faster than before. But tornadoes are just the tip of the iceberg when it comes to the mining algorithms developed and employed by McGovern's team. They could be used in other fields of science, such as other instances of atmospheric turbulence across the U.S., or even robotics.
For now the team will continue to analyze the enormous volumes of data from their tornado simulations, providing the scientific community with a new understanding of twisters and enabling enhanced predictive capability that could give everyone from Nashville to Nebraska a little more time to duck for cover.—Gregory Scott Jones.