September 12, 2018 – If you ask the staff and researchers at the Department of Energy’s Oak Ridge National Laboratory how they were first referred to the lab, you will get an extremely varied list of responses. Some may have come here as student interns, some grew up in the area and knew the lab by reputation, and others perhaps were attracted by a news article or a compelling lecture from an employee at a conference.
For David Green, a computational physicist in the Theory and Modeling group of the Fusion and Materials for Nuclear Systems Division, it was a fateful Internet search that pointed him in this direction.
“I literally Googled ‘supercomputing’ and ‘fusion’ and Oak Ridge was on the top of the list,” David said. “ORNL is unique in having world-leading programs in both on-site, and direct interaction between the two enables research we might not have considered otherwise.”
David came to ORNL as a postdoc from Newcastle University in Australia. He studied physics during his undergraduate years and worked for a semester in a professor’s lab, which engaged him more than the typical lectures and got him interested in research.
After completing his graduate work in near-Earth space physics, studying the environment of spacecraft and predicting space weather, he decided on a change, and looked for a field that provoked his curiosity more.
“I wanted to change fields into something where it would be easy to explain why it was important,” David said. “It wasn’t a huge stretch from space plasma physics to fusion plasma physics, and fusion’s importance is simple to explain. It’s one of those world-changing things where everybody wins if we get it to work.”
It was at this time that he also became involved with supercomputing, so ORNL was a natural fit for his research interests. Today, David works on fusion plasma simulations, creating new tools to model and understand the physics at the heart of fusion devices.
“Fusion is effectively trying to replicate the sun in a bottle,” David said. “The plasma can be more than 10 million degrees, and directly measuring what’s happening in such an extreme environment is very challenging.”
If measuring everything that’s happening in a physical experiment on a fusion plasma is next to impossible, then the goal for David and other computational physicists is to build a virtual bottle, one that is customizable and rich enough in information to be an acceptable simulacrum.
“If you have the underlying equations of the fusion plasma, which we do, and a sufficiently powerful computer, which we also have, you’d just be able to solve those equations and run simulated experiments where you know all of the variables everywhere at every time,” he said. “Obviously there is more to it than that, but the idea of accurate simulations which produce huge amounts of data that can be mined for insight in the same way you would a physical experiment is a compelling approach.”
Traditionally, fusion problems have been too large to solve in full, requiring them to be simplified and shrunk down to fit the capabilities of the computer, which limited the discovery of information. With the rise of modern high-performance computing and the new generation of supercomputers like Summit, it is becoming possible to solve fusion problems holistically, with added physics and more complex information to extract.
“We’re getting to the point now where computers are becoming large enough to maybe fit the full problem, so we’re actually developing new tools that will hopefully allow us to solve it all together,” David said.
Building a new code to tackle the entire fusion plasma is no easy task, especially when you are designing it for the world’s most powerful supercomputer and the systems beyond that. Machines like Summit can be daunting in size, but the potential for discovery has researchers like David ready to investigate how to best utilize them and to advance computational physics in the process.
“What I find most fascinating about my work is probably the scale of the tools we have to bring to bear on the problems we’re studying,” he said. “The computational tools are growing at a rate that almost exceeds our ability to use them.”
Gone are the days of writing code for the present generation of supercomputers and expecting that to work unchanged on the system you’ll have in five years, David said. Science applications used to have longevity, with some simulation and modeling programs still in use 10, 20 or even 30 years after their invention. Now, though, the breakneck rate at which our major computational resources are changing is driving computational physics to reassess how they approach problem solving and how to modify their field to keep up with the pace of innovation.
“The approach to creating simulation codes has changed a lot, even in the 10 years since I’ve been here. The scale is just so much larger,” David said. “And if we really want to get the most out of the new supercomputing platforms, we have to be able to adapt and change the way we do things even more.”
Summit, for instance, contains hardware specifically designed for artificial intelligence and machine learning applications. If a researcher utilizes Summit the way they have with previous supercomputers, they will still get an amazing new tool, David said, but if they can recast their problem in the language of machine learning, it makes that tool even more powerful, and opens up even more possibilities in terms of the research that can be done.
“Figuring out how to reframe the problems from the approaches we’ve traditionally taken into those that suit the tools we have is exciting,” he said. “It presents a challenge that, if you’re successful, presents the opportunity to make progress or add in more physics and hopefully make the scientific discoveries we are searching for.”
Unlocking the potential of these powerful machines requires more expertise than can be found in one person or research group. Given the complex nature of fusion modeling, having both supercomputing and fusion experts in the same facility enables collaborations that would be difficult in other institutions. It also allows David to stay informed of other research programs at the lab, where collaboration often results in saving significant amounts of time and effort.
“What I tend to do now is try to educate myself on all the different tools and talk to more people across disciplines,” he said. “I’ll go talk to the mathematicians, the computer scientists, and I find the more I do that, the more I realize that’s how we are going to solve the really challenging problems. You need to bring together the tools that a wide range of expertise can provide.”
Once a team overcomes the “language gap” between fields and establishes a common understanding, they can reap the benefits that only collaborative work can provide. Systems like Summit and the Exascale Computing Project are major scientific investments and whatever comes next promises to continue that pattern of growth and innovation, requiring those who work with them to broaden their scope as they expand into new scientific frontiers.
“The scale is much larger, so you have to think in teams. We are building teams of domain scientists, computer scientists, applied mathematicians and supercomputing hardware experts,” David said. “You need teams of people who can help you utilize the tools that will help you do the science you want to do.”
ORNL is managed by UT-Battelle for the Department of Energy’s Office of Science, the single largest supporter of basic research in the physical sciences in the United States. DOE’s Office of Science is working to address some of the most pressing challenges of our time. For more information, please visit http://science.energy.gov/.