Filter News
Area of Research
- (-) Supercomputing (34)
- Advanced Manufacturing (3)
- Biology and Environment (29)
- Biology and Soft Matter (1)
- Clean Energy (35)
- Climate and Environmental Systems (1)
- Fusion and Fission (18)
- Fusion Energy (4)
- Isotopes (17)
- Materials (18)
- Materials for Computing (1)
- National Security (14)
- Neutron Science (5)
- Nuclear Science and Technology (17)
News Topics
- (-) 3-D Printing/Advanced Manufacturing (2)
- (-) Climate Change (12)
- (-) Cybersecurity (2)
- (-) Exascale Computing (14)
- (-) Frontier (14)
- (-) Isotopes (1)
- (-) Mathematics (1)
- (-) Nuclear Energy (2)
- (-) Space Exploration (1)
- Artificial Intelligence (21)
- Big Data (14)
- Bioenergy (3)
- Biology (6)
- Biomedical (7)
- Biotechnology (1)
- Buildings (2)
- Chemical Sciences (1)
- Computer Science (45)
- Coronavirus (7)
- Decarbonization (3)
- Energy Storage (1)
- Environment (13)
- Grid (1)
- High-Performance Computing (22)
- Machine Learning (7)
- Materials (4)
- Materials Science (8)
- Microscopy (2)
- Nanotechnology (5)
- National Security (3)
- Net Zero (1)
- Neutron Science (6)
- Physics (4)
- Quantum Computing (10)
- Quantum Science (10)
- Security (1)
- Simulation (11)
- Software (1)
- Summit (22)
- Sustainable Energy (3)
- Transportation (3)
Media Contacts
Outside the high-performance computing, or HPC, community, exascale may seem more like fodder for science fiction than a powerful tool for scientific research. Yet, when seen through the lens of real-world applications, exascale computing goes from ethereal concept to tangible reality with exceptional benefits.
Wildfires have shaped the environment for millennia, but they are increasing in frequency, range and intensity in response to a hotter climate. The phenomenon is being incorporated into high-resolution simulations of the Earth’s climate by scientists at the Department of Energy’s Oak Ridge National Laboratory, with a mission to better understand and predict environmental change.
As extreme weather devastates communities worldwide, scientists are using modeling and simulation to understand how climate change impacts the frequency and intensity of these events. Although long-term climate projections and models are important, they are less helpful for short-term prediction of extreme weather that may rapidly displace thousands of people or require emergency aid.
With the world’s first exascale supercomputer now fully open for scientific business, researchers can thank the early users who helped get the machine up to speed.
To support the development of a revolutionary new open fan engine architecture for the future of flight, GE Aerospace has run simulations using the world’s fastest supercomputer capable of crunching data in excess of exascale speed, or more than a quintillion calculations per second.
Simulations performed on the Summit supercomputer at ORNL revealed new insights into the role of turbulence in mixing fluids and could open new possibilities for projecting climate change and studying fluid dynamics.
At the National Center for Computational Sciences, Ashley Barker enjoys one of the least complicated–sounding job titles at ORNL: section head of operations. But within that seemingly ordinary designation lurks a multitude of demanding roles as she oversees the complete user experience for NCCS computer systems.
A trio of new and improved cosmological simulation codes was unveiled in a series of presentations at the annual April Meeting of the American Physical Society in Minneapolis.
Oak Ridge National Laboratory, in partnership with the National Oceanic and Atmospheric Administration, is launching a new supercomputer dedicated to climate science research. The new system is the fifth supercomputer to be installed and run by the National Climate-Computing Research Center at ORNL.
ORNL’s next major computing achievement could open a new universe of scientific possibilities accelerated by the primal forces at the heart of matter and energy.