Skip to main content
SHARE
News

Super construction at super scale: How ORNL built a new home for Frontier

Staff working on construction and facility updates in preparation for the Frontier, the world’s first exascale supercomputer. Credit: Carlos Jones/ORNL, U.S. Dept. of Energy

Making room for the world’s first exascale supercomputer took some supersized renovations.

Frontier’s 74 cabinets cover more than 7,300 square feet in the Oak Ridge Leadership Computing Facility’s data center located at the Department of Energy’s Oak Ridge National Laboratory. That’s a space roughly 1,700 square feet larger than that occupied by its high-speed predecessor Summit and more than three times the size of the average American home for a machine that runs on 30 million watts of electricity.

Those numbers don’t include the infrastructure necessary to support Frontier, from the 90-plus miles of cable connecting its cabinets to the million pounds of cooling pipes that keep the machine from overheating to the 50 cabinets of Orion, the file system where users store data from the world-changing studies run on Frontier.

If that’s not enough of a challenge, renovations had to begin before Frontier was built and in the middle of a pandemic and global shutdown.

“Part of the ordeal of this kind of project is you’re building a facility for a machine that doesn’t yet exist,” said Bart Hammontree, an ORNL project manager who oversaw the renovations. “This isn’t a computer you pick out of a catalog where you can just look up the dimensions. It’s the first machine of its kind, and nobody knew at the beginning exactly what Frontier would look like.”

The work began by bulldozing existing construction and breaking up concrete for Frontier’s new cooling plant. A robotic hammer knocked down the tallest wall, starting 50 feet up and working its way down.

Once demolition ended, work began to build the new cooling towers and the elaborate pipe system that would carry water to the data center.

“We had over a million pounds of equipment and piping, so this was a real structural engineering challenge,” Hammontree said. “This is a system that’s running anywhere from 6,000 to10,000 gallons a minute over the roof to the computer center to keep Frontier running. We had a structural steel frame called a jungle gym to support all the pipes, but it had to be built piece by piece in the right sequence, because those pipes won’t move once they’re in it.”

After the cooling plant and pipe system came the main chamber of the data center, Frontier’s new home. The supercomputer’s 74 cabinets each weigh in at 4 tons, heavier than two full-size pickups apiece and beyond the strength of most floors.

“In order to support that kind of weight, you basically have to transfer the load down to the bedrock below the building,” Hammontree said. “So we had to pour concrete pads that would sit on foundations straight down to the rock. This was inside an existing building, so we can’t just come in and start pouring. We had to lay these micropiles where we got a small drill rig in and poured concrete around each foundation column to transfer that weight.”

The construction team navigated their way around 75 years’ worth of existing pipes, electrical lines and other utilities as they laid the foundation, often with no original blueprints or other guides.

“We had to scan with ground-penetrating radar and electromagnetic wands, looking for those magnetic fields a live circuit would give off,” Hammontree said. “We’d lock out all the power in the area and sometimes still find a live signal. It took a lot of detective work, but we made safety the top priority and never took a shortcut.”

Another enemy — gravity — nearly shut down work. The team had finished installing the pipe system when a joint on one of the 24-inch plastic pipes burst and began pouring water during a pressure test.

“The whole building shook from that one,” Hammontree said. “That was a bad day. Nobody got hurt, though. We just got the roof wet. We were able to identify where the welding failed, and within three weeks we had the pipe full of water again and were back on schedule.”

Through the entire project, crews followed strict safety protocols to avoid spreading COVID-19. A single sick worker could infect enough crewmates to delay work by weeks.

“In the end, that was probably the biggest challenge of all,” Hammontree said. “We had daily meetings to reshuffle work plans and just keep everything going to finish on time. The great thing about an experienced team like ours is we don’t get surprised often, even during a situation like the pandemic. We see the problems, learn from them and apply those lessons as we go.”

The Oak Ridge Leadership Computing Facility is a DOE Office of Science user facility located at ORNL.

UT-Battelle manages ORNL for the Department of Energy’s Office of Science, the single largest supporter of basic research in the physical sciences in the United States. The Office of Science is working to address some of the most pressing challenges of our time. For more information, please visit energy.gov/science. – Matt Lakin

Read more: Super speeds for super AI: Frontier sets new pace for artificial intelligence

Interactive News: The Journey to Frontier: The Story of How the Exascale Era Began

Documentary Video: The Journey to Frontier