Abstract—Computer centers such as NERSC and OLCF have traditionally focused on delivering computational capa- bility that enables breakthrough innovation in a wide range of science domains. Accessing that computational power has required services and tools to move the data from input and output to computation and storage. A “pivot to data” is occurring in HPC. Data transfer tools and services that were previously peripheral are becoming integral to scientific workflows. Emerging requirements from high-bandwidth de- tectors, high-throughput screening techniques, highly concur- rent simulations, increased focus on uncertainty quantification, and an emerging open-data policy posture toward published research are among the data-drivers shaping the networks, file systems, databases, and overall compute and data environment. In this paper we explain the pivot to data in HPC through user requirements and the changing resources provided by HPC with particular focus on data movement. For WAN data transfers we present the results of a study of network performance between centers.