Abstract
Ferritic-martensitic (FM) steels for in-vessel components in proposed fusion reactors are expected to suffer from high levels of displacement damage and helium (He) generation by neutron transmutation. However, a thorough understanding of the He synergistic effects on the cavity swelling in FM steels is still not well established. To gain fundamental insights into the He effect on cavity swelling, high purity Fe and Fe-10 wt.% Cr ferritic model alloys were irradiated with 8 MeV Ni ions and co-implanted He ions at 400 to 550 °C up to 30 displacements per atom (dpa) with He implantation rates of 0.1, 10 and 50 appm He/dpa. The current study focuses on the 50 appm He/dpa, 500 °C behavior vs. 0.1 and 10 appm He/dpa. Irradiation-induced defects, including cavities, dislocation loops, and dislocation networks were characterized using transmission electron microscopy (TEM). In the grain interior, a bimodal cavity size distribution was observed in the 10 and 50 appm He/dpa samples, but not for 0.1 appm He/dpa. Cavity swelling was maximized at intermediate He implantation rates of ∼10 appm He/dpa for both ion-irradiated Fe and Fe-10Cr alloys. The cavity swelling behavior as a function of He implantation rate appears to be controlled by the He/dpa-dependent variation of cavity sink strengths. Treating small bubbles as biased sinks for interstitial absorption can significantly increase the ratio of biased to unbiased sink strengths (Q) and results in maximized cavity swelling for a Q ratio close to one.