Abstract
The performance of metallic fuel alloys, U-10Mo and U-17Mo, was examined using the MiniFuel test system in the High Flux Isotope Reactor. Approximately 0.8 mm thick disks were irradiated at target temperatures of 250°C, 350°C, 450°C, and 500°C up to three different fission densities, culminating at a maximum fission density of 6.8×1020 cm−3. Fission rates decayed from 3 – 6 × 1013 cm−3s−1 to 2 – 4 × 1013 cm−3s−1 over the course of the longest, eight cycle, irradiation as the 235U was consumed and the 239Pu concentration accumulated. Actual irradiation temperature on the last day of irradiation was measured via dilatometry using the SiC passive thermometry recovered from the MiniFuel subcapsules and compared favorably with the thermal calculations using as-built geometry and test conditions. Furthermore, average simulated temperatures were within 50°C of the target temperatures, except for the 500°C irradiation, for which the temperature variation was 62°C, 32°C, and 90°C in the in two, four, and eight cycle irradiations, respectively. Fission gas release (FGR) measurements showed no release above recoil for any U-17Mo fuels or for the U-10Mo fuel irradiated at 250°C. Significant (40%–80%) FGR was found for the medium- to highest-burnup U-10Mo samples irradiated at target temperatures 350°C–500°C. Significant FGR correlated with sample thickness swelling, which was as high as 13%–35% for high-release samples and below 7% for all other (low–gas release) samples.