Upcoming Training Courses
Training is provided by developers and expert users from the SCALE team. Courses provide a review of theory, description of capabilities and limitations of the software, and hands-on experience running problems of varying levels of complexity.
All attendees must be licensed users of SCALE 6.2, which is available from ORNL/RSICC, the OECD/NEA Data Bank in France, and the RIST/NUCIS in Japan. All currently scheduled SCALE Courses are described below.
Course Dates and Descriptions
The next SCALE training block at Oak Ridge National Laboratory will be held October 15 – November 9, 2018. The course schedule will be announced soon.
SCALE Training Course Extended Descriptions
SCALE Criticality Safety Calculations Course
This course provides instruction on the use of the KENO Monte Carlo codes for criticality safety calculations and is appropriate for beginning through advanced users. KENO V.a is a fast and easy-to-use code that allows users to build complex geometry models using basic geometrical bodies such as cuboids, spheres, cylinders, hemispheres, and hemicylinders. KENO-VI is a 3-D generalized geometry Monte Carlo code that allows for versatile modeling of complex geometries. Both versions of KENO provide convenient, efficient methods for modeling repeated and nested geometry configurations such as lattices. Both versions of KENO use ENDF/B-VII.0 or ENDF/B-VII.1 cross-section data distributed with SCALE to perform either continuous energy (CE) or multigroup (MG) calculations. KENO includes a 2D color plotting capability and produces easy-to-navigate HTML output. This class uses the Fulcrum user interface for interactive model setup, visualization, computation, and output review. The KENO3D tool is still used in SCALE 6.2 for 3-D visualization. Instruction is also provided on the SCALE material input and resonance self-shielding capabilities and Fulcrum capabilities for visualizing fluxes, reaction rates, and cross-section data.
SCALE Criticality Safety and Radiation Shielding Course
This course provides instruction on the use of the KENO-VI Monte Carlo code for criticality safety calculations and the MAVRIC (Monaco with Automated Variance Reduction using Importance Calculations) shielding sequence with 3-D automated variance reduction for deep-penetration problems. KENO-VI is a 3D eigenvalue Monte Carlo code for criticality safety and Monaco is a 3D fixed-source Monte Carlo code for shielding analysis. Both codes use the SCALE Standard Composition Library and the SCALE Generalized Geometry Package (SGGP), which allows for versatile modeling of complex geometries and provides convenient, efficient methods for modeling repeated and nested geometry configurations such as lattices. The MAVRIC sequence is based on the CADIS (Consistent Adjoint Driven Importance Sampling) methodology. For a given tally in a Monte Carlo calculation that the users wants to optimize, the CADIS method uses the result of an adjoint calculation from the Denovo 3D deterministic code to create both an importance map for weight windows and a biased source distribution. MAVRIC is completely automated in that from a single user input, it creates the cross sections (forward and adjoint), computes the adjoint fluxes, creates the importance map and biased source, and then executes Monaco. An extension to the CADIS method using both forward and adjoint discrete ordinates calculations (FW-CADIS) is included in MAVRIC so that multiple point tallies or mesh tallies over large areas can be optimized (calculated with roughly the same relative uncertainty). Both KENO and Monaco use ENDF/B-VII.0 or ENDF/B-VII.1 cross-section data distributed with SCALE to perform continuous energy (CE) or multigroup (MG) calculations. Both codes can also be used with the Fulcrum consolidated SCALE user interface and KENO3D for interactive model setup, computation, output review, and 3-D visualization. Instruction is also provided on the SCALE material input and resonance self-shielding capabilities and the data visualization capabilities within Fulcrum for visualizing fluxes, reaction rates, and cross-section data as well as mesh tallies. KENO-VI and MAVRIC can be applied together to perform an integrated criticality accident alarm system (CAAS) analysis.
SCALE/ORIGEN Standalone Fuel Depletion, Activation, and Source Term Analysis Course
This is a hands-on class that covers the use of ORIGEN for isotopic depletion, decay, decay heat, and radiation source-terms calculations. The course features the use of the Fulcrum consolidated SCALE graphical interface and Fulcrum plotting capabilities for displaying nuclear data and results. The class includes solving activation, spent fuel, and nuclear safeguards and security analyses. This class provides an introduction to the ORIGAMI tool for convenient characterization of spent nuclear fuel with radially and axially varying burnup. Advanced applications including simulation of chemical processing, continuous feed and removal are also covered.
SCALE/Polaris Lattice Physics, Depletion, and Uncertainty Analysis
Polaris is a new 2-dimensional (2-D) lattice physics capability in the SCALE code system for LWR analysis. Polaris provides an easy-to-use input for defining lattice geometries, material compositions, and reactor state conditions. Other features of Polaris include a new resonance self-shielding implementation based on the novel embedded self-shielding method (ESSM), a new 2-D method of characteristics (MOC) neutron transport solver, and the integration of the ORIGEN depletion and decay solver for depleting material compositions. For the first three days of this five days course, attendees will learn how to model typical PWR and BWR assemblies (VVER currently not supported): develop geometry models, perform depletion simulations, setup branch and history calculations to generate few-group cross sections for full-core nodal diffusion analysis (.t16 file), and perform reflector calculations.
Sampler is a new uncertainty analysis capability in SCALE that propagates uncertainties in nuclear data and input parameters to estimate the resulting uncertainty in calculated responses for most codes and sequences within the SCALE code system. Using stochastic sampling to generate perturbed calculation models, Sampler can automate multiple runs (i.e. samples) of a calculation model and then post-process the outputs to quantify the uncertainty in user-selected quantities of interest. In the final two days of this course, attendees will learn how to use Sampler with Polaris to quantity the uncertainty in lattice physics quantities of interest (reactivity, nodal cross sections, isotopic inventories) from a broad range of input uncertainty sources (nuclear data, geometry, composition, and reactor condition).
Additional topics for this course include overviews of several modeling scenarios for Polaris (control blade, IFBA, spacer grid, thermal expansion, and detector modeling); how to generate depleted material composition data files (.f71 file) for subsequent use in ORIGEN calculations; and how to utilize Sampler/Polaris outputs for uncertainty analysis for full-core nodal calculations.
No prior knowledge of SCALE is required.
SCALE/TRITON Lattice Physics and Depletion
SCALE supports a wide range of reactor physics analysis capabilities. SCALE reactor physics calculations couple neutron transport calculations with ORIGEN to simulate the time-dependent transmutation of various materials of interest. TRITON is SCALE’s modular reactor physics sequence for a wide variety of system types. Attendees of this course will learn how to use TRITON for depletion analysis. The TRITON training material is centered around using the NEWT 2-D transport module for 2-D depletion analysis and briefly touches on 3-D depletion analysis. The course will instruct users on the use of KENO in place of NEWT for 3-D Monte Carlo-based depletion; however, KENO is not covered in depth within this course. Additional applications of TRITON are incorporated into the training, including the creation of ORIGEN libraries for rapid spent fuel characterization calculations, defining appropriate unit cell calculations of various reactor types for cross section processing, performing restart calculations, and performing uncertainty analysis of reactor physics calculations using Sampler.
SCALE Sensitivity and Uncertainty Analysis for Criticality Safety Assessment and Validation Course
Sensitivity and uncertainty analysis methods provide advanced techniques for code and data validation including the identification of appropriate experiments, detailed quantification of bias and bias uncertainty, identification of gaps in available experiments, and the design of new experiments. The Sampler sequence within SCALE provides a flexible tool for quantifying uncertainties due to manufacturing tolerances as well as composition and dimensional uncertainties in criticality safety assessments. This 5-day training class provides a foundation on sensitivity and uncertainty analysis and applies these methods to criticality safety validation applications, as well as instruction on the use of Sampler for uncertainty quantification. Topics covered include:
- The TSUNAMI sensitivity and uncertainty analysis techniques for determining the sensitivity of the k-eff eigenvalue to cross section uncertainties using both multigroup and continuous-energy physics.
- SCALE's comprehensive cross section covariance data library, which is applied to these sensitivity coefficients to estimate the data-induced uncertainty in k-eff.
- The TSUNAMI-IP code, which determines the correlation between benchmark and application systems in terms of their shared sources of data-induced uncertainty.
- The USLSTATS trending analysis tool, which uses similarity coefficients from TSUNAMI-IP (among other parameters) to estimate the computational bias and bias uncertainty for design and licensing applications.
- The TSURFER data adjustment tool, which uses generalized linear least squares to adjust nuclear data parameters to minimize discrepancies between computed predictions and the results of integral experiments; these adjustments can then be used to estimate bias and bias uncertainty in design and licensing applications.
- The SAMPLER code for uncertainty assessment, which randomly samples nuclear data and/or system compositions and dimensions to quantify the uncertainty in system k-eff.
This course will cover the theoretical basis for these analysis techniques and will also conduct exercises for attendees to familiarize themselves with these tools. It is recommended that attendees are familiar with the KENO Monte Carlo code or are experienced SCALE users, although these are not necessary prerequisites.
Sensitivity/Uncertainty Analysis and Uncertainty Quantification in Reactor Physics Calculations
In this updated class, participants will learn to apply the sensitivity/uncertainty analysis (SA) and uncertainty quantification (UQ) capabilities in SCALE, focusing on two approaches: 1) perturbation theory-based TSUNAMI sequences to perform nuclear data SA and UQ for eigenvalue and reaction rates using 1D, 2D and 3D tools, including multigroup and new CE Monte Carlo capabilities; and 2) stochastic sampling-based UQ analysis using the new Sampler super-sequence to perform UQ for any computed parameter with respect to uncertainties in many input quantities including nuclear data, dimensions, densities, temperatures, etc. Training will include workshop problems analyzing a variety of different systems including LWR (both UO2 and MOX fuel), HTGR, and fast systems.
Source Terms and Radiation Shielding for Spent Fuel Transportation and Storage Applications
One of the unique features of the SCALE code system is the flexibility of assembling different SCALE codes or sequences to solve complex problems. Transportation and storage of spent fuel require a computational tool set to characterize both the spent fuel source terms and the doses for containers used to transport or store the fuel. Spent fuel is a complex neutron and photon source that can be well characterized using the ORIGEN code in SCALE. Additionally, ORIGEN can be used to characterize the radioactive sources resulting from activation of non-fissile materials and components in a nuclear reactor, such as the pressure vessel. The variety of source terms generated with ORIGEN can be used for shielding analyses with the MAVRIC sequence. MAVRIC can estimate particle fluxes and dose rates outside of containers, to ensure that the safety requirements for transportation, storage and ultimate disposal of spent fuel or activated materials are met.
This one-week course will first cover the use of ORIGEN for isotopic depletion, decay and radiation source-terms calculation, generation of ORIGEN activation libraries, and the use of the ORIGAMI tool for quick calculation of spent fuel sources. The next part of the course will focus on MAVRIC, including: building complex 3D models (materials and compositions); using a connection to the ORIGEN libraries to model simple radioactive sources; importing complex ORIGEN sources; and, calculating neutron fluxes to create ORIGEN activation libraries. Additionally, the advanced variance reduction tools for deep penetration problems, CADIS and FW-CADIS, that are the foundation of MAVRIC will be covered. This class uses the Fulcrum user interface for interactive model setup, visualization, computation, and output review.
Previous experience with the SCALE/KENO-VI geometry is required.
SCALE Computational Methods for Burnup Credit
This course describes the use of SCALE tools to meet the requirements of NRC Interim Staff Guidance 8 Rev. 3 for the use of actinide and fission product burnup credit. The course reviews the depletion capabilities of TRITON, details basic and advanced burnup credit criticality safety calculations with STARBUCS/KENO, and describes the validation requirements for k-eff and isotopic composition calculations, including uncertainty analysis. Applications of the ORIGAMI tool for convenient characterization of spent nuclear fuel with radially and axially varying burnup to burnup credit are also introduced. Previous experience with SCALE is recommended.
Slide Rule for Nuclear Criticality Accident Response
Oak Ridge National Laboratory has developed rapid "in-hand" and electronic methods for estimating pertinent information needed to guide response team actions and help characterize some types of nuclear criticality accidents. The concept uses a series of sliding graphs that function similarly to a slide rule. This tool, which was developed with the promise that visual demonstration of trends (e.g., dose versus time or distance), is helpful to response personnel. The hand-held version provides rapid assessments for direct radiation approximations. The electronic version is useful for solving for parameters that are dependent upon independent specific parameters such as variable shielding, distances, and anticipated time related radiation doses to personnel. The course includes a review of the technical bases for the Slide Rule as well as instructions in the appropriate use of the hand-held and electronic Slide Rules. The Slide Rule is documented in NUREG/CR-6504 (ORNL/TM-13322) and is available from RSICC as CCC-704.