Skip to main content
Research Highlight

Integrating Machine Learning with Microscope Control using INTERSECT

Achievement: A web-based GUI (Graphical User Interface) for INTERSECT has been created which allows a user to configure an experiment on an electron microscope, setting such parameters as maximum number of steps for the machine learning algorithm to perform. The experiment is then submitted from the GUI to the experiment controller microservice, which sends initial commands to a machine learning microservice and a Nion Swift microservice. The Nion Swift microservice sends several commands to a Nion electron microscope or digital twin. The results of these initial measurements are forwarded through the experiment controller to the machine learning microservice, storing the data in a MinIO server so that endpoints that don’t need to read the large amounts of data have only a short identification number sent to them. The machine learning algorithm then determines the next point to measure based on maximal projected information gain and sends this new measurement to the experiment controller. The process continues in this cycle until the experiment is complete. 

Significance and Impact: The INTERSECT project aims to create software capable of managing scientific workflows from the specification of experiments to the communication between control machines to the physical devices which will perform them. The INTERSECT SDK (software development kit) has now been successfully used to present a GUI over the web which can send an INTERSECT message to INTERSECT microservices which then calculate an action for the microscope to take. This action is then sent to Nion Swift, a program for controlling Nion electron microscopes. The results of the electron microscope’s measurement(s) are then sent back to the machine learning algorithm, and the cycle repeats until the experiment is completed. This demonstrates INTERSECT’s capacity to manage experiment control and communication with scientific devices as well as allowing machine learning based control of instruments.

Research Details

  • A web GUI was created and deployed allowing for the creation and configuration of experiments.
  • A series of INTERSECT services and microservices were created which support managing the communication for the experimental workflow.
  • A machine learning program was developed to choose the next step in a series of experimental measurements to efficiently gain data from samples in electron microscopes in a reasonably small number of measurements.
  • An adapter which allows INTERSECT messages to perform actions on an electron microscope or a digital twin of an electron microscope using Nion Swift was created.

Sponsor/Funding: LDRD

PI and affiliation: Addi Malviya, Software Engineering Group, Computer Science and Mathematics Division, ORNL 

Team: Addi Malviya, M. Abraham, R. Archibald, B. Chance, L. Drane, J. Hetrick, S. Hitefield, M. McDonnell, J McGaha, B. Mintz, C. Nguyen, K. Roccapriore, R. Smith, S. Yakubov, G. Watson, M. Wolf, M. Ziatdinov (ORNL)

Integrating Machine Learning with Microscope Control using INTERSECT
A diagram of the components of INTERSECT. The Nion Swift adapter is an example of an Instrument Adapter in the upper left. The web GUI (Graphical User Interface) falls under the User category in the lower left. The rest of the system, the experiment controller, machine learning computation, and data storage, fall under the heading on the right.

Summary: Providing a standardized system for the running of scientific workflows on a variety of instruments offers a greater capacity to run automated experiments. The INTERSECT project aims to create an SDK (Software Development Kit) to allow users to easily create programs to run a variety of instruments and access computational resources within a single framework, alongside digital twins which can be used for testing before requiring time on the actual hardware. INTERSECT has been used to create one such test case: using a machine learning algorithm to control an electron microscope experiment by selecting the next location on the sample to measure. A web-based GUI has also been created to allow users to define and start these experiments.