Machine learning for experimental design of ultrafast electron diffraction

Mohammad Shaaban, Sami El-Borgi, Aravind Krishnamoorthy*

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

Abstract

Ultrafast electron diffraction (UED) experiments can extract insights into material behavior at ultrafast timescales but are limited by the manual analysis required to process several gigabytes of diffraction pattern data. The lack of real-time data prevents in situ tuning of experimental parameters toward desirable material dynamics or avoid sample damage. We demonstrate that machine learning methods based on Convolutional Neural Networks trained on synthetic and experimental diffraction patterns can perform real-time analysis of diffraction data to resolve dynamical processes in a representative material,, and identify signs of material damage. By building on CNN’s ability to learn compressed representations of diffraction patterns that map to distinct material dynamics, we construct Convolutional Variational Autoencoder models to track structural phase transformation in a model material system through the time trajectory of UED images in the low-dimensional latent space. Such models enable real-time steering of experimental parameters towards conditions that realize phase transformations or other desirable outcomes by mapping experimental conditions to distinct regions of the latent space. These examples show the ability of machine learning to design self-correcting diffraction experiments to optimize the use of large-scale user facilities.

Original languageEnglish
Article number23059
JournalScientific Reports
Volume15
Issue number1
DOIs
Publication statusPublished - Dec 2025

Keywords

  • Experimental design
  • Machine learning
  • Self-supervised learning
  • Ultrafast electron diffraction
  • Variational autoencoder

Fingerprint

Dive into the research topics of 'Machine learning for experimental design of ultrafast electron diffraction'. Together they form a unique fingerprint.

Cite this