Electronic Structure Calculations for Energy Materials
Abstract
We use electronic structure calculations to understand, at the atomic level, the microscopic processes that affect how materials perform in energy applications. These processes include surface-catalyzed reactions and photon-to-chemical energy conversion. With density functional theory (DFT) and established codes like VASP and GPAW, we can predict how different materials, surfaces, and molecules behave under relevant conditions. These simulations help us learn about important factors such as reaction energetics, catalytic activity, and material stability, which are key for discovering and improving new functional materials. Most of these calculations will use large-scale CPU resources.
We will also use machine-learning methods to speed up the search for new materials. Recent progress in generative artificial intelligence, such as Wasserstein Autoencoders (WAE) and Transformer models, allows us to create new candidate materials and predict their properties using large datasets. WAEs can learn complex links between composition, structure, and properties, which helps with inverse materials design. Transformers are very good at finding structure–property relationships in large materials databases. Training these models works best with Graphics Processing Units (GPUs), which handle the large-scale calculations needed for modern machine learning.
By combining electronic structure calculations and machine-learning models, we create an integrated approach to materials discovery. AI models suggest new candidates, and first-principles calculations give us atomic-level understanding. To do this research efficiently, we plan to use both CPU and GPU resources on the Alvis system.