Summary: Researchers have developed a brain-inspired AI technique that uses neural networks to model the challenging quantum states of molecules that are crucial to technologies such as solar panels and photocatalysis.
This new approach provides a significant improvement in accuracy, allowing for better prediction of molecular behavior during energy transitions. By improving our understanding of molecular excited states, this research has the potential to revolutionize materials prototyping and chemical synthesis.
Key Facts:
The neural network modeled molecular excited states with unprecedented accuracy — achieving five times greater accuracy for complex molecules than traditional methods — which could lead to the creation of computer-simulated materials and chemical prototypes.
Source: Imperial College London
New research using neural networks, a type of brain-inspired AI, proposes a solution to the difficult challenge of modeling molecular states.
The study shows how the technique can help solve fundamental equations for complex molecular systems.
This could have practical applications in the future, allowing researchers to use computer simulations to prototype new materials or chemical syntheses before building them in the lab.
The researchers developed a new mathematical approach and used it in a neural network they called Ferminet (Fermionic Neural Network), which was the first example of using deep learning to calculate the energies of atoms and molecules from basic principles that was accurate enough to be practical. Credit: Neuroscience News
The research, led by scientists from Imperial College London and Google DeepMind, was published today in the journal Science.
Excited molecules
The team investigated the problem of understanding how molecules transition into and out of “excited states. When a molecule or material is stimulated by a large amount of energy, such as by exposure to light or high temperature, its electrons are stimulated into a temporary new configuration called an excited state.
The precise amount of energy absorbed and released during molecular state transitions creates unique fingerprints for different molecules and materials, which influence the performance of technologies ranging from solar panels and LEDs to semiconductors and photocatalysts, and also play key roles in biological processes that involve light, such as photosynthesis and vision.
However, this fingerprint is extremely difficult to model because excited electrons are quantum in nature, and their location within a molecule can never be certain but can only be expressed as a probability.
Dr David Pfau, lead researcher at Google DeepMind and Imperial College's Department of Physics, said: “Representing the state of a quantum system is extremely difficult – you need to assign probabilities to every possible configuration of electron positions.”
“The space of all possible configurations is huge. If you try to represent it as a grid with 100 points along each dimension, the number of possible electronic configurations of a silicon atom is greater than the number of atoms in the universe. This is exactly where we thought deep neural networks would be useful.”
Neural Networks
The researchers developed a new mathematical approach and used it in a neural network they called FermiNet (Fermionic Neural Network), which is the first time deep learning has been used to calculate the energy of atoms and molecules from fundamental principles and with enough accuracy to be practical.
The team tested the technique on a range of examples with promising results: For a small but complex molecule called a carbon dimer, they achieved a mean absolute error (MAE) of 4 meV (millielectronvolts, a small unit of energy), close to five times the experimental result of the traditional gold standard method, which reaches 20 meV.
Dr Pfau said: “We tested our method on some of the most challenging systems in computational chemistry, where two electrons are excited simultaneously, and found that it is within about 0.1 eV of the most demanding and complex calculations ever done.”
“Today, we are open-sourcing our latest research findings and hope that the research community will build on our methods to explore unexpected ways in which matter and light interact.”
About this Artificial Intelligence (AI) Research News
Author: Hayley Dunning
Source: Imperial College London
Contact: Hayley Danning – Imperial College London
Image: This image is provided by Neuroscience News
Original research: Closed access.
“Accurate computation of quantum excited states using neural networks,” David Pfau et al., Science
Abstract
Accurate computation of quantum excited states using neural networks
introduction
Understanding the physics of how matter interacts with light requires accurately modeling the electronic excited states of quantum systems, which underpin the operation of photocatalysts, fluorescent dyes, quantum dots, light-emitting diodes (LEDs), lasers, solar cells, and more.
Existing quantum chemical methods for excited states are much less accurate than methods for ground states, and in some cases may be qualitatively inaccurate or require prior knowledge targeted to the specific state. Neural networks combined with variational Monte Carlo (VMC) have achieved excellent accuracy for ground-state wave functions of a variety of systems, including spin models, molecules, and condensed matter systems.
VMC has been used to study excited states, but traditional approaches have limitations that make them difficult or impossible to use with neural networks, and they often have a large number of free parameters that need to be tuned to get good results.
basis
We combine the flexibility of neural networks with mathematical insight to transform the problem of finding excited states of a system into a problem of finding the ground states of an extended system that can be addressed by standard VMC. We call this approach Naturally Excited State VMC (NES-VMC).
Linear independence of the excited states is automatically imposed through the functional form of the hypotheses: the energies and other observables of each excited state are obtained by diagonalizing a matrix of Hamiltonian expectations taken over the single-state hypotheses and can be accumulated at no extra cost.
Importantly, this approach has no free parameters to tune and no penalty terms are required to enforce orthogonalization. We explore the accuracy of our approach using two different neural network architectures: FermiNet and Psiformer.
result
We demonstrate this approach on benchmark systems ranging from individual atoms to benzene-sized molecules: the accuracy of NES-VMC closely matches experimental results for first-row atoms, and for a range of small molecules, it produces highly accurate energies and oscillator strengths that rival the best existing theoretical estimates.
We calculated potential energy curves for the lowest excited states of the carbon dimer and localized the states across the full bond length by analyzing symmetry and spin. The NES-VMC vertical excitation energies agreed with values obtained using the semi-stochastic heat bath configuration interaction (SHCI) method with high accuracy within chemical accuracy for all bond lengths, while the adiabatic excitations were on average within 4 meV of the experimental values, a factor of four improvement over SHCI.
In the case of ethylene, NES-VMC accurately described the conical intersection of the twisted molecule, in excellent agreement with high-precision multi-reference configuration interaction (MR-CI) results. We also considered five challenging systems with low-lying double excitations involving multiple molecules of the benzene scale.
In all systems where there is good agreement between methods for vertical excitation energies, Psiformer was within chemical accuracy between states, including butadiene, where even the ordering of certain states has been debated for decades. For tetrazine and cyclopentadienone, where state-of-the-art calculations from a few years ago were known to be inaccurate, NES-VMC results closely matched recent sophisticated diffusion Monte Carlo (DMC) and complete active space third-order perturbation theory (CASPT3) calculations.
Finally, considering the benzene molecule, NES-VMC combined with the Psiformer hypothesis agreed significantly better with the theoretical best estimates compared to other methods, including the neural network hypothesis using a penalty method, validating the mathematical accuracy of our approach and demonstrating that neural networks are capable of accurately representing molecular excited states at the current limits of computational approaches.
Conclusion
NES-VMC is a parameter-free and mathematically sound variational principle for excited states, which, when combined with neural network assumptions, yields remarkable accuracy on a wide range of benchmark problems. The development of an accurate VMC approach for excited states of quantum systems opens up many possibilities and significantly expands the scope of applicability of neural network wave functions.
Although we have considered only electronic excitations of molecular systems and the neural network hypothesis, NES-VMC is applicable to any quantum Hamiltonian and any hypothesis, enabling accurate computational studies that can advance our understanding of vibrational coupling, optical band gaps, atomic physics, and other challenging problems.