TY - JOUR
T1 - Tensor-Reduced Atomic Density Representations
AU - Darby, James P.
AU - Kovács, Dávid P.
AU - Batatia, Ilyes
AU - Caro, Miguel A.
AU - Hart, Gus L.W.
AU - Ortner, Christoph
AU - Csányi, Gábor
N1 - Funding Information:
D. P. K. acknowledges support from AstraZeneca and the EPSRC. G. C. acknowledges discussion with Boris Kozinsky. J. P. D. and G. C. acknowledge support from the NOMAD Centre of Excellence, funded by the European Commission under Grant Agreement No. 951786. We used computational resources of the UK HPC service ARCHER2 via the UKCP consortium and funded by EPSRC Grant No. EP/P022596/1. C. O. acknowledges support of the Natural Sciences and Engineering Research Council [Discovery Grant No. IDGR019381] and the New Frontiers in Research Fund [Exploration Grant No. GR022937].
Publisher Copyright:
© 2023 authors. Published by the American Physical Society. Published by the American Physical Society under the terms of the "https://creativecommons.org/licenses/by/4.0/"Creative Commons Attribution 4.0 International license. Further distribution of this work must maintain attribution to the author(s) and the published article's title, journal citation, and DOI.
PY - 2023/7/13
Y1 - 2023/7/13
N2 - Density-based representations of atomic environments that are invariant under Euclidean symmetries have become a widely used tool in the machine learning of interatomic potentials, broader data-driven atomistic modeling, and the visualization and analysis of material datasets. The standard mechanism used to incorporate chemical element information is to create separate densities for each element and form tensor products between them. This leads to a steep scaling in the size of the representation as the number of elements increases. Graph neural networks, which do not explicitly use density representations, escape this scaling by mapping the chemical element information into a fixed dimensional space in a learnable way. By exploiting symmetry, we recast this approach as tensor factorization of the standard neighbour-density-based descriptors and, using a new notation, identify connections to existing compression algorithms. In doing so, we form compact tensor-reduced representation of the local atomic environment whose size does not depend on the number of chemical elements, is systematically convergable, and therefore remains applicable to a wide range of data analysis and regression tasks.
AB - Density-based representations of atomic environments that are invariant under Euclidean symmetries have become a widely used tool in the machine learning of interatomic potentials, broader data-driven atomistic modeling, and the visualization and analysis of material datasets. The standard mechanism used to incorporate chemical element information is to create separate densities for each element and form tensor products between them. This leads to a steep scaling in the size of the representation as the number of elements increases. Graph neural networks, which do not explicitly use density representations, escape this scaling by mapping the chemical element information into a fixed dimensional space in a learnable way. By exploiting symmetry, we recast this approach as tensor factorization of the standard neighbour-density-based descriptors and, using a new notation, identify connections to existing compression algorithms. In doing so, we form compact tensor-reduced representation of the local atomic environment whose size does not depend on the number of chemical elements, is systematically convergable, and therefore remains applicable to a wide range of data analysis and regression tasks.
UR - http://www.scopus.com/inward/record.url?scp=85164957493&partnerID=8YFLogxK
U2 - 10.1103/PhysRevLett.131.028001
DO - 10.1103/PhysRevLett.131.028001
M3 - Article
C2 - 37505943
AN - SCOPUS:85164957493
SN - 0031-9007
VL - 131
JO - Physical Review Letters
JF - Physical Review Letters
IS - 2
M1 - 028001
ER -