Abstract
We provide a definition and explicit expressions for n-body Gaussian Process (GP) kernels which can learn any interatomic interaction occurring in a physical system, up to n-body contributions, for any value of n. The series is complete, as it can be shown that the "universal approximator" squared exponential kernel can be written as a sum of n-body kernels. These recipes enable the choice of optimally efficient force models for each target system, as confirmed by extensive testing on various materials. We furthermore describe how the n-body kernels can be "mapped" on equivalent representations that provide database-size-independent predictions and are thus crucially more efficient. We explicitly carry out this mapping procedure for the first non-trivial (3-body) kernel of the series, and show that this reproduces the GP-predicted forces with meV/A accuracy while being orders of magnitude faster. These results open the way to using novel force models (here named "M-FFs") that are computationally as fast as their corresponding standard parametrised n-body force fields, while retaining the nonparametric character, the ease of training and validation, and the accuracy of the best recently proposed machine learning potentials.
Original language | English |
---|---|
Pages (from-to) | 184307-1-184307-12 |
Journal | Physical Review B |
Volume | 97 |
Issue number | 184307 |
Early online date | 24 May 2018 |
DOIs | |
Publication status | Published - May 2018 |