Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
144 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Boosting quantum machine learning models with multi-level combination technique: Pople diagrams revisited (1808.02799v2)

Published 8 Aug 2018 in physics.chem-ph

Abstract: Inspired by Pople diagrams popular in quantum chemistry, we introduce a hierarchical scheme, based on the multi-level combination (C) technique, to combine various levels of approximations made when calculating molecular energies within quantum chemistry. When combined with quantum machine learning (QML) models, the resulting CQML model is a generalized unified recursive kernel ridge regression which exploits correlations implicitly encoded in training data comprised of multiple levels in multiple dimensions. Here, we have investigated up to three dimensions: Chemical space, basis set, and electron correlation treatment. Numerical results have been obtained for atomization energies of a set of $\sim$7'000 organic molecules with up to 7 atoms (not counting hydrogens) containing CHONFClS, as well as for $\sim$6'000 constitutional isomers of C$7$H${10}$O$_2$. CQML learning curves for atomization energies suggest a dramatic reduction in necessary training samples calculated with the most accurate and costly method. In order to generate milli-second estimates of CCSD(T)/cc-pvdz atomization energies with prediction errors reaching chemical accuracy ($\sim$1 kcal/mol), the CQML model requires only $\sim$100 training instances at CCSD(T)/cc-pvdz level, rather than thousands within conventional QML, while more training molecules are required at lower levels. Our results suggest a possibly favourable trade-off between various hierarchical approximations whose computational cost scales differently with electron number.

Summary

We haven't generated a summary for this paper yet.