Machine-learned regression models represent a promising tool to implement accurate and computationally affordable energy-density functionals to solve quantum many-body problems via density functional theory. However, while they can easily be trained to accurately map ground-state density profiles to the corresponding energies, their functional derivatives often turn out to be too noisy, leading to instabilities in self-consistent iterations and in gradient-based searches of the ground-state density profile. We investigate how these instabilities occur when standard deep neural networks are adopted as regression models, and we show how to avoid them by using an ad hoc convolutional architecture featuring an interchannel averaging layer. The main testbed we consider is a realistic model for noninteracting atoms in optical speckle disorder. With the interchannel average, accurate and systematically improvable ground-state energies and density profiles are obtained via gradient-descent optimization, without instabilities nor violations of the variational principle.

Deep-learning density functionals for gradient descent optimization

Pilati, S
2022-01-01

Abstract

Machine-learned regression models represent a promising tool to implement accurate and computationally affordable energy-density functionals to solve quantum many-body problems via density functional theory. However, while they can easily be trained to accurately map ground-state density profiles to the corresponding energies, their functional derivatives often turn out to be too noisy, leading to instabilities in self-consistent iterations and in gradient-based searches of the ground-state density profile. We investigate how these instabilities occur when standard deep neural networks are adopted as regression models, and we show how to avoid them by using an ad hoc convolutional architecture featuring an interchannel averaging layer. The main testbed we consider is a realistic model for noninteracting atoms in optical speckle disorder. With the interchannel average, accurate and systematically improvable ground-state energies and density profiles are obtained via gradient-descent optimization, without instabilities nor violations of the variational principle.
2022
File in questo prodotto:
File Dimensione Formato  
PhysRevE.106.045309.pdf

solo gestori di archivio

Tipologia: Versione Editoriale
Licenza: NON PUBBLICO - Accesso privato/ristretto
Dimensione 1.06 MB
Formato Adobe PDF
1.06 MB Adobe PDF   Visualizza/Apri   Richiedi una copia
Deep-learning density functionals for gradient descent optimization _ Pre print.pdf

accesso aperto

Tipologia: Documento in Pre-print
Licenza: DRM non definito
Dimensione 723.9 kB
Formato Adobe PDF
723.9 kB Adobe PDF Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11581/468954
Citazioni
  • ???jsp.display-item.citation.pmc??? 0
  • Scopus 2
  • ???jsp.display-item.citation.isi??? 2
social impact