Stein variational gradient descent with learned direction

Research output: Contribution to journalArticlepeer-review

7 Scopus citations

Abstract

Recently, as a particle-based Bayesian inference method, Stein variational gradient descent (SVGD) has attracted much attention due to its strong approximation power and efficiency compared to traditional variational inference and Markov chain Monte Carlo sampling methods. However, the original SVGD method assumes the descent direction to be within an reproducing kernel Hilbert space (RKHS), which on one hand limits its expressive power, and on the other hand suffers from the curse of dimensionality. To address this issue, we propose to parameterize the descent direction using a flexible neural network. With this parameterization strategy, the descent direction can be possibly optimized in each step for updating the particles, such that the target distribution can be expected to be better approximated. Experiments on both synthetic and real datasets demonstrate the effectiveness and efficiency of the proposed method, especially in the high-dimensional situations, as compared with the original SVGD.

Original languageEnglish
Article number118975
JournalInformation Sciences
Volume637
DOIs
StatePublished - Aug 2023

Keywords

  • Bayesian modeling
  • Neural network
  • Posterior
  • Stein discrepancy

Fingerprint

Dive into the research topics of 'Stein variational gradient descent with learned direction'. Together they form a unique fingerprint.

Cite this