Amortized backward variational inference in nonlinear state-space models - LMBA-UBS Accéder directement au contenu
Pré-Publication, Document De Travail Année : 2022

Amortized backward variational inference in nonlinear state-space models

Résumé

We consider the problem of state estimation in general state-space models using variational inference. For a generic variational family defined using the same backward decomposition as the actual joint smoothing distribution, we establish for the first time that, under mixing assumptions, the variational approximation of expectations of additive state functionals induces an error which grows at most linearly in the number of observations. This guarantee is consistent with the known upper bounds for the approximation of smoothing distributions using standard Monte Carlo methods. Moreover, we propose an amortized inference framework where a neural network shared over all times steps outputs the parameters of the variational kernels. We also study empirically parametrizations which allow analytical marginalization of the variational distributions, and therefore lead to efficient smoothing algorithms. Significant improvements are made over state-of-the art variational solutions, especially when the generative model depends on a strongly nonlinear and noninjective mixing function.
Fichier principal
Vignette du fichier
backward_variational.pdf (609.07 Ko) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-03683622 , version 1 (31-05-2022)
hal-03683622 , version 2 (24-01-2024)

Identifiants

Citer

Mathis Chagneux, Élisabeth Gassiat, Pierre Gloaguen, Sylvain Le Corff. Amortized backward variational inference in nonlinear state-space models. 2022. ⟨hal-03683622v1⟩
141 Consultations
50 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More