notes

 (

index

)

pyro

variational inference, probabilistic inference, computation

lectures

Du Phan implemented Statistical rethinking by Richard McElreath in 2020 in pyro

talk on pyro by Chi Nhan Nguyen at ML conference

technical talk including implementation information Uber Open Summit 2018 Pyro: Deep Probabilistic Programming

documentation

introduction

  • SVI Part I: An Introduction to Stochastic Variational Inference in Pyro

    • Setup

      • The different pieces of a `model()` are encoded via the mapping:

          1. observations \[\Longleftrightarrow\] `pyro.sample` with the `obs` argument
          1. latent random variables \[\Longleftrightarrow\] `pyro.sample`
          1. parameters \[\Longleftrightarrow\] `pyro.param`
      • When random variables are specified in Pyro with the primitive statement `pyro.sample()` the first argument denotes the name of the random variable. These names are used to align the random variables in the model and guide.
    • Guide

      • The guide is a variational distribution
      • Just like the model, the guide is encoded as a stochastic function `guide()` that contains `pyro.sample` and `pyro.param` statements.
      • It does not contain observed data, since the guide needs to be a properly normalized distribution.
      • Pyro enforces that `model()` and `guide()` have the same call signature, i.e. both callables should take the same arguments.
    • ELBO

      • the gap between the ELBO and the log evidence is given by the KL divergence between the guide and the posterior
    • Optimizers

      • Indeed parameters may be created dynamically during the course of inference. In other words the space we’re doing optimization over, which is parameterized by 𝜃θ and 𝜙ϕ, can grow and change dynamically.

advanced

examples

contributed

notes

stochastic variational inference in variational bayesian methods