Little interactive demo of 2-layer generative nnet trained with SGVB

Instructions: Generate digits by wiggling the sliders. (Tip: if you don't notice a change in the image, try a different slider).


z:
x:

This is a little demo of a small generative model of MNIST digits, learned using the Stochastic Gradient Variational Bayes (SGVB) / Auto-Encoding Variational Bayes (AEVB) algorithm [1]. The sliders set the value of the latent variables z. The image represents the conditional distribution p(x|z), and is computed from 2-layer neural network, whose parameters are learned by maximizing the lower bound of the marginal likelihood p(x) using the SGVB algorithm.

[1] D.P. Kingma and M. Welling (2013). Auto-Encoding Variational Bayes..