Reparametrization Trick
Introduction In generative models, such as VAEs and Diffusion, we would like to learn the parameters of the distribution of some data in order to generate novel examples of the data itself. As such, we would need our architecture to be stochastic (random) in nature so that we may sample from the learned distribution to generate new data. This means that in between our layers, there would be layer(s) that create random values and backpropagating over those random values doesn’t make sense, nor it is feasible....