Reparametrization Trick

Introduction In generative models, such as VAEs and Diffusion, we would like to learn the parameters of the distribution of some data in order to generate novel examples of the data itself. As such, we would need our architecture to be stochastic (random) in nature so that we may sample from the learned distribution to generate new data. This means that in between our layers, there would be layer(s) that create random values and backpropagating over those random values doesn’t make sense, nor it is feasible....

Last Updated: July 10, 2023 · Created on: July 7, 2023 · 8 min · 1580 words · Abid

ELBO: Evidence Lower Bound

Motivation From bayesian probability we have: $ \displaystyle p_\theta(z\mid x) = \frac{p_\theta(x\mid z)~p_\theta(z)}{p_\theta(x)} $ The importance of this probabilistic formula cannot be overstated, especially in probabilistic models that is at the basis of deep learning — especially generative models. However, in cases where the model relies on such formulation, issues prop up regarding its analytical usability. The main issue with this formula is that for a good chunk of problems that are high dimensional, the evidence $ p_\theta(x) $ is intractable....

Last Updated: July 11, 2023 · Created on: July 5, 2023 · 4 min · 768 words · Abid

Test Post

This post serves as the first post for testing purposes. It seems to be working pretty well so far as I’m typing it. My fingers hurt now :( A math formula here: $$ \displaystyle \int_a^b f(x)~dx = \lim_{n \to \infty} RS(f, a, b, n)\\ \int_a^b f(x)~dx = \lim_{n \to \infty} \frac{b-a}{n}\sum_{i=0}^{n-1}f\left(a+i\cdot \frac{b-a}{n}\right) $$ A sample of python code: def sample_func(args): return args A sample of C code here: typedef struct { int FirstVal; } new_struct; int main(void) { return 0; } A simple visulization here:

Last Updated: July 11, 2023 · Created on: July 5, 2023 · 1 min · 85 words · Abid