Skip to content

y0ast/VAE-Torch

Repository files navigation

##Variational Auto-encoder

This is an improved implementation of the paper Stochastic Gradient VB and the Variational Auto-Encoder by D. Kingma and Prof. Dr. M. Welling. This code uses ReLUs and the adam optimizer, instead of sigmoids and adagrad. These changes make the network converge much faster.

In my other repository the implementation is in Python (Theano), this version is based on Torch7.

To run the MNIST experiment:

th main.lua

Setting the continuous boolean to true will make the script run the freyfaces experiment.

The code is MIT licensed.

I gratefully reused MNIST downloading and reading code written by Rahul G. Krishnan.

About

Implementation of Variational Auto-Encoder in Torch7

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages