A discussion and tutorial on statistical whitening.

Thoughts about the importance of data for machine learning and humans.

Generative probabilistic models, combined with deep neural networks, demonstrate the possiblilty of extracting meaningful abstract representations in unsupervised ways.

In this blog post, I'll show you how I implemented GoogLeNet in Keras and copied over the weights from Caffe. Then we'll classify some cats!

Backpropagation is a method for computing derivatives in artificial neural networks, allowing us to use gradient descent to train these models. Here, I walk through implementing backpropagation.