I am a Ph.D. student in the Computation and Neural Systems program at the California Institute of Technology. I am co-advised by Professors Pietro Perona and Yisong Yue. My focus is on deep learning, with the goal of extracting structure from data in unsupervised or semi-supervised settings.
Generative deep neural networks, combined with probabilistic models, have been recently showing promising capabilities, hinting at the possiblilty of extracting meaningful abstract representations in unsupervised ways.
In this blog post, I'll show you how I implemented GoogLeNet in Keras and copied over the weights from Caffe. Then we'll classify some cats!
Backpropagation is a method for computing derivatives in artificial neural networks, allowing us to use gradient descent to train these models. Here, I walk through implementing backpropagation.
A collection of my favorite blogs and podcasts on machine learning and deep learning.
A brief guide on how to get started with deep learning.