This was a quick and dirty test I did the with some trial hours on the Google Machine Learning Cloud. I was curious how one of their Tensor Processing Units (TPUs) compares to a single Nvidia Titan X in terms of training time on an MNIST benchmark.
Zipline is a Pythonic algorithmic trading library for backtesting and trading quantitative strategies. TensorBoard is a visualization tool provided with the deep learning library TensorFlow. These two can be used together to create a dashboard that monitors and compares Zipline backtests. You don’t even need to know a thing about deep learning.
I’ve added some code on GitHUb for training deep convolutional neural networks to classify images in the Oxford 102 category flower dataset. This is using the lovely Caffe. The prototxt files for fine-tuning AlexNet and VGG_S models are included and use initial weights from training on the ILSVRC 2012 (ImageNet) data.
When using deep convolutional neural networks (CNNs) for image classification tasks, it’s common to apply several transformations to the images in order to augment the data and reduce overfitting. For example, images are often randomly cropped, mirrored, rotated, and blurred to artificially increase the number of training examples. It’s much more efficient to do this in real-time rather than store extra transformed images on disk.