Tags

Came back inspired and fascinated from the cinema the other day having sat with one other lone cinema goer watching AlphaGo.
Alphago is a deep learning program create by the company DeepMind to challenge the world champion Go player. Since the defeat of Kasparov, a world chess champion by a similar deep learning program the next mountain to climb was always Go. So far its proved elusive as the number of game permutations in Go make Chess look like noughts and crosses and it was thought that Go might be just too difficult for an AI program. If you get a chance you must see the documentary as it tracks the development, first beating the European champion and then the world champion. Even more interesting is the reaction of the huge crowd watching the event.

If you have read any of my other posts you will know that I have been impressed by the gains that seem to be possible surrounding the machine learning algorithm Gradient Boosting. This algorithm seems to be the defacto Kaggle competition winner at the moment. Kaggle, if you are not familiar, is a web site where data scientist hobbyists and pro’s take on submitted data sets and see who can produce the best Machine Learning solution. Inspired by Go I finally got around to checking out Deep Learning and was not surprised to find further gains. I tested three approaches on a simple data set consisting of just two features, namely horse age and days since last run. In all three cases I trained the models on two years of flat handicap data and tested them on one year of handicap data. Deep learning came out ahead of GDB which in turn beat Random Forests in terms of profit and loss of top rated.

If this topic would be of interest as perhaps a hands on tutorial then please leave a comment below. In the meantime probably the first thing you need to do if you want to get involved is to install Tensorflow and Keras. Keras is a front end built on top of Tensorflow and provides a simplified access to deep learning. You will need to have Anaconda Python installed which if you followed my earlier blog on Machine Learning you should have installed see here

https://markatsmartersig.wordpress.com/2016/01/13/profitable-punting-with-python-1/

Installing Tensorflow and Keras

First you need to create a new environment for your Keras based programs. Pull up a command box (type command in windows search box)

Assuming you have Anaconda installed enter the following command (not very clear in WordPress but that is a double dash before name shown below)

conda create –name deeplearning python

You can change deeplearning to whatever you’d like to call the environment. You’ll be prompted to install various dependencies throughout this process—just agree each time.

Let’s now enter this newly created virtual environment. Enter the following command

activate deeplearning

The command prompt should now be flanked by the name of the environment in parentheses—this indicates you’re inside the new environment.

We know need to install in this new environment any libraries we may need as they wont be accessible from the original root folder created when Anaconda was installed.

IPython and Jupyter are a must for those who rely on Jupyter notebooks for data science. Enter the following commands

conda install ipython
conda install jupyter

Pandas includes the de facto library for exploratory analysis and data wrangling in Python. Enter the following command

conda install pandas

SciPy is an exhaustive package for scientific computing, but the namesake library itself is a dependency for Keras. Enter the following

conda install scipy

Seaborn is a high-level visualization library. Enter the following

conda install seaborn

Scikit-learn contains the go-to library for machine learning tasks in Python outside of neural networks.

conda install scikit-learn

We’re finally equipped to install the deep learning libraries, TensorFlow and Keras. Neither library is officially available via a conda package (yet) so we’ll need to install them with pip. One more thing: this step installs TensorFlow with CPU support only and not GPU support. Enter the following

pip install –upgrade tensorflow
pip install –upgrade keras

Check all is OK

Get Jupyter Notebook up and running by entering

jupyter notebook

Once you are in notebook create a new notebook file and simply enter

from keras.models import Sequential
from keras.layers import Dense

Now run the above cell and hopefully all will be OK

Should you at any point wish to remove the new environment simply use the following command

conda remove –name deeplearning –all

That’s enough for now, if there is interest then we could perhaps explore the code sessions.