This post is about my introduction to Machine Intelligence and encouragement to other people who are unfamiliar and are of all other fields to try Machine Learning as it is now quite mature and easy to do simple classifications and predictions using Deep learning. So if you’re a ML/AI hotshot my next post will be much more amusing than this one!

The first time in 2014 when I started learning about Machine Intelligence, the idea of converting 0s and 1s with some mathematical calculations to gain near human intelligence felt like the theory behind it would take someone a lot smarter than me with a PhD in Mathematics and Computer Science to understand it.

So to try my luck, I went on to take this wonderful and classic Coursera course on ML by Andrew Ng. This course is really good IMHO, if you want to get an intuition behind these following things:

- How linear regression and logistic regression works to predict/classify things.
- Why do we use gradient descent to optimize variables.
- What are different types of ML models are out there and what are their roles.
- Basic understanding of Neural Networks.

Then, as an engineer who wants to get started quickly building stuff without getting into much of math behind it, I decided to take the fastai course **Practical Deep Learning For Coders, Part 1****.**

The fastai has a really great top down approach of starting with creating a state of the art model and then deep diving into code and theory behind it. I am a really big fan of top down approach which I tried to apply in every aspect of my education, so it was an instant hit for me. Also, fastAI gives you a really good API to do transfer learning on a big NN architecture with a very minimal amount of code.

So, I started to watch the course videos and went straight to Kaggle datasets(A heaven for ML enthusiasts) and downloaded the 10 Monkey species dataset. Opened a fresh notebook with fastAI kernel and copied all the library import code. Then started looking at the data:

We import the data and check what it looks like using python’s plotting library.

Data is not really in much of a bad shape, you can pre-process it to resize it to lower size so that you can run experiments a lot faster, but I didn’t bother as the dataset is not really huge.

Now its time for the fastai magic, we build a learner using fastai API by:

learn = ConvLearner.pretrained(arch, data=data, precompute=True)

where precompute=True means the architecture/model is already trained with a large dataset and initialized with those weight matrices except the last layer which is going to be specially designed and selected for our particular dataset with 10 classes to predict.

Now, we calculate how the 10 classes of monkeys look according to our previous knowledge of how to look at Image data, just train the model for 3 epochs/iterations using pretrained weights by:

learn.fit(1e-1, 3)

Where 1e-1 is 10^-1 selected as learning rate, and 3 is the number of epochs.

Whaaaatttt! with just 3 epochs we are at 98.9% accuracy of predicting these classes. So, I went on to adjust all the weights of all layers to get much better accuracy

So, we achieve 100% accuracy on our test set in minutes without much data, though I assume it won’t be 100% on all the photos on internet of these species but yeah with 0.011 loss, I think it has generalized pretty well.

I downloaded another image from internet which is different than what we have in validation/train/test set.

Notice that the images in training set doesn’t have an image in this angle. So, lets see what out model predicts for this image:

Next I wanted to see what happens if I take a larger architecture (because there is no such thing as overkill in my dictionary) and train with precomputed weights to see what happens

We get 100% accuracy with 0.001 validation loss in just 5 epochs and 2.57 secs 😮

You can check full Jupyter notebook below:

Thank you for your time and I hope you’re now motivated to take this course too!

**Exercise for reader: Try this notebook and pre process the dataset to make the images smaller to speed up the learning process.**

Source: Deep Learning on Medium