Transfer Learning : the time savior

Original article was published on Artificial Intelligence on Medium

Transfer Learning : the time savior

The whole backdrop of Artificial intelligence and deep learning is to imitate the human brain, and one of the most notable feature of our brain is it’s inherent ability to transfer knowledge across tasks. Which in simple terms means using what you have learnt in kindergarten , adding 2 numbers, to solving matrix addition in high school mathematics.

The field of machine learning also makes use of such a concept where a well trained model trained with lots and lots of data can add to the accuracy of our model.

Here is my code for the transfer learning project I have implemented.

Part 1: Prepare a dataset

I have made use of open cv to capture real time images of the face and use them as training and test datasets. I clicked 100 images for training and 50 for test. The dataset consists my and my mother’s images.

dataset structure

Libraries were imported and a basic code to capture image and save them was written inside a while loop to capture 100 images.


This gave me a customized dataset.

Part 2: Use Transfer Learning

VGG16 model was imported which uses weights of imagenet and following pre trained layers we have freeze all layers except input and output layers :

Model is trained for 10 epoch and early stopping is also applied to prevent overfitting.

On prediction an image from test set was given and predictions came to be correct.

Here is the link to my github repo that contains the full code for the project!