Deep Learning with PyTorch

Original article was published on Deep Learning on Medium


Transfer Learning using ResNet34

Transfer learning is a research problem in machine learning that focuses on storing knowledge gained while solving one problem and applying it to a different but related problem.

I have used Data Augmentation and Regularization technique to create new dataset.

Data augmentation is a strategy that enables practitioners to significantly increase the diversity of data available for training models, without actually collecting new data.

I have used few transform functions to transform the dataset

Resize() = Resize the input Image to the given size.

RandomHorizontalFlip() = Horizontally flip the given image randomly. RandomRotation() = Rotate Image by given angle

RandomErasing() = Randomly selects a rectangle region in an image and erases its pixels

Created a dataset using this transforms

To view images in batches after transform I used same show_batch() function.So by calling this function we can see the batches of inverted images like this.

After that I have built IntelImageResnet() by extending IntelImageClassificationBase(). Used pretrained ResNet34 Model to train this dataset.

Improving Fit Function

  1. Learning rate scheduling = One Cycle Learning Rate Policy which start with lower learning rate, gradually start increasing it batch by batch to the given high learning rate for about 30% of the epochs and then gradually decreasing to the lower value for the remaining epochs
  2. Weight decay: Regularization technique which prevents the weights from becoming too large by adding an additional term to the loss function
  3. Gradient clipping: Apart from the layer weights and outputs, it also helpful to limit the values of gradients to a small range to prevent undesirable changes in parameters due to large gradient values. This simple yet effective technique is called gradient clipping.

Created the object of this model and moved it to GPU.

After that trained the model on the final layer which means we will not change the parameters of layers except final layer.

model2.freeze()

Then unfreeze all the layers and retrained the model.

model2.unfreeze()

After training I got accuracy of 94%

Lets plot accuracy of training dataset.

Also plot training and validation loss

Both the training loss and validation loss decreasing gradually

Predict few test images on this model.

So from ResNet34 model I got accuracy of 94% also model could able predict the given images correctly.

ResNet model with (94%) accuracy out perform CNN (76%) and feed forward neural network(49%)models