Source: Deep Learning on Medium
Fortunately, my fiancé is creative enough to come up with ideas on things that look similar that a computer could help classify. We are not botanists (or at least I’m not), but I agree that a flower classifier is a cool application of the new fastai vision library. A link to my web app is available at the top of the page for anyone who has pictures of flowers and wants to know what kind they are with 97% accuracy. If you ever find yourself lost in a forest and can’t remember if Indian basket flowers grow facing north, fire up the flower classifier web app, snap a picture of the flower in front of you, and find out how to get out of your hairy situation (forget about google maps).
Data gathering — 80% of the work, 20% of the fun
The first step was to gather names of all the different flower types for the classifier to learn. Since I can only name about five different types of flowers on a good day, I googled around for a list of the most common flower species in Texas and found a couple courtesy of Texas A&M and others. I chose the top 12 flower types (adding in some other of the most known types) and read them into Pandas. Pandas HTML reader in python read the websites and turned the tables into a dataframe. I took a shortcut and deleted the column of scientific names since a quick spot check in Google showed that there are more useful image results from the common names.
Training — the 10 minute classifier
The advantage of the fastai library is it’s ease of use. Once I was done with lesson 2 (4 hours into the course, not including homework), I had the skills to train an image classifier in less than 10 minutes. My setup is the Juno app connected to a Jupyter Notebook server on my Paperspace Linux machine all running on my iPad (https://link.medium.com/wNRjuJivXT). I loaded the data from the paths that I had Google images download to, then created a resnet50 model with size 224 pixels, and created a lerner object. Fastai version 1 now recommends an initial training method called “fit_one_cycle.” I let it run for 4 epochs. This initial training took less than 2 minutes and resulted in 89% accuracy. I then unfroze the earlier layers and used the learning rate finder to get a slice of the new learning rate ranges to train the entire model with (for this case it was 1e-5 to 2e-4). I let the learner train again for another 3 epochs until I got an accuracy of 92%.
I ran the results through the DataFormatter from top losses and used the image cleaner widget to find any images that shouldn’t belong in the training set. I ended up removing 15 images that were not of flowers from my dataset. Once that was done, I read the “cleaned.csv” file into a new data bunch object, loaded the learner, unfroze the early layers, and trained for another 3 epochs. This resulted in 97% accuracy. The confusion matrix showed that the I used the classification interpretation tool to plot the confusion matrix, and found that the classifier is only getting tripped up on 1 or 2 images per category. Probably better than what my brain can do. Good enough for me.
Production — only so I can show off
I wanted to put this trained classifier on the internet in order to have concrete evidence of my skills, and also to give people the opportunity to text me about all the images that the classifier could not recognize (hint, the list of species is on the top of the web app)! The first step is to export the trained model using learn.export() method. This saved a file called “export.pkl” in the same folder as the path for my data. The next step was for me to upload the model to Google drive. I did this with Google drive for Linux using the instructions I’ve attached at the end. At this point with Google chrome and Google drive on my Linux server on my iPad, I felt like a true computer rebel. I saved the download link to the file from my Google drive and turned link sharing on. Then it was time to upload to deploy on Render.
Render is the perfect website for this kind of deployment. The founder, Anurag Goel, is very active on the fastai forums and even helped me to get started. Fastai has a template on their Github that can be used to upload to Render (link to instructions below). Pretty simple, I forked the github repo for Jeremy’s grizzly bear v. teddy bear classifier and update the URL link to the classifier to the link where I saved the export of the trained model (with name export.pkl). I also updated the classes to the 12 types of flowers my model is trained to recognize. A note — I had trouble initially getting the repo to upload to Render when I renamed my forked repo to a different name. When I kept the same name as the original repo, everything worked fine. The final touch was to sprinkle in a little HTML and CSS updating to make my web app look and read a little bit differently than the original. Once everything was updated on my Github, all I needed to do on Render was to link the Github repo to my deployment server. The advantage of Render is that it does everything else for me. Once the project is live on Render, the final step is to show off the link to everyone by writing about it on Medium!
List of Trees by common name, Alphabetical A-Z of Tree species with other botanical names in a searchable list.www.treenames.net
Python Script to download hundreds of images from ‘Google Images’. It is a ready-to-run code! -…github.com
This tutorial will help you to setup Selenium with ChromeDriver on Ubuntu, and LinuxMint systems. This tutorial also…tecadmin.net
If you just want to test initial deployment on Render, the starter repo is set up to use Jeremy’s bear classification…course.fast.ai