Beginner’s Guide to TensorFlow 2.x for Deep Learning Applications

Original article was published by Orhan Gazi Yalçın on Artificial Intelligence on Medium

Well, we can compare TensorFlow an PyTorch for days, but this post is not about framework benchmarking. This post is about what you can achieve with TensorFlow.

What is TensorFlow?

TensorFlow is an end-to-end framework and platform designed to build and train machine learning models, especially deep learning models. It was developed by Google and released as an open-source platform in 2015.

The two programming languages with stable and official TensorFlow APIs are Python and C. Besides, C++, Java, JavaScript, Go, and Swift are other programming languages where developers may find limited-to-extensive TensorFlow compatibility. Most developers end up using Python since Python has compelling data libraries such as NumPy, pandas, and Matplotlib.

Why Should We Use TensorFlow?

There are several advantages of using a powerful deep learning framework, and the non-exhaustive list below points out to some of them:

  • Reduced time to build and train models;
  • Useful data processing tools;
  • Compatibility with other popular data libraries such as NumPy, matplotlib, and pandas;
  • A rich catalog of pre-trained models with TF Hub;
  • Tools to deploy trained models across different devices such as iOS, Android, Windows, macOS, and Web;
  • Great community support;
  • A desirable skill by tech companies.

A Brief History of TensorFlow

Currently, we are using the second major version of TensorFlow: TensorFlow 2.x. It took almost nine years to achieve this level of maturity. However, I can say that we are still in the beginning phase of the ultimate deep learning platform because the current trends indicate that deep learning processes will be much more streamlined in the future. Some claims that API based practices will be the standard way of using deep learning and artificial neural networks. But, let’s not get ahead of ourselves and take a look at the history of the TensorFlow platform:

The TensorFlow team deliberately uses the term platform since its deep learning library is just a part of the whole technology.

2011–2016: The Infancy and Initial Developments

IIIn 2011, Google Brain developed a proprietary machine learning library for internal Google use, called DistBelief. DistBelief was primarily used for Google’s core businesses, such as Google Search and Google Ads.

IIn 2015, to speed up the advancements in artificial intelligence, Google decided to release TensorFlow as an open-source library. Tensorflow Beta was released.

IIn 2016, Google announced Tensor Processing Units (TPUs). Tensors are the building bricks of TensorFlow applications, and as the name suggests, TPUs are specially designed ASICs for deep learning operations.

Figure 4. Google’s Tensor Processing Units on Wikipedia
ASIC stands for application-specific integrated circuit. ASICs are customized for a particular use such as deep learning or cryptocurrency mining, rather than general-purpose use.

2017–2019: First Major Version and the Advancements in Cross-Platform Technologies

The Developments of 2017:

IIn February, TensorFlow 1.0 was released, setting a milestone. Before February 2017, TensorFlow was still in 0.x.x versions, the initial development process. In general, version 1.0.0 defines the public API with a stable production capability.Therefore, February 2017 was indeed a big milestone for TensorFlow.

IISeeing the rapid advancements in mobile technologies, the TensorFlow team announced TensorFlow Lite, a library for machine learning development in mobile devices, in May 2017.

IIFinally, in December 2017, Google introduced KubeFlow. Kubeflow is an open-source platform that allows operation and deployment of TensorFlow models on Kubernetes. In other words, “the Machine Learning Toolkit for Kubernetes

The Developments of 2018:

IIIn March, Google announced TensorFlow.js 1.0, which enable developers to implement and serve machine learning models using JavaScript.

IIn July 2018, Google announced the Edge TPU. Edge TPU is Google’s purpose-built ASIC designed to run TensorFlow Lite machine learning (ML) models on smartphones.

The Developments of 2019:

I In January 2019, the TensorFlow team announced the official release date for TensorFlow 2.0.0: September 2019.

II In May 2019, TensorFlow Graphics was announced to tackle issues related to graphic rendering and 3D modeling.

2019–2020: From September 2019 Onwards: TensorFlow 2.0+

IIIn September 2019, the TensorFlow team released TensorFlow 2.0, the current major version, which streamlined many of the inconveniencies of building neural networks.

IWith version 2.0, TensorFlow finally embraced Keras as the official main High-level API to build, train, evaluate neural networks.

ITensorFlow 2.0 streamlined the data loading and processing tools and provided newly added features.

IEager Execution was made the default option, replacing Graph execution. This strategy was adopted because PyTorch has attracted many researchers with eager execution.

With Eager execution, TensorFlow calculates the values of tensors as they occur in your code.

As you can see, TensorFlow is much more than a deep learning library for Python. It is an end-to-end platform that you can process your data, build & train machine learning models, serve the trained models across different devices with different programming languages. Below you can see the current diagram of the TensorFlow platform:

Figure 5. The Current Diagram of the TensorFlow Platform (Figure by Author)

How Popular is TensorFlow?

As of 2020, the real competition is taking place between TensorFlow and PyTorch. Due to its maturity, extensive support in multiple programming languages, popularity in the job market, extensive community support, and supporting technologies, TensorFlow currently has the upper hand.

Figure 6. Deep Learning Framework Power Score 2018 (based on Jeff Hale’s Work) (Figure by Author)

In 2018, Jeff Hale developed a power ranking for the deep learning frameworks in the market. He weighs the mentions found in the online job listings, the relevant articles and the blog posts, and on GitHub. Since 2018, PyTorch has achieved an upward momentum, and I believe it must have a higher score by now. But, I believe TensorFlow still has superiority over PyTorch due to its maturity.

I am Convinced! What’s Next?

You have come to this point, and I hope you already developed an understanding of what TensorFlow is and how you can benefit from it. If you are convinced to learn TensorFlow, in the next post, I will explain the topics below with actual code examples:

  • The very basics of TensorFlow: Eager Execution, Tensors, and Variables; and
  • Five major capabilities of TensorFlow 2.x that cover the entire deep learning pipeline operations.

You can follow my account and subscribe to my newsletter:

Slide to Subscribe to My Newsletter

Final Notes

Over the years, TensorFlow turned into a big platform covering every need of machine learning experts from head to toe. There is still a long way to go, but we are far ahead compared to where we were ten years ago. Join the rise of this new technology and learn to implement your own deep learning models with TensorFlow’s help. Don’t miss out…

Finally, if you are interested in applied deep learning tutorials, check out some of my articles: