AI Economy will speed up innovation further

Original article was published by SHAIK SAMEERUDDIN on Artificial Intelligence on Medium


PART 1: Innovation in technology-why does it accelerate?
I have written on Linkedin rarely and I have nevertheless summarised some trending articles that might explain why these mini-videos are to be released.

Trending Video # 1-Innovation in the technology sector is driving traditional businesses out
This example shows clearly how Apple came to take the heart of the beloved company in the world from near bankruptcy. It’s the same with businesses like Google, Amazon, and Facebook which didn’t even exist 20 years ago.

Real Talk with Data Scientist and Reinforcement Learning Lead | skills required for a data scientist

Future Perspective: If the business doesn’t invest heavily in data-driven intelligence, the next decade will not last.

Trending Video # 1 — Learning to track and destroy genetically engineered cells
One such example is this breakthrough in medical cancer research from Nanotech https:/nanolive.ch/applications.

Future advantages: where science helped us to understand how the right medicines or cells programmed to save the worst curse of mankind.

Video pattern # 2 — Car learning to use evolutionary algorithms
This example is used to teach a car about racing using the Deep Reinforcement Learning technology.

MIT and Toyota Release Autonomous Driving Dataset DriveSeg | dataset for a data science project

Future prospect: Friendly and privileged driving in the 60s. Dense cities and traffic have become a major issue these days. Such algorithms help us to build a driving vehicle that can do the job better than us without stress!

How does it accelerate?
Rapid broadband internet, cheap commodity hardware, and soon cheap HPC hardware are also available for individual researchers, researchers. Engineers and managers too could soon have quick computers under their desks!

Real Talk with Chief Data Scientist and Artificial Intelligence leader (with 23 years of experience)

PART 2: HPC is the basis for AI innovation and why it
In the context, high-level computing makes cities smarter, data-driven organizations, and decision-making an easy-to-sift process through yottabytes (meaning huge amounts of data).

Courtesy: HPCwire https://www.hpcwire.com/2018/01/18/new-blueprint-converging-hpc-big-data/

Technological progress is happening at a very rapid rate and with Artificial Intelligence and the related architectures that come with it, it will even go faster!

PART 3: Will this speed be increased further by artificial intelligence, but HOW?

This all began when Google open-sourced its Machine Learning library TensorFlow AI library and Tensor Processing Unit.

We’ll look back and say someday how it was the AI Economy turning point-or what a fancy term will be in 2030.

Of course, the open-source route published by PyTorch was followed by Facebook. We hear today that Uber, Netflix, Tesla, and almost all rapidly growing businesses use some sort of machine learning or profound learning.

Nvidia explicitly goes ahead in his GPUs, but the next wave of this Movement will be characterized by two different things.

We have all the tensor architectures
We will see a decline and the slow death of 32-bit and 64-bit IEEE 754 floating-point architectures

You might wonder: “Wait, What is a Tensor”

Tensor

Moving on…

[I apologize in advance for making math out of this fun storytelling but it is crucial, stay with me]

  1. It’s nothing new, you’ve been using this in the 80s as well: Tensor languages have actually been around for years. Programming languages like APL and Fortran have used it in the past.
  2. Numerical computing optimization has been going on for a while: In the 1950s already programmers knew how to make linear algebra go faster by blocking the data to fit the architecture. Matrix-matrix operations in particular run best when the matrices are tiled into submatrices and even sub-submatrices.
Prerequisite for Data Science: It’s Not What You Think It Is | prerequisites for data science master

Dot Product — Huh?

I’m sure you may have heard from your engineer colleague or employee of this term. If not, you all have done this in high school math some time ago.

all those pictures or text corpus that your Machine Learning or Deep Learning engineers are using to do face recognition, securing devices from hacks, or analyzing traffic for self-driving cars are essentially (somewhere) Matrix-matrix operations.

All right, before your head hurts, I’ll stop now!

AI will drastically obsolete many of the 1990s and 2000s’ computing paradigms into modern models, architectures, and hardware solutions, which will flood the market within the next 5–7 years.

Technologically, we will see more advanced linear algebra solutions embedded in hardware where these parallel computation — which is what Deep Learning systems do best, with multi-level submatrices BLAS (Basic Linear Algebra Subprograms) will further make matrix multiplications faster.

Unplash

Already 10–20 years from now we will look back at current computing infrastructures, data centers, desktop machines, and devices and smile like when we look at outdated computing methods.

This is why AI Economy will be massive because it will lead to a massive overhaul at an industrial scale, just like the Internet 20 years ago.

My advice to you is to be open-minded and think outside of the box while you are looking for a career in data science. It will give you a competitive edge in your career in data science.

Bio: Shaik Sameeruddin I help businesses drive growth using Analytics & Data Science | Public speaker | Uplifting students in the field of tech and personal growth | Pursuing b-tech 3rd year in Computer Science and Engineering(Specialisation in Data Analytics) from “VELLORE INSTITUTE OF TECHNOLOGY(V.I.T)”

Career Guide and roadmap for Data Science and Artificial Intelligence &and National & International Internship’s, please refer :

More articles for your data science journey: