Original article can be found here (source): Artificial Intelligence on Medium
In the last few years, artificial intelligence implementations in various companies have changed around the world. As more enterprise-wide efforts dominate, Cloud Computing became an essential component of the AI evolution. As customers spend more time on their devices, businesses increasingly realize the need to bring essential computation onto the device to serve more customers. This is the reason that the Edge Computing market will continue to accelerate in the next few years. The Edge Computing is forecasted to reach 1.12 trillion marketing by the year 2023.
To prepare for this, large Cloud companies are offering Edge Computing services. Intel and Udacity just launched its program to train 1 million developers worldwide.
According to Gartner, 91% of today’s data is processed in centralized data centers. But by 2022, about 74% of all data will need analysis and action on the edge.
The Drivers of Edge Computing and Edge AI
Edge computing is a distributed computing paradigm that brings computation and data storage closer to the location of the device. Edge computing originated from content delivery networks. Now, companies use virtualization to extend the capabilities.
There’s a misconception that edge computing will replace the Cloud. On the contrary, it functions in conjunction with the Cloud.
Big data will always be operated on the Cloud. However, instant data that is generated by the users and relates only to the users can be computed and operated on the edge.
There are several drivers of Edge Computing and Edge AI.
Privacy — Increasingly, as consumers are more conscious of where their data is located, companies are designing apps where personalization features are delivered upon user authorization inside apps. This will allow companies to deliver more AI-enabled personalized features while giving users the ability to understand how their data is being collected.
Security — With increasingly distributed architectures being deployed and increased sensitivities in data stored in the Cloud, there’s a movement toward multiple layers of encryption and more dynamic encryption mechanisms. With an increasing variety of AI-enabled devices such as speakers, phones, tablets, and robots, edge nodes can determine the right security mechanism for different devices.
Latency — The most obvious reason for tasks to be done on the edge is latency. As our services are more distributed at both the network level as well as the device level, there’s more latency concerns when sending data across networks and devices.
Load Balancing — To increase application end to end resiliency on increasingly distributed systems, there needs to be multiple endpoints of load balancing. This brings up the idea of the Cloudlet that resides on the edge or closer to the mobile device to increase resiliency at the device level.
Getting Into The New Wave of Edge AI
Data scientists, machine learning engineers, front end developers, network ops, Dev ops, IoT developers and back end developers all already understand a piece of knowledge that is necessary to work in this new Edge AI economy.
Concepts that were helpful to learn to operate in the Big Data or the Cloud Computing world can be readily applied in the Edge AI economy. The convergency between programming on device in the IoT world and programming on the Cloud of the Big Data/AI world will allow everyone to unleash their creativity.
What does it take to build an Edge AI network to interact with personalized apps that you use on multiple devices while maintaining core intelligence on your Enterprise AI Cloud?
It takes the joining of minds of data scientist, machine learning engineer, front end developer, back end developer, IoT developer, etc.. who are used to functioning in their lanes of specialization to make this happen.
The challenge of being successful in the Edge AI economy is to understand the direction of computing, architecture, and build next-generation AI-enabled applications and devices that make full use of within the AI and machine learning eco-system.
Below are some resources and fun projects that will enable you to learn more about the new Edge AI economy.
This is a project based program where you will earn a certificate at the end. It uses Intel® OpenVINO™ toolkit. It’s a great program for data scientists and machine learning engineers who are either having trouble finding a job or are between positions as well as for IoT developers.
This is an amazing weekend project to learn all about Kubernetes for any developer. It allows you to learn about what it takes to build a cluster in your home.
Serverless architecture is well hyped up recently in part because of edge computing. Understanding the architecture will allow you to understand the concepts behind Edge AI training courses.
For any developer who are have never programmed in the IoT world, any IoT development courses can ease you into this world. For data scientists and machine learning engineers, taking these courses in conjunction with Edge AI courses will be helpful.
For IoT developers who have never built applications that uses AI and machine learning, now is the time to learn about algorithms. You will be able to see the vision of more personalized features using AI, machine learning on instant data.