Artificial Intelligence: Beyond the Buzzword

Original article was published by Nick Myers on Artificial Intelligence on Medium


Artificial Intelligence: Beyond the Buzzword

Artificial Intelligence. When you see this phrase what image(s) pop into your head? If I had to place a guess probably something along the lines of a robot death machine that may want to kill you. Yes? No? When I ask this question in front of a live audience the audience typically responds with something like “the HAL 9000” from Stanley Kubrick’s 2001: A Space Odyssey or “Skynet from The Terminator series.” If references to either of these two films appeared in your mind’s eye just know that you are not alone, and roughly 90% of other people probably thought about something similar.

The truth is this image most of us have about AI and intelligent computers wanting to kill us and destroy humanity is one of the great fallacies of Artificial Intelligence that has been ingrained in our minds over the last few decades. I must say, the media and pop culture have done an excellent job at convincing most people that one day machines and computers will become intelligent enough to overtake humanity. However, this depiction of AI (as we know it today) could not be farther from reality.

Since beginning my AI journey a few years ago I have come to the great realization that most people truly have no clue what AI is, how it works, and where the technology is at today. No matter what industry you currently work in you more than likely have been inundated with the words “AI”, “Machine Learning”, and “Automation”. In fact, these three words have become the ultimate buzzwords for startups and savvy marketers who are seeking to paint an advanced technological picture of their product or service assuming that most of the people they are talking to really have no solid understanding of what AI actually is. The sad fact of the matter is that these savvy marketers and startup founders are right. Most people know absolutely nothing about AI let alone Blockchain, IoT, Cloud Computing, Big Data, and every other piece of emerging tech that has been transformed into a meaningless buzzword.

Helping you to understand “AI” beyond just the buzzword is exactly what I will be writing about in this article. By the time you have finished reading this (or even mindlessly skimming if you so choose) you will be more knowledgeable about AI than 99% of the people you know. Buckle up, because we’re about to hop into the DeLorean for a trip back to when the idea of AI as we know it today all began.

A Very (VERY) Brief History of AI

Believe it or not, the idea of intelligent machines or “automatons” has been around since ancient Greece. In fact, ancient civilizations had their craftsman create moving statues that were treated as sacred. It was not until the Golden Age of Sci-Fi between 1938–1946 where more modern-day interpretations of intelligent machines, robots, etc. were created by authors like Isaac Asimov and Doc Smith.

Immediately following the golden age, the birth of modern artificial intelligence began in the early to mid-1950s as the field of machine intelligence research (otherwise known as AI research) began to explode. It was during this period that Alan Turing published his now-famous white paper “Computing Machinery and Intelligence” where he developed the Turing Test among many other computer intelligence theories. In 1956, one of the most prominent gatherings of AI researchers took place at Dartmouth College where the term “Artificial Intelligence” was coined by one of the founders of AI research John McCarthy.

In the years and decades that followed the field of AI research ebbed and flowed as researchers developed new theories, new technologies, and as computing power continued to increase with companies like Intel, IBM, Microsoft, and Apple making computing technology cheaper and easier to use for businesses and consumers alike. In the 2010s, the field of AI research entered a new renaissance as cloud computing became mainstream and the internet made it easy to access large amounts of data.

Narrow AI vs. AGI

AI (as we know it today) can be placed into two different categories. Narrow AI constitutes almost all of today’s current AI capabilities. With today’s AI systems and the current amount of computing power available AI can only do one thing incredibly well. For AI to do one thing incredibly well it requires a lot of computing power and access to a lot of data that can be used to train the system. Although we now have access to more data than ever before in the entirety of human history it is still relatively challenging to access large quantities of data and sort it appropriately so it can be given to an AI system to use.

AGI (Artificial General Intelligence) is ultimately where AI is headed and is what the entire field of AI research is working towards. Once AI can be placed into the category of AGI it will be just as intelligent (if not more) than you or I. It will be capable of collecting and interpreting data on its own and will be able to complete tasks without much oversight from a human operator.

Although we can create extremely intelligent Narrow AI systems there is still a lot of debate within the community of AI researchers as to if we will ever be able to create a true AGI system. Some AI researchers predict that we could have a functioning AGI by 2030 and others claim that we may never be able to fully create an autonomous AGI system. Even so, understanding these two foundational aspects of AI should help to calm some of your fears about the HAL 9000 or Skynet suddenly rising to power. Long story short, it probably never will.

What Makes AI Tick?

When it comes to understanding how today’s AI systems work there are only four primary components/sub-technologies that you will want to understand. These four components/sub-technologies include:

Machine Learning

In a nutshell, Machine Learning (ML) is the study of computer algorithms that improve automatically over time through experience. Machine Learning allows a computer system to sift through massive amounts of data, learn from that data, and recognize specific patterns within that data that humans would otherwise miss. Machine Learning is one of the foundational components of AI, and without Machine Learning AI as we know it today would not exist.

Natural Language Processing

Natural Language Processing (NLP) is a technology that allows a computer to interpret and understand spoken language. Voice assistants like Amazon Alexa and Google Assistant use NLP to understand what you say so the technology can help you fulfill a request like listening to music, setting a timer, or asking a question.

Neural Networks

You can think of a Neural Network like an artificial human brain. A neural network consists of a series of perceptron’s or “artificial neurons” that connect with another to interpret and analyze information. As information moves across the neural network the system can form connections between data points (much like the human brain does when learning something new) and is able to form assumptions and generate predictions from the data. A Neural Network can adapt to the changing input; so, the network generates the best possible results without the need to redesign the output criteria.

Deep Learning

Deep Learning takes the concept of a neural network even further. A Deep Learning system consists of multiple layers of neural networks that can interpret information and make multiple connections between data points. Deep Learning is what powers most of today’s advanced AI systems that include facial recognition and voice assistant technologies.

Why AI Matters and What It Means for You

With the technical jargon now behind us the question remains: Why does AI matter and what does it mean for you and for humanity? This a tough question there is no denying that. It is a question that over time I have become better at answering, but at the end of the day, any response is driven by pure speculation. However, I have settled on the answer that AI matters because one day (sooner than later) we will not be able to remember a world where AI and intelligent machines did not exist.

Over the next 10–20 years, the world is going to see an explosion of intelligent technology, unlike anything that we have experienced to date. AI will replace millions of jobs due to automation but will also create millions of new jobs that we cannot yet hope to predict. AI is going to force us to take a hard look at the current way our society functions and make us rethink our current forms of government and how our economies operate. AI is going to be fundamental in helping to solve some of the world’s biggest problems like climate change, overpopulation, and resource management. AI is going to help us form colonies on the moon, travel to Mars, and assist us with uncovering the mysteries of the cosmos. This is why AI matters, and we are just now beginning to scratch the surface of what AI can truly be capable of.

(For a more in-depth answer to this question I suggest that you visit our YouTube channel and watch my TEDx talk “AI: The Final Tool” where I share my thoughts on the future of AI and the impact that it will have on human society.)

What Next?

If you have made it this far congratulations! This means that I kept things interesting enough for you to care and keep reading while also helping you to understand the impact of one of the most important pieces of technology to ever be created in our lifetime. Now go forth, continue to learn, and the next time you hear someone talking about Artificial Intelligence that is more to AI than just the buzzword.

Stay hungry. Stay foolish.

Nick Myers is the Founder & CEO of RedFox AI based in Madison, WI. RedFox AI is a is a Voice technology company for innovative Biotech organizations that want their in-home testing efforts to be more frictionless, cost-effective, and successful. Nick is a TEDx and International Keynote Speaker and has been featured in publications such as PR Daily, In Business Madison, and the Journal of Digital and Social Media Marketing for his work with AI and voice technology.