Is it too late to adopt Artificial Intelligence?
The analysis shows that those companies who take the initial steps of AI adoption have a 20% advantage over their competitors to gain market leadership
At the dawn of this millenium, Steven Spielberg launched a science fiction movie called Artificial Intelligence (AI). The inception of this film originated from the great American director, Stanley Kubrick, who got inspired by the 1969 short story called “Supertoys Last All Summer Long”.
Before this, Holywood has been toying with AI concepts for decades, even before the term AI was even conceived! The first movie to feature a robot was Metropolis, produced in 1927. What followed was a chain of blockbuster movies featuring AI, which includes “2001: A Space Odyssey”, “Blade Runner”, “The Terminator”, “The Matrix”, “I Robot”, and many others. However, none of them pushed the term AI into collective use as much as the AI movie by Steven Spielberg.
Since then, AI has not only found its way into our dictionaries, but it is increasingly entering everyday use. The main reason it took so long was that systems that use AI do not acknowledge it! Have you ever seen a car manufacturer prominently advertise AI? Yet many standard cars have parking sensors, auto parking features, lane-centring steering and traffic-sign recognition, to name a few. We can say the same for air conditioners, ovens, washing machines, and most mundane household appliances. So we have all been using AI for quite a while without even knowing it!
But AI did not just gain popularity because of the movies. Something else happened in the first decade of this Millenium.
First of all, AI is data-hungry and, as such, requires large volumes of information to operate well. Ever since the beginning of this Millenium, the world has experienced a massive surge in data production, which is still ongoing. Computers, smartphones or other wearable devices constantly capture videos, photos, sounds, 3D and text-based content. Social media has turned everyone into a content creator. Internet of Things (IoT) sensors are being plugged all over the place, promptly reporting all sorts of things; whose ringing the doorbell, the state of the garage door, the temperature or air quality inside the home. According to Forbes, 90% of all the data in the world was created during the past two years!
Second, just having data is not enough if the algorithms cannot handle it. Luckily around 2006, a new breed of AI algorithms were developed called Deep Learning (DL). These programs build upon Artificial Neural Networks (ANN), a set of techniques created in 1958 inspired by the internal workings of the human brain. DL algorithms can learn from massive data sets, and their accuracy is extremely high, in some cases, even higher than what a human can achieve. Most of the systems used today within consumer applications make use of such techniques.
Third, since these algorithms process vast amounts of data, they need powerful computers. During the same period, powerful processors designed initially to cater for high-end graphics became the processors of choice for DL algorithms. But even these were not enough to quench the thirst for processing powers. Luckily, cloud computing was on the rise, thus allowing AI developers to source cheap processing power located remotely around the globe.
So the right conditions suddenly formed around 2010, which is why AI took off so rapidly immediately after and is today on everyone’s lips. But the window of opportunity in any innovation is not perpetual and typically lasts around 15 years. If you do the math, to gain the most advantage out of AI, a company needs to take action by 2025. A survey conducted by the Massachusetts Institute of Technology (MIT) amongst 3000 managers shows that one-third of companies still have little to no understanding of AI and will most likely miss this opportunity. 48% are piloting AI projects only to appear forward-looking, but they have no intention of using these projects. Only 20% of the companies are pioneers; they have deep knowledge of AI and plan to integrate it inside their processes. This very much tallies with research conducted by MMC Ventures, a London based investment firm. They looked at 2,830 AI companies based in Europe and discovered that 40% of them don’t use any AI at all! Trying to keep up appearances rather than gaining real benefit from AI is rather tragic. Unfortunately, according to the Diffusion of Innovations theory developed by E.M. Rogers in 1962, this seems to be a relatively common trend in all innovations, and AI is not an exception!
Based on the same survey, while almost 90% of companies believe that AI will offer them an advantage over others, only 60% have an AI strategy in place. Of these, only 10% are using it to generate financial benefits. If these companies do not act, the gap with the leading companies will widen further. The analysis also shows that those companies who take the initial steps of AI adoption have a 20% advantage over their competitors to gain market leadership. The percentage increases to 40% for those who successfully implement AI into their systems. But their shoot up to more than 70% if they revamp their processes using an intelligent combination of humans and machines. Their profits will also increase fivefold, compared to those who made minor changes to a few business processes.
okay. It’s up to you to decide whether your company will reap the rewards or lament a missed opportunity in the coming years!