AI is everywhere in the media at the moment, but it’s all quite confusing. What does artificial intelligence actually mean? What’s the difference between a chatbot, AI interfaces, Siri and the science fiction future of virtual consciousness? Is there a difference? And what does AI mean for content creators?
AI – What Is It?
Some people say it will save humanity. Others say it could destroy us all. But, the truth is, most people don’t really know what AI is. According to tech research giant Gartner, 90% of enterprises are investigating chatbots, but only 4% have them up and running. Why? They don’t understand it.
And it’s not surprising. AI is used as an umbrella term for all sorts – from robotics to machine learning – so it’s hard to make sense of it all. Here’s how I look at it…
‘Intelligence’ is the ability to acquire and apply knowledge and skills. ‘Artificial intelligence’ is the ability of a computer system to perform tasks usually associated with intelligent beings, such as decision-making. Generally, psychologists do not characterise human intelligence by just one trait. Instead, a combination of many diverse abilities is considered. Research into AI has focused mainly on the following components of intelligence: learning, reasoning, problem-solving, perception, and using language.
Types Of Intelligence
This is where it gets more complicated; there are many types of AI. Some are classified as ‘weak’ or ‘narrow’ AI, e.g., virtual personal assistants (think Siri) and others as ‘strong’ AI – those systems capable of finding a solution when presented with an unfamiliar task.
Another classification system from Arend Hintze divides AI into four categories, from reactive machines to conscious machines. (Hintze knows what he’s talking about – the Michigan State University assistant professor in Integrative Biology and Computer Science Engineering has developed an artificial intelligence system for video games that adapts to the player’s behavior by Darwinian evolution.) It’s a comprehensive categorisation which even takes into account systems that don’t yet exist, but there’s a lot to digest.
To put it simply: AI is complex, and researching into it tends to raise more questions than it solves, which is why it’s easy to end up in a muddle. Here’s a few of the basics covered…
Chatbots
What a buzz word! But what is a chatbot? The lines are getting blurry, because the media uses this word to describe simple and intelligent bots.
A ‘bot’ (short for robot) is an automated program that runs over the internet. The 1990s saw the rise of online chat rooms, and chatbots became popular. These bots are scripts that look for certain text patterns submitted by chat room participants, and respond with automated actions. Chatbots have evolved since then.
Now, chatbots are great for linear and single dimensional support, like customer service, and most of us have experienced bots online: “click here and talk to Tom!” But you’re not talking to a person, you are talking to a bot. According to Gartner, chatbots will power 85% of all customer service interactions by the year 2020. It’s very possible that soon we will have more conversations with bots than with our spouse.
Virtual Assistants
Think Siri and Amazon Alexa. While chatbots have a specific purpose for being built, i.e., to sell you something, an intelligent assistant is built to serve you, and can do far more – from telling jokes and providing stock market levels, to setting timers and playing your favourite tune.
AI And Bots
Both chatbots and virtual assistants are more intelligent than a simple bot. A bot only follows the script, while a chatbot and virtual assistant have more options to interpret the command. If they are supported by artificial intelligence, one fundamental difference is the NLP (Natural Language Processing) capability, which is the brain of the AI bot. Their brains keep on evolving, based on the data and analysis, so that they can make smarter decisions. They can also understand the meaning of what was said or typed, and they can utilise information from other sources, like a CRM, real-time insights, etc.
Take, for instance, IBM’s Watson: when answering a question, it compiles information by looking at thousands of pieces of text, and combining this information with an ability to recognise patterns. Importantly, it also weighs the evidence found in those patterns. Watson can come to conclusions in much the same way that people do.
The Future
What if AI could move beyond transactional interactions and could recognise human emotion? If we can teach machines to think, how long will it be before we can teach them to feel, and to interact in a meaningful way?
While caring, emotional bots might seem like an idea pulled from a Steven Spielberg movie, the reality is much nearer than we think. Take Replika for example – a chatbot that learns to mimic how humans speak in order to converse. There is also MIT’s Media Lab project, Affectiva, that can detect human vocal and facial expressions, using data from millions of videos and recordings of people across cultures. Another example is Pepper, a humanoid robot who first came on the scene in 2016 – a ‘genuine day-to-day companion, whose number one quality is his ability to perceive emotions’.
What If We Added VR To AI – What Happens Then?
The combination of VR & AI offers a big opportunity for brands. Their integration will provide a new range of experiences and opportunities that respond in a more human way. Suddenly, that virtual environment becomes more intelligent, but, importantly, more personal, providing copious opportunities for brand marketers to use this technology to make stronger connections with consumers. This is the next generation of media. Taking a 3-D virtual world and combining that with a smart system that can replicate a human – it’s a whole new level of engagement.
Today, companies like Retinad build heat maps that display exactly where a person’s gaze has landed and how long it has lingered inside a virtual environment. In the future, AI-powered apps will use that information to individually tailor the experience. For example, the AI in the app will know how the user has been moving, how well they are doing at tasks, what the user has been interacting with most, and what has caught the user’s subconscious gaze the most. It will then adapt it into an entirely bespoke new narrative, completely personalised to the individual.
The combination of VR and AI is only now becoming possible due to developments in AI, particularly machine learning, that foster real-time image and speech recognition; increased availability and reduced cost of local processing and storage; expanding network bandwidth, allowing richer data streams and availability of AI in the cloud.
AI and VR are moving to the cloud. The potential to run simulations faster-than-real-time in the cloud promises to unlock a more connected future.
Immersive Tech, AI And Retail
The combination of immersive tech and AI offers exciting possibilities for retail. Imagine walking into the Burberry flagship store on Regent Street and being approached by a holographic Iris Law, the face of the brand. You could ask virtual Iris questions and, using natural language processing, it would understand what you’re saying and respond as the real Iris might. Combining an AR interface with an in-store virtual assistant such as Macy’s On Call, powered by IBM Watson and Satisfi, is likely to be the next step for retail marketers.
The possibilities are endless and exciting.
But, no, AI will not take over the world…. well, at least not yet.