For Artificial Intelligence To Go Mainstream It Has To Be Real

Artificial Intelligence
Image credit: source

Shutterstock

Remember the very early days of interactive voice response (IVR)? From its inception in the 1980s to the early 2000s, automated call systems went from decoding touch tone signals to speech recognition and a universal standard for determining voice dialog between humans and computers — a standard that allowed call centers across industries to scale the technology. However, as virtual assistants (such as Siri and Alexa) came into existence, customer expectations surpassed the level of service IVR can provide and call centers have been required to adapt.

Many call centers have transitioned to virtual assistants that can provide a better service experience than traditional IVR ever could. Although great strides have been made, and automated systems are arguably less frustrating than they used to be, have you ever really had a positive experience interacting with your cable, utility or mobile provider’s service bots?

Today’s overall use of artificial intelligence (AI) still falls within the non-sentient machine intelligence phase of AI’s full potential. And I would argue we are even in the infancy stage of this realm, having barely scratched the surface of what’s possible.

Since AI can refer to a breadth of technologies, it’s important that I clarify the following points in this article and references to AI are specific to chatbots.

I like to compare the state of today’s AI to that of the first automobile. When it comes to interacting with machines, we’ve managed to abandon the technological equivalents of the horse and buggy, but we’re lacking the standard parts, sophisticated highways and seamless user experience needed for AI to really go mainstream. For instance, even though today’s chatbots can be somewhat helpful ("Hey Alexa, tell me the weather"), you always know when you’re interacting with a bot. The experience is limited and disjointed. You can go from point A to point B but no further and without complexity. There is no standard communication protocol, so bots from different providers can’t collaborate to provide a holistic experience for users. These restraints are holding AI back from achieving scale and becoming as integral to our lives as cars, highways and traffic laws.

The ultimate goal of chatbots is to delight customers with an experience that is so human-like that they don’t know that they’re not speaking to an actual person. Chatbots should offer assistance to your users that help solve real problems and saves them time. For instance, it’s nice that you can ask Siri to call your parents, but wouldn’t it be great if I could ask Siri what to buy my mom for her birthday, then have the bot determine what she would want, relay the best option to me, and then handle purchase and delivery?

There are a lot of steps that need to be taken to get from where we are today to the human-like chatbots of tomorrow, including but not limited to:

1. Establishing A Standard Communication Platform

The lack of a common platform to communicate across bots is one of the biggest limitations holding back the virtual assistant user experience. Right now, you can’t ask Amazon’s Alexa to ask Google Home or Apple’s Siri to perform a task. If I can ask my friend Peter to ask our friend Chris to meet us for dinner, Siri should be able to ask Alexa to make a purchase on Amazon Prime or tell Google Home to turn down my Nest thermostat.

2. Creating A Network Of Bots

Once a universal platform is established, the next step in simulating the complexity of the human experience is forming a network of bots that will work together to solve increasingly difficult problems. Bots will be able to collaborate and combine varying specialties to understand, anticipate and solve for any task the user requires of them.

Take, for instance, the example of my mom’s birthday gift. I would simply interact with my virtual assistant of choice and a chain of bots would take it from there. Perhaps one bot is trained to know my contacts and their relation to me. It passes that information to the bot that understands my mom’s buying habits, has access to her Amazon wish lists and can predict a desirable gift for her. That data is then passed to a bot that can find the best price for the item and automate purchase and delivery. The entire experience is automated, expedited and presented to me as though I am interacting with one very helpful personal assistant.

3. Thinking Outside The Text App Box

Historically, apps have been designed for the basic visual experience. From the first website to today’s mobile app, design has been focused on the user’s visual interaction with simple text and graphics. AI is adding another dimension to user design with chatbot and voice technology. Just like companies have started to design for a mobile-first experience, the future of AI will also call for voice-first and bot-first design.

Although we are not yet living in a world where AI is a key part of people’s lives, eventually we will be. Tech giants such as Google are quickly paving the way. Google Assistant’s latest feature, Duplex, demonstrated the right mix of language, inflection and crutch words to successfully book restaurant reservations and hair appointments without the people on the other end of the line realizing they were speaking with a bot.

Further, the recent announcement of the Data Transfer Project between Google, Microsoft and others could be the first step toward the standard communication platform needed to truly scale AI.

Forbes Technology Council is an invitation-only community for world-class CIOs, CTOs and technology executives.
Do I qualify?

“>

Remember the very early days of interactive voice response (IVR)? From its inception in the 1980s to the early 2000s, automated call systems went from decoding touch tone signals to speech recognition and a universal standard for determining voice dialog between humans and computers — a standard that allowed call centers across industries to scale the technology. However, as virtual assistants (such as Siri and Alexa) came into existence, customer expectations surpassed the level of service IVR can provide and call centers have been required to adapt.

Many call centers have transitioned to virtual assistants that can provide a better service experience than traditional IVR ever could. Although great strides have been made, and automated systems are arguably less frustrating than they used to be, have you ever really had a positive experience interacting with your cable, utility or mobile provider’s service bots?

Today’s overall use of artificial intelligence (AI) still falls within the non-sentient machine intelligence phase of AI’s full potential. And I would argue we are even in the infancy stage of this realm, having barely scratched the surface of what’s possible.

Since AI can refer to a breadth of technologies, it’s important that I clarify the following points in this article and references to AI are specific to chatbots.

I like to compare the state of today’s AI to that of the first automobile. When it comes to interacting with machines, we’ve managed to abandon the technological equivalents of the horse and buggy, but we’re lacking the standard parts, sophisticated highways and seamless user experience needed for AI to really go mainstream. For instance, even though today’s chatbots can be somewhat helpful (“Hey Alexa, tell me the weather”), you always know when you’re interacting with a bot. The experience is limited and disjointed. You can go from point A to point B but no further and without complexity. There is no standard communication protocol, so bots from different providers can’t collaborate to provide a holistic experience for users. These restraints are holding AI back from achieving scale and becoming as integral to our lives as cars, highways and traffic laws.

The ultimate goal of chatbots is to delight customers with an experience that is so human-like that they don’t know that they’re not speaking to an actual person. Chatbots should offer assistance to your users that help solve real problems and saves them time. For instance, it’s nice that you can ask Siri to call your parents, but wouldn’t it be great if I could ask Siri what to buy my mom for her birthday, then have the bot determine what she would want, relay the best option to me, and then handle purchase and delivery?

There are a lot of steps that need to be taken to get from where we are today to the human-like chatbots of tomorrow, including but not limited to:

1. Establishing A Standard Communication Platform

The lack of a common platform to communicate across bots is one of the biggest limitations holding back the virtual assistant user experience. Right now, you can’t ask Amazon’s Alexa to ask Google Home or Apple’s Siri to perform a task. If I can ask my friend Peter to ask our friend Chris to meet us for dinner, Siri should be able to ask Alexa to make a purchase on Amazon Prime or tell Google Home to turn down my Nest thermostat.

2. Creating A Network Of Bots

Once a universal platform is established, the next step in simulating the complexity of the human experience is forming a network of bots that will work together to solve increasingly difficult problems. Bots will be able to collaborate and combine varying specialties to understand, anticipate and solve for any task the user requires of them.

Take, for instance, the example of my mom’s birthday gift. I would simply interact with my virtual assistant of choice and a chain of bots would take it from there. Perhaps one bot is trained to know my contacts and their relation to me. It passes that information to the bot that understands my mom’s buying habits, has access to her Amazon wish lists and can predict a desirable gift for her. That data is then passed to a bot that can find the best price for the item and automate purchase and delivery. The entire experience is automated, expedited and presented to me as though I am interacting with one very helpful personal assistant.

3. Thinking Outside The Text App Box

Historically, apps have been designed for the basic visual experience. From the first website to today’s mobile app, design has been focused on the user’s visual interaction with simple text and graphics. AI is adding another dimension to user design with chatbot and voice technology. Just like companies have started to design for a mobile-first experience, the future of AI will also call for voice-first and bot-first design.

Although we are not yet living in a world where AI is a key part of people’s lives, eventually we will be. Tech giants such as Google are quickly paving the way. Google Assistant’s latest feature, Duplex, demonstrated the right mix of language, inflection and crutch words to successfully book restaurant reservations and hair appointments without the people on the other end of the line realizing they were speaking with a bot.

Further, the recent announcement of the Data Transfer Project between Google, Microsoft and others could be the first step toward the standard communication platform needed to truly scale AI.

(Excerpt) Read more Here | 2018-08-17 22:38:58