We’re in the midst of a fundamental shift in how people interact with computers. What started as an impressive novelty has become a trillion-dollar opportunity and a trigger to an arms race between technology’s largest companies to be the go-to solution for voice search. Conversational user interface provides the kind of direct engagement with computers that we’ve dreamed of since the first episode of Star Trek more than 50 years ago.
Now, conversational user interface (UI) is more than just another notion of the future in popular science fiction. It is driving the slow disappearance and replacement of physical tools and applications — from the traditional graphical user interface to menu bars, trackpads, mobile device buttons and more. It begs the question, how will tools be used in a post-voice-computing environment and how will data be received and processed as the technology advances?
As a thought leader in this space and the owner of two tech companies, one an enterprise software company in natural language processing and the other a big-data solutions provider, I have been witnessing a gradual shift and the influence conversational interfaces now have in this industry. As a business owner, here’s what you need to know.
Conversational interface is more than just voice search on an Echo device or mobile phone. It’s about fundamentally overhauling and replacing the old way of doing things. Early communications, which we still use in most instances, required a syntax-specific series of commands to be entered into a computer interface. You had to click, type or tap on the items you want to activate and then enter the required sequence of data to get it to do what you want.
This is not how people think, though. Think back to what it was like to search in Google ten years ago. You couldn’t necessarily ask a question and get a response. You had to creatively combine the keywords that Google could effectively interpret and track as a query. Today’s search engine has been trained for more than 20 years to take that search data and learn from it — to recognize human speech patterns and the intent behind a question. And the result is a more conversational interface. Today, Google says that 20% of searches are now voice-based and by 2020, as Gartner estimates, 30% of all web sessions will be done without a screen.
This is best seen in the wave of voice-interfaces and chatbots making their way into homes and mobile devices. Websites are using artificial intelligence-powered chatbots to replicate human conversations. Voice assistants are answering questions in the tone and syntax in which they are asked. Computers are responding to human speech with comparable human speech, and it’s changing not only how we use computers, but how we think of them.
How Conversation Interface Tools Are Used
There are several mass-market examples of conversational interface from the big tech giants. Google’s Home devices, Amazon’s Echo line, Microsoft’s integration of Cortana into Windows and Siri in iOS and OSx are all front-line conversation interfaces. Facebook M plays a similar role in social media as a powerful chatbot that is now being used for money transfers, meeting scheduling, ordering food and a wide range of customer service inquiries.
So, what is it about these voice assistants and conversational interfaces that have companies so intrigued and actively embedding them in their customer journey? There are several reasons, including:
• Platform agnostic: A conversational interface can be implemented across any device, with or without visual components.
• Natural language: Voice data can be collected and used from these assistants to improve their performance over time.
• Reduced friction: Even on the most expensive, fastest smart devices on the market, the user must stop and input data to perform a query. Voice-activation is frictionless. Someone can check the weather while making lunches or check for traffic while putting on their coat. For a business, offering this kind of frictionless experience means being a more immediate part of someone’s life.
While we’re still in the midst of the implementation of these technologies, they are becoming an increasingly important part of marketing, customer success and sales strategies in the enterprise. From personalized messages via Facebook M to interactive voice experiences on one of the many voice assistants on the market, businesses are rushing to invest heavily in the next wave of UI.
Business owners should keep an eye on how other enterprises are leveraging this technology in order to stay ahead. For example, installing a chatbot directly onto your businesses website can help answer customer questions quickly and efficiently while cutting down on man-hours and costs. No one wants to wait on the phone on hold to get simple questions answered and no business owner wants to employ a huge customer service division for simple tasks. However, customer service representatives should be on hand to take over the conversation as needed.
How Businesses Using Voice Are Evolving
In 2017, Canalys estimated that more than 33 million smart speakers were sold and that more than 50 million would be sold in 2018. That means there are more than 100 million of these devices in U.S. homes, and people use them for dozens of routine tasks, including reading the news, setting alarms and checking the time.
This is changing, though, as voice assistants become more adept at complicated queries and managing smart home systems. As the artificial intelligence becomes more sophisticated, it will be able to anticipate needs in new and complex ways. Some businesses are already experimenting with voice UI for customer returns, for example, and others make it easy to order products and services. You can request an Uber with Alexa or order lunch. It also allows businesses to reach an entirely new audience that is unable to use traditional GUI on mobile devices.
The future of computing is conversational. As devices continue to penetrate homes, and as the AI powering them becomes more powerful, new opportunities will arise for businesses that shift their research and development into non-visual user interfaces.