Last week, NVIDIA showcased a massive improvement in conversational artificial intelligence that has broad implications that are both good and bad. In effect, it has been able to drop the time it takes for a computer to understand and respond to the spoken word to milliseconds, which will allow AVR (automated voice response) solutions to become far more conversational and to emulate real people far better.
I was briefed last year on an IBM system based on Watson that even with the delay not only increased the close rate on computer-generated telephone sales by something like 3X, but one of the callers' monitored responses indicated he was so convinced the system was a person that he to tried to flirt with it.
Go here to see a listing of eWEEK's Top Predictive Analytics Companies.
Now I mentioned this is good and bad; let’s talk about both this week.
A Huge Jump in AI AVR Value
A lot of companies and governments deploy AVR systems to the annoyance of most of us. They have historically been slow, have severe understanding problems and are scripted so you generally must provide the response they want to get the answer you need. The experience is typically just a bit more fun than a root canal, and even I, who started working with early versions of these systems in the 1980s, often have wanted to yell at them.
With AI, however, you can dramatically improve the system’s capability to understand real language and respond more intelligently, as the Watson example points out. But AI introduced latency, which, much like it was when talking to someone overseas on an analog line, presents a different kind of annoyance; this results in the customer and system talking over each other and other annoyances that reduce close rates, customer satisfaction and the quality of the interaction.
NVIDIA has been aggressive at improving AI capability, and by effectively removing the latency, systems based on this technology should be able to significantly exceed the performance of systems that don’t have it because the conversations will feel more local and the customers will be more engaged.
The result, once deployed, should be a dramatic increase in the effective use of AVR AI technology and an even more rapid replacement of call centers by technology. Now you may think this is bad for jobs, but if you look at call centers, they typically have incredibly high turnover rates, incredibly low employee morale and incredibly uneven quality as a result.
The other big advantage to an AI system using ML (machine learning) or especially DL (deep learning) is that the system can know things about the customer that will allow it to tailor the offer specifically for that customer. Not only are the close rates higher, the customer annoyance is far lower, because the customer actually gets what he wants. In effect, more win-wins for both the firm and the customer.
Now for the Bad News
The bad news is that legitimate companies aren’t the only ones that buy AVR solutions; most of these suck so badly you are generally able to tell they are a scam off the bat. For example: That call from a phone “Microsoft service person,” or the one from Social Security that isn’t, or the call from the (enter three-letter agencies here) that is trying to scam you. But with this technology, scams also can become far more effective. This means we will need to be far more careful about scam calls because they will increasingly sound more real as well and be far more capable of scamming us.
Fortunately, the scammers seem to be running about two generations back in terms of technology, so we have several years to get ready for this problem. But you may want to chat with your kids and older relatives about this coming problem before you find that most of their savings—and perhaps your own—has been turned into cash cards sent to the scammer. Remember that the only folks who ask for cash cards are scammers.
We already know that the customer experience is significantly enhanced and close rates increased by AI-driven AVRs. With the technology that NVIDIA showcased this week, that will significantly improve. This will cause many call centers, which are unfortunately connected to what looks like employee abuse, to close and be replaced by these systems.
We do have to be aware that these systems can be used to scam us. We should recognize that even if a caller seems to know you and sounds like a real person, he or she may not be; we must raise our scam alert sense to include far more realistic callers.
Rob Enderle is a principal at Enderle Group. He is a nationally recognized analyst and a longtime contributor to QuinStreet publications and Pund-IT.