Call Centers, IVR Systems Lack Testing, Monitoring
Most companies do nothing or call manually to discover whether their customer service systems are delivering superior service, according to an Empirix-sponsored LinkedIn survey of more than 1,000 international technical professionals.
The survey found that 20 percent of companies adding new technology to their customer contact centers either wait for customers to complain, pray or only pay attention if it's a major upgrade. The majority (62 percent) said they test upgrades manually by having employees randomly evaluate different aspects of performance. Only 18 percent use automated testing, which provides the most consistent and reliable intelligence.
Conducted in early summer, the survey asked managers and executives across multiple industries how they assure high-quality customer experiences in call centers as they enhance telephony, information systems and interactive voice response (IVR) systems.
The survey also included responses on contact center monitoring and a detailed breakdown of respondents. Their responses indicated that few companies invest in technology to proactively identify problems with customer service systems, or to pre-empt them in the first place.
"The survey identified a huge disconnect between investing in customer service technology and knowing how to realize the maximum return on that investment," Tim Moynihan, vice president of marketing at Empirix, said in a statement. "Companies are committing significant sums to purchasing the hardware and software, but they're not viewing quality assurance as part of the equation. There seems to be a lot of talk about superior customer service, but this survey shows an inconsistent approach to testing and monitoring these systems."
The results were more encouraging for contact center monitoring, with 31 percent of companies investing in monitoring technology to keep their customer service systems running smoothly. Still, the largest percentage (45 percent) use manual methods. In addition, most companies (68 percent) never test the voice quality in their contact centers.
"Manual testing at least shows that the company is trying. But it's not a good use of peoples' time and it's usually a one-shot deal. It doesn't provide a broad view of when and how problems might occur, especially when systems are subjected to a realistic number of users," Moynihan said. "Automated testing can validate the system thoroughly, from end to end, but few companies use it."
The report also noted systems that drop calls, deliver the wrong information to customer service representatives, route calls incorrectly or provide poor voice quality force reps to spend more time resolving each issue and adding unnecessary cost to each interaction. Meanwhile, lack of predeployment testing and ongoing monitoring can reduce the returns on companies' investments in customer service technology.