Axa Financial Inc. dodged a big, bad bullet early this year when, soon after launching a new customer Web portal, the financial services company got a call from an irate customer who couldnt register on the site.
AXA officials traced the problem to incomplete customer data coming from one of the companys dozens of legacy systems. Information about 600 of AXAs best financial services customers had been entered into the portals database minus street addresses. Without them, those customers also would be locked out when they tried to access their account information online.
Before anyone else was inconvenienced, AXA officials fixed the problem and launched a broader investigation. Finding other examples of incomplete or inaccurate data, AXA officials decided to head off future full-fledged online disasters by making formal, enterprisewide data standards a key part of AXAs CRM (customer relationship management) strategy.
“Our CRM applications could only be as good as our data,” said Jennifer Schuppert, director of strategic data technology at AXA Client Solutions, in New York. “We knew from the beginning we had to make data quality a part of [CRM] if we had any hope of success.”
After years of neglecting the problem or investing millions of dollars and hundreds of man-hours to manually clean up and rework data in a stopgap fashion, many companies, like AXA, are finally getting serious about data quality. And for many of them, CRM is the driver. Why? Because, unlike with earlier generations of back-office systems where inconsistent data merely drove up costs, in e-business, where youre dealing directly with customers, miscues resulting from faulty or incomplete data can drive away business in an instant, experts say.
Its not surprising, then, that data quality is starting to command the attention of top management instead of being passed off to IT. Many e-businesses today are putting board-approved data management and quality strategies in place, according to a global data management survey of 600 CIOs and IT directors conducted by PricewaterhouseCoopers.
“Data quality issues rank in the top 25 percent of inhibitors to successful CRM,” said John Benge, a partner in PricewaterhouseCoopers data management practice, based in Floran Park, N.J. “So companies are making data quality a boardroom issue these days. Three years ago, it was, but only for a very small percentage.”
But just because companies have recognized the problem doesnt mean theres a quick and easy fix. For established companies with lots of legacy systems, the task of implementing a comprehensive data quality regime can be herculean, taking upward of 18 months and requiring significant investments in new packaged or home-grown data quality tools. It can also require considerable time spent garnering consensus on and then instituting new, cross-enterprise business processes and data standards.
The latter issue, experts say, can be the most daunting. The difficulty lies in achieving consistency in how customer information is rendered in different source systems across the various pieces of a business. That process alone entails targeting the systems containing key customer information, auditing them to see what kind of information is stored and in what format, then creating common business processes across a company to ensure that the same customer data is stored in the same fashion, regardless of the system.
At Lucent Technologies Inc., although the mandate to refocus the companys organizational structure and marketing thrust on customers came directly from CEO Henry Schacht, cultivating support for common customer information data standards among top executives across different business functions such as manufacturing and sales was the tricky part. Lucent officials wanted a single, common customer master file that could be used by all lines of business. But to get there, all of Lucents businesses had to cooperate.
“This requires a significant change in behavior,” said Bruce Ellington, director of information services for the Murray Hill, N.J., company, which delivers communications technology and services to service providers for use in their networking products. “New disciplines must be adopted, and there has to be agreement on a common approach.”
To hammer out a companywide commitment to common data standards, Lucent recently created the Process Management Architecture group, a steering committee made up of six representatives on the executive level from groups such as sales and marketing and supply chain. The group, which meets monthly or bimonthly, decides on everything from how to create customer coding standards to establishing a unified set of processes and policies for managing end-to-end customer relationships.
Six months into what Lucent officials fully expect will be an 18-month process, the company is making headway on the customer master file. The idea is to make information from any customer interaction—whether through sales reps, the call center or the Web—available in a common format so it can be fed with full-integrity, new analytical or other customer-focused systems, Ellington said. From there, data mart technology can be applied for analyzing macro trends such as customer retention or even specific customer buying patterns. The CRM initiative employs sales automation software from Siebel Systems Inc., enterprise resource planning applications from Oracle Corp. and Teradata database technology from NCR Corp.
AXA is also in the thick of planning a standardized customer file. Because of the Web portal data near-disaster, Schupperts group didnt have to cajole the companys business units to support a data quality project. Schuppert said the project was launched from the business side by a strong sponsor in customer service.
Like most companies pursuing a customer-centric course, AXAs objective is to have a 360-degree view of all interactions with its clients across different businesses, product lines and touch points—for instance, call center interactions or Web-based transactions. But simple inaccuracies such as misspelled names or inadequate address fields detract from its ability to achieve that total picture.
For example, if an AXA mutual fund customer is stored in one customer database as John B. Smith but also holds a life insurance policy with The Equitable Life Assurance Society (another AXA business), where he is known as Jon Smith, its not certain that AXA will recognize the two as the same individual. Schupperts group is banking that the standardized customer file will address these concerns.
With the right stakeholders engaged, AXA has moved on to the next stage of data quality management: performing a formal analysis or audit of existing legacy customer systems to get an idea of what customer data is out there, how consistent or complete it is, and what should be integrated into the central customer file. To do that, the company is using an automated data-profiling tool from Evoke Software Corp., of San Francisco. The tool analyzes the databases, outlining their content, structure and quality and producing a normalized data model.
“We have to go through all the customer systems and understand the data in there before we can consolidate them into a new customer information file,” Schuppert said.
Experts caution, however, that theres work to be done even before the actual data audit. One of the biggest mistakes companies make with CRM is trying to cleanse and integrate all customer information, regardless of relevance, into a master customer file. “Unnecessary use of data is one of the biggest black holes companies throw money into with CRM,” said Barton Goldenberg, president of IMS Inc., a CRM consultancy in Bethesda, Md. “The question most companies dont ask and should is where does customer data reside, then figure out which data is relevant and how to pull it together in a clean way.”
Once the appropriate data sources have been identified, cleansing efforts can begin. Companies can either enlist manpower to manually pore through databases and make changes or, more likely, enlist one of the automated tools from companies such as Vality Technology Inc., Firstlogic Inc., Trillium Software Inc. and Innovative Systems Inc.
These tools perform a variety of functions, including automatically correcting and filling in missing data, eliminating duplications, and identifying and matching households and other relationships. Many of these tools have been recently upgraded, adding real-time capabilities so they can check and fix data even as it flows over the Web to and from customers (see story, Page 51).
Royal Bank of Canada, in Toronto, Canadas largest bank, has enlisted Valitys Integrity data re-engineering product, among other tools and practices, to ensure it gets a consolidated and consistent view of its customers.
Vality is used to match customer information within the various data sources running the banks 11 lines of business. With a complete and accurate view of customers relationships with it, the bank can run queries to determine, for example, who between the ages of 35 and 45 bought mortgages in Ontario over the last six months to help it market additional services such as equity lines or home improvement loans.
“With the move to customer segmentation and CRM, this issue of standardization of data or linking data to know who is doing what at all touch points of the bank becomes critical,” said Mohammad Rifaie, senior manager of information resource management and data warehousing for Royal Bank.
But its not just click-and-mortar companies that need to worry about data quality for CRM. Although pure Web-based businesses typically dont have data in multiple legacy systems to clean up and make consistent, they do need to manage data from multiple sources, and they do need to worry about inconsistent data from Web-based customers.
Take SciQuest.com Inc., a business-to-business marketplace for the life sciences industry, which has been around for less than four years. The marketplace has to give customers access to different product catalogs that may represent similar products in different ways. It also has to account for customers sometimes-creative spelling as they search for products. To handle all that, the Research Triangle Park, N.C., company last quarter began using Valitys Beacon search and matching engine (formerly called eSearch), which, on the fly, corrects data inconsistencies.
“Our catalog of scientific products consists of complicated words and terms, and customers frequently spelled something a particular way and couldnt find a product,” said Rob Fusillo, SciQuests CIO. “When analyzing our click-stream data, we became aware that users were searching for stuff we had in the catalog but werent completing a sale because they couldnt find it.”
Similarly, computer and electronics e-tailer Outpost.com has been forced to build a detailed data architecture that cleans and organizes the customer and click-stream data from 100,000 daily visits that the company uses to analyze purchasing trends and decide on site design issues. Cleansing is critically important, Outpost officials said, because much of the customer data is entered online by consumers themselves. Therefore, spelling and formatting errors are common.
“The data we collect is the majority of what we use to drive and guide what we do on a daily basis,” said Raymond Karrenbauer, chief technology officer of the Kent, Conn., company, which was acquired last month by PC Connection Inc., a direct marketer of PC products in Merrimack, N.H. “Absolute quality is the most important element,” Karrenbauer added.
So important, in fact, that Outpost decided to forgo off-the-shelf cleansing tools in favor of building its own data quality environment. The reason: Outpost was concerned that commercial tools couldnt keep up with the growing flow of customer and click-stream data. It took a team of four three months to build the new architecture and even longer to introduce data cleansing techniques and data quality practices to the organization.
Now that the system has been deployed for eight months, its paying off by helping Outpost refine site design and business processes. For example, Outpost recently rescinded a pop-up help screen that it had added to its checkout process when data clearly showed customers were not responding.
Because that customer data has been cleaned and checked for consistency, Outpost shouldnt have to worry about having to dodge any bullets. And, in the current dicey dot-com environment, thats a very good thing.
Said Karrenbauer: “As a pure-play e-tailer, we dont have the luxury of screwing up.”