Usability testing, tools help prevent site flaws, reveal secrets to Web success.
Shoppers at Staples Inc.s retail Web site, Staples.com, were clearly annoyed. More and more of them early last year began abandoning the sites registration process midstream, leaving behind online shopping carts of unpurchased office supplies and concerned Staples officials. Something, clearly, had to be done.
While other companies facing a similar challenge might have rushed to the nearest Web design guru for a site overhaul using the latest design trends, Staples chose a different approach. The company went directly to its customers to expose problems with the site that developers originally missed. Members of the companys usability team combined their own expert review of the site with formal usability tests with consumers to pinpoint the roadblocks in the sites registration section.
After Staples revamped the section in April last year based on the testing results, the number of shoppers dropping out of the registration process decreased by 53 percent, said Colin Hynes, Staples director of usability, in Framingham, Mass. The office supplies and equipment retailer has been following a similar formulaand seeing similar resultssince building an internal usability group to improve the user friendliness of all its Web sites and applications two and a half years ago.
"Its really about the customer being a co-developer of our sites," Hynes said. "Its about being close to customers and not assuming that were so smart that we can predict what customers will say and do but backing it up with solid scientific research."
As online shopping matures into a major retail channel and as Web applications proliferate inside enterprises and among business partners, users of online systems are becoming less and less tolerant of poorly designed sites. As a result, experts say, enterprises that want to cash in on the second coming of e-business need to take Web site usability testing as seriously as Staples is. That means making usability testing an ongoing process, not something ignored until the end of a sites development. Fortunately, a new generation of usability testing services and tools that can automate part of the job of getting into users heads and spotting common site design defects are springing up to help. Many of those services arent cheap, and they arent always quick. Usability testing by an outside lab on a complex site can take weeks, for example, and cost upward of $30,000.
Still, say experts, continual usability testing can pay off big by helping companies avoid common design problems that turn away customers. Problems such as broken site links, hidden information and poor search results are top reasons why visitors abandon Web sites (see chart, Page 48). Improving usability can help boost revenue by making the process of finding products and purchasing them easier. It can also cut support costs by decreasing help desk calls and e-mails and can improve overall customer satisfaction, experts say.
"If you dont do usability testing before you launch, I promise youll be doing usability testing after you launch [and] in a public forum, and its painful," said Joel Wecksell, an analyst at Gartner Inc., in Stamford, Conn.
Usability test labs such as Human Factors International Inc., of Fairfield, Iowa; Austin Usability Inc., of Austin, Texas; and Optavia Corp., of Madison, Wis., focus on having individual users conduct specific tasks, such as buying an item on an e-commerce site or finding a piece of information on a content site. Testers measure such factors as the users success rates in completing a task. They also allow site developers and design experts to watch users, often through one-way mirrors or videotapes of sessions, and gain insight into the sections of a site that seem to frustrate users most.
Such testing techniques work best during the development of a new or redesigned site and should be part of a mixture of tests used at various stages of development, experts say (see story, Page 46).
"Usability testing is only one way to talk to customers," said Randy Souza, an analyst at Forrester Research Inc., in Boston. "You should start communicating with customers before application development work is done."
KPMG Consulting Inc. has followed that advice in creating a statewide government portal, Texas Online, for the state of Texas. The result has been happier users and fewer help desk calls. Texas Online launched in August 2000 and today includes information and applications such as online license renewals from 20 state agencies and the cities of Dallas and Houston. No major application is added to the site without first undergoing multiple approaches to getting user feedback, said Gary Miglicco, a managing director of KPMG, in Austin.
The first step is to hold focus group meetings to understand user needs and try out paper mock-ups that describe the general organization and structure of a new section of the site. Once a live prototype is available, the consulting company gathers users for formal one-on-one usability lab testing, often working with outside labs such as Austin Usability.
KPMG followed this approach for the development of a drivers license renewal application, conducting four weeks of testing before its May 2001 launch. Changes made in layout and graphics during the course of the tests helped cut calls to the portals help desk by two-thirds, Miglicco said.
Traditional lab testing isnt the only or always the best option for understanding customers preferences. Travelocity.com Inc., of Fort Worth, Texas, uses a combination of outsourced regional usability labs as well as a hosted online testing software tool from Vividence Corp. to test changes to its travel site. Each approach has its benefits and pitfalls. Traditional lab tests provide visual feedback, such as facial expressions, and are most helpful in pinpointing problems in large-scale site redesigns. Lab-based tests, however, are limited to about a dozen users and can take a week or more to complete, said Elizabeth Cole, vice president of customer experience at Travelocity.com.
Online testing, in which users responses to design changes are recorded remotely, on the other hand, allows for large sample sizes of hundreds of users and can be completed within a day or two. Online tests, however, provide no visual feedback and are best for judging specific sections or design issues on a site. Online testing is generally available on an annual subscription basis and, say vendors, ranges in cost from $30,000 to $500,000 per year, depending on the number and size of online tests conducted. An average customer, according to officials at Vividence, in San Mateo, Calif., spends about $200,000 per year for between five and 10 tests.
A single usability lab test typically costs between $15,000 and $30,000 with 12 to 15 users, according to Forrester.
When Travelocity.com last summer decided to redesign its home page, Cole used Vividences online testing tool because it would provide the company with a larger sample of customer opinions on the most critical page on the site. Travelocity.com started in August by asking 200 consumers which of three home page designs they preferred. The results were overwhelming, with users picking one design over others by a 2-1 margin.
With a basic design set, Cole in November engaged a different group of 200 customers for more detailed tests, again using online testing. She compared the new home page with the old one and with competitors sites. One part of the test drilled down into whether customers would accept a major change in the quick travel search feature on the home page. The search area contains six tabs representing different forms of travel services, such as airlines, cars and hotels. Travelocity.com tested a feature that, once a user clicks on a tab, not only performs the requested search but also changes the content of the entire home page to reflect the search category selected.
Fifty-seven percent of those tested preferred this kind of home page personalization. As a result of the favorable feedback, Travelocity.com was planning to add the dynamic content to its home page by the end of this week, Cole said.
While usability testing is cru- cial during Web development, user opinion shouldnt be forgotten once a new site or redesigned feature launches. Only by monitoring behavior and analyzing user feedback can companies continually make usability changes that can significantly improve site performance, experts say.
Larger enterprises with lots of sites to monitor can save money on that kind of continual testing by creating an in-house usability group. Thats exactly what Staples did. Its seven-person usability group includes a combination of information architects who are experts in developing and reviewing a sites structure, usability engineers who develop user tests and analyze behavior, and a recruiter who finds participants for user studies.
The group supports various Staples departments that sponsor new site developments or redesigns, including marketing, merchandising and IS, and then works closely with Web designers and content writers to make the changes. Besides the main Staples.com site, the group tests and tweaks designs on the companys Canadian site, BusinessDepot.com, and subsidiary Quill Corp.s Quill.com business-to-business office supply site.
The Staples group does more than just run lab tests. Among other things, it conducts field tests, observing users in their work environments. The group also uses an in-house-developed online survey tool to get customer feedback once site changes are live.
Staples decided to move usability testing competency in-house because of the volume of testing it does. Just in the 20 working days of last month, for instance, Staples usability group conducted formal lab tests on 13 of those days, Hynes said.
"What you gain first is cost, especially in the world of usability [where] the consultants can be rather pricey," Hynes said. "To be able to have those people in-house, to be able to live and work in [the] environment and to really understand where people are coming from makes them a lot more valuable."
For Staples, Hynes estimates that using internal usability testing capabilities costs at least 25 percent less than outsourcing. Setting up an internal lab costs about $25,000, not including annual salaries for staff, said Matthew Berk, an analyst at Jupiter Media Metrix Inc., in New York.
"They are not alternatives but are things you [can] do alongside usability testing," Gartners Wecksell said.
By using WebCriteria, retailer Camping World Inc. was able to pinpoint problems in the checkout process on its online shopping site Campingworld.com. In November last year, Camping World shifted its checkout process so that shipping costs were calculated upfront and not near the end of the process. In the first three weeks after the change, the site boosted sales by $60,000 and has seen an overall increase in the percentage of people who complete a purchase, said David Scifres, vice president of Internet services at parent company Affinity Group Inc., based in Ventura, Calif.
Even though most companies have focused their usability efforts on consumer and e-commerce sites, internal applications shouldnt be ignored. Poorly designed user interfaces are a major reason why self-service human resources applications such as benefits sign-ups on intranets are struggling to yield cost savings, according to a recent study by Towers Perrin, a management and HR consultancy in New York.
Staples, for one, is beginning to realize the importance of usability across the board, not just on consumer Web sites. Hynes said he expects his usability group to focus more on internal applications, such as intranets, this year.
"Were basically the usability team for the company," Hynes said. "Whether the user is somebody in corporate finance or whether its somebody using the Internet, [we want to be] providing a great experience for them as well and making them more productive."
As an online reporter for eWEEK.com, Matt Hicks covers the fast-changing developments in Internet technologies. His coverage includes the growing field of Web conferencing software and services. With eight years as a business and technology journalist, Matt has gained insight into the market strategies of IT vendors as well as the needs of enterprise IT managers. He joined Ziff Davis in 1999 as a staff writer for the former Strategies section of eWEEK, where he wrote in-depth features about corporate strategies for e-business and enterprise software. In 2002, he moved to the News department at the magazine as a senior writer specializing in coverage of database software and enterprise networking. Later that year Matt started a yearlong fellowship in Washington, DC, after being awarded an American Political Science Association Congressional Fellowship for Journalist. As a fellow, he spent nine months working on policy issues, including technology policy, in for a Member of the U.S. House of Representatives. He rejoined Ziff Davis in August 2003 as a reporter dedicated to online coverage for eWEEK.com. Along with Web conferencing, he follows search engines, Web browsers, speech technology and the Internet domain-naming system.