Users Want Freedom to Be Wrong

By Peter Coffee  |  Posted 2004-10-25 Print this article Print

System designers should understand that users won't let dumb machines decide what risks to take for them.

People resent involuntary risks. In studies of the unconscious trade-off thats made between acceptable risk and reward, it appears that an imposed risk seems roughly a thousand times worse than the same risk assumed by choice. If information systems designers dont know this or dont care, then they will build systems that people dislike and that dont yield expected returns.

For example, a system might be built with a safety feature that prevents a user from taking some action if a dangerous condition exists. If an accident would cost, on average, $100,000, but an automatic delay costs only $1,000, it would be logical to set that features threshold at a risk of about 1 percent.

People are more complicated. The risk that a users work will be delayed, perhaps with financial consequence or even with mere inconvenience, is a risk thats being imposed by the machine. The risk that a user might take by overriding that safety feature, perhaps by using a special "test bench" input code that somehow manages to spread through word of mouth, is voluntary and a thousand times more tolerable.

Peter Coffee writes that chips that focus on tasks may be the right idea at the right time. Click here to read more. Users might readily, if unconsciously, assume a 10 percent risk or even more, even though the long-term result is undesirable—possibly even fatal—to themselves and others.

Examples abound: People try to sneak across railroad crossings, they disable warning alarms, they ignore needed maintenance on their cars. My point here is that peoples use of information systems is becoming more imposed by the environment and less assumed by choice. People dont sit down at a terminal and voluntarily engage with a system but, rather, live in an information services environment that has no visible exit signs. It therefore seems likely that well see many more such examples of irrational but predictable behavior in the realm of enterprise IT.

One obvious example is the reaction to automatic software updates. If I look over a list of available updates, such as the one thats offered to me by Microsofts Windows Update site or the Software Update interface of Apples Macintosh OS X, I get to participate in the choice of which downloads Ill accept. I can make my own decision about how many places Ill check for information on possible bugs in a service pack; I can make my own risk analysis in deciding whether Ill do an update on the day that its released or wait a few days to see if problem reports arise.

If updates are presented as necessary, or even made automatic and mandatory, the time thats required and the consumption of network bandwidth will be seen as imposed rather than voluntary costs. The delay of a users work thats required to do system restarts is imposed. The chance that the updated system wont work as expected or might not even work at all is subconsciously multiplied by a thousand in the mental estimate of nuisance suffered.

The predictable result, over time, is that systems will be less secure and more likely to make avoidable errors.

Its one thing for this kind of uncertainty to afflict an open-architecture device such as a Windows PC. People understand that its impossible to test every plausible combination of hardware, application and user behavior. The risk comes with their freedom of choice—and feels almost voluntary.

Its harder to forgive such problems in an update that comes from Apple, which has much more control of its technology stack—but not total control, as shown by the data-loss problems on some FireWire drives after the update to OS X 10.3.

Whats appalling is seeing these problems in the most contained hardware domains: for example, this months firmware 1.0.4 update for Canons EOS 20D camera. Heaven help the user who installed it with a lens actually attached to the camera at the time. Among the possible resulting malfunctions was the inability to turn on the camera. Oops. Try to get those users to make future updates in a timely manner.

There are only two ways to get an IT user to follow the One Right Path: Make it an easy choice, or make it a low-risk requirement. Choose wisely.

Technology Editor Peter Coffee can be reached at

To read more Peter Coffee, subscribe to eWEEK magazine. Check out eWEEK.coms Security Center for the latest security news, reviews and analysis.

Be sure to add our security news feed to your RSS newsreader or My Yahoo page

Peter Coffee is Director of Platform Research at, where he serves as a liaison with the developer community to define the opportunity and clarify developers' technical requirements on the company's evolving Apex Platform. Peter previously spent 18 years with eWEEK (formerly PC Week), the national news magazine of enterprise technology practice, where he reviewed software development tools and methods and wrote regular columns on emerging technologies and professional community issues.Before he began writing full-time in 1989, Peter spent eleven years in technical and management positions at Exxon and The Aerospace Corporation, including management of the latter company's first desktop computing planning team and applied research in applications of artificial intelligence techniques. He holds an engineering degree from MIT and an MBA from Pepperdine University, he has held teaching appointments in computer science, business analytics and information systems management at Pepperdine, UCLA, and Chapman College.

Submit a Comment

Loading Comments...
Manage your Newsletters: Login   Register My Newsletters

Rocket Fuel