System designers should understand that users won't let dumb machines decide what risks to take for them.
People resent involuntary risks. In studies of the unconscious trade-off thats made between acceptable risk and reward, it appears that an imposed risk seems roughly a thousand times worse than the same risk assumed by choice. If information systems designers dont know this or dont care, then they will build systems that people dislike and that dont yield expected returns.
For example, a system might be built with a safety feature that prevents a user from taking some action if a dangerous condition exists. If an accident would cost, on average, $100,000, but an automatic delay costs only $1,000, it would be logical to set that features threshold at a risk of about 1 percent.
People are more complicated. The risk that a users work will be delayed, perhaps with financial consequence or even with mere inconvenience, is a risk thats being imposed by the machine. The risk that a user might take by overriding that safety feature, perhaps by using a special "test bench" input code that somehow manages to spread through word of mouth, is voluntary and a thousand times more tolerable.
Peter Coffee writes that chips that focus on tasks may be the right idea at the right time. Click here to read more.
Users might readily, if unconsciously, assume a 10 percent risk or even more, even though the long-term result is undesirablepossibly even fatalto themselves and others.
Examples abound: People try to sneak across railroad crossings, they disable warning alarms, they ignore needed maintenance on their cars. My point here is that peoples use of information systems is becoming more imposed by the environment and less assumed by choice. People dont sit down at a terminal and voluntarily engage with a system but, rather, live in an information services environment that has no visible exit signs. It therefore seems likely that well see many more such examples of irrational but predictable behavior in the realm of enterprise IT.
One obvious example is the reaction to automatic software updates. If I look over a list of available updates, such as the one thats offered to me by Microsofts Windows Update site or the Software Update interface of Apples Macintosh OS X, I get to participate in the choice of which downloads Ill accept. I can make my own decision about how many places Ill check for information on possible bugs in a service pack; I can make my own risk analysis in deciding whether Ill do an update on the day that its released or wait a few days to see if problem reports arise.
If updates are presented as necessary, or even made automatic and mandatory, the time thats required and the consumption of network bandwidth will be seen as imposed rather than voluntary costs. The delay of a users work thats required to do system restarts is imposed. The chance that the updated system wont work as expected or might not even work at all is subconsciously multiplied by a thousand in the mental estimate of nuisance suffered.
The predictable result, over time, is that systems will be less secure and more likely to make avoidable errors.
Its one thing for this kind of uncertainty to afflict an open-architecture device such as a Windows PC. People understand that its impossible to test every plausible combination of hardware, application and user behavior. The risk comes with their freedom of choiceand feels almost voluntary.
Its harder to forgive such problems in an update that comes from Apple, which has much more control of its technology stackbut not total control, as shown by the data-loss problems on some FireWire drives after the update to OS X 10.3.
Whats appalling is seeing these problems in the most contained hardware domains: for example, this months firmware 1.0.4 update for Canons EOS 20D camera. Heaven help the user who installed it with a lens actually attached to the camera at the time. Among the possible resulting malfunctions was the inability to turn on the camera. Oops. Try to get those
users to make future updates in a timely manner.
There are only two ways to get an IT user to follow the One Right Path: Make it an easy choice, or make it a low-risk requirement. Choose wisely.
Technology Editor Peter Coffee can be reached at firstname.lastname@example.org.
To read more Peter Coffee, subscribe to eWEEK magazine.
Check out eWEEK.coms Security Center
for the latest security news, reviews and analysis.
Be sure to add our eWEEK.com security news feed to your RSS newsreader or My Yahoo page