If you happened to be monitoring conversations in my home, youd often hear the phrase "lying robot." We dont have humanoid machines doing housework or taking part in conversation, but when our dishwasher says that it will be done in 20 minutes, and 10 minutes later it says 19 minutes, we know its actually going to take much longer than that. Or when an incoming e-mail message gets sorted to the top of the list for weeks, until it gets archived, we know its probably afflicted with an incorrect date stamp.
"Lying robot" is the shorthand that my wife and I routinely use to say, "Dont believe this dumb box."
Machines tell lies all the time, although its usually not their fault. The root of the falsehood is usually a faulty sensor, a naïve algorithm or a fragile data structure that sacrifices error-detection or error-correction capabilities in the interest of saving a few bits.
Ironically, the source of our "lying robot" phrase is a short story by Larry Niven—"Neutron Star"—in which a machine is telling the truth but is only measuring a narrow slice of its environment. Eventually, the storys spacecraft pilot figures out whats going on, but only by thinking outside the box—or outside the hull, to be precise. Either way, the machine wasnt helping the user understand his situation.
The subject of machines that lie to us—or that tell the truth as they know it, but get it wrong—came to mind when I read Junes report by our colleagues at Baseline magazine on the subject of stock-option backdating.
Theoretically, compensation in the form of stock options represents a gamble on the part of the people who receive them: If they do their best to make the share price rise, their options will be worth more. If an option is backdated, though, to a date when the share price was at a low point, the element of risk is removed or greatly reduced.
If everyone knows whats going on, then this is merely a transfer of wealth from the shareholders to the option recipients. Whether this should happen is a question of corporate governance and shareholder rights, not an IT matter. This becomes an infrastructure issue when the question arises of whether systems are sufficiently skeptical about key data that their users provide. In principle, its possible to build systems that dont take the word of any single person for fact; in practice, we too often build systems that will let too many individuals supply or revise the systems notions of truth.
I find myself thinking that an IT system is increasingly the means by which an enterprise reports to itself, and that information systems would therefore do well to live by the standards of Journalism 101: asking who, what, when, where and why.
Who? Without reliable authentication of user identity, and without good systems for accurately mapping identities to dynamically changing roles, no supposedly secure approach to systems is anything more than a matter of going through motions.
What? We might be talking about proven approaches to database maintenance or about proper disciplines for RFID (radio-frequency identification) systems that dont make it trivial to make one thing appear to be another. Either way, integrity matters.
When? As indicated by the option-backdating issue, we must have reliable authority for date-time stamping of transactions. I might make the case that no system should ever have user input for a date or time field—that every such value should flow from some other process in a manner that provides reliable provenance.
Where? Between RFID, GPS, cell phone tower triangulation and Wi-Fi signal strength mapping, we have a great many ways to know where data or actions are originating in our networks. We should use that information: There should be a reasonable match between an entitys location and the privileges that it seeks.
Why? Theres a difference between having repair privileges to fix things and having edit privileges to change them arbitrarily. There should be no such thing as a universal super-user who can make any change and alter any resulting log.
Its not enough to be accurate. Systems should be truthful as well.
Technology Editor Peter Coffee can be reached at firstname.lastname@example.org.