I wrote last week about the failure of conventional security thinking in dealing with information security threats. As I said in that column, you cant control traffic in information assets by merely inspecting messages—their contents are too easy to disguise. Even if you do hold on to data, you cant be sure that what you have today is what you had in your hands yesterday: Bits are too easy to alter using methods that leave no trace.
Infosec, I argued, therefore requires a focus on relationship issues of who created data and who has rights to access, update and exchange. Thats quite a different proposition from the physical security model of locking down the storehouse and inspecting whatever tries to leave, but it seems that much of whats spent on infosec goes toward a misguided imitation of physical methods.
The image that came to mind as I wrote last weeks column was the “cargo cult.” In the years after World War II, South Pacific islanders were seen to imitate the behaviors of the armies that had come and gone during that conflict. They built landing strips, constructed mock aircraft out of straw, built structures resembling control towers and even carved wooden replicas of controllers headphones and other communication equipment. Despite these faithful imitations, though, no airplanes filled with food and clothing appeared to unload their riches.
Neither do other technologies respond to those who merely imitate their form while failing to understand and respect their substance. You can build a massive vault to hold your information—with armed guards at the door—but when the bad guys have steganography algorithms and hacking tools in their arsenals, the guards dont know what to look for.
In checking details of where and when the cargo cults were observed, I discovered that others have applied this same label to other flawed IT practices. The famous “Jargon File” is found in many instances all over the Web, notably at hacker and essayist Eric Raymonds www.catb.org/~esr/jargon site. That compendium defines “cargo cult programming” as “a style of (incompetent) programming dominated by ritual inclusion of code or program structures that serve no real purpose.”
When I read that entry, little alarm bells rang. I found myself thinking about times when Ive needed to reverse-engineer code written by someone else—or even code written by myself that I had not touched in several years. I dont think Ive ever actually thrown up my hands and said, “I dont know why thats there or what it does, but Id better keep it.” On the other hand, I know there are many people who have to deal with similar situations involving much more complex systems, in a contemporary IT environment of too few people and relentless time-to-market pressures. I can understand how the temptation might arise.
One has to wonder how much code winds up encysted in our systems, no longer doing anything useful and perhaps even creating security loopholes by its presence.
When code is reused, and especially when code is acquired from outsourced teams or incorporated via Web services technologies, theres a real opportunity for cargo cult practices to take hold. Source code may follow unfamiliar naming conventions, and design documents and internal memos may be written in unfamiliar languages or in a language that we know by people who dont speak that language very well. We may not even have the source—we may have only WSDL (Web Services Description Language) or some other interface definition to guide us.
The wooden headphones may bear fancy names like “design patterns,” but theyre still an indicator that we may be building systems that look like those that have worked before—instead of designing from deep understanding toward solutions that meet new needs.
“The first principle is that you must not fool yourself,” said the late physicist Richard Feynman in the 1974 Caltech commencement address thats often considered the origin of the “cargo cult” phrase, at least as used by coders. Thats a good principle. Reusing code that we dont understand or reusing familiar methods merely because we do understand them are behaviors for which we should be on guard.
Technology Editor Peter Coffee can be reached at peter_coffee@ziffdavis.com.
To read more Peter Coffee, subscribe to eWEEK magazine.
Check out eWEEK.coms Developer & Web Services Center at http://developer.eweek.com for the latest news, reviews and analysis in programming environments and developer tools.
Be sure to add our eWEEK.com developer and Web services news feed to your RSS newsreader or My Yahoo page