At last weeks OReilly Emerging Technology Conference in San Diego, two of the most concrete sessions dealt with some of the threats—whether merely commercial, or actually dangerous or criminal—that face forthcoming products and services.
Monday offered a half-day tutorial on reverse engineering by Andrew Huang, author of "Hacking the Xbox," whose work I profiled in a column last year. Huangs own Web site is moving, and the link from that eWEEK column may not be valid if you read this a month or so from now, but you should still be able to find that material at his new URL.
Huangs hardware presentation included his comment, "I dont know why people are so nervous about giving out schematic diagrams of circuits: with practice, you can read a circuit board like a book." I spoke with Huang later about the additional possibility of tools that can decompile a circuit diagram into a functional description, recognizing common elements such as registers or counters: he said that those functions are not provided in the otherwise excellent (and free) design tools that he demonstrated in his session, including an impressive suite from Xilinx. He suggested, though, that such capabilities are very likely at work in design firms, and that various measures to protect hardware designs against reverse engineering are at best just slightly ahead of their countermeasures.
In the meantime, its clear that hardware hacking has moved far beyond the Heathkits that I was building thirty years ago: FPGAs are already sampling in the 90-nanometer process technology that IBM is using so well in its PowerPC 970 processors to set new standards of high performance with low power consumption. "Soft hardware" can perform today at speeds that invite the creation of custom designs for even low-volume applications.
Tuesdays Emerging Technology Conference agenda included an alarming session, titled "Evolving the Bad Guy," presented by Eric Bonabeau of Icosystem Corporation, and dealing with the use of evolutionary computation to create novel attacks on complex systems.
I was reminded of a comment by Kernighan and Plauger, in their book "The Elements of Programming Style," that I mentioned in a previous column on the need to re-read our own code: "If youre as clever as you can be when you write it, how will you ever debug it?" Bonabeau put the problem in even starker terms when he concluded with the observation that "Our brains are producing systems that are so much more complex than our brains can understand; wanting to test them with our brains is going to leave a lot of weaknesses and loopholes."
When systems go out into the world, they face both legitimate users and highly skilled attackers. We shouldnt unduly punish the former in our efforts to deter the latter, but we should think ahead about building systems that dont extend an open invitation to abuse.