Law enforcement agencies see a growing share of communications moving from traditional channels to the Internet.
Law enforcement agencies see a growing share of communications moving from traditional channels to the Internet. In their effort to respond with new collection and monitoring measures that balance security with the expectation of privacy, these agencies face the same dilemma that marked the punishment of the mythical Tantalus, who was condemned to hang from a branch above a lake whose waters receded whenever he bent to drink. Bureau chieftains cant bring themselves to ignore the Internets potential trove of valuable traffic, but it will evade every effort they make to collect it.
The first and most brutal obstacle is sheer volume. Measurements made earlier this year by Lawrence Roberts, chief technology officer at Caspian Networks Inc., show an Internet traffic growth rate of 300 percent per year; even a pessimistic forecast by McKinsey & Co. and J.P. Morgan Chase & Co. predicts continued growth through 2005 at annual rates tapering off toward 60 percenta rate that still exceeds tenfold growth every five years. Anything faster will outpace the Moores Law growth rate (doubling every 18 months) that has long characterized computer price/ performance trends, implying that watchers will have to upgrade their systems at state-of-the-art rates merely to stay abreast of the flow.
Compounding the problem is the growing quantity of information per bit. When digital channels carry ASCII text, indexing and searching algorithms can perform astonishing feats of association and retrieval (as evidenced in any Google search of HTML pages). Classical cryptanalysis relies heavily on redundancy in the written word. As traffic tends toward images, however, users adopt redundancy-slashing compression schemes that take advantage of the processing power abundant on personal systems: In the aggregate, its a formidable challenge to find any remaining patterns in all that data, whether criminals or terrorists use steganography to bury text in an image file or simply take a picture of a handwritten message (and perhaps flip it upside down and backward before transmission).
Even innocent-looking text messages, attracting no notice from keyword scanning technologies, can readily bear concealed meanings using "commercial code" methods pioneered in the early days of the telegraph (when many operators handled every message, one letter at a time). One 19th-century code, for example, readily reduced a 64-word summary of flour market conditions to the nine-word message, "Bad came aft keen dark ache lain fault adopt." These same methods can also be easily used with voice communication, whether the circuit thats traveled uses analog wiggles or digital bits to represent that speech: The portable power of todays handheld computers makes these methods far more practical than when they required laborious manual transcription.
Yes, such "code books" must be securely distributed to their users, but this is a trivial task for participants in a criminal conspiracy. It is only the legitimate e-business that needs a way of holding secure conversations among previously unknown participants on public networks, and it is only these buyers and sellers who are really inconvenienced by any measures that place backdoors in strong encryption schemes.
Peter Coffee is Director of Platform Research at salesforce.com, where he serves as a liaison with the developer community to define the opportunity and clarify developers' technical requirements on the company's evolving Apex Platform. Peter previously spent 18 years with eWEEK (formerly PC Week), the national news magazine of enterprise technology practice, where he reviewed software development tools and methods and wrote regular columns on emerging technologies and professional community issues.Before he began writing full-time in 1989, Peter spent eleven years in technical and management positions at Exxon and The Aerospace Corporation, including management of the latter company's first desktop computing planning team and applied research in applications of artificial intelligence techniques. He holds an engineering degree from MIT and an MBA from Pepperdine University, he has held teaching appointments in computer science, business analytics and information systems management at Pepperdine, UCLA, and Chapman College.