Opinion: These are the errors of the ISP market. Its 15-year failure: to protect its customers. Is a secure ISP network science fiction?
The role of ISPs in security is one of the great neglected topics in our industry, and one of my favorite subjects back to the time before I started focusing on security.
Back, I believe, in 1999, I wrote an article predicting (because it made perfect sense) that the future of security for consumers was through the ISP.
Anti-virus, Anti-spam, perhaps even network security like firewalls could be implemented by the ISP.
Of course this wouldnt preclude the need for client-side protection, but just imagine if ISPs had been offering serious security for the last few years.
Would you be willing to spend extra money on an ISP that offered real network anti-virus and other aggressive security features? Imagine them offering a "safe" network: "We help you to keep your systems clean and we keep troublemakers off our network." Obviously the ISPs dont think so because absolutely nobody does it.
When you think about it, many cutting-edge enterprise network security features could be applied to an ISP, up to and including NAC.
But this week Trend Micro is releasing its ICSS (InterCloud Security Service), a first step toward helping ISPs and some other large network providers, like universities, to make their networks safer.
ICSS replaces the existing recursive DNS in the network and uses that position to monitor activity looking for suspicious acts, especially those indicative of botnets.
They claim it can detect compromised systems in near-real time, remediate them and remove the infection.
This capability is impressive, but not surprising coming from Trend Micro and its vast experience on corporate networks.
Some of the behaviors they look for are relatively obvious: Any ISP client computer that does a large number of MX lookups on the DNS in a short period of time is probably a spam bot.
Even better, companies like Trend Micro have good maps of the big botnet C&Cs (command and control networks).
If a system on the network makes requests to one of these C&Cs, it has basically dropped its pants and you know its a bot.
ZERT (the Zero Day Emergency Response Team) has issued a patch for the latest unpatched Windows flaw. Click here to read more.
But would you really want your ISP (or your university) remediating your computer?
Thats a very complicated question; Ill take the cowards way out and declare it to be a "policy issue." Clearly some users wouldnt mind this at all, just as some would raise cries of "Big Brother."
Personally, I went Republican on this issue quite a while ago and wouldnt mind ISPs blocking client systems that exhibit behaviors that are well-understood to be indicative of bots.
Theres a good way to do it and a bad way.
In the good way the ISP a) pre-publishes and notifies customers of the criteria for blocking; b) notifies the customer when they are blocked, including details on what their computers did to get them blocked and instructions on how they can remediate; c) includes a reference # and a support phone number to call.
But ISPs have been miserable failures at fixing compromised systems on their networks, including large botnets.
Part of the problem is that they dont want to do all the work involved; part is that they dont want to offend customers by inconveniencing them just because their computer is a bot sending out spam; and a big part is that they dont have great tools for fixing the problems. This is what Trend says it is trying to address.
For instance, Trend tells me of one ISP in France that has 500,000 compromised client systems on their network.
They are currently plowing through four to five a day, meaning that well be colonizing Alpha Centauri by the time theyre done. Clearly, the current methods are inadequate.
Being in the position of the DNS, there are some helpful tricks that a service like ICSS can perform: If they detect a command or request to a botnet C&C, ICSS could spoof the result.
They could tell the system to perform an attack on 127.0.0.1 (the loopback address). They could redirect the users browser to Trends Housecall site, where they could have the system scanned and cleaned, or some other site with a similar purpose.
Its not hard to see how malware could be designed to work around this specific technique, such as changing the default DNS to an external compromised one.
This isnt the point. The point is to provide tools to give large network operators a practical way to make their networks more secure. I know Id pay more for that.
Security Center Editor Larry Seltzer has worked in and written about the computer industry since 1983. He can be reached at firstname.lastname@example.org.
Check out eWEEK.coms for the latest security news, reviews and analysis. And for insights on security coverage around the Web, take a look at eWEEK.com Security Center Editor Larry Seltzers Weblog.
Larry Seltzer has been writing software for and English about computers ever since,much to his own amazement,he graduated from the University of Pennsylvania in 1983.
He was one of the authors of NPL and NPL-R, fourth-generation languages for microcomputers by the now-defunct DeskTop Software Corporation. (Larry is sad to find absolutely no hits on any of these +products on Google.) His work at Desktop Software included programming the UCSD p-System, a virtual machine-based operating system with portable binaries that pre-dated Java by more than 10 years.
For several years, he wrote corporate software for Mathematica Policy Research (they're still in business!) and Chase Econometrics (not so lucky) before being forcibly thrown into the consulting market. He bummed around the Philadelphia consulting and contract-programming scenes for a year or two before taking a job at NSTL (National Software Testing Labs) developing product tests and managing contract testing for the computer industry, governments and publication.
In 1991 Larry moved to Massachusetts to become Technical Director of PC Week Labs (now eWeek Labs). He moved within Ziff Davis to New York in 1994 to run testing at Windows Sources. In 1995, he became Technical Director for Internet product testing at PC Magazine and stayed there till 1998.
Since then, he has been writing for numerous other publications, including Fortune Small Business, Windows 2000 Magazine (now Windows and .NET Magazine), ZDNet and Sam Whitmore's Media Survey.