Google Looks into the Exploit Thing

 
 
By Larry Seltzer  |  Posted 2007-05-17 Email Print this article Print
 
 
 
 
 
 
 

Opinion: The company has a unique perspective on the problem of Web-based exploits, but is it a useful one?

I was excited when I began reading a recent research paper by a group of people at Google on Web-based malware. As the paper says, Google has an interesting position with respect to the problem. But it doesnt seem to me that their research adds much thats helpful. Of course, they need to get up to speed on this problem, as they are likely to be involved in it whether they like it or not. As a researcher recently showed, its easy to abuse Googles Adwords program to spread malware. Its not just academic: Another recent Adwords scam tried to trade on the good name of the Better Business Bureau. See this video for a demo.

Google is already in the business of scanning anything and everything on the Internet. Why not look for malware? Its been well-understood in the anti-malware community for a while that direct malware distribution has generally moved out of the SMTP stream and onto Web sites. E-mail is still heavily involved, but instead of attaching malicious executables, we see messages such as "Click here for hot stock tips and dirty pictures."

Go to the advertised Web site and there are a number of ways they try to trick you into running the malware. They might just say, "Click here and run the program." They may say that to see the content you need to install a "codec"—there are many malicious fake codecs out there. Or they may try a "drive-by download" that relies generally on a browser vulnerability to run a program and infect the system without requiring user intervention. There are dozens of scams and one born every minute. The Google paper goes through much of this background.

Since they do the scanning anyway, Google decided to try to look for pages that attempt to attack user systems. They did a first filtering pass on a scan of "several billion URLs" and then an "in-depth analysis" of 4.5 million, resulting in 450,000 URLs that they say were successfully launching drive-by downloads and another 700,000 that looked dirty, but didnt work as well.

A report finds that more than 40 percent of companies have been hit with malware. Click here to read more.
The paper speaks generally in these terms, that of the number of URLs found, but this is something of a mirage. It doesnt bear much resemblance to a list of the sites that are actually infecting people. Other companies do research that shows more useful information. Exploit Prevention Labs makes a line of products called LinkScanner to secure Web browsing problems found by their users and reports aggregate data. They just reported the top 5 most-reported Web exploits for April:
  1. Link to known exploit site (27.42 percent of all attacks)—Not an exploit per se, "Link to known exploit site" is simply an attempt to link to a known exploitive site. There are several known sites, and it is the aggregation effect rather than the actual potential for damage that has pushed it to the top of the list.
  2. Modified MDAC (23.92 percent of all attacks)—MDAC refers to a creative method of using certain ActiveX controls in a context Microsoft did not originally intend. An ActiveX control is instantiated inside a Web script that allows files to be written to disk and executed.
  3. ANI (11.9 percent of all attacks)—Originally discovered and used by a group of Chinese hackers, the exploit takes advantage of Windows handling of animated cursor (.ani) files. It infects fully patched Windows XP SP2 machines running IE 6 or 7.
  4. Q406 Roll-up package (9.33 percent of all attacks)—Comprising up to a dozen exploits including Setslice, VML, XML and IE COM CreateObject Code, the package is usually heavily encrypted.
  5. WebAttacker 2.0 (9.1 percent of all attacks)—A new pre-package of current exploits, WebAttacker 2.0 uses similar distribution methods to earlier WebAttacker output. Hackers can purchase the package on underground markets and use it just like commercial software.

Im also bothered by their emphasis on the number of pages found as a metric; its what youd expect from Google, of course, with such an emphasis on crawling Web sites, but its too easy to imagine that most of the sites they found never see a real human click.

Out of all this research comes dog-bites-man conclusions: Malware is spread over many URLs to increase its chance of spreading and changes binary pattern frequently to evade detection. Thank goodness we have Google to point this out.

I didnt need to read the paper to know that Google isnt in the anti-malware business, but reading it confirmed that they dont understand malware. Perhaps they have the attitude that theyre Google and they scan everything so they know everything, but theyre utter novices when it comes to the anti-malware business.

Thats OK, its an academic paper exercising a new perspective on a problem without preconceptions. Sometimes these things just dont work out.

Security Center Editor Larry Seltzer has worked in and written about the computer industry since 1983. Check out eWEEK.coms for the latest security news, reviews and analysis. And for insights on security coverage around the Web, take a look at eWEEK.com Security Center Editor Larry Seltzers blog Cheap Hack More from Larry Seltzer
 
 
 
 
Larry Seltzer has been writing software for and English about computers ever since—,much to his own amazement—,he graduated from the University of Pennsylvania in 1983.

He was one of the authors of NPL and NPL-R, fourth-generation languages for microcomputers by the now-defunct DeskTop Software Corporation. (Larry is sad to find absolutely no hits on any of these +products on Google.) His work at Desktop Software included programming the UCSD p-System, a virtual machine-based operating system with portable binaries that pre-dated Java by more than 10 years.

For several years, he wrote corporate software for Mathematica Policy Research (they're still in business!) and Chase Econometrics (not so lucky) before being forcibly thrown into the consulting market. He bummed around the Philadelphia consulting and contract-programming scenes for a year or two before taking a job at NSTL (National Software Testing Labs) developing product tests and managing contract testing for the computer industry, governments and publication.

In 1991 Larry moved to Massachusetts to become Technical Director of PC Week Labs (now eWeek Labs). He moved within Ziff Davis to New York in 1994 to run testing at Windows Sources. In 1995, he became Technical Director for Internet product testing at PC Magazine and stayed there till 1998.

Since then, he has been writing for numerous other publications, including Fortune Small Business, Windows 2000 Magazine (now Windows and .NET Magazine), ZDNet and Sam Whitmore's Media Survey.
 
 
 
 
 
 
 

Submit a Comment

Loading Comments...

 
Manage your Newsletters: Login   Register My Newsletters























 
 
 
 
 
 
 
 
 
 
 
Rocket Fuel