The first step in securing the enterprise is assessment--knowing where and what your vulnerabilities are, both from technical and social standpoints.
Security risks in enterprise IT systems have many technical elements. The magnitude of IT risk is largely determined, however, by nontechnical factors, including business relationships and IT users attitudes. IT vulnerability assessment therefore demands a multidisciplinary approach—especially since risk analysis shapes (or distorts) every subsequent aspect of an IT security process.
As eWEEK Labs begins this five-part Special Series on the IT security cycle, we focus our exploration of vulnerability assessment on the need for IT professionals to learn new ways of thinking. In an environment that is not merely complex but actually hostile, IT architects must move beyond technical correctness into the less well-mapped territories of acceptable risk, social engineering and even the beginnings of game theory.
Essential Threat: Data Loss
The problem with an umbrella phrase such as "IT security" is it can mean almost anything, from the physical security of an IT installation to the abstract mathematics of encryption.
The loss or corruption of data is the essential threat that must be identified and contained. Like any other asset, information must be inventoried with an eye toward both its value to the business and its cost of being replaced; unlike other assets, however, information costs little to duplicate and to store at different locations using technologies with different failure modes.
Risk assessment processes should therefore evaluate data in terms of how frequently it is needed, how much it costs the organization not to have it available on demand, how dangerous it would be to place copies of that data in other hands and how long it is practical to wait for a backup copy to be retrieved.
This analysis will enable a rational trade-off among the affordable simplicity but slow response of off-site tape storage, the high cost of a remote but full-speed "hot site" disk farm or the (literally) massive impact of a fortified "vault" to protect a single primary site—with minimal reliance on the discretion of third parties.
Unlike other assets, information can be stolen without being lost. Its not enough, therefore, to ensure that data remains available to those who are authorized to use it. Data access must also be denied to others, not just in the course of transactions but also during archive storage and even after disposal.
In many cases, the question that must be asked during the risk assessment process is: How long is the shelf life of a particular class of data? Transactional data thats only valuable for minutes, or hours, is adequately protected by low-cost Triple DES (Data Encryption Standard) encryption, but data that needs to be protected for years may need far more elaborate and costly protective measures.
The second part of IT is technology, more replaceable than data but also worth protecting. Physical security for stationary IT assets should be complemented by anti-theft hardware, data protection tools and asset recovery services for portable computers.
Risk assessment actions should include identification of which personnel actually need to carry which types of data into the field. Guidelines can be promulgated for storing certain types of data only on well-protected servers, rather than portable hard disks, or for archiving unused but still-sensitive files to secured removable-media vaults.
What makes IT worth what it costs is its productivity gain over nonautomated systems; that gain is the dividend that flows from investment in packaged and custom software.
Every aspect of software availability must be scrutinized and addressed. Specific risk assessment steps include the identification of all software and hardware elements—perhaps including license files or authentication tokens—that need to be present for a particular application to be usable, followed by preparation of contingency plans for any disruption of those resources.
Software Soft Spots
Unlike other industrial machinery, software is easily and invisibly modified to serve the private ends of self-interested employees. The organization chart may be as revealing as the flow chart in identifying risk: Where a single employee has sole charge of any application, a tempting opportunity may arise.
Assessment steps to contain this risk are familiar to any bank auditor: Applications that deal with financial or other assets should be analyzed to ensure that more than one employee would need to be involved in any attempt to divert those resources for private gain. Even vacation schedules should be examined to ensure that fraudulent manipulation of systems cannot be indefinitely concealed.
Though it may seem like a distant memory, a Y2K readiness effort may have developed a methodology for enumerating enterprise hardware, applications and data sets. This inventory should be kept current and should be continually cross-checked for currency of updates or patches using sources such as the SANS Institute, the Computer Emergency Response Team and SecurityFocus Bugtraq (see Web resources list, www.eweek.com/links).
Automated tools can scan networks, servers and individual PCs for common security loopholes. HFNetChk, developed by Shavlik Technologies LLC (www.shavlik.com), compares the patch status of Windows NT 4.0 and Windows 2000 installations (as well as Microsoft Corp.s Internet Information Services, SQL Server and Internet Explorer) against a database maintained by Microsoft.
A free command-line version of HFNetChk is available via the URL in the Web resources list, left, and Linux sites can find many automated update tools at rpm.redhat.com/software.html.
More-complex vulnerabilities in networks, devices and databases can be detected with tool sets such as Safesuite from Internet Security Systems Inc. (www.iss.net); other network intrusion opportunities may surface under the scrutiny of the open-source Firewalk (www.packetfactory.net/Projects/Firewalk).
The burden of constant surveillance provides fertile ground for service-based offerings such as Qualys Inc.s QualysGuard (www.qualys.com). Full-scale outside security audits are the next logical step, given the economies of leveraging sophisticated expertise across many organizations sites. eWeek Labs compared the approaches of several major security auditors earlier this year. (See eWEEK Labs eValuation at www.eweek.com/article/0,3658,s%253D710%2526a%253D10784,00.asp.)
Individual users can also participate in assessment using online scanners such as Gibson Research Corp.s ShieldsUp (www.grc.com) or Symantec Corp.s Internet Security diagnostics (www.symantec.com/securitycheck), both of which perform batteries of tests and provide illustrated tutorials on eliminating the vulnerabilities they find. Network Associates Inc.s McAfee Asap (www.mcafeeasap.com) also offers a broad range of tools and resources.
IT managers may find that users become more security conscious when they are encouraged to find vulnerabilities themselves, but some threat-mitigation measures can disrupt PC functions if incorrectly performed. Managers should not encourage users to venture beyond their competence in attempting to modify system configurations or install software patches.
Using anti-virus software has become an act of basic due diligence; failure to install and maintain anti-virus tools might jeopardize recovery in future claims against a business continuity insurance policy.
Managers should verify that anti-virus tools are installed, that signature update intervals are monitored and that anti-virus vendors own updates are also monitored and installed to prevent the tools from being turned against themselves—for example, through perversion of auto-update features into entry points for attack.
Remember, though, that the enemy is lost productivity, not viruses as such. This means that virus hoaxes can be almost as much of a problem as the real thing. Authoritative hoax lists, useful for dispelling users fears and maintaining the credibility of actual threat warnings, include those maintained by Symantec and McAfee.com Corp.
Managers should also monitor e-mail for sudden bursts of activity that might reveal either an actual worm attack or a panic of hoax-inspired warning messages.
Long(er) Arm of Law
The value of your own information assets should not be the only consideration in deciding whether to use such tools. Failure to secure ones IT installation might lead to liability if that facility becomes enrolled in a distributed attack against some other site.
IT managers should discuss with corporate risk management professionals the extent of an organizations network interactions with suppliers and customers and should participate in drafting appropriate agreements that limit liability for consequential damage not directly caused by the organizations own actions. Some IT security case law does exist and should not be ignored.
When private systems are made accessible via public networks, its the obligation of the systems owner to post a "no trespassing" sign—an opening banner with a statement that access is limited to authorized users.
This and every other interface between enterprise IT systems and any user, internal or (especially) external, should be examined for appropriate advisory notice. Literally centuries of legal doctrine support the rights of property owners—but only when those rights are explicitly asserted.
Most IT risks, sad to say, cant be addressed by merely staking ones claim: In fact, they require continual re-examination as they reappear in new guise. Its therefore essential to think of IT security as a continuing cycle.
Peter Coffee is Director of Platform Research at salesforce.com, where he serves as a liaison with the developer community to define the opportunity and clarify developers' technical requirements on the company's evolving Apex Platform. Peter previously spent 18 years with eWEEK (formerly PC Week), the national news magazine of enterprise technology practice, where he reviewed software development tools and methods and wrote regular columns on emerging technologies and professional community issues.Before he began writing full-time in 1989, Peter spent eleven years in technical and management positions at Exxon and The Aerospace Corporation, including management of the latter company's first desktop computing planning team and applied research in applications of artificial intelligence techniques. He holds an engineering degree from MIT and an MBA from Pepperdine University, he has held teaching appointments in computer science, business analytics and information systems management at Pepperdine, UCLA, and Chapman College.