Eight Reasons Why Your Server Security Is Insufficient

 
 
By Chris Preimesberger  |  Posted 2015-08-05
 
 
 
 
 
 
 
 
 
  • Previous
    1 - Eight Reasons Why Your Server Security Is Insufficient
    Next

    Eight Reasons Why Your Server Security Is Insufficient

    The armored-car approach to server security isn't doing the job. And new techniques are unproven. What's a data center manager to do?
  • Previous
    2 - There Are Generally Too Many Parts
    Next

    There Are Generally Too Many Parts

    Market fragmentation has produced myriad specialized security tools used in combination to protect server infrastructure. The problem is that they were never designed to work together. Due to pragmatic concerns regarding the operational complexity and cost of assembling, integrating, operating and keeping the collection of tools up-to-date, IT is forced to take a minimalist approach and cherry-pick from among them. This results in incomplete or unmanageable solutions being deployed.
  • Previous
    3 - Patching Servers Is Way Too Slow
    Next

    Patching Servers Is Way Too Slow

    The time to scan and identify operating system or application vulnerabilities is nearly instantaneous, whereas the time to verify a fix and install patches across a fleet of servers is slow. While organizations need to rely on patches to defend their systems, there is no feasible way they can close the gap.
  • Previous
    4 - Verifying the Integrity of Platforms Is Difficult
    Next

    Verifying the Integrity of Platforms Is Difficult

    It is difficult to verify that only intended software and firmware is running on a system, especially as conditions change due to maintenance, upgrades and component changes. Rootkits and other advanced malware are sophisticated and designed to hide from detection. Traditional host-based software measures assume that parts of the underlying operating system and firmware are trusted, which is frequently not the case.
  • Previous
    5 - Limited Visibility of Malware Inside Networks
    Next

    Limited Visibility of Malware Inside Networks

    Network-based measures are challenged because they allow only limited visibility for the communications and bad actions the malware performs. Network-based processes also are usually complex to operate.
  • Previous
    6 - Applications Are Not Secured on an Individual Basis
    Next

    Applications Are Not Secured on an Individual Basis

    Each application has unique trust boundaries, and none can inherently trust its network neighbors. Establishment of a security perimeter with granular policy controls and the ability to update services before an application patch is available are a must. This ability to "harden" applications results in less exposure and ensures contextually relevant audit information.
  • Previous
    7 - Servers Are Vulnerable to Lateral Attacks
    Next

    Servers Are Vulnerable to Lateral Attacks

    Organizations rely on firewalls to limit who and what can communicate to their applications and servers. But when a server behind the firewall is compromised, there is little to stop it from probing and attacking neighboring systems. Effective protection against lateral attacks is complex to address, as it is difficult to implement, audit, maintain and scale.
  • Previous
    8 - Insecure Lights-Out Management
    Next

    Insecure Lights-Out Management

    Remote, secure management across the entire software stack is needed, yet hard to lock down, when servers are not managed locally. Methods of lock-down include changes to the BIOS, drivers, firmware, hypervisor and any security software as part of regular patch upgrade procedures. Locking the system down in a way so only approved administrators and tools are able to perform remote management—and so that the system is not opened up to be controlled by attackers—is a challenge.
  • Previous
    9 - Unauthorized Access in Hostile Locations
    Next

    Unauthorized Access in Hostile Locations

    Uncontrolled access to the system is possible when the location where the server resides is not physically secure or confidence in the local staff's motivations is uncertain. When unauthorized aggressors have physical access to the server or the environment, they can install malware through USB and console ports, take administrative control of the device, or install attack or snooping tools.
 

Conventional interconnected IT environments—whether virtualized, cloud-enabled or neither—leave organizations more vulnerable to data breaches than ever before. Why is this the case? With increasing numbers of mobile users and virtual workloads, more application programming interface (API) integrations, rich partner and cloud interconnections, and rapid application adoption, it is no longer possible to rely on zone-based perimeter security. Attack surfaces are increasing. The armored-car approach—bullet-proofing the central server/networking/storage complex—simply isn't doing the job. Even new techniques like micro-segmentation, which divides a network into smaller zones and provides protection by making security adaptive and multilayered, are unproven. What's a data center manager to do? Objective self-evaluation is necessary. eWEEK, using resources that include our own archives, information from Forrester Research and industry insight from Skyport Systems, discusses in this slide show the most common reasons why servers and data itself are still as vulnerable as ever.

 
 
 
 
 
 
 
 
 
 
 

Submit a Comment

Loading Comments...
 
Manage your Newsletters: Login   Register My Newsletters























 
 
 
 
 
 
 
 
 
Rocket Fuel