Traditional Capacity Management Approaches

 
 
By Imad Mouline  |  Posted 2010-10-08 Email Print this article Print
 
 
 
 
 
 
 


Traditional capacity management approaches

Within this context, many traditional approaches to capacity management are now outdated. First and foremost, traditional approaches often manage capacity based on utilization instead of managing it based on the response times that users experience when going through critical workflows on a Website. For example, an organization may determine that a particular Web server in its data center is operating at 50 percent CPU utilization and can therefore handle more load. However, this approach leaves a blind spot where performance is concerned because utilization is not linearly related to system performance and does not convey the point at which response times begin to slow down.

In addition, traditional approaches often test application performance from inside the data center, relying on synthetic traffic generated from one's own servers in order to gauge the speed and availability of Websites and applications under various load sizes. The issue with this approach is that users don't live in the data center. They live at the outer edges of the Internet and their experiences are subject to an extremely wide range of performance-impacting variables beyond the data center (including not just third-party service providers but also ISPs, carriers, content delivery networks and other elements).

A composite application is more than the sum of its components; it's the whole that comes together when all the components described earlier-which often have interdependencies-work in concert. But if one component slows down, the rest of the components may not be invoked in a timely manner and there's a cascading effect of application performance degradation.

So, testing from inside the data center only never portrays an understanding of how a user ultimately sees and experiences an application. Ironically, it's this user experience which ultimately determines the success of a Web initiative. Internal, data center-focused approaches therefore inhibit capacity managers from identifying capacity needs beyond the firewall, which is needed in order to ensure seamless execution and delivery of composite applications.




 
 
 
 
Imad Mouline is CTO of Gomez. Imad is a veteran of software architecture, research and development. Imad is a recognized expert in Web application development, testing and performance management. Imad's breadth of expertise spans Web 2.0, cloud computing, Web browsers, Web application architecture and infrastructure, and software as a service. Prior to Gomez, Imad was CTO at S1 Corp. He can be reached at imouline@gomez.com.
 
 
 
 
 
 
 

Submit a Comment

Loading Comments...
 
Manage your Newsletters: Login   Register My Newsletters























 
 
 
 
 
 
 
 
 
 
 
Rocket Fuel