Improving performance and availability

By Peter Melerud  |  Posted 2008-11-18 Print this article Print

Improving performance and availability

While these steps are important for cost savings and environmental reasons, data center operators are more focused on performance and availability of networks, applications and equipment than on energy efficiency. Improving network efficiency serves the twofold purpose of reducing costs and improving performance with the benefit of being green.

SMBs often mistakenly purchase too many servers and use them at very low capacity in order to have redundancy. A better way to maximize the efficiency of servers is to optimize them, because whether they are working at full capacity or only 10 percent capacity, they still use energy. While power reduction is important to cost savings and being environmentally conscious, performance is also a vital part of the equation.

Minimizing costs while maximizing performance

For an SMB, the decision about whether to maximize performance or lower cost has become easier in recent years; performance does not have to suffer in order to minimize costs. In order for an SMB to be successful in e-commerce, 32 percent of SMBs sell goods online. These businesses need to handle the traffic connecting customers and suppliers. SMBs in a wide range of vertical markets also use intranets to share information among employees and use extranets to link to their suppliers.

As the information accessed and shared becomes more complex and bandwidth-consuming, it becomes more important that network and application infrastructure performance be optimized. Site availability is key to having a successful e-commerce site. Site availability includes having ample bandwidth, memory, storage, redundancy, failover, load balancing and persistence.

Optimizing server efficiency

Deploying an Application Delivery Controller (ADC) has become a strategic means for achieving efficient Web site, Internet and intranet connectivity, backed by performance and energy conservation. An ADC distributes user requests across multiple servers within a server farm at a data center, thereby enabling flexible and cost-efficient scaling of application performance. An ADC can measure elements such as the number of concurrent connections, memory utilization and much more. It also acts as a traffic cop between the servers and the users directing traffic, thereby accelerating response times. ADCs offer benefits associated with a site's performance, reliability, offloading and accelerating SSL traffic.

SSL offload/acceleration can dramatically decrease the amount of physical servers needed to provide encrypted access to applications. Since SSL processing places a large burden on server CPU utilization, offloading and accelerating SSL at the ADC frees the servers from handling the compute-intensive SSL processing. SSL offload can help further consolidate the number of servers needed to provide optimum application performance levels. Placing SSL acceleration on the ADC rather than on the server improves the server's ability to deliver application requests. This, in turn, allows a site to handle more business transactions, and provides faster transaction handling. When servers operate more efficiently, the data center uses less power. 

Peter Melerud is VP of Product Management at KEMP Technologies. Peter has over 20 years experience in designing, building and managing datacenters for large corporations, financial institutions, as well as small and medium-sized businesses. His broad technology expertise covers datacenter server and network communications infrastructure, enterprise business intelligence, data management, content security and compliance technologies. At KEMP Technologies, Peter is responsible for product management and business development of application delivery and load-balancing solutions for the small and medium enterprise (SME) infrastructure market. He can be reached at

Submit a Comment

Loading Comments...
Manage your Newsletters: Login   Register My Newsletters

Rocket Fuel