Google Releases Tools for Comparing Cloud Performance Across Vendors

 
 
By Jaikumar Vijayan  |  Posted 2015-02-11 Print this article Print
 
 
 
 
 
 
 
cloud dashboard

PerfKit Benchmarker will enable apples-to-apples comparison of cloud offerings, Google claims.

Google has released a new open-source toolset for developers struggling to measure the relative performance of the Google Cloud Platform against cloud offerings from other vendors.

The company's new PerfKit Benchmarker offers what Google claims is a way for organizations to do an apples-to-apples comparison of application performance on cloud platforms from different service providers.

It allows companies to compare metrics like application throughput, performance variances, latency and overhead, and the time it takes for an organization to provision resources in the cloud. The benchmarking framework is based on input from other cloud providers, analysts and cloud performance experts in academia, Google said in a blog post Feb. 11.

Among the more than 30 organizations that contributed to the effort are Cisco, Rackspace, Broadcom, Canonical, Red Hat and CloudHarmony. In addition, researchers from MIT and Stanford University will contribute to the ongoing effort to keep the benchmarks updated and relevant.

"PerfKit is unique because it measures the end to end time to provision resources in the cloud, in addition to reporting on the most standard metrics of peak performance," Google said.

According to an official description, PerfKit Benchmarker is designed to operate via vendor-provided command-line tools. "It instantiates VMs on the Cloud provider of your choice, automatically installs benchmarks, and runs the workloads without user interaction," according to the description.

Google will constantly update the benchmarks to make sure they measure the latest workloads so organizations can make more informed decisions about their infrastructure requirements, the company noted in its blog.

"We'll adapt PerfKit to keep it current," Google noted. "It already includes several well-known benchmarks, and covers common cloud workloads that can be executed across multiple cloud providers."

In addition to the benchmarking toolkits, Google has released a new dashboarding and performance analysis tool called PerfKit Explorer to help developers interpret benchmark results. The data visualization tool comes with a set of dashboards prepopulated with data from internal network performance tests from Google to allow developers to get used to it without having to input any of their own data first.

Google has made the source code for PerfKit Benchmarker and PerfKit Explorer available under open-source Apache 2 licensing terms.

The new benchmarking tools represent an effort by Google to help make cloud performance comparison easier. The rapid growth of infrastructure-as-a-service (IaaS) and platform-as-a-service (PaaS) cloud services in recent years and the different metrics used by providers of these services to tout application performance have made it hard for companies to do straightforward comparisons of different cloud services.

According to a Google spokeswoman, the new tools will specifically allow organizations to compare virtual machines (VMs) and object storage across clouds and physical hardware. The company plans to expand it to include more IaaS, PaaS and software as a service (SaaS) services in the future.

"It automates a number of workloads like Cassandra, Aerospike, Hadoop, SpecCPU and micro-benchmarks like iperf and netperf," she said.

CloudHarmony, one of the companies that contributed to Google’s PerfKit, highlighted some of these difficulties in a blog post last year. "Because of the diverse deployment options and dissimilar features of different services, formulating relevant and fair comparisons is challenging to say the least," Cloudharmony founder Jason Read wrote. "In fact, we've come to the conclusion that there is no perfect way to do it."

Among the challenges that Read highlighted in his report is the difficulty in finding virtual machine instances and services that are truly comparable across the different vendors, finding appropriate workloads for testing and using the right benchmarks.

"With a lack of standardization in the IaaS industry, providers freely use unique terminology to describe their VM resource allocations," Cloud Spectator previously noted after performing cloud performance testing. "As the market quickly saturates with IaaS providers, the decision-making complexity of choosing the right provider evolves as well."

 
 
 
 
 
 
 
 
 
 
 
 
 

Submit a Comment

Loading Comments...
 
Manage your Newsletters: Login   Register My Newsletters























 
 
 
 
 
 
 
 
 
 
 
Rocket Fuel