Google is experimenting with a new method to drastically speed up the transfer of large datasets on the company’s cloud storage platform. The process, called Online Cloud Import for Google Cloud Storage, is now in limited preview mode for selected customers to use and test as the system is refined.
The new data importing tool was announced by Lamia Youseff, a Google product manager, in a June 19 post on the Google Cloud Platform Blog. “Our customers tell us that importing multi-terabyte datasets, such as log files and media libraries, from other cloud-based 3rd party storage providers can be slow and expensive. Today, we’re introducing a limited preview of Online Cloud Import for Google Cloud Storage which makes it faster, easier and cheaper to import online data to your Cloud Storage buckets (standard or regional in U.S. or Europe) using Google’s high performance network.”
Interested customers can sign up to participate in the limited preview, wrote Youseff. “Online Cloud Import provides a number of options that make data transfers and synchronization easier. Not only can you create a one-time backup/transfer of your data to your Cloud Storage bucket, but you can also set a periodic scheduled synchronization of your 3rd party storage to Google Cloud Storage.”
Users can also configure their desired synchronization using a set of advanced filters based on file creation dates, filename filters and the times of day you prefer to import data, wrote Youseff. Once the data is synchronized, users will receive an email notification with complete details about the process.
The Online Cloud Import feature complements both gsutil and Cloud Storage Offline Disk Import, she wrote. “Cloud Storage Offline Disk Import is suitable if you have hundreds of terabytes of offline data—disk-upload centers are available in Switzerland, Japan, India and the United States. If you’d like command-line control, and are dealing with less than 10TB, we suggest using gsutil (now with rsync) to transfer your data.”
Google is often adding new services and tweaking its Google Cloud Platform for users and developers.
In April 2014, Google announced the availability of its Google Cloud Platform services to the Asia Pacific region as it moves to expand the reach of its cloud services to more developers around the world.
Earlier in April, Google unveiled new lower pricing for Google Cloud Platform users through “Sustained Use Discounts” that the company made available to users who run large projects on virtual machines. Under the new pricing scheme, users will save more as they use more virtual machines in the Google Cloud.
In March 2014, Google introduced a new Google APIs Client Library for .NET and improved documentation for using third-party Puppet, Chef, Salt and Ansible configuration-management tools, according to an eWEEK report. The new Google APIs Client Library for .NET is an open-source effort, hosted at NuGet, that lets developers building on the Microsoft .NET Framework integrate their desktop or Windows Phone applications with Google’s services. The library includes more than 50 Google APIs for Windows developers.
Also released in March was a new Google paper, “Compute Engine Management with Puppet, Chef, Salt, and Ansible,” which provides information for Google Cloud Platform developers who want to use configuration-management tools such as those from Puppet, Salt, Chef and Ansible.
In October 2013, Google replaced its old Google API Console with a new, expanded and redesigned Google Cloud Console to help developers organize and use the more than 60 APIs offered by Google.
Earlier in October, the company released several technical papers to help cloud developers learn more about the development tools it offers through its Google Compute Engine services. The papers, “Overview of Google Compute Engine for Cloud Developers” and “Building High Availability Applications on Google Compute Engine,” offer insights and details about how the platform can be used and developed for business users.