Several new features in the Google Cloud Storage environment aim to make it easier for developers to manage, access and upload data into the cloud.
The new capabilities, including automatic deletion policies, regional buckets and faster uploads, were revealed in a July 22 post on the Google Cloud Platform Blog by Brian Dorsey, a Google developer programs engineer.
“With a tiny bit of up-front configuration, you can take advantage of these improvements with no changes to your application code—and we know that one thing better than improving your app is improving your app transparently,” wrote Dorsey.
The new Object Lifecycle Management feature will allow Cloud Storage to automatically delete objects based on certain conditions, according to Dorsey. “For example, you could configure a bucket so objects older than 365 days are deleted, or only keep the three most recent versions of objects in a versioned bucket. Once you have configured Lifecycle Management, the expected expiration time will be added to object metadata when possible, and all operations are logged in the access log.”
Developers can also use Object Lifecycle Management alongside Object Versioning to limit the number of older versions of objects that are retained, he wrote. “This can help keep your apps cost-efficient while maintaining a level of protection against accidental data loss due to user application bugs or manual user errors.”
The Regional Buckets feature allows developers to co-locate Durable Reduced Availability data in the same region as your Google Compute Engine instances to improve performance, wrote Dorsey. “Since Cloud Storage buckets and Compute Engine instances within a region share the same network fabric, this can reduce latency and increase bandwidth to your virtual machines, and may be particularly appropriate for data-intensive computations.”
Developers will still have ultimate control over which data centers are used, he wrote. “You can still specify the less-granular United States or European data center locations if you’d like your data spread over multiple regions, which may be a better fit for content distribution use cases.”
The upload improvements are part of the latest version of Gsutil Version 3.34, which now automatically uploads large objects in parallel for higher throughput, wrote Dorsey. “Achieving maximum TCP throughput on most networks requires multiple connections, and this makes it easy and automatic. The support is built using Composite Objects.”
More details about temporary objects can be found in the accompanying Parallel Composite Uploads documentation, wrote Dorsey. “To get started, simply use ‘gsutil cp’ [command] as usual. Large files are automatically uploaded in parallel.”
Earlier in July, Google invited developers to participate in its new “Build Day” program for its Cloud platform. The participating developers, who will be selected by Google after responding to a questionnaire, are being solicited to help make its Cloud Platform even better by offering their ideas and insights at one of several in-person, hands-on sessions in the next several weeks. The study involves developing an application using Google Cloud Platform services.
Google Adds Cloud Storage Management Improvements
In June, Google unveiled a new Cloud Playground environment where developers can quickly try out ideas on a whim, without having to commit to setting up a local development environment that’s safe for testing coding experiments outside of the production infrastructure. The Cloud Playground is slated as a place where application developers can try out all kinds of things, from sample code to viewing how production APIs will behave, in a safe, controlled place without having to manage the testing environment, according to Google. The new Cloud Playground is presently limited to supporting Python 2.7 App Engine apps.
The Cloud Playground is an open-source project that includes mimic, which is a regular Python App Engine app that serves as a development server; and bliss, which is a trivial browser-based code editor that lets users edit code in the mimic virtual file system, wrote Sauer.
Earlier in June, Google opened its Google Maps Engine API to developers so they can build consumer and business applications that incorporate the features and flexibility of Google Maps. By using the Maps API, developers can now use Google’s cloud infrastructure to add their data on top of a Google Map and share that custom mash-up with consumers, employees or other users. The maps can then be shared internally by companies or organizations or be published on the Web.
Google also recently created a new Mobile Backend Starter that lets developers focus on building and selling their apps by automating the back end of apps development. The Mobile Backend Starter works with Google App Engine. The Mobile Backend Starter was first announced at the Google I/O 2013 Developers Conference, where it was the topic of the “From Nothing to Nirvana in Minutes: Cloud Backend for Your Android Application” presentation.
In January, Google announced that it was moving its Google Cloud Platform (GCP) over to the GitHub collaborative development environment to make it easier for software developers to contribute and continue the evolution of GCP. The GCP program has been growing since Google unveiled a new partner program in July 2012 to help business clients discover all of Google’s available cloud services. GitHub is a rapidly growing collaborative software development platform for public and private code sharing and hosting.