When Google announced Assistant, its digital assistant technology earlier this year, CEO Sundar Pichai described it as critical to the company’s effort to enable a more conversation-based and artificial intelligence-centric computing model in coming years.
This week, Google took one step toward executing on that vision by opening the development platform for the digital assistant to third-party application developers. The goal is to allow more services to be accessed directly through Assistant than currently supported.
Assistant is essentially Google’s version of digital assistants like Amazon’s Alexa, Microsoft Cortana and Apple’s Siri in that it allows users to engage with applications and services using ordinary conversation. As with other digital assistants, people can do everything from conducting web searches to making airplane or restaurant reservations using Google Assistant.
Google has said that it will integrate Assistant into pretty much all of its hardware products in the next few years including tablets and smartphones and its Google Home voice-activated speaker and digital assistant.
With this week’s announcement Google has opened to third-party developers Actions on Google, which is the development platform for the Google Assistant.
Google previewed Actions at a developer conference in October. At that time, the company had described Actions as an open developer platform that will let anyone build apps for Google Assistant. Developers can use the platform to build conversational interfaces so their applications and services are accessible via Assistant.
Assistant allows developers to enable what Google describes as either Direct Actions or Conversation Actions on their applications. Developers of home automation applications or food ordering apps for instance can use Direct Actions to write interfaces that let users control lights or order food directly through Google Assistant.
Conversation Actions on the other hand is designed to give developers a way to build an interface where users can carry on a two-way dialog with their application, using Assistant. For example, an individual might use Google Assistant to summon a ride sharing service.
The ride sharing app would use Assistant to initiate a two-way conversation to find out details such as type of vehicle required, number of people travelling, destination and where to pick up the travelers.
Starting this week, developers can build Conversation Actions for Google Home using the development platform for Assistant, said Jason Douglas, director for Actions on Google in a blog.
To help enable an easy experience for developers, Google has made available several conversational interaction development tools that developers can use to build interfaces for their applications, Douglas said.
Google’s development platform also offers analytical tools such as VoiceLabs and DashBot that developers can use when integrating their products with Assistant. In addition, Google has also compiled a set of samples and voice user interface resource that developers can refer to when building interfaces for their applications and services, Douglas said.
While initially Actions on Google will be available only for integration with Google Home, Google will soon release the Actions for Pixel mobile phones and Actions for the Google Allo smart messaging app. Also in the cards is Actions for purchase and booking applications, he added.