Google has announced general availability of a Quick Access feature in Google Drive that up to now has only been available to customers of its G Suite apps on Android.
The feature uses machine-learning tools to predict the files that a user is likely going to be looking for in Drive and presents it before the individual has typed anything into the search field.
Quick Access is designed to help reduce the amount of time people have to spend searching for files and documents stored in Drive, Google software engineer Sandeep Tata said in a blog.
It uses what are known as deep neural networks to detect patterns in the user’s activity and from various other data points—such as meetings on their Calendar or interaction with colleagues—to make an educated guess of their requirements in Drive. Quick Access then presents the documents or files on the user’s Google Drive home screen, precluding the need for the user to search for them, Tata said.
“For example, if you have a Calendar entry for a meeting with a coworker in the next few minutes, Quick Access might predict that the presentation you’ve been working on with that coworker is more relevant compared to your monthly budget spreadsheet or the photos you uploaded last week,” he said.
Similarly, if a user has a regular habit of updating a particular spreadsheet every weekend, Quick Access will present that spreadsheet on top of the Drive home screen on weekends, he said.
Traditionally, machine-learning approaches require subject matter experts and domain experts to recognize patterns and train models to make predictions from disparate data points. For Quick Access, Google’s approach is to leverage deep neural networks to learn from an aggregate of user activity. “By using deep neural networks we were able to develop accurate predictive models with simpler features and less feature engineering effort,” Tata said.
Google’s research has shown that Quick Access takes users to the documents that they are looking for in roughly half the time it would take them to search for it manually on Drive, Tata said.
Google introduced Quick Access last September for Android users of its G Suite cloud application suite. Starting this week, the feature is generally available also for Google Drive users on the web and with iOS devices.
Quick Access is an example of how Google plans to use machine learning and artificial intelligence approaches to make its products and service smarter and easier to use.
Google’s CEO Sundar Pichai has said that eventually the company wants to get to a point where people are able to interact with its products and services in a highly personalized manner using conversational speech and commands.
Some of Google’s most popular services already use machine learning extensively. The company’s language translation service for instance uses machine learning for speech recognition and to translate spoken and written words from one language to another. Similarly, the image recognition features in Google Photos and YouTube thumbnails allow people to search for and find photos by label, by subject and by other attributes.
Google’s Smart Reply feature in its Inbox email client is another example. The feature uses machine intelligence to automatically determine if an email can be answered via a short response. It then composes a few suggestions from which the user can choose one and send in response.