Google’s experimental self-driving cars have been out on the streets since 2010, but they are now getting one of their toughest assignments: learning how to operate amid dense traffic, crowds of pedestrians and other hazards of inner-city driving.
That’s the word from the Google self-driving car project, which explained how the self-driving vehicles are now focusing on urban driving in an April 28 post by Chris Urmson, the director of the self-driving car project, on the Google Official Blog.
“A mile of city driving is much more complex than a mile of freeway driving, with hundreds of different objects moving according to different rules of the road in a small area,” wrote Urmson. “We’ve improved our software so it can detect hundreds of distinct objects simultaneously—pedestrians, buses, a stop sign held up by a crossing guard, or a cyclist making gestures that indicate a possible turn. A self-driving vehicle can pay attention to all of these things in a way that a human physically can’t—and it never gets tired or distracted.”
The work is continuing to help teach the self-driving cars all the things they need to know as Google might expand its program in the future. The traffic scene in city driving is much different from the freeway driving that the self-driving cars have done so far, wrote Urmson.
For example, city driving includes having to deal with jaywalking pedestrians, vehicles pulling out of hidden driveways, and double-parked delivery trucks blocking lanes and driver sight lines. “At a busy time of day, a typical city street can leave even experienced drivers sweaty-palmed and irritable,” he wrote. “We all dream of a world in which city centers are freed of congestion from cars circling for parking and have fewer intersections made dangerous by distracted drivers. That’s why over the last year we’ve shifted the focus of the Google self-driving car project onto mastering city street driving.”
Since Google’s last update on its self-driving car program back in August of 2012, the company has logged thousands of miles on the streets of Mountain View, Calif., where Google is headquartered, wrote Urmson. And the lessons learned so far about city driving are different from what might have been expected.
“As it turns out, what looks chaotic and random on a city street to the human eye is actually fairly predictable to a computer,” he wrote. “As we’ve encountered thousands of different situations, we’ve built software models of what to expect, from the likely (a car stopping at a red light) to the unlikely (blowing through it). We still have lots of problems to solve, including teaching the car to drive more streets in Mountain View before we tackle another town, but thousands of situations on city streets that would have stumped us two years ago can now be navigated autonomously.”
So far, the Google self-driving vehicle fleet has logged nearly 700,000 autonomous miles, “and with every passing mile we’re growing more optimistic that we’re heading toward an achievable goal—a vehicle that operates fully without human intervention,” wrote Urmson.
When Google started its self-driving vehicle project in 2010, it was seen as creating ground-breaking new research about how to save peoples’ lives and driving time and to ease pollution of the environment by curbing carbon emissions. The project began using Toyota Prius hybrid vehicles with trained operators all over the roads and highways of California, and since has expanded to other vehicles.
Before an automated car takes the road, Google sends out a driver to map the route and road conditions, logging lane markers and traffic signs to become familiar with terrain, according to an earlier eWEEK report. This road information is relayed to software in Google’s data centers. Armed with this intelligence, the automated hybrid cars use video cameras, radar sensors and a laser range finder to “see” other traffic, along with detailed maps. The cars stop at stop signs and traffic lights completely on their own. A trained safety operator sits in the driver seat to take the wheel in case the software goes buggy, while a Google software engineer rides in the passenger seat to monitor the car’s software.
Google remains active in its research in robotics, beyond just the self-driving vehicle project. In December 2013, Google quietly acquired seven companies that specialize in robotics as the company expands its pursuit of technology innovations, just as it has with self-driving cars, Google Glass and other Google initiatives.