Privacy Forum Shows Regulators Challenges Posed by Technology
He said that a number of products, including the Echo and some smart televisions are always listening, which concerns some people. Others, such as Hello Barbie aren’t, but may capture intimate conversations anyway. Some items have been in the news recently because of their apparent ability to capture anything that’s said in their presence, Polonetsky said. But he noted that in many cases, the threat is actually less than the level of fear. For example, he noted that in the case of Hello Barbie, users have to press a button before the toy can listen and that all conversations are available for parents to inspect online, and that anything can be deleted. But some devices that gather such data may not be so obvious. “We designed a notice for companies doing location analytics,” Polonetsky explained. He said that this way people who were in an area where some basic cell phone information was being gathered remotely would know that it was in use.Of course all of the gadgets on display were there primarily to let the regulators and managers know what they were. In the background is a more serious purpose, which is to demonstrate just how much data can be collected under fairly normal circumstances and what these devices would do with that data, whether it was sending it back to the cloud for analysis or whether the data was kept locally. Managing all of that data and keeping tabs on the way the data flows and is used, is a primary responsibility of organizations such as the FTC and the EU’s Data Protection office. While currently very little of the data is being used for anything beyond conversations with children or commands to play music, some of it is, and Polonetsky believes that in some special cases it should be put to greater use. For example, Polonetsky is critical of Hello Barbie, but not for the information that it stores in the cloud. He said that it should be able to respond to statements like, “Help I’m being kidnapped,” or in other similar dire emergencies. But he’s also concerned about what’s done with the data that’s shared in unguarded moments or data that’s stored somewhere in the cloud without sufficient protection. The idea, he said, is to give those managers and regulators a chance to learn just how complex a task they face in regulating data privacy issues. In other words, the goal of the FPF is to help them have some idea of what they’re doing before they do it. That’s always a worthwhile goal.
In regard to the computer that was trying to decide the characteristics of the attendees, he said that the point was to demonstrate that there’s a difference between facial recognition and facial detection. The computer at the meeting was doing facial detection. Facial recognition is what the folks on those television crime shows would like you to think is possible.