WASHINGTON—At first the meeting at the Future of Privacy Forum headquarters downtown looked like any other office soiree here, one populated by Millennials and hipsters trying to impress each other with their real or perceived level of official access.
But a second look changed that impression. Right behind the check-in desk was a massive flat screen television showing an image of each person as they arrived.
Superimposed on each person’s image was a computer’s guess as to our demographics. Were we young or old? Male or female? I was identified as a woman under 30. A friend who works for USA Today was also identified as a woman, but to his annoyance, the computer decided he was over 40.
Of course the gender-mangling computer was really there for another purpose, which was to show how far machines had come in determining the sort of person that was in its view finder. The idea is that marketers could get an idea about the makeup of people at an event to see if they were attracting the audience they hoped to attract.
Across the room from the ubiquitous beer-and-wine bar was a table with several aerial drones. Next to that a man was wearing an Oculus VR headset. There was also an Amazon Echo in the room, which I commanded to play some Mozart. There were radio frequency beacons and perhaps the creepiest interactive device ever seen—a Hello Barbie toy.
Clustered around each display were groups of people from technology companies and government agencies. There was someone from the Federal Communications Commission, which is currently wrestling with privacy rules, as well as a representative of the Department of Homeland Security. Talking in a corner were former Federal Trade Commission member Julie Brill and the European Union’s Data Protection Supervisor Giovanni Buttarelli. Clearly this was more than just a bunch of hipsters.
Jules Polonetsky, executive director and co-chair of the Future of Privacy Forum explained that his organization had invited these people who are responsible for managing or regulating technology with a privacy impact to get some time with the technology for which they’re making rules. “It’s surprising how few have had a chance to interact with these tech items,” he said.
The FPF is a think tank that “seeks to advance responsible data practices,” according to its mission statement. The organization does this through educational outreach, academic research and through actual evaluations of tech products that have an impact on personal privacy.
Polonetsky said that the group does this by obtaining versions of the products they’re evaluating and then trying them out to learn not just how they work, but to discover information they gather and what they do with that information once it’s collected.
Privacy Forum Shows Regulators Challenges Posed by Technology
He said that a number of products, including the Echo and some smart televisions are always listening, which concerns some people. Others, such as Hello Barbie aren’t, but may capture intimate conversations anyway.
Some items have been in the news recently because of their apparent ability to capture anything that’s said in their presence, Polonetsky said. But he noted that in many cases, the threat is actually less than the level of fear.
For example, he noted that in the case of Hello Barbie, users have to press a button before the toy can listen and that all conversations are available for parents to inspect online, and that anything can be deleted.
But some devices that gather such data may not be so obvious. “We designed a notice for companies doing location analytics,” Polonetsky explained. He said that this way people who were in an area where some basic cell phone information was being gathered remotely would know that it was in use.
In regard to the computer that was trying to decide the characteristics of the attendees, he said that the point was to demonstrate that there’s a difference between facial recognition and facial detection. The computer at the meeting was doing facial detection. Facial recognition is what the folks on those television crime shows would like you to think is possible.
Of course all of the gadgets on display were there primarily to let the regulators and managers know what they were. In the background is a more serious purpose, which is to demonstrate just how much data can be collected under fairly normal circumstances and what these devices would do with that data, whether it was sending it back to the cloud for analysis or whether the data was kept locally.
Managing all of that data and keeping tabs on the way the data flows and is used, is a primary responsibility of organizations such as the FTC and the EU’s Data Protection office. While currently very little of the data is being used for anything beyond conversations with children or commands to play music, some of it is, and Polonetsky believes that in some special cases it should be put to greater use.
For example, Polonetsky is critical of Hello Barbie, but not for the information that it stores in the cloud. He said that it should be able to respond to statements like, “Help I’m being kidnapped,” or in other similar dire emergencies. But he’s also concerned about what’s done with the data that’s shared in unguarded moments or data that’s stored somewhere in the cloud without sufficient protection.
The idea, he said, is to give those managers and regulators a chance to learn just how complex a task they face in regulating data privacy issues. In other words, the goal of the FPF is to help them have some idea of what they’re doing before they do it. That’s always a worthwhile goal.