In the 11 or so days since the story broke that 50 million Facebook users’ data was exposed through a Cambridge Analytica researcher’s app, executives up and down the Facebook food chain—from CEO Mark Zuckerberg and COO Sheryl Sandberg’s five-day-late apology tour to vp of global marketing Carolyn Everson’s talking points memo—took to the media to explain what happened, though perhaps later than some would have liked.
One voice, however, has been suspiciously quiet: that of chief privacy officer Erin Egan. No longer.
In a blog post today, Egan and deputy general counsel Ashlie Beringer announced several data settings the company believes will help users better understand what data is being collected and how Facebook users can be a bit more in control of the data they put into the Facebook engine.
“Last week showed how much more work we need to do to enforce our policies and help people understand how Facebook works and the choices they have over their data,” Egan and Beringer wrote. “We’ve heard loud and clear that privacy settings and other important tools are too hard to find and that we must do more to keep people informed.”
“We’ve heard loud and clear that privacy settings and other important tools are too hard to find and that we must do more to keep people informed.”
Facebook chief privacy officer Erin Egan and and deputy general counsel Ashlie Beringer
In a nod to how the company views itself as a mobile destination, the company did a complete redesign of its settings on mobile, stating in the post, “Instead of having settings spread across nearly 20 different screens, they’re now accessible from a single place. We’ve also cleaned up outdated settings so it’s clear what information can and can’t be shared with apps.”
A lot of this thinking, according to sources, was informed by workshops the company has been holding with regulators and privacy experts over the past few months on how to design for privacy. One aspect the company is homing in on is the fact that legal announcements, those dreadful terms and conditions nobody reads but readily accepts, are an eyesore and in a language that’s difficult to parse. By changing how the settings look, the company believes, it can help people learn more about what data is being shared.
Facebook also introduced some tools it calls “Privacy Shortcuts,” basically a “menu where you can control your data in just a few taps with clearer explanations of how our controls work. The experience is now clearer, more visual, and easy-to-find.” Again, this is a more design-centric tool. Users can now add multiple layers of protection. Think two-factor authentication. Users can also now control the ads they see and manage who sees their profile and what they post. This is accessed via a single dashboard, a tool the company calls “Access Your Information.”
At the IAPP conference in Washington, D.C, where chief privacy officers from all kinds of companies gathered to talk about, well, privacy, Facebook’s deputy CPO, Rob Sherman, said that the company has been investing in building improved privacy experiences for a few months. “The things that we’re announcing today,” he said, “are a part of that, but I think it’s part of a broader reflection of the feedback we’re getting. … People are more sophisticated and individualized with what happens with their data, and this is something we want to respond to.”
Of course, all of this doesn’t quiet the voices asking why users’ data was shared in the first place. Facebook’s argument seemed to have been this: The point of Facebook is to personalize your experience, and data control and privacy should be sound and logical. It’s not like we autopopulate your profile with personal interests. That’s on you, buddy. And a lot of people use Facebook to log in to other apps. So, it’s not really our problem if you can’t keep track of what data you’re sending out into the world.
There’s a Silicon Valley logic to all of this. Blame the user, not the platform.
The tools introduced today, the company believes, address a shift in that thinking. So, instead of a user saying, “I didn’t want my data to be used that way,” Facebook is saying, “We want to check with you to make sure you’re good with that, and oh, by the way, you can easily change that if you want.”
Technology moves faster than policy or the law, on the one hand. And on the other, people don’t have as much time to adapt to new technology as they once did—with, for example, previous disruptors like TV and radio. We tend to focus on developing new products and features, not on how to innovate within privacy design. Companies like Facebook have to make sure they offer privacy controls while also educating people as they keep pace with newer technologies.
Egan and Beringer wrote in their post: “It’s also our responsibility to tell you how we collect and use your data in language that’s detailed, but also easy to understand. In the coming weeks, we’ll be proposing updates to Facebook’s terms of service that include our commitments to people. We’ll also update our data policy to better spell out what data we collect and how we use it. These updates are about transparency—not about gaining new rights to collect, use, or share data.”
The talk has begun, and now the answer has to come in action. It’s time.