Why Facebook fears the EU's new privacy rules
(Bloomberg View) -- Facebook Chief Executive Officer Mark Zuckerberg and Chief Operating Officer Sheryl Sandberg have apologized (again and again) for the company's handling of user data.
The best indication that they aren't actually sorry, however, is the social network's intention to change its terms of service to put all non-European users under the jurisdiction of its U.S. headquarters rather than the international headquarters in Dublin, Ireland. That means users in Africa, Asia, Australasia, and Latin America won't be covered by the European Union's General Data Protection Directive, which goes into effect on May 25. The U.K. may also get a carve-out after Brexit.
Facebook's admission of the planned change comes immediately after the company effectively promised to apply GDPR protections to the entire world.
"Today we're introducing new privacy experiences for everyone on Facebook as part of the EU's General Data Protection Regulation (GDPR), including updates to our terms and data policy," the company wrote in a blog post on Wednesday. "Everyone — no matter where they live — will be asked to review important information about how Facebook uses data and make choices about their privacy on Facebook."
But once non-European users' agreements are no longer with Facebook Ireland, now responsible for all of the company's activities outside North America, they won't be able to hold the company legally responsible for GDPR violations. In effect, they'll be subject to toothless U.S. privacy laws.
Under the GDPR, companies can be fined up to 4 percent of their annual global revenue for not having sufficient customer consent to process data or ignoring the "privacy by design" principle that states customers' privacy rights must be handled as a core feature of the product, not an afterthought.
In Facebook's case, that's $1.6 billion based on 2017 revenue. It's natural for the company to try to limit its exposure to that kind of punishment, but it undermines its narrative of contrition and a commitment to privacy.
It's worth taking stock of what the GDPR requires. Perhaps most importantly, the regulation demands a granular approach to asking for consent to process personal data. Consent must be received for each separate data collection practice, explicitly, in clear and plain language. Consent must also be "as easy to withdraw as to give," and the use of the service shouldn't be conditional on a customer's consent to the collection of personal data that is not directly necessary for the service itself, not just for its monetization as in Facebook's case.
On all these points, Facebook currently fails. The "Privacy Settings and Tools" section of a user's profile doesn't ask for consent to any kind of data collection. Nor does the Data Policy contain any links to consent forms for particular types of data harvesting. Some of these forms are hidden in the "Ads" section of the profile, where most people wouldn't look for them, and even there, I'm not asked directly to agree to give up my data.
For example, Facebook informs me that I've been accurately placed in the advertising category "Returned from travels 1 week ago" -- but I have no idea how it knows that, since I haven't posted anything on Facebook from my most recent trips nor explicitly agreed anywhere to provide that information to advertisers. I may have clicked to approve some long, incomprehensible legal document at some point to give Facebook access to my location data, but that won't wash in Europe starting May 25. All I can do about it now is delete the ad category, but that won't stop Facebook from continuing to collect the information.
In its most recent post, Facebook uses elliptical language to promise to ask users whether they want to let it "use data from partners" to target advertising. If it took GDPR seriously, it would use plainer language: "For years, we have been collecting data about your browsing and app use outside Facebook. We use the data to place you in categories advertisers can select when buying our ads. May we continue or would you like us to stop?" That would comply with the clarity requirement and with the GDPR provision that users can object at any time to the the use of their data.
Of course, as Facebook knows, only the most carefree user will give it the right to a blanket surveillance of digital activities. Facebook doesn't want a refusal, one reason the personal data file Facebook allows us to download doesn't actually include web logs -- just the "ad interests" derived from them.
Zuckerberg had to correct the record on that in his congressional testimony after stating several times that the file contained all the information Facebook possessed about a user. That's not strictly in compliance with the GDPR, which requires the disclosure of data as provided by the user.
The GDPR gives users a right to have their information erased if consent for its collection is withdrawn. As an ordinary user, I have no idea how I can do that using Facebook's interface. In late March, Facebook promised to fix this ("It's time to make our privacy tools easier to find," the company wrote, as if there was also a time to make them difficult to find). But no changes have been rolled out yet. Facebook's intention to move most of its users out of the EU jurisdiction shows full compliance with the GDPR is not desirable, so the changes most of its users will see will probably not be 100 percent GDPR-compliant.
That's a big mistake in the wake of the Cambridge Analytica scandal, which raised the public awareness of Facebook's problems with personal data handling. Pivotal Research Group analyst Brian Wieser, a long-time Facebook bear, wrote in a note released on Thursday that brand marketers will scrutinize their advertising investment in Facebook more carefully, with a skeptical eye on Facebook's claims of precise targeting, now that it's in doubt that users provide much of the data knowingly and willingly.
To quote Facebook itself, "it's time" for the company to come clean about the data it has collected for which it doesn't have user consent under the GDPR, and to start systematically informing advertisers and investors about the number of users who have refused to provide such data.
So far, Facebook hasn't even provided accurate information about the number of fake accounts in its user base. The Pivotal Research note, for example, asserts that there were 287.4 million false and duplicate accounts among Facebook's reported 2.1 billion users. In 2017, according to the note, the reported user base grew by 269 million accounts, but 142 million of them -- almost 53 percent -- were fakes and duplicates. Add all the people who will opt out of providing data when (if ever) they are asked clearly about it, and Facebook's ability to sell targeted ads may be severely impaired.
If Facebook actually complies with the GDPR, its business performance in Europe will be an indicator of how the whole company can perform if required to stop misleading users and customers about practices central to its business model. Investors should follow it closely: Privacy rules will inevitably be tightened outside Europe someday, too.