Facebook Gets Its First Real Privacy Penalty - From Apple

Fraud Management & Cybercrime , Governance , Privacy

Apple's Privacy Warning to Facebook: We Can Break You(jeremy_kirk) • February 1, 2019    Facebook Gets Its First Real Privacy Penalty - From AppleApple's New York store on Fifth Avenue (Photo: Celsim Junior via Flickr/CC)

Facebook is like a bank now: It has transitioned into the category of too big to fail.

See Also: Live Webinar: Building Secure Delivery Pipelines with Docker, Kubernetes, and Trend Micro

Sure, the company's lawyers will be kept busy for years, and regulators continue to look into how the company should be held accountable for the Cambridge Analytica scandal. Fines will eventually be issued, and Facebook will pay them (see: Report: Federal Trade Commission Weighs Facebook Fine).

The peer pressure from Apple to Facebook over privacy is undoubtedly positive. Apple's message to Facebook came through loud and clear: We can break you. 

Don't hold out for a user revolution that catalyzes meaningful reform. As others have pointed out, Facebook holds your family and friends as digital hostages. Deleting an account is a protest that sounds like this: "Click." Mark Zuckerberg isn't listening.

But the latest transgression uncovered by TechCrunch raises a couple of very interesting issues. Apple actually handed Facebook its first-ever meaningful punishment, one that was shockingly effective at raising attention. And Facebook, despite the Cambridge Analytica scandal, clearly isn't undertaking serious reform.

Got Root?

Since 2016, Facebook's market research partners distributed an app called Facebook Research. The app flew under the radar because it was only distributed to a closed audience. That audience included those as young as age 13 on up to 35-year-olds.

The app was powerful because it used a root certificate that allowed inspection of internet traffic that would normally be protected by SSL/TLS. It also monitored how someone used their phone, including apps, email, messaging and more. That kind of data is crucial for Facebook to keep tabs on the competitive landscape.

Because the app wasn't offered in the App Store, Facebook used an enterprise digital certificate to sign it. Apple says Facebook violated its rules because those certificates are only supposed to be used for internal apps. That certificate allows apps to be "sideloaded" or installed without going through Apple's normal approval process that ferrets out suspicious apps.

Facebook Research was technically the same app as one called Onavo Protect. Last August, Facebook pulled Onavo Protect from the App Store following Apple's enforcement of requirements that apps shouldn't be collecting data beyond their scope.

Here's the first transgression: Facebook continued to distribute Facebook Research anyway. Keep in mind this shenanigan is post-Cambridge Analytica, when the company should be have been closely looking at areas where it may have stretched the bounds.

There's also the issue of consent. Facebook says it obtained consent from parents before their kids installed Facebook Research. TechCrunch described the wording in some of the agreements. But the reward probably outweighed any close inspection: The payoff for installing it was up to $20 per month and additional fees for successful referrals. Here's money!

Whether adults actually understand what tech companies are doing with their data is at the heart of new laws such as the EU's General Data Protection Regulation. Facebook may have obtained parental consent, but do the parents understand what's going on here? How many parents understand a root certificate?

Breaking Bad

When Apple revoked Facebook's enterprise certificate, it subsequently broke all of Facebook's other internal employee apps. The side effect was likely unintentional, as just one certificate signed all of the apps. But it still had a far greater impact than any regulator could match for a privacy-related issue. The two companies, however, were working to restore Facebook's ability to use internal apps.

Could Apple take this further and use its power in the mobile OS market to bring Facebook in line with evolving privacy wisdom? It's an idea floated in a column by Kevin Roose in the New York Times on Thursday. Apple could boot all of Facebook's apps - Instagram, WhatsApp included - literally with a few digital certificate revokations, a power no regulator has.

There are all kinds of obvious problems with this, of course. A multi-billion dollar tech company shouldn't be taking up the slack for governments that are failing to protect consumers' privacy rights. Apple can take a strong privacy stance because it has little stake in the personal data trade. That could change, of course, depending on how Apple's business interests shift.

But the peer pressure from Apple over privacy is undoubtedly positive. Apple's message to Facebook came through loud and clear: We can break you.