Data Protection: Instagram for Under-13s: Is This Facebook's Child Privacy Fight?

Child privacy and Instagram for under-13s | AWS data processing agreements | Facebook "GDPR bypass" | BIPA under threat | Germany v. Ireland DPA spat

Here are this week’s musings on data protection, privacy, and big tech shenanigans.

Instagram for Under-13s: Facebook’s “Fight” With Child Privacy Law?

Facebook is planning a social media network for young kids. Will it succeed? Probably.

Mark Zuckerberg once said that the Children's Online Privacy Protection Act (COPPA) was a "fight" Facebook would "take on at some point."

This week, we learned Facebook is planning an Instagram for under-13s. If this is the fight, Facebook will probably win.

Why? Why would Facebook do this?

Instagram is currently unavailable for under-13s because they have special legal protections, meaning that it’s harder for businesses to collect their data and to target them with ads.

In the U.S., the main children’s privacy law is COPPA, a federal law passed in 2000—before Facebook, YouTube, or Instagram existed.

Since 2013, COPPA has required websites to get parental consent before tracking under-13s with cookies.

Does COPPA have teeth?

Some COPPA settlements seem big at first—like the Federal Trade Commission’s $170 million settlement with Google in 2019. But bear in mind that Google's turnover was $160 billion that year.

Google allegedly tracked users under 13 on YouTube. After the settlement, YouTube basically shifted COPPA liability to content creators (among other measures)

TikTok (previously also settled for $5.7 million with the FTC under COPPA in 2019.

Whether carelessly or willingly, apps and sites are repeatedly allowing kids to sign up without getting parental consent.

What about outside the U.S.?

Outside the U.S., things could be even more complicated for Facebook’s new venture.

I covered two ongoing U.K. child privacy cases last year alleging platforms had violated the GDPR's child privacy rules.

The first was against YouTube, aiming for an ambitious $2.5 billion in damages. This had to do with how YouTube processes kids’ data to make recommendations. The second, which is at a very early stage, was against TikTok, led by a 12-year-old girl.

Recently, we’ve seen some EU DPAs coming down hard on social media apps under the GDPR’s child privacy rules. See Italy’s recent action against TikTok, for example.

How will Facebook avoid problems like this?

Setting up a separate platform for youngsters might be a way for Facebook to avoid claims under child privacy law.

For example, Tiny Instagram could require parental consent at sign-up. Or it might avoid collecting certain types of data. Or it could—imagine this—avoid targeting ads based on users’ personal information.

Or, Facebook could just take the risk, pay any resulting fines, and work out some way of complying with enforcement notices without failing to profit.

Cases like those explored above could be a headache for Facebook. But I think it'll be fine—it always is.

Doctolib Ruling: Does Schrems II Now Apply to Inter-EU Transfers?

French court ruling says vaccine-booking platform’s contract with Amazon is lawful. But this case isn’t as clear-cut as it seems.

I’m drawing my analysis here from the IAPP’s summary of the case.

Here’s the background

The French Conseil d’Etat looked at a data processing agreement between Doctolib, whose platform is used for booking vaccinations, and AWS Sarl, a Luxembourg-based subsidiary of Amazon Web Services.

Doctolib used AWS to process health data. The claimants asked the court to suspend transfers of personal data between Doctolib and AWS.

Who cares?

The case was significant, in part, because so many EU data controllers use AWS—or another U.S. subsidiary—as a data processor. The case could have invalidated a lot of data processing agreements and caused a lot of companies serious issues.

Why would the agreement with AWS have been a problem?

It all comes back to Schrems II. Because AWS Sarl is a subsidiary of Amazon, the claimants argued that the personal data in its care was at risk of interception under U.S. laws.

Even though AWS Sarl is storing the data in the EU?

Yes—U.S. surveillance laws like FISA 702 and EO 12333 affect certain U.S. companies even when operating overseas. This means that AWS Sarl might be obliged to submit personal data to U.S. intelligence services.

So Schrems II applies despite there being no third-country transfer of personal data?

In a sense, yes. Controllers always need to take appropriate safeguards when disclosing personal data to a processor—and the risk of access by intelligence services is a relevant consideration.

The Conseil d’Etat examined the data processing agreement between Doctolib and AWS Sarl to determine whether there were sufficient safeguards to protect the personal data—just as the CJEU examined transfers the safeguards provided by Privacy Shield in Schrems II.

What did the court decide?

The Conseil d’Etat said that the data processing agreement was valid and that it would not suspend transfers between Doctolib and AWS Sarl.

Phew! So all processing agreements with AWS Sarl are safe?

No. This case has been reported in these terms, but in my view, this isn’t the right takeaway.

The court found that the transfers to AWS Sarl were valid because of the safeguards Doctolib and AWS Sarl had put in place. These included:

  • AWS was contractually bound to challenge access requests by foreign authorities (NB: I’m not sure such challenges would much difference).

  • The data was encrypted and the key was held by a “trusted third-party” in France.

  • There was a relatively short retention period of three months.

There are a couple of points here that might be open to challenge. The court also found that the data about vaccinations was not “health data” under the GDPR. I was surprised by this part.

Controllers should consider whether their processors and subprocessors are subject to interception under surveillance law—regardless of where their servers are actually based.

The point of the international transfer provisions is to safeguard personal data, and it’s worth thinking about any transfer or processing agreements—third-country or otherwise—in these fundamental terms.

(This, of course, is not legal advice.)

Could the US Lose Its Best Privacy Law?

Illinois lawmakers are trying to undermine the Biometric Information Processing Act (BIPA). This is one of the few U.S. privacy laws providing Americans with real privacy protection.

What is America’s best privacy law?

Maybe it’s not objectively the best, but my favourite U.S. privacy law is Illinois’ Biometric Information Processing Act (BIPA), which passed way back in 2008. BIPA is one of the most powerful—albeit limited—privacy laws in the U.S.

BIPA requires businesses to provide notice and obtain consent before collecting biometric information from consumers, including facial recognition data, fingerprints, and voiceprints.

Sounds reasonable?

Not according to a series of Illinois bills that have attempted to weaken the law, apparently under the guise of helping “small businesses”.

Various bills proposed by Illinois legislators have attempted to remove BIPA’s private right of action, narrow its scope, change its definition of “biometric information”, or chip away at its consent requirements.

Why do you like BIPA so much, anyway?

BIPA has resulted in some high-profile cases and settlements, not least the $650 million Facebook class-action settlement from earlier this month.

In February I spoke to the American Civil Liberties Union (ACLU) of Illinois, which is suing facial recognition company Clearview AI under BIPA.

Clearview’s business model involves hoovering up social media photos (including, mostly likely, yours) without notice or consent, extracting unique biometric data about the subjects’ faces, sorting them into a searchable database, and selling access to that database to police and, until recently, private companies.

It's just one state law. If it gets repealed, what's the big deal?

The stitching-together of America’s patchwork of privacy laws has been one of the big stories of 2021 so far. But the country still lacks meaningful, rights-based privacy protection for much of its population.

Other than BIPA, there is no effective U.S. law prohibiting and companies like Facebook and Clearview from gathering biometric information without consent (although that may change soon).

This will change when Virginia’s Consumer Data Protection Act (CPDA) comes into force. However, this law has no private right of action, so there will be less incentive for businesses to comply with it.

Repealing or amending BIPA would be a huge step backward for U.S. privacy law, at a time when it seems to be moving forward faster than ever.

Facebook “GDPR Consent Bypass” Hits Austrian Supreme Court

Facebook is "bypassing GDPR consent”, according to a case brought to the Austrian Superior Court by GDPR final boss Max "Schrems II" Schrems.

What does “bypassing the GDPR” mean?

Here’s the background of this case:

  • Under the Data Protection Directive, Facebook relied on the legal basis of "consent" for cookies.

  • The GDPR passed in 2016, with a higher consent standard. Consent now had to be obtained via an “unambiguous,” “clear, affirmative action.”

  • Facebook's consent request was no longer valid. What would the company do? Ask for consent in a valid way? Stop undertaking activities that require consent (such as using tracking cookies)?

  • No—on the day the GDPR came into force, in May 2018, Facebook copied its consent request into its terms. The social media platform now said it was now relying on the legal basis of "contract”, not consent.

What are the requirements for relying on the legal basis of “contract”?

The lawful basis of ”contract” is for when you need to process personal data to perform your obligations under a contract with the data subject.

If you order a product from Amazon, Amazon needs your address to send it to you—and Amazon can rely on “contract” to collect and use your address for this purpose.

Facebook said it "needed" cookies to enable its business model to operate. After all, the social media giant can’t fulfill its obligations under the Facebook Terms of Service if it goes out of business—right?

Facebook also needed cookies to provide personalised ads (as “promised” in Facebook’s terms), and to enable the user to use Facebook for free.

So… Is that a valid reliance on “contract”?

Not according to the European Data Protection Board (EDPB).

The EDPB says activities that are "necessary for the performance of a contract" do not include "activities (that) are not necessary for the individual services requested by the data subject, but rather necessary for the controller’s wider business model."

I covered this case after it was heard by the Viennese Superior Court. The court’s decision seemed odd to many people—like me—who spend a lot of time submerged in data protection.

There were other factors at play here—local contract law, for one. But it seems likely that the Austrian Supreme Court will refer the case to the Court of Justice of the European Union (CJEU), which will confirm Schrems’ arguments.

So what if the CJEU says Facebook must get “proper” consent? What will Facebook do?

Here’s one possibility.

The ePrivacy Regulation will come into force soon. Under the current version, controllers will be allowed to make access to services contingent on consent to cookies—as long as they offer an alternative service that doesn’t involve cookies, for which they can charge a fee.

In my view, this provision creates two tiers of consent in the EU (which I think is problematic).

Facebook is unlikely to be happy about the idea of offering users a genuinely free choice over its use of cookies.

But what if Facebook could offer a paid alternative, safe in the knowledge that most people would continue to use the free version?

A paid version of Facebook? Unlikely. But it’s a possibility.

Irish DPA Criticised (Again) Over GDPR Enforcement

The Irish DPA has been criticised by the German federal data regulator. Fair enough?

Here’s the background:

A letter from federal data protection regulator (BfDI) Ulrich Kelber has been reported by the Irish Times criticising the Irish Data Protection Commission (DPC).

This letter reiterated what many observers have been saying about the Irish DPC for some time. As home to most big tech companies, Ireland has earned a reputation as a GDPR-compliance haven.

Is the Irish DPC’s reputation fair?

Think of it this way. As lead supervisory authority to Facebook and Google, the DPC’s job is to ensure these companies comply with data protection law.

Under the one-stop-shop procedure, DPAs have to forward complaints about these firms to the DPC except in certain specific circumstances.

Despite this, Ireland has never concluded an investigation Google, Facebook, or any of their subsidiaries. But other DPAs have—even within the very narrow set of circumstances under which they have been permitted to do so.

Here's a list of every EU DPA that has fined each firm since 2018.


• France (under both the GDPR and the ePrivacy Directive)
• Belgium
• Sweden
• Hungary
(Not Ireland)


• Hamburg
• UK (Data Protection Directive)
• France (ePrivacy Directive)
(Not Ireland)

So what enforcement action has the Irish DPA taken against big tech firms?

Just one penalty, against Twitter for €450,000, after it failed to properly notify the DPC of a data breach.

This isn't a large fine—around 0.1% of Twitter's turnover. But the Irish DPA originally proposed an even smaller penalty, of between €135,000 and €275,000.

This small penalty was seen as too lenient by other EU DPAs. They disputed it under the first-ever use of the GDPR’s Article 65 procedure. Several DPAs recommended multi-million euro fines.

On the other hand, Ireland is reportedly due to impose a €50 million on WhatsApp later this year (but this hasn’t been officially confirmed yet).

This is a complex issue, and the GDPR isn't all about fines. There’s also some question as to whether the German regulator was right to criticize the DPC in this way. But it does reiterate this bottleneck of GDPR enforcement.

Recommended Reading

Your Face Is Not Your Own—Kashmir Hill—New York Times

This long-read on surveillance firm Clearview AI is well worth your time. Hill discusses the company’s history, personnel, and legal defenses.

Google and the Age of Privacy TheaterGilad Edelman—Wired

A good overview of Google’s FLoC proposals with some interesting technical questions.

Facebook’s SEER AI Carries Privacy Risks—Me—Digital Privacy News

I’m not too proud to plug one of my own articles this week, about Facebook’s SEER AI and the implications for privacy. I spoke to some great sources who shared some excellent insights.