Meta blinded children on its platforms for years, lawsuit alleges

A recent unedited version in a multi-state lawsuit against Meta alleges a troubling pattern of fraud and undermining in how the company handles children under the age of 13 on its platforms. Internal documents appear to indicate that the company’s approach to this allegedly forbidden demographic is more laissez-faire than it publicly admits.

The lawsuit, which was filed last month, alleges a wide spread of harmful practices by the company related to the health and well-being of the young people who use it. From body image to bullying, invasion of privacy to maximizing engagement, everything said to be bad about social media is laid at Meta’s door – perhaps rightly so, but it also gives the appearance of inadequacy in focus.

In one respect at least, however, the documentation obtained by the Attorneys General of 42 states is quite specific, “and it is damning,” as AG Rob Bonta of California put it. That’s in paragraphs 642 through 835, which largely document violations of the Children’s Online Privacy Protection Act, or COPPA. This law makes certain restrictions around young people online, limiting data collection and requiring things like parental consent for various actions, but many tech companies seem to be consider it more of a suggestion than a requirement.

You know it’s bad news for the company when they ask for pages and pages of redactions:

Image Credits: TechCrunch / 42 AGs

This recently happened to Amazon as well, and it turns out that they are trying to hide the existence of a price-hiking algorithm that skimmed billions from consumers. But it’s worse when you redact COPPA complaints.

“We are very confident and confident in our COPPA allegations. Meta is known to have taken steps to harm children, and lied about it,” AG Bonta told TechCrunch in an interview. “In the unmodified complaint we found that Meta knew these social media platforms were used by millions of children under the age of 13, and that they were unlawfully collecting their personal information. This reflects the common practice where Meta says one thing publicly in the face of comments from Congress and other regulators, while internally it says something else.

The lawsuit argues that “Meta did not obtain—or even attempt to obtain—verifiable parental consent before collecting children’s personal information on Instagram and Facebook…
Instagram and Facebook target and successfully enroll children as users.”

In essence, while the problem of identifying children’s accounts created in violation of the rules of the platform is certainly a difficult one, it is said that Meta has chosen to turn a blind eye for years instead of creating stricter ones. rule that should affect user numbers.

Here are some of the most amazing parts of the suit. While some of these allegations relate to practices from years ago, remember that Meta (formerly Facebook) has said publicly that it does not allow children on the platform, and is working diligently to identify and expelling them, within a decade.

Meta has internally tracked and documented under-13s, or U13s, audience breakdowns over the years, such charts as shown in the filing. In 2018, for example, it was noted that 20 percent of 12-year-old children on Instagram use it daily. And it’s not in a presentation about how to remove it – it has to do with market penetration. Another chart shows Meta “knowledge that 20-60% of 11- to 13-year-old users in particular birth cohorts actively use Instagram on at least a monthly basis.”

The recent unredacted chart shows that Meta is closely tracking under-13 users.

It is difficult to square this with the public position that users of this age are not welcome. And it’s not because leadership is unknown.

In the same year, 2018, CEO Mark Zuckerberg received a report that there were approximately 4 million people under 13 on Instagram in 2015, which amounted to third among all 10-12-year-old children in the US, they estimate. Those numbers are obviously dated, but they’re still amazing. Meta has never, to our knowledge, admitted to having such a large number and proportion of under 13 users on its platforms.

Not on the outside, at least. Internally, the numbers seem well documented. For example, as the lawsuit states:

Meta has data from 2020 showing that, of the 3,989 children surveyed, 31% of child respondents aged 6-9 and 44% of child respondents aged 10 to 12- used Facebook.

It’s hard to extrapolate from the 2015 and 2020 numbers to today (which, as we’ve seen from the evidence presented here, is almost certainly not the whole story), but Bonta noted that the large numbers presented for effect, not as legal. righteousness.

“The basic principle remains that their social media platforms are used by millions of children under 13. Whether it’s 30 percent, or 20 or 10 percent … any child, it’s illegal,” said he. “If they did this at any time, it would have been against the law at the time. And we’re not confident that they’ve changed their ways.”

An internal presentation called “2017 Teens Strategic Focus” appears to specifically target children under 13, noting that children are using tablets by the age of 3 or 4, and the ” Social identity is an unmet need Ages 5-11.” A stated goal, according to the lawsuit, is the specifics of “grow (Monthly Active People), (Daily Active People) and time spent by children U13.”

It’s important to note here that while Meta does not allow accounts to be run by people under 13, there are many ways it can legally and safely interact with that demographic. Some kids just want to watch videos from Spongebob Official, and that’s fine. However, Meta must check parental consent and limit the ways it can collect and use their data.

But the redactions suggest that these under 13 users are not of the legal and safe type. Reports of minor accounts are reportedly automatically ignored, and Meta “continues to collect the child’s personal information if there are no photos associated with the account.” Of the 402,000 reports of accounts owned by users under 13 in 2021, less than 164,000 were disabled. And these actions are reported not to cross between platforms, meaning that the Instagram account that is disabled is not flagged as being associated with or linked to Facebook or other accounts.

Zuckerberg testified to Congress in March of 2021 that “if we find someone who might be under 13, even if they’re lying, we’ll remove them.” (And “they lied about it a TON,” a research director said in another quote.) But documents from the next month cited in the lawsuit show that “Age verification (for under 13) there is a huge backlog and the demand exceeds the supply” due to the “lack of (staffing) capacity.” How big is the backlog? Sometimes, the lawsuit says, in the order of millions of accounts.

A potential smoking gun can be seen in a series of anecdotes from Meta researchers who dangerously avoided the possibility of inadvertently confirming an under 13 group in their work.

One wrote in 2018: “We just want to make sure to be sensitive about a couple of things specific to Instagram. For example, does the survey go to under 13s? Because everyone has to be at least 13 years of age before they create an account, we want to be careful about sharing findings that go back and focus on under 13 years bullied on the platform.

In 2021, another, studying “child-adult sexually related content/behavior/interaction” (!) said that he was “excluded (by) younger children (10-12 yos) in this research” although there are “definitely children this age of IG,” because he “Worried about the risks of disclosure because they really shouldn’t be on the IG.”

Also in 2021, Meta ordered a third-party research company conducting a survey of preteens to remove any information indicating that a survey subject was on Instagram, so the “company cannot be known under 13.”

Later that year, outside researchers provided Meta with information that “of children aged 9-12, 45% use Facebook and 40% use Instagram every day.”

During the 2021 internal study of youth on social media, they first asked parents if their children were on Meta platforms and removed them from the study if so. But one researcher asked, “What happens to kids who get through the screener and then say they’re on IG during interviews?” Instagram’s Head of Public Policy Karina Newton responded, “We don’t collect usernames, correct?” In other words, what happens is nonsense.

As the lawsuit says:

Although Meta knows specific children on Instagram through interviews with children, Meta takes the position that it still lacks actual knowledge that it is collecting personal information from an under 13 users because it does not collect user names while conducting these interviews. In this way, Meta goes to great lengths to avoid meaningful compliance with COPPA, looking for loopholes to excuse its knowledge of users under the age of 13 and maintain their presence on the Platform.

Other complaints in the high-profile case have softer contents, such as the argument that the use of platforms contributes to poor body image and that Meta failed to take appropriate measures. That cannot be actionable. But the COPPA stuff is a lot more cut and dry.

“We have evidence that parents sent them notes about their children being on their platform, and they took no action. I mean, what else do you need? It shouldn’t even have come to that point,” said said Bonta.

“These social media platforms can do whatever they want,” he continued. “They can operate on a different algorithm, they can have plastic surgery filters or not, they can give you alerts in the middle of the night or during school, or nothing. They choose to do things that increase the frequency of use of that platform by children, and the duration of that use. They could end all of this now if they wanted to, they could easily prevent those under 13 from accessing their platform. But they don’t.”

You can read the typical unrepentant complaint HERE.

TechCrunch has reached out to Meta for comment on the lawsuit and some of the specific allegations, and will update this post when we hear back.

Leave a comment