Meta has received more than 1.1 million reports of users under the age of 13 on its Instagram platform since early 2019, but it has “deactivated only a fraction” of those accounts, according to a recently unsealed legal complaint against the company filed by attorneys general from 33 states.
Instead, the social media giant “regularly continued to collect” children’s personal information, like their location and email address, without parental permission, in violation of a federal child privacy law. children, according to court records. Meta could face hundreds of millions of dollars or more in civil penalties if states prove the allegations.
“Within the company, Meta’s current knowledge that millions of Instagram users are under the age of 13 is an open secret that is regularly documented, rigorously analyzed and confirmed,” the complaint states, “and jealously protected against public disclosure.
The privacy charges are part of a larger federal lawsuit filed last month by California, Colorado and 31 other states in the U.S. District Court for the Northern District of California. The lawsuit accuses Meta of unfairly trapping young people on its Instagram and Facebook platforms while concealing internal studies demonstrating harm to users. And it seeks to force Meta to stop using certain features that states say have harmed young users.
But much of the evidence cited by states was obscured by redactions in the original filing.
The unsealed complaint, filed Wednesday evening, provides new details about the states’ lawsuit. Using excerpts from internal emails, employee discussions, and company presentations, the complaint says Instagram “coveted and pursued” underage users for years, even though the company “failed to » complied with the law on children’s privacy.
The unsealed filing says Meta “continually failed” to make effective age verification systems a priority and instead used approaches that allowed users under 13 to lie about their age to create Instagram accounts. He also accused Meta executives of publicly stating in congressional testimony that the company’s age verification process was effective and that the company removed minor accounts when it did so. had become aware – even though executives knew there were millions of underage users on Instagram.
“Tweens want access to Instagram, and they’re lying about their age to get it now,” Instagram head Adam Mosseri said during an internal company conversation in November 2021, according to the court filing .
In Senate Testimony The following month, Mr Mosseri said: “If a child is under 13, they are not allowed to use Instagram. »
In a statement released Saturday, Meta said it has spent a decade working to make online experiences safe and age-appropriate for teens and that the state’s complaint “misrepresents our work through the use of selective citations and hand-picked documents.
The release also points out that Instagram’s terms of service prohibit users under the age of 13 in the United States. And it said the company has “put measures in place to remove these accounts when we identify them.”
The company added that verifying people’s ages was a “complex” challenge for online services, particularly for younger users who did not have school IDs or driver’s licenses. Meta said she would like to see federal legislation that would require “app stores to get parental approval whenever their teens under 16 download apps” rather than asking young people or their parents to provide personal information such as dates of birth to many different applications.
The privacy charges in this case are based on a 1998 federal law, the Children’s Online Privacy Protection Act, which requires online services with content directed to children to obtain verifiable permission from a parent before collecting personal information, such as names, email addresses or selfies. users under 13 years old. Fines for violating the law can be up to more than $50,000 per offense.
The lawsuit argues that Meta Elect failed to build systems to effectively detect and exclude these underage users because it viewed children as a crucial demographic — the next generation of users — that the company needed to capture to ensure a continued growth.
Meta had numerous indicators of underage users, according to Wednesday’s filing. An internal company chart displayed in the unsealed document, for example, showed how Meta tracked the percentage of 11- and 12-year-olds who used Instagram daily, the complaint says.
Meta also knew accounts belonging to specific underage Instagram users through the company’s reporting channels. But it “automatically” ignored some reports from users under 13 and allowed them to continue using their accounts, the complaint says, as long as the accounts did not contain a biography or photos of the user.
In one case in 2019, Meta employees explained in emails why the company had not removed four accounts belonging to a 12-year-old child, despite requests and “complaints from the girl’s mother claiming that her daughter was 12 years old,” according to the complaint. Employees concluded that the accounts were “ignored,” in part because Meta representatives “could not say with certainty that the user was a minor,” the legal filing says.
This is not the first time the social media giant has faced allegations of privacy violations. In 2019, the company agreed to pay a record $5 billion and change its data practices to settle Federal Trade Commission charges that it misled users about their ability to control their privacy. .
It might be easier for states to sue Goal over violations of children’s privacy rather than predict that the company encourages compulsive social media use – a relatively new phenomenon – among young people. Since 2019, the FTC has successfully filed similar children’s privacy complaints against tech giants, including Google and its YouTube platform, Amazon, Microsoft and Epic Games, the creator of Fortnite.