Meta Platforms, the parent company of Facebook and Instagram, received over 1.1 million reported instances of users under the age of 13 since early 2019—a fact it largely failed to act on and “zealously” sought to keep from the public, the attorneys general of 33 states alleged in a newly unredacted complaint.

Further, Meta “routinely continued to collect children’s data”—including their emails and locations—“without parental consent” in violation of the law, according to the complaint.

Dozens of U.S. states filed suit against Meta last month, accusing the social media giant of getting America’s youth addicted to social media and lying to the public about the potential dangers of its platform in violation of state and federal law.

“Meta has harnessed powerful and unprecedented technologies to entice, engage, and ultimately ensnare youth and teens. Its motive is profit, and in seeking to maximize its financial gains, Meta has repeatedly misled the public about the substantial dangers of its Social Media Platforms,” prosecutors alleged.

An updated filing this week saw large portions of the complaint against the company unredacted, detailing evidence collected by the states

Internal documents and presentations obtained by prosecutors showed the extent to which the company studied the hold it has over young users.

“Teens are insatiable when it comes to ‘feel good’ dopamine effects,” and “IG has a pretty good hold on the serendipitous aspect of discovery through our Explore surface, recommendations and social graph. And every time one of our teen users finds something unexpected their brains deliver them a dopamine hit,” one internal document read.

Former Facebook Vice President of Analytics Alex Schultz in 2020 said in an internal email thread that “fundamentally I believe that we have abused the notifications channel as a company,” according to the filing.

The profit Meta is able to garner per 13-year-old user—internally dubbed as their “Lifetime Value”—was quantified by Meta in 2018: roughly $270 per child.

“[t]his number is core to making decisions about your business,” an internal Meta email from September 2018 read.

Obtaining the location data of teenagers was important for Meta and its advertising operations, with one goal to “get teens to share their location with us so we can leverage that data for awesome product experiences and also analytics around high schools,” an internal 2017 email said.

Around 70 percent of teenage girls in the United States “may see ‘too much’ sensitive content,” internal researchers at Meta concluded, per the filing. 13.5 percent of teen girls believe the platform worsens thoughts of suicide; 17 percent believe it worsens eating issues, Meta concluded internally.

That same year, employees weighed commissioning a report to study bullying on its Instagram platform. Fiona Brown, Instagram’s director for communication regarding well-being and community initiatives, internally expressed worry the survey would uncover instances of children under 13 experiencing bullying on the platform, according to the filing.

Another document from that year plainly reads: “We do very little to keep U13s off our platform.”