A New Mexico complaint against Mark Zuckerberg’s Meta has been unredacted to reveal horrifying claims of sexual harassment against children on Facebook and Instagram. One internal presentation estimated that 100,000 children a day are targets by pedophiles on Zuckerberg’s platforms, including receiving ” pictures of adult genitalia.”
Digital Content Next CEO Jason Kint pointed out in an X/Twitter thread that “most redactions just now removed in the New Mexico attorney general’s complaint vs Instagram and Facebook” show “it is even more shocking, infuriating and stomach turning” than what people may have realized. The mostly unredacted complaint is available here.
woah, most redactions just now removed in the New Mexico attorney general’s complaint vs instagram and facebook. it is even more shocking, infuriating and stomach turning to see what has now been lifted (in yellow).
“a 2021 presentation estimated 100,000 children per day…” /1 pic.twitter.com/k8vJGfeZvQ
— Jason Kint (@jason_kint) January 21, 2024
In one highlighted section, the complaint claims that “100,000 children per day received online sexual harassment, such as pictures of adult genitalia.”
The complaint goes on to state the following:
One internal document shows Meta scrambling in 2020 to address an Apple executive whose 12-year-old was solicited on the platform, noting, “this is the kind of thing that pisses Apple off to the extent of threatening to remove us from the App Store,” asking whether there was a timeline for when “we’ll stop adults from messaging minors on IG Direct,” and noting that if they did not address other accounts with “sugardaddy” — “they will reply with 100 more accounts if we’re not able to take them down.”
“A May 2018 presentation to Meta’s Audit Committee confirms this fact: ‘While user-provided data indicates a decline in usage among young users, this age data is unavailable because a disproportionate number of our young users register with an inaccurate age,’” the suit continues.
“Two years later, a January 2020 presentation entitled, ‘Succeeding in US Messaging’ demonstrated both the depth of Meta’s knowledge regarding usage of Messenger by children and its ambitions to leverage that usage to further entice younger generations to use its products,” the complaint reads.
The complaint goes on to say Meta is aware that its platforms are popular with children as young as age six:
One of Meta’s “End Game” goals was to “become the primary kid messaging app in the US by 2022.” The document confirmed that “in the US — Messenger is popular… with Kids (13% primacy, US 510).” Meta’s knowledge that its platforms were used by and “popular with” children as young as 6-years-old makes its failures to protect minors against CSAM and solicitation all the more egregious.
so much of these allegations point toward internal knowledge during harm. And there still are redactions we don’t get to see. I can’t imagine how bad still sealed items like this may be. /3 pic.twitter.com/nQM5MMSLnD
— Jason Kint (@jason_kint) January 21, 2024
Elsewhere in the thread, Kint points out an internal conversation highlighted in the complaint, in which one Meta employee states that nothing is being done to counter child grooming on the company’s social media platforms, adding, “I’d argue we’re making it worse.”
“What specifically are we doing for child grooming,” an employee asks, to which a second employee replies, “Somewhere between zero and negligible. Child safety is an explicit non-goal this half. I’d argue we’re making it worse with Interop, but that’s a can of worms.”
“somewhere between zero and negligible”
This internal quote was in an AP report on a prior version of this complaint which was sealed this weekend.
On Instagram vs Facebook….
“prevalence of ‘sex talk’ to minors is 38X on Instagram Direct vs Facebook Messenger…”
/5 pic.twitter.com/kMMqdw57K8— Jason Kint (@jason_kint) January 21, 2024
Moreover, Meta’s “People You May Know” (PYMK) feature allegedly contributed to 75 percent of all inappropriate adult-minor contact. and “had a direct link to trafficking,” prompting employees to ask, “How on earth have we not just turned off PYMK between adults and children?” the complaint added.
Another apparent internal conversation showed an employee saying, “teenage self harm and suicide are so difficult to explain publicly that our current response looks convoluted and evasive… The fact that we have age limits which are unenforced (unenforceable?) and that there are, as I understand it, important differences in the stringency of our policies on IG vs Blue App [Facebook] makes it difficult to claim we are doing all we can.”
After that, someone else chimed in, asking whether Meta could improve its policies or whether it was a question of enforcement, adding, “We can definitely say that we need to improve our enforcement or our policies.”
This is interesting. As Facebook and Instagram apparently leaned into interoperability its branding and infrastructure after risks of structural and behavioral remedies from germany/EU and the US, it creates risk with inconsistent enforcement policies as noted here.” /7 pic.twitter.com/wSoPrmfBWl
— Jason Kint (@jason_kint) January 21, 2024
Another chart highlighted in the complaint appears to show that Instagram’s own research found that 13 to 15-year-old users were more likely than average to be exposed to adult nudity and sexual activity.
This chart. So according to Instagram’s own internal research, 13-15 year olds were more likely than avg users to be exposed to adult nudity and sexual activity? Then see last column showing data they publicly release which may mislead/suppress as to the potential harms. /8 pic.twitter.com/WmEL1MUthh
— Jason Kint (@jason_kint) January 21, 2024
You can follow Alana Mastrangelo on Facebook and X/Twitter at @ARmastrangelo, and on Instagram.