SAN FRANCISCO (AP) 鈥 Newly unredacted documents from New Mexico's lawsuit against Meta underscore the company's 鈥渉istorical reluctance鈥 to keep children safe on its platforms, the complaint says.
New Mexico's Attorney General Ra煤l Torrez sued Facebook and Instagram saying the company failed to protect young users from exposure to child sexual abuse material and allowed adults to solicit explicit imagery from them.
In the passages freshly unredacted from the lawsuit Wednesday, internal employee messages and presentations from 2020 and 2021 show Meta was aware of issues such as adult strangers being able to contact children on Instagram, the sexualization of minors on that platform, and the dangers of its 鈥減eople you may know鈥 feature that recommends connections between adults and children. But Meta dragged its feet when it came to addressing the issues, the passages show.
Instagram, for instance, began restricting adults' ability to message minors in 2021. One internal document referenced in the lawsuit shows Meta 鈥渟crambling in 2020 to address an Apple executive whose 12-year-old was solicited on the platform, noting 鈥榯his is the kind of thing that pisses Apple off to the extent of threating to remove us from the App Store.鈥欌 According to the complaint, Meta 鈥渒new that adults soliciting minors was a problem on the platform, and was willing to treat it as an urgent problem when it had to.鈥
In a July 2020 document titled 鈥淐hild Safety 鈥 State of Play (7/20),鈥 Meta listed 鈥渋mmediate product vulnerabilities鈥 that could harm children, including the difficulty reporting disappearing videos and confirmed that safeguards available on Facebook were not always present on Instagram. At the time, Meta's reasoning was that it did not want to block parents and older relatives on Facebook from reaching out to their younger relatives, according to the complaint. The report's author called the reasoning 鈥渓ess than compelling鈥 and said Meta sacrificed children's safety for a 鈥渂ig growth bet.鈥 In March 2021, though, it was restricting people over 19 from messaging minors.
In a July 2020 internal chat, meanwhile, one employee asked, 鈥淲hat specifically are we doing for child grooming (something I just heard about that is happening a lot on TikTok)?鈥 The response from another employee was, 鈥淪omewhere between zero and negligible. Child safety is an explicit non-goal this half鈥 (likely meaning half-year), according to the lawsuit.
In a statement, Meta said it wants teens to have safe, age-appropriate experiences online and has spent 鈥渁 decade working on these issues and hiring people who have dedicated their careers to keeping young people safe and supported online. The complaint mischaracterizes our work using selective quotes and cherry-picked documents.鈥
Instagram also failed to address the issue of inappropriate comments under posts by minors, the complaint says. That's something former Meta engineering director about. B茅jar, known for his expertise on curbing online harassment, recounted his own daughter鈥檚 troubling experiences with Instagram.
鈥淚 appear before you today as a dad with firsthand experience of a child who received unwanted sexual advances on Instagram,鈥 he told a panel of U.S. senators in November. 鈥淪he and her friends began having awful experiences, including repeated unwanted sexual advances, harassment."
A March 2021 child safety presentation noted that Meta is 鈥渦nderinvested in minor sexualization on (Instagram), notable on sexualized comments on content posted by minors. Not only is this a terrible experience for creators and bystanders, it鈥檚 also a vector for bad actors to identify and connect with one another.鈥 The documents underscore the social media giant's 鈥漢istorical reluctance to institute appropriate safeguards on Instagram," the lawsuit says, even when those safeguards were available on Facebook.
Meta said it uses sophisticated technology, hires child safety experts, reports content to the 春色直播 Center for Missing and Exploited Children, and shares information and tools with other companies and law enforcement, including state attorneys general, to help root out predators.
Meta, which is based in Menlo Park, California, has been updating its safeguards and tools for younger users as lawmakers pressure it on child safety, though critics say it has not done enough. Last week, the company announced it will from teenagers鈥 accounts on Instagram and Facebook, including posts about suicide, self-harm and eating disorders.
New Mexico's complaint follows the that claim Meta is harming young people and contributing to the youth mental health crisis by knowingly and deliberately designing features on Instagram and Facebook that addict children to its platforms.
鈥淔or years, Meta employees tried to sound the alarm about how decisions made by Meta executives subjected children to dangerous solicitations and sexual exploitation,鈥 Torrez said in a statement. "While the company continues to downplay the illegal and harmful activity children are exposed to on its platforms, Meta鈥檚 internal data and presentations show the problem is severe and pervasive.鈥
Meta CEO Mark Zuckerberg, along with the CEOs of Snap, Discord, TikTok and X, formerly Twitter, are scheduled to testify before the U.S. Senate on child safety at the end of January.