Internal warnings that Meta’s encryption rollout would severely limit child exploitation detection went unheeded as the company proceeded with plans to encrypt Facebook Messenger and Instagram direct messages, newly released court documents reveal.
“We are about to do a bad thing as a company. This is so irresponsible,” wrote Monika Bickert, Meta’s head of content policy, in a March 2019 internal chat exchange as CEO Mark Zuckerberg prepared to announce default end-to-end encryption for Messenger and Instagram.
The documents, filed in a New Mexico state court case, show senior safety executives expressed grave concerns that encryption would eliminate their ability to proactively detect and report child exploitation cases.
A February 2019 briefing document estimated that if Messenger had been encrypted the previous year, Meta’s reports of child nudity and sexual exploitation imagery to the National Center for Missing and Exploited Children would have dropped from 18.4 million to 6.4 million, a 65% reduction. A later update warned the company would have been “unable to provide data proactively to law enforcement in 600 child exploitation cases, 1,454 sextortion cases, 152 terrorist cases [and] 9 threatened school shootings.”
Despite these internal warnings, Meta announced its encryption plan in 2019 and completed the rollout across Facebook Messenger and Instagram direct messages by 2023. The documents show Bickert accused the company of making “gross misstatements of our ability to conduct safety operations” while publicly promoting encryption on privacy grounds.
“I’m not very invested in helping him sell this, I must say,” Bickert wrote regarding Zuckerberg’s efforts to promote encryption. “With end-to-end encryption, there is no way to find the terror attack planning or child exploitation” and proactively refer those cases to law enforcement.
Antigone Davis, Meta’s Global Head of Safety, highlighted specific risks in a 2019 email: “FB allows pedophiles to find each other and kids via social graph with easy transition to Messenger.” She contrasted this with WhatsApp, noting that “WA does not make it easy to make social connections, meaning making Messenger e2ee will be far, far worse than anything we have seen/gotten a glimpse of on WA.”
The internal communications emerged as part of a lawsuit brought by New Mexico Attorney General Raúl Torrez alleging Meta allowed predators unfettered access to underage users on its platforms. The trial began earlier this month in Santa Fe and marks the first case of its kind against Meta to reach a jury.
“The concerns raised in 2019 represent the very reason we developed a range of new safety features to help detect and prevent abuse, all designed to work in encrypted chats,” Meta spokesperson Andy Stone told Reuters.
Under the current system implemented in 2023, messages are encrypted by default but users can still report objectionable conversations for review and possible referral to law enforcement. Meta also introduced special protections for underage users designed to prevent adults from initiating contact with minors they do not know.
The New Mexico case specifically accuses Meta of misrepresenting the safety implications of its encryption plan while internally acknowledging the risks it posed for child protection efforts.















