Facebook and Instagram have been profiting from placing corporate adverts from companies such as Walmart and Match Group next to content potentially promoting child sexual exploitation, a legal filing alleges.

The accusation is the latest in an explosive lawsuit initiated in December by Raúl Torrez, the New Mexico attorney general, against Meta claiming the company “enabled adults to find, message and groom minors” for sexual exploitation. The suit follows a Guardian investigation in April, which revealed that the tech giant is struggling to prevent people from using its platforms to buy and sell children for sex.

“New evidence indicates Meta officials are duping corporate advertisers and permitting sponsored content to appear alongside deeply disturbing images and videos that clearly violate Meta’s promised standards,” said Torrez in a statement. “Mr Zuckerberg and Meta are refusing to be honest and transparent about what is taking place on Meta’s platforms.”

The legal complaint, reviewed by the Guardian, quotes correspondence among Meta, Walmart and Match, the owner of the dating apps Tinder and Hinge. The advertisers objected to their material being placed next to graphic and potentially illegal content, according to the filing.

In early November, Match notified Meta that ads for its dating apps had appeared alongside “disturbing” content on Reels, short videos posted by users on Facebook and Instagram, according to the complaint. The company allegedly said it believed some of the content in the Reels “is clearly promoting illegal and exploitative businesses” and included provocative images of young girls. Match adverts were also featured in a Facebook Group titled “Only women are slaughtered”, which showed graphic films of women being murdered, according to the complaint.

“We need to quickly figure out how we stop this from happening on your platforms,” a representative for Match wrote, according to one email quoted in the filing.

When Meta failed to address these concerns, Match’s CEO, Bernard Kim, allegedly wrote to Zuckerberg himself, stating: “Meta is placing ads adjacent to offensive, obscene – and potentially illegal – content, including sexualization of minors and gender-based violence.” Kim’s letter also highlighted that Match spent millions of dollars on advertising on Meta but “our ads are being serviced to your users viewing violent and predatory content”, the complaint says. Zuckerberg did not respond to the letter, the legal complaint says.

Meanwhile, in October, Walmart emailed Meta with concerns that the tech giant’s “level of attention/consideration” to brand safety issues “has disappeared”.

Meta confirmed that Walmart advertisements were being displayed on unapproved channels, responding that “there is some minimal exposure to placements that you’ve not approved”, according to the complaint. Walmart’s marketing representatives allegedly continued to question why their adverts were running next to illicit content, eventually becoming so frustrated that one called Meta’s response to the problem “unacceptable”.

“Candidly, we were disappointed that your team seemed more focused on getting a press statement right than on addressing this problem,” one representative wrote, according to the complaint.

Torrez wrote in the complaint: “The experiences of Match and Walmart are emblematic of a larger problem … Meta’s claims regarding the content of its platforms are false and that its tools are ineffective.”

In response to the filing, Walmart said in a statement: “We take brand safety issues extremely seriously, and protecting our customers and communities will always be a top priority.”

Meta and Match did not respond to request for comment by press time.

The filing also contains new evidence that child predators are allegedly able to find victims through Instagram. It includes excerpts of users allegedly discussing how to lure minors into engaging with them, highlighting the absence of controls to prevent unknown adults from messaging minors on the social network. A former Instagram employee testified in November before Congress that his own daughter had received unwanted online advances and that, when he notified senior Meta leadership, he was ignored.

Before filing the lawsuit, investigators at the New Mexico attorney general’s office conducted their own investigation into child sexual exploitation taking place on Meta’s platforms. According to the lawsuit, they “found numerous posts and accounts on Instagram that depicted or promoted choking, slapping, tying up, engaging in sex with, and otherwise abusing little girls”.

According to the filing, all of the images and videos the investigators found were reported to Meta. According to the suit, the company removed only half of the reported images.

“Investigators found that content that was removed frequently reappeared or that Meta recommended alternative, equally problematic content to users – demonstrating both that Meta is capable of identifying this content but incapable of effectively dealing with it,” the lawsuit states.