Meta Platforms Inc META, the owner of Instagram and Facebook, is grappling with child-safety issues on its platforms. 

Despite efforts, including a task force established in June, Meta struggles to curb a network of pedophile accounts. 

Also Read: Algorithm vs. Data: Google’s Antitrust Trial Defines the Battle for Online Dominance

Instagram’s algorithms reportedly connected accounts involved in underage sex content creation and trading.  

Meta’s recommendation systems continue to promote such content, the Wall Street Journal and the Canadian Centre for Child Protection cite from recent tests.

While Meta has removed problematic hashtags and accounts, its efforts have been inconsistent, with new variations of banned content emerging, WSJ notes. 

The issue extends to Facebook Groups, where large groups sexualize children, and Meta’s algorithms often suggest similar groups.

Meta spokesman told WJS that the company has hidden numerous groups and disabled accounts, but progress is slower than desired. 

The company is enhancing tools to limit algorithmic connections among pedophiles and targeting forums that attract them. 

However, broader cost cuts have led to layoffs in safety staff, including child-safety specialists.

Previous WSJ reports indicated that Instagram’s algorithm also recommended inappropriate content, including sexualized videos, to specific user profiles. 

Price Action: META shares traded lower by 0.54% at $325.29 premarket on the last check Friday.

Disclaimer: This content was partially produced with the help of AI tools and was reviewed and published by Benzinga editors.