Social MediaWorld News

Lawsuit alleges that more than 100,000 children receive sexually explicit content on Facebook and Instagram

The lawsuit contends that Meta's internal findings were brought to the attention of executives years ago but were dismissed

Meta, the parent company of Facebook and Instagram, is under scrutiny as a lawsuit by New Mexico alleges that the platforms recommend sexual content to underage users and promote underage accounts to predatory adults. Recently disclosed documents from 2021 reveal shocking statistics – an estimated 100,000 minors receive explicit content, including photos of adult genitalia, daily. The lawsuit contends that Meta’s internal findings were brought to the attention of executives years ago but were dismissed.

Algorithmic Concerns: In 2021, internal documents cited Meta employees revealing that the “People You May Know” (PYMK) algorithm, responsible for connecting users, was known to link child users with potential predators. Employees expressed distress, with one stating that the algorithm contributed up to 75 percent of inappropriate adult-minor contact. Despite concerns, Meta allegedly rejected suggestions to adjust the algorithm.

Employee Concerns: Comments appended to the report include employees questioning why PYMK hasn’t been turned off between adults and children, expressing deep concern about the distressing situation. The lawsuit claims that Meta executives were aware of these issues but failed to take necessary actions.

Meta’s Response: Meta has not directly commented on the newly-unsealed documents but accused New Mexico of mischaracterizing their work using selective quotes. The company defended itself, calling child predators “determined criminals” and asserting that they have invested significantly in enforcement and safety tools for young users and their parents.

Internal Communication on Instagram: An internal email from 2020 highlighted that the prevalence of “sex talk” to minors on Instagram was 38 times greater than on Facebook Messenger in the US. The email urged the company to implement more safeguards on the platform. A November 2020 presentation titled ‘Child Safety: State of Play’ criticized Instagram for having “minimal child safety protections” and immature policies regarding “minor sexualization” and trafficking.

Wider Legal Implications: The New Mexico lawsuit is not an isolated case; more than 40 other states in the US filed a lawsuit against Meta in October, alleging that the company misled the public about the dangers its platforms pose to young people. This underscores a broader legal battle that Meta is currently facing.

Meta’s Countermeasures: In response to growing concerns, Meta announced plans to automatically restrict teen Instagram and Facebook accounts from harmful content. The restrictions include videos and posts related to self-harm, graphic violence, and eating disorders. This move is seen as an attempt to address criticism and enhance child safety on its platforms.

You might also be interested in – Google, Apple, and Facebook likely to cut 90% hiring in India’s IT sector

Related Articles

Back to top button