14:46 PM, 21 October 2024 PST

Meta Platforms to Enhance Content Controls for Teens

TECHNOLOGY

Meta Platforms, the parent company of Facebook and Instagram, has announced increased content controls for teenagers on its platforms, responding to global regulatory concerns regarding child safety. All teens will now be placed in the most restrictive content control settings, and Instagram will limit additional search terms to shield them from potentially harmful content such as self-harm, suicide, and eating disorders.

The move, set to roll out in the coming weeks, aims to create a more age-appropriate online experience for teenagers. Meta is facing regulatory scrutiny in the U.S. and Europe over allegations of its apps contributing to youth mental health issues and addiction. In October, attorneys general from 33 U.S. states sued the company, accusing it of misleading the public about platform dangers.

The European Commission is also seeking information on how Meta safeguards children from illegal and harmful content. The recent changes are part of Meta’s efforts to address concerns raised by regulatory authorities and public figures, including a former employee who testified in the U.S. Senate about the company’s awareness of harassment issues on its platforms.

Despite the changes, critics like Arturo Bejar, the former Meta employee, argue that the company’s response is inadequate and lacks mechanisms for teens to easily report unwanted content. The competitive landscape has intensified as Meta faces challenges from TikTok, with young users increasingly favoring the latter over Facebook and Instagram.

Leave a Reply

Your email address will not be published. Required fields are marked *

LATEST POSTS