Meta has announced plans to introduce a new safety tool aimed at preventing children from sending and receiving nude images, even in encrypted chats. This optional feature, likely available to adults on Instagram and Facebook, is designed to protect users, particularly women and teenagers, from unsolicited explicit content. The move comes in response to criticism following Meta’s decision to encrypt Messenger chats by default, with concerns raised about the potential impact on detecting child abuse.
The company emphasizes that the new tool is solely intended to safeguard users and discourage the sharing of inappropriate content. Additionally, minors on Instagram and Messenger will default to being unable to receive messages from strangers. This strategic step follows concerns raised by police chiefs about the role of explicit image-sharing contributing to the rise in sexual offenses committed by children.
While Meta’s decision to encrypt Facebook Messenger chats by default has faced backlash from authorities, who argue that it hampers efforts to identify and report child abuse material, the company maintains that the new feature is not client-side scanning. Instead, it utilizes machine learning to identify nudity, operating entirely on the user’s device to respect privacy. Meta argues that using machine learning for child abuse detection across its vast user base could lead to errors and false reports.
In addition to this new feature, Meta has introduced over 30 tools for child safety and unveiled other features. For instance, minors on Instagram and Messenger will default to being unable to receive messages from people they do not follow. Parental supervision tools have also been enhanced to provide parents with the ability to deny teenagers’ requests to change default safety settings, offering greater control over their online experience.