The European Union is probing Meta, the parent company of Facebook and Instagram, over concerns about its platforms’ addictive nature, particularly among children. This investigation mirrors earlier inquiries into TikTok.
Regulators will focus on the addictive effects of Meta-owned platforms and how young users are exposed to content related to depression or unrealistic body images. They’ll also assess the effectiveness of measures to prevent children under 13 from accessing these services.
Thierry Breton, the EU’s internal markets commissioner leading the investigation, doubts Meta’s compliance with the Digital Services Act (DSA) obligations, which aim to safeguard young Europeans’ physical and mental well-being online.
Meta spokesperson Kirstin MacLeod highlighted the company’s efforts to protect young users, mentioning the development of over 50 tools and policies. She acknowledged the industry-wide challenge and expressed Meta’s commitment to collaborating with the European Commission.
Although separate, investigations into Meta and TikTok share similarities in platform functionality and market dynamics, according to a commission spokesperson.
The impact of social media on children’s mental health has sparked significant concern. The Digital Services Act, enacted last year, aims to safeguard online human rights, prompting ongoing investigations into platforms like AliExpress, Facebook, Instagram, TikTok, TikTok Lite, and X. Violating platforms could face fines of up to 6 percent of their global revenue.
TikTok recently suspended its points-for-views reward system on TikTok Lite amid mounting scrutiny. Breton emphasized the importance of protecting children from social media’s potentially harmful effects, stressing that they shouldn’t be used as experimental subjects.