Prime Highlights:
- The European Parliament’s Renew Europe group is urging the EU to take stronger action against the harmful effects of social media on young people’s mental health.
- Proposed measures include child-safe defaults, age verification, and stricter regulation of addictive algorithms.
Key Facts:
- In 2022, 96% of adolescents in the EU used social media daily, with 37% spending more than three hours online.
- The EU plans to use existing legislation, such as the Digital Services Act (DSA) and GDPR, and may introduce additional measures through the proposed Digital Fairness Act.
Key Background:
The Renew Europe group in the European Parliament is urging the European Commission to take stronger action against the harmful effects of social media on young people’s mental health. In a recent paper, the group suggested using existing EU rules, such as the Digital Services Act (DSA) and the GDPR, to reduce the negative impact of heavy social media use.
Commission data from 2022 revealed that 96% of adolescents in the EU use social media daily, with more than one-third spending over three hours online.
MEP Veronika Ostrihoňová said that children are being shaped by algorithms that grab attention and influence feelings, calling it a real concern for many families that the EU must address.
The Renew Europe initiative coincides with growing awareness in Brussels about the risks of excessive social media use. Just last week, EU digital ministers signed the Jutland Declaration, emphasising the importance of protecting children online. European Commission President Ursula von der Leyen has also highlighted the issue, calling social media use among children “profit-seeking and harmfully addictive” and stressing that “parents, not algorithms, should be raising our children.”
Renew Europe is advocating for clear rules to address addictive algorithms, including mandatory child-safe defaults. Measures include stopping autoplay videos, limiting night notifications, blocking screenshots of children’s content, and removing harmful filters.
The group is also pushing for an EU-wide age verification system built on the European Digital Identity Wallet, as well as exploring standardized minimum or tiered age limits for social media access.
EU countries will enforce the rules, and the Commission will update child protection guidelines. The Digital Fairness Act may add more measures. The EU aims to treat social media addiction as a health issue and hold platforms accountable.