EU Parliament Calls for Stricter Social Media Age Limits to Protect Minors – 26 November 2025

The European Parliament recommends new social media age limits to safeguard minors, proposing parental consent rules and stronger platform safety measures.

Raja Awais Ali

11/26/20252 min read

EU Parliament Pushes for New Social Media Age Limits to Safeguard Minors

The European Parliament on 26 November 2025 approved an important resolution calling for the introduction of stricter age limits on social media platforms to protect minors from potential online harm. Although the resolution is not legally binding, it signals a strong political push urging EU-wide action to regulate children’s access to digital platforms.

Under the proposed framework, children under 16 would not be allowed to use social media platforms, video-sharing apps, online games, or AI chatbots without verified parental consent. The resolution further recommends a complete ban on social media use for children under 13. Lawmakers argue that the digital environment has become increasingly unsafe for minors, who are exposed to addictive designs, harmful content, and mental health risks.

Recent studies cited by the European Parliament reveal alarming trends. Approximately 97% of minors across Europe use the internet daily, and around 78% of teenagers between 13 and 17 spend multiple hours every day on their smartphones or social platforms. Moreover, nearly one-quarter of adolescents are reported to engage in “problematic or dysfunctional” social media use—patterns associated with anxiety, poor sleep, emotional imbalance, and decreased concentration.

The resolution also criticizes the manipulative design strategies used by many digital platforms. Lawmakers urged social media companies to restrict or remove features such as infinite scrolling, autoplay videos, algorithmic traps, gamified rewards, loot boxes, and other engagement-driven mechanisms that encourage minors to spend excessive time online. According to the Parliament, these features are intentionally developed to maximize user attention, making them especially harmful for developing minds.

Furthermore, the Parliament called for strict penalties, including heavy fines or platform restrictions, for companies that repeatedly fail to comply with child-safety rules. It also suggested that senior executives of such companies could be held personally accountable if their platforms endanger minors by violating safety obligations.

While the resolution does not enforce immediate legal changes, it plays a crucial role in shaping future EU legislation. The European Union is already a global leader in digital governance, with major regulations such as the Digital Services Act (DSA) and General Data Protection Regulation (GDPR). The latest resolution is expected to influence upcoming policy proposals from the European Commission and could lead to legally binding age-verification standards across all EU member states.

The issue extends beyond Europe, as governments, educators, and parents around the world are increasingly concerned about the impact of social media on children. The rapid expansion of digital technology has provided young people with unprecedented access to information, but it has also exposed them to privacy risks, cyberbullying, addictive behavior, and harmful online communities. The EU’s latest initiative highlights a growing global consensus that children require stronger digital protection.

The European Parliament’s 26 November 2025 resolution makes one point absolutely clear: establishing age limits and ensuring safe digital environments for minors is no longer optional—it is an urgent necessity. Stronger regulation will help safeguard mental health, promote responsible digital habits, and build a safer online future for the next generation.