Greece Is Banning Social Media for Under-15s. The World Is Watching.

Greece’s social media ban for children under 15 takes effect on January 1, 2027, following a formal announcement by Prime Minister Kyriakos Mitsotakis on April 9, 2026. The ban covers platforms designed around addictive engagement features, including Facebook, Instagram, TikTok, Snapchat, and their equivalents, and applies to all children born from 2012 onward. Non-compliant platforms face fines reaching up to 6% of global annual turnover under the EU Digital Services Act. Greece’s parliament will legislate the ban through mid-2026.

80% of surveyed Greeks approved of the ban in a February 2026 ALCO poll, before the legislation was formally announced.

What Is Actually Being Banned and Why

Greece’s social media ban for under-15s targets platforms that promote “endless scrolling,” the infinite scroll design feature that removes natural stopping points from a user’s browsing session, engineered to extend session duration and increase engagement metrics. The ban is not a general restriction on internet access or digital communication. It is specifically directed at the design architecture of social media platforms that Greek government research, alongside a growing international body of evidence, links to anxiety, sleep disruption, and addictive use patterns in children and teenagers.

Prime Minister Mitsotakis addressed teenagers directly in his announcement video, posted on TikTok: “I know that some of you are going to be angry. Our aim is not to keep you away from technology but to combat addiction to certain applications that harm your innocence and your freedom. Science is clear: when a child is in front of screens for hours, their brain does not rest.”

The Greek government has already implemented 2 preceding measures in this direction, prohibiting mobile phones in schools and launching parental control platforms to limit teenagers’ screen time. The January 2027 ban represents the third and most legally binding step in that progression.

The Legal Framework Enforcing the Ban

Greece’s social media age restriction operates within 3 intersecting legal frameworks: national, European, and international, each providing a distinct enforcement mechanism.

1. EU Digital Services Act (DSA) 

The DSA is the European Union’s primary regulatory instrument governing how large online platforms operate within EU member states. Platforms classified as Very Large Online Platforms (VLOPs), which include Meta, TikTok, Alphabet, and Snapchat, face fines of up to 6% of global annual turnover for non-compliance with DSA obligations. Greece’s Digital Governance Minister, Dimitris Papastergiou, confirmed that enforcement of the under-15 ban will apply these DSA fine structures directly to non-compliant platforms. For Meta, whose 2025 global revenue exceeded $160 billion, 6% represents a potential fine exposure exceeding $9.6 billion per violation cycle.

2. EU General Data Protection Regulation (GDPR) 

GDPR Article 8 establishes that processing personal data of children under 16 requires verifiable parental consent across EU member states, with member states permitted to lower that threshold to 13. Greece’s proposed “Digital Age of Majority” at 15 aligns with and extends this existing GDPR child protection architecture, applying it specifically to social media platform access rather than data processing alone. GDPR violations carry separate fine structures of up to 4% of global annual turnover or €20 million, whichever is higher.

3. UN Convention on the Rights of the Child (UNCRC) 

The UNCRC, ratified by 196 countries, making it the most widely adopted human rights treaty in history, establishes in Articles 17 and 19 that states must protect children from information and material injurious to their well-being and from all forms of harm to their mental health. Greece, Australia, the UK, France, and other nations pursuing social media age restrictions frame their legislative justifications explicitly within UNCRC obligations. This international treaty foundation provides the rights-based legal argument that national bans are not arbitrary restrictions but compliance with binding international child protection commitments.

Legal InstrumentJurisdictionApplicable ProvisionMaximum Penalty
EU Digital Services ActEU member statesVLOP non-compliance6% of global annual turnover
EU GDPR Article 8EU member statesChild data processing without consent4% global turnover or €20M
UN Convention on the Rights of the Child196 countriesArticles 17 & 19: child protection from harmful contentState obligation (no direct corporate penalty)
Australia Online Safety Act 2021AustraliaAge assurance for social media platformsAUD $50M per violation

Greece Is Not the First and Is Pressuring the EU to Act

Australia became the first country in the world to implement a comprehensive social media ban for minors when it restricted access for all users under 16 in December 2025. The Australian ban covers TikTok, Instagram, YouTube, Snapchat, X, Facebook, and equivalent platforms. Meta, Snapchat, and TikTok publicly stated they believed the Australian ban would not protect young people, and then committed to complying with it anyway. Meta subsequently reported removing access to approximately 550,000 accounts belonging to users believed to be under 16 in Australia.

Greece’s ban at 15 follows Australia’s lead while setting a different age threshold and operating within a more complex regulatory environment. EU member states do not implement national digital regulation that directly conflicts with EU-level law without risking legal challenge. State Minister Akis Skertsos addressed this constraint directly: “National legislation is linked and influenced to a large extent by EU legislation. Unless we have an EU legislative framework, national legislation alone will be ineffective.”

This is precisely why Mitsotakis wrote separately to European Commission President Ursula von der Leyen on April 9, calling for the following 3 EU-wide actions

  1. Establishing a unified “Digital Age of Majority” at 15 across all member states.
  2. Mandating age verification and biannual re-verification for all platforms operating in the EU.
  3. Creating a harmonized enforcement and penalty framework, with a deadline of end-2026 for implementation.

The United Kingdom, Malaysia, France, Denmark, and Poland are each either considering or actively legislating comparable social media age restrictions.

The Age Verification Problem Nobody Has Solved Yet

Greece’s ban carries 1 significant implementation gap that the Australian experience has already exposed: social media platforms currently lack reliable, privacy-preserving mechanisms to verify user ages at scale. Greece doesn’t yet force platforms to verify user ages under existing EU law. The government’s current position recommends that platforms use age verification mechanisms already established under EU and Greek frameworks, a recommendation rather than a mandate, while urging parents to support enforcement directly.

Australia’s non-profit OpenAge Initiative launched AgeKeys in 2025, described as the first interoperable, privacy-preserving global age verification signals designed to help platforms identify and remove underage accounts without collecting excessive personal data. AgeKeys represents the technical infrastructure that national bans require to function at an enforcement scale. Without equivalent tools embedded into platform architecture, age bans depend on self-reporting by users and platform goodwill, neither of which reliably excludes 14-year-olds from Instagram.

Digital Governance Minister Papastergiou confirmed that from January 1, 2027, platforms that do not demonstrate the ability to restrict underage users will face DSA-structured fines. The compliance burden falls on the platforms, not on children or parents, to navigate technical workarounds.

Our Honest Opinion

Well, it is the right direction towards a hard problem!

Greece’s under-15 social media ban is directionally correct and legally grounded in 3 overlapping frameworks that give it real enforcement teeth, unlike voluntary platform commitments, which the California jury verdict on March 25, 2026, demonstrated are insufficient to prevent systematic harm to child users. A Los Angeles jury finding Meta and Google liable for designing addictive platforms, and a Greek prime minister announcing a legislative ban, are 2 branches of the same conclusion: the social media industry’s self-regulatory era is ending.

The honest difficulty is technical. Age verification at the scale of billions of global users, without creating surveillance infrastructure or requiring children to submit biometric data to access the internet, is an unsolved engineering and policy problem. Australia is 4 months into its ban. Greece is 9 months from its implementation deadline. The EU has been asked to deliver a unified framework by the end of 2026. AgeKeys and comparable tools are promising. None of them is deployed at the scale any of these bans requires.

The legislation is arriving faster than the infrastructure to enforce it. That gap is the next problem, and solving it determines whether these bans protect children or simply displace the problem onto less visible platforms with weaker compliance incentives.

Platform regulation, child safety legislation, and the legal frameworks reshaping how social media companies operate globally are covered at The IT Horizon. Subscribe to our newsletter. We track every law, verdict, and policy shift that affects how technology companies are held accountable.

Join the IT Horizon Community

Stay connected with a community of curious minds following the ideas, breakthroughs, and disruptions shaping our digital future. Join the conversation.

Related blogs

Top Stories

April 14, 2026

Google Maps Just Got Its Biggest Upgrade in a Decade, and It Changes Everything About How You Find Places

April 14, 2026

Japan Just Bet $16 Billion on a Chip Startup Nobody Had Heard of 3 Years Ago

April 14, 2026

Blue Light and Sleep: Why Your Phone Isn’t the Real Reason You’re Tired at Night

April 14, 2026

Trump Posted an AI Image of Himself as Jesus, Then Deleted It After His Own Base Turned on Him

April 14, 2026

Has Neuralink Made a Miscalculation? The Reality Behind the Hype

April 14, 2026

Art schools vs AI: adaptation or erosion?