Skip to content

Discussion on Ineffectual Social Media Ban in Germany Concludes Unproductively

Restrictions on TikTok and Similar Platforms based on Users' Age

Debate on Social Media Ban in Germany Likely to Yield No Results
Debate on Social Media Ban in Germany Likely to Yield No Results

Discussion on Ineffectual Social Media Ban in Germany Concludes Unproductively

The European Union is taking strides to protect minors on social media platforms, with individual Member States setting and enforcing minimum age limits for their use. This focus on safeguarding young users is underpinned by the Digital Services Act (DSA), which came into effect in mid-2025 [1][3].

The DSA requires large platforms such as TikTok, Instagram, YouTube, and others to implement stringent age verification and safety measures aimed at minors [1][3]. The regulations vary by Member State, with some countries leading the way in national rules. For instance, France has proposed banning users under 15 from TikTok and Instagram without parental consent, while other states like Denmark, Italy, Spain, Ireland, and Germany are exploring or implementing mandatory age verification or minimum age restrictions [1][5].

From December 2025, EU-wide practice will align with a ban on users under 16 from creating accounts on "age-restricted" platforms, including Facebook, Instagram, TikTok, Snapchat, X, and YouTube [2]. This shift aims to raise the digital age of majority and better protect child users across the bloc.

The DSA mandates privacy-respecting, legally enforceable age verification systems by July 25, 2025 [1][3]. It also sets out broad child protection standards, including default private profiles for minors, safer content recommendation, limits on manipulative monetization, and restrictions on features that encourage addictive behavior or exploitation. Although the DSA sets a baseline, the General Data Protection Regulation (GDPR) allows member states to set age limits anywhere between 13 and 16, leading to variation in enforcement and national age thresholds across the EU [4].

Ongoing initiatives by countries like Greece, France, Spain, Denmark, Cyprus, and Slovenia push the European Commission to establish a unified EU-wide age verification system and agree on a common digital age of majority, aiming to harmonize protections and minimum age rules in the near future [5].

In Australia, age verification is already mandatory, but poorly enforced rules are easily circumvented. The country has implemented a strict ban on the use of social media for users under 16 [6].

Politician Cem Özdemir calls for a ban on TikTok and similar platforms for users under 16, while Education Minister Karin Prien has announced that an expert commission will examine stricter rules for social networks in the next few weeks [7]. A new study by the Leopoldina suggests a link between social media use and eating disorders, adding to the concerns about the impact of these platforms on young users [8].

A YouGov survey shows that 70% of Germans support the idea of an age limit on social media, but most respondents were over 55, and no minors were included [9]. Providers of social media platforms face fines of up to €31 million if they do not implement age controls in Australia [10].

In conclusion, while there is no single EU-wide minimum age limit legally fixed for social media use yet, from late 2025, platforms must block users under 16 on certain large platforms by default. Member States are increasingly introducing stricter, sometimes higher age limits requiring parental consent or outright bans for younger minors, with the DSA enforcing robust age verification and safety standards overall [1][2][5].

  1. The Commission is proposing to extend the period of validity of the agreement, as per the Digital Services Act (DSA), to cover the period from 1 January to 31 December, focusing on topics such as science, health-and-wellness, mental-health, social-media, entertainment, politics, and general-news, with a particular emphasis on protecting minors on social media.
  2. In the realm of science, ongoing studies are being conducted to explore the potential link between social media use and eating disorders, raising concerns about the impact of these platforms on young users.
  3. As entertainment and political debates often unfold on social media, measures are being put in place to ensure that minors are protected from exposure to inappropriate content or manipulative monetization.
  4. With the General Data Protection Regulation (GDPR) allowing member states to set age limits anywhere between 13 and 16, a unified EU-wide age verification system and a common digital age of majority are being advocated for, aiming to harmonize protections and minimum age rules across the EU in the near future.

Read also:

    Latest