EU Digital Single Market Governance: More Public-Intervention, More Responsibility

JCMS |

By Sebastian Heidebrecht

In the 1990s, European policymakers argued that “the creation of the information society in Europe should be entrusted to the private sector and to market forces” (European Council, 1994). In this regard, the European Union’s (EU) market integration has often been portrayed as nothing more than a Trojan horse of neoliberalism. EU leaders’ more recent rhetoric indicates policy change, including French President Emmanuel Macron’s aim to strengthen strategic autonomy, former German Chancellor Angela Merkel’s embracing of the term digital sovereignty, the Internal Market Commissioner Thierry Breton’s call to boost Europe’s digital technologies capabilities, and the President of the European Commission Ursula von der Leyen announcement to lead a geopolitical Commission.

In a recent article on the EU’s governance of its digital single market (which can be found here), I show that the EU’s approach has indeed changed from a market-liberal tradition to more public-intervention. Research on digital policy has shown that the EU has aimed for a regulatory approach to internet governance that is somewhere between the United States’ laissez-faire approach and China’s state-controlled model. To establish such a unique path, the Commission used a dual approach and combined policies that tend to be market-correcting, as in the area of consumer protection, and ones that tend to be market-making, like eliminating obstacles that cause market fragmentation.

Challenges of the digital economy require regulatory response

Over the last decade, public scandals and problematic business practices necessitated policy change. For example, revelations by Edward Snowden in 2013 highlighted the problematic practices of many big US technology companies and intelligence services. The involvement of the private US company Cambridge Analytica in the 2016 US election campaign, and Russian interference in the 2016 US election were seen as examples of political involvement that could threaten the European elections in 2019.

In this context, EU policymakers changed their rhetoric to stress the need to take back control in the digital sphere and call for digital sovereignty. The EU’s digital sovereignty discourse in cybersecurity presents foreign companies as security threats, rather than partners. In trade policy, the EU moves towards open strategic autonomy, away from neoliberalism. This change also occurs in key areas of the EU’s digital single market, like data protection and digital services regulation.

Less Market Liberal and more Public Interventionist EU Governance

These substantial changes can be demonstrated by tracing the historical evolution of digital single-market governance. As the Internet commercialised in the 1990s, the EU pursued a pronounced market-liberal agenda. The famous Bangemann report promoted “a market-driven revolution” to achieve European competitiveness. In this period, two important pillars of the digital single market, the 1995 data protection directive, and the 2000 e-commerce directive were designed. Data protection was also geared to prevent fragmentation of the single market and e-commerce regulation ensured weak intervention in online business models and established limited liability rules for online content of service providers to promote online activity. Both laws shaped EU governance in the field for more than two decades.

Around twenty years later, in its 2015 Digital Single Market Strategy, the Juncker Commission promised to undertake a comprehensive assessment of the social and economic effects of the digital economy and large platforms in particular. On a rhetorical level, the Commissioner for the Digital Economy Günther Oettinger sought to regain the “digital sovereignty” that the EU had forfeited and reassert digital independence. During that time, public salience shocks in the context of the Snowden revelations lead to stronger EU data protection regulation in the form of the General Data Protection Regulation (GDPR). While the GDPR, and its rules like the right to be forgotten, is a step towards more public-interventionist governance, it also maintained the more market-liberal country of origin principle, stating that digital companies are supervised by the national authorities of the Member State where they are established.

A 2016 Commission report on the effects of the platform economy identified the need to increase transparency, (which was addressed in a 2019 regulation), and in particular, challenges posed by large platform companies in the areas of content moderation and fair competition. As a response, on 15 December 2020, the Commission proposed a digital services package, consisting of two regulations. The Digital Services Act (DSA) deals predominantly with content moderation issues and implements rules that intervene in platform business practices, like a prohibition of misleading tricks that manipulate users (‘dark patterns’) and a prohibition of behavioural advertising targeted at minors. The Digital Markets Act (DMA) aims to ensure fair competition. It defines a full catalogue of do’s and don’ts for very large online platforms, like allowing business users to access the data that they generate and ensuring that users can un-install any pre-installed software or app. By addressing market structure and not only low prices, the DMA’s “ex-ante” powers resemble in some respects economic regulation more than competition policy.

More power brings more responsibility

The interventionist digital services package seems to learn from the weaknesses of the GDPR. Because most big tech companies have their headquarters in Ireland, the market-liberal country of origin principle leaves the Irish data protection authority in charge of many cases. However, a 2021 report found that 98 percent of major cases remain unresolved. Thus, the package moves policing powers over very large online platforms to the EU Commission. Now for the first time beyond competition policy, the Commission can apply substantial fines of up to six percent of companies’ global revenue in the DSA, which is two percent higher than the maximum fines in the area of data protection, and even up to 20 percent in the event of repeated infringements of the DMA.

Yet, it is one thing to have more powers and another of using them responsibly. In the context of the GDPR and the DSA, supervisors must carry out their tasks “with complete independence, […] and [without taking] instructions from any other public authority or any private party” (Article 50 DSA and Article 52 GDPR). But the Commission is also a political body and so has to prove that it can make difficult choices. For example, enforcing the DSA might require balancing different objectives like preventing online harm and ensuring the protection of the freedom of expression online. Thus, more public intervention powers might require more democratic scrutiny.


Sebastian Heidebrecht is a post-doctoral researcher (Universitätsassistent) at the University of Vienna’s Department of Political Science, Center for European Integration Research (EIF). In his work, he examines how European politics shape digital policies, how recent crises (war, inflation, pandemic) affect the stability of the euro area, and how the EU responds to recent advances in digital finance.

Find Sebastian Heidebrecht’s academic profile here. The Centre for European Integration Research (EIF) is on Twitter here.