Economy

The case for a European Union digital enforcement authority

The European Union’s digital rulebook could be better enforced by delegating some of the European Commission’s powers to an independent agency.

The European Commission plays the primary role in enforcing the European Union’s digital rules, but is under pressure to relax enforcement to avoid retaliation from the United States. In this context, it risks succumbing and undermining its own authority, or resisting so strongly that it over-uses its enforcement powers to unfairly penalise foreign competitors, driven by the false belief that this would help the EU become less dependent on foreign technology. The risk of upward or downward enforcement bias is problematic because it weakens the effectiveness of the EU’s regulatory framework and distorts market competition dynamics.

A country’s ability to set, preserve and enforce its own rules is perhaps the clearest expression of its autonomy. This Policy Brief asks whether a structural change in the institutional setup would improve the enforcement of the EU’s digital rulebook. We ask whether the European Commission should delegate to an independent agency the digital-enforcement powers given to it by the Digital Markets Act (DMA), Digital Services Act (DSA) and Artificial Intelligence (AI) Act. Independent EU agencies with significant enforcement powers, including the power to impose financial sanctions on private entities, already exist. An EU digital authority could follow this template.

We ask whether the challenge in outsourcing digital enforcement is outweighed by the expected increase in enforcement accuracy. Furthermore, we assess whether the establishment of an EU digital authority is technically, politically and legally feasible. We find that the case for an independent authority is not equally compelling for each of the three regulations examined: a structural separation of enforcement may be currently unsuitable for the DMA and the AI Act, while it is advisable for the DSA. We thus recommend the establishment of an independent EU agency to enforce the DSA, and provide an outline of its possible structure.

This Policy Brief benefited from helpful discussions within Bruegel. Many thanks in particular to Griffith Couser, Stephen Gardner, Bertin Martens, Paul Richter and Nicolas Véron for their helpful comments.

1 Addressing institutional inertia

European Union regulation of digital markets is in its relative infancy. The Digital Markets Act (DMA, Regulation (EU) 2022/1925) and the Digital Services Act (DSA, Regulation (EU) 2022/2065) only came into force in 2022. Other laws, such as the Artificial Intelligence Act (AI Act, Regulation (EU) 2024/1689), will not be fully enforceable until August 2026 at the earliest. The EU’s digital rulebook, however, may already have been superseded by markets and the political environment shifting in unanticipated ways since the legislation was passed, reducing its expected effectiveness.

This is not the first instance of a mismatch between evolving environments and public institutions failing to adapt. The 2008 global financial crisis was largely caused by the inability of the United States’ Securities and Exchange Commission and the US Federal Reserve to update their oversight powers to anticipate and mitigate risk in emerging derivatives markets (FCIC, 2011). Similarly, the slow public response to the COVID-19 pandemic in 2020 was rooted in the rigid protocols of public health institutions, which proved inadequate for addressing the global emergency (Schiff and Mallinson, 2023). 

In European digital markets, the risk of institutional inertia may be compounded by the European Commission’s role in enforcing the rules. The Commission is unlikely to be an effective market watchdog1. The EU treaties established it as a technocratic body focused primarily on economic integration within the single market. Insulation from political control was seen as a source of legitimacy (Moravcsik, 2002). But the Commission is no longer solely technocratic; it has developed into a political entity (Haroche, 2023) that pursues ambitious goals in areas including climate and technology, and deploys trade and defence policies to advance the EU’s geopolitical ambitions. 

In digital markets, the Commission is under pressure to relax enforcement against US companies because of fear of US retaliation2. The Commission has a history of yielding to trade threats. For example, in July 2021, the European Commission suspended work on a potential EU-wide digital levy in July 2021 following US pressure and US threats of retaliatory tariffs against EU countries that had implemented their own national digital taxes3. In 2024, the Commission removed sovereignty requirements from a proposed cloud service certification scheme after the US threatened to exclude European companies from US government procurement markets4. All these instances have undermined the Commission’s credibility as a digital-rules enforcer.

However, the Commission also risks over-enforcing EU digital regulations by yielding to domestic protectionist pressure instead. It could be tempted to use its enforcement powers to unfairly penalise foreign competitors, driven by the notion that this would help the EU become less dependent on foreign technology5. The DMA is intended to expand competition in digital markets and to directly ensure ‘fairness’ in market outcomes. Fairness, however, is a fuzzy concept, making it sometimes difficult to determine whether DMA measures are intended to protect businesses or consumers (Andriychuk, 2023). DMA enforcement may thus lend itself to being twisted for protectionist purposes. Meanwhile, the DSA and AI Act contain measures guarding against large platforms or AI providers potentially creating systemic risks’. These risks, however, are not yet well-defined in practice; this vagueness could potentially be used to pursue political objectives.

The risk of upward or downward enforcement bias could undermine the effectiveness of the EU regulatory framework, increase business uncertainty, reduce competitiveness and investment incentives and diminish strategic autonomy6. Based on this premise, this paper asks whether a structural change in the institutional setup could improve enforcement of the EU’s digital rules. Specifically, we ask whether the European Commission should delegate its enforcement powers under the DMA, DSA and AI Act to an independent EU agency. 

This paper focuses narrowly on specific, incremental improvements to the current regulatory framework. It does not address other specific but equally relevant questions, such as whether the current distribution of EU digital enforcement power between the supranational and national levels should be refined. Nor does it address the broader question regarding the adequacy of the current EU regulatory framework. 

Independent EU agencies with significant enforcement powers, including the power to impose pecuniary sanctions on private entities, already exist. The European Securities and Markets Authority (ESMA) has direct supervisory authority over market critical infrastructure, such as credit rating agencies and trade repositories (Véron, 2025). Established in 2024, the Anti-Money Laundering Authority (AMLA) will directly supervise 40 complex, cross-border financial groups. Both ESMA and AMLA can bypass national regulators to oversee companies directly, conduct their own investigations and, most importantly, impose heavy financial penalties. The two agencies are legally accountable only to the Court of Justice of the EU (CJEU).

This paper evaluates whether the establishment of an independent digital agency modelled on ESMA or AMLA is warranted. That would be the case if two premises are demonstrated to be true: first, that such an institutional change increases enforcement effectiveness; second, that it is legally and technically feasible. In the paper, we show that these two conditions, for now, are unlikely to be met simultaneously under the DMA or the AI Act, whereas they are likely to be satisfied under the DSA. 

The paper is organised as follows. Section 2 explains the background of the Commission’s enforcement powers. Section 3 evaluates the first premise: the desirability and effectiveness of an independent digital agency. Section 4 evaluates the second premise: the legal and technical feasibility of such an agency. Section 5 compares the merits of an independent authority across the DMA, DSA, and AI Act. Finally, section 6 concludes by proposing a blueprint for establishing a separate agency to assume DSA enforcement duties currently sitting with the Commission.

2 The European Commission as enforcer of the EU’s digital rulebook

The EU’s digital rulebook comprises numerous EU regulations directly applicable to digital markets7. In this paper, we focus on the specific subset of regulations that include a significant enforcement role for the European Commission: the DMA, DSA and AI Act (Table 1). Other regulations that could be considered part of the EU’s digital rulebook do not contain a significant enforcement role for the Commission, often relying entirely on national authorities to carry out key enforcement duties, including conducting investigations and imposing sanctions for infringements8.

In addition to the DMA, DSA and the AI Act, the Commission exerts significant enforcement powers in the digital economy through competition policy: antitrust enforcement and merger control9. This paper, however, does not address the enforcement of EU competition law, as any such institutional change would entail amending the Treaty on the Functioning of the EU (TFEU), posing an overwhelming political challenge (amending the EU Treaty requires complex procedures and the unanimous agreement of all EU countries).

Table 1: Enforcement of the DMA, DSA and AI Act

InstrumentWhat it regulatesEnforcement by the European CommissionEnforcement role by member statesOther enforcement bodies
Digital Markets ActCore platform services by ‘gatekeepers’ (search, social networks, OS, app stores).Commission is sole public enforcer; designates gatekeepers; investigates; fines up to 10–20% global turnover.National authorities assist investigations; national courts handle private enforcement.High-Level Group for the DMA (coordination).
Digital Services ActIntermediary services and online platforms; illegal content; transparency; systemic risks.Commission supervises VLOPs/VLOSEs; audits risk mitigation, data access. Fines up to 6% global turnover.Member states designate Digital Services Coordinators (DSCs) as national enforcers.European Board for Digital Services; national DSCs.
AI ActAll AI systems; bans; high-risk rules; transparency; general-purpose AI and systemic-risk models.AI Office supervises general-purpose and systemic-risk models; fines up to 3% global turnover.National AI competent authorities and market surveillance authorities oversee high‑risk systems.European AI Board; Scientific Panel; Advisory Forum; notified bodies.

Source: Bruegel.

The DMA aims to ensure fairness and contestability in digital markets. It establishes a set of obligations for ‘gatekeepers’: large companies enjoying substantial, entrenched power in EU markets and providing critical ‘core’ digital services such as online search, social networks, operating systems and app stores. The Commission is the sole DMA enforcer, investigating potential infringements, requesting information and conducting inspections. It can impose sanctions of up to 10 percent of a violator’s total worldwide annual turnover (up to 20 percent for repeated violations). National authorities may support the Commission in investigating potential DMA infringements. However, only the Commission can impose sanctions and remedies.

The DSA establishes an EU regulatory framework for online intermediaries to enhance transparency, accountability and user protection and to reduce the spread of illegal and harmful content online. Contrary to the DMA, the DSA’s enforcement powers are shared between the Commission and member-state authorities. The Commission has exclusive competence over ‘very large online platforms’ (VLOPs) and ‘very large online search engines’ (VLOSEs) that have more than 45 million users in the EU.

Obligations for VLOPs and VLOSEs most notably include taking down illegal online content, identifying and analysing systemic risks from the spread of harmful content on their services and implementing adequate mitigating procedures; and transparency measures, such as maintaining a public database of the advertisements published on their services or enabling vetted researchers to access their internal data to monitor systemic risks. The Commission may impose penalties of up to six percent of the violator’s global annual turnover. National authorities, meanwhile, enforce obligations on smaller online intermediaries that do not meet the ‘very large’ threshold. The obligations enforced by national authorities primarily concern illegal online content. 

Finally, the AI Act lays out a comprehensive framework governing the development, market placement, deployment and use of AI systems within the EU. The AI Act is modelled as a ‘product safety’ regulation. It primarily targets providers (ie developers), deployers (ie users) and distributors of AI systems with a tiered obligation structure linked to the risks these systems pose. An AI system is considered high risk if it is part of a product’s safety component or used in areas where the risk of harm is deemed high, such as employment, education or critical infrastructure10. Similarly to the DSA, enforcement is shared between member states and the Commission. The former generally supervise providers, deployers and distributors of high-risk AI systems; the latter enforces measures specifically targeting general-purpose AI (GPAI)11. Providers of GPAI must, for example, ensure that their services comply with EU copyright laws or put in place measures to mitigate systemic risk when present. If they fail to do so, the Commission can impose fines of up to three percent of their global annual turnover.

There is as yet no evidence that the Commission has incorrectly enforced the three regulations. Under the AI Act, GPAI obligations are not yet enforceable, though some are expected to become enforceable as early as August 2026. Regarding the DMA and the DSA, the rules enforced by the Commission became applicable only as of May 2023 (for the DMA) and August 2023 (for the DSA); given the time required to investigate, there have been only a few infringement decisions so far. DMA infringement decisions have been taken against Apple (DMA.100109) and against Meta (DMA.100055) in March 202512. A DSA infringement decision was taken against X in December 202513. Even if these companies appeal to the EU General Court, which could provide an opportunity to assess the accuracy of the Commission’s decisions, it will take time before a judicial review of the three decisions is conducted.

Even when a regulation has been in place for a long time, it is hard to find evidence of improper enforcement because enforcement can also be distorted by inaction, which is difficult to quantify. It is difficult to prove that the Commission should have opened an investigation when it did not, or that it failed to sanction a company that it should have, given that information available only to the parties involved14. Thus, there is little factual evidence so far that the available enforcement tools need to be improved.

A case can, however, be made that the risk of distortion has increased. External and internal political pressure on the Commission is high. The US has openly demanded concessions on digital regulation in return for EU steel tariff relief15. Commission Vice President Teresa Ribera openly accused the US of blackmailing the Commission through trade talks to prevent the enforcement of the EU’s digital rulebook16. French President Emmanuel Macron blamed the Commission for being reluctant to tackle large US tech companies17

Increased political interference in enforcement is associated with a greater risk of distortion (Laffont and Tirole, 1991; Laffont and Tirole, 1993; Laffont, 1999). For instance, in the banking sector, Lambert (2019) found that regulators are 44.7 percent less likely to initiate enforcement actions against banks that lobby them. Similarly, concerning the enforcement of competition policy, Fidrmuc et al (2018) found that the probability of a merger being challenged or blocked is negatively correlated with the acquirers’ lobbying activity before the merger announcement. Mehta et al (2020) showed that mergers in the political districts of powerful US congressional members who serve on committees with antitrust oversight receive relatively favourable antitrust review outcomes.

The perception that there is a high risk of enforcement distortion in the current institutional setting is widespread among stakeholders. In response to a call for evidence during the first review of the DMA, the Commission reported that “several respondents expressed concerns about insufficiently robust enforcement” and that there were “calls for preserving the DMA’s political independence, […] by ensuring that its application remains insulated from broader political considerations”18. Reports have documented enforcement gaps in the DSA, with vetted researchers accusing platforms of implementing delaying tactics to prevent them from accessing their data (Scott et al, 2025). In November 2025, the Socialists & Democrats group in the European Parliament reportedly requested the establishment of a formal inquiry committee into a possible failure by the Commission to enforce the DSA19.

3 Independent regulatory agencies in practice

If the risk of digital regulatory distortion is high, the question is whether delegating enforcement to an independent agency would mitigate it. Insulation from political influence is a primary reason why governments establish independent regulatory agencies. Gilardi (2002) provided empirical evidence that, for western European governments, mandating independence is often a strategic decision to shield enforcement from the political cycle and to enhance the credibility of regulatory actions. Sadeh and Rubinson (2024) listed additional benefits of independence: increased expertise because regulatory agencies are more likely than governments to employ staff with the necessary technical skills (Koop and Hanretty, 2018); and protection against political uncertainty because their working plans are not supposed to change when the government does (Ruffing et al, 2024). 

Independent agencies can also serve as ‘lightning rods’, protecting elected officials from public blame when unpopular choices are made (Heinkelmann-Wild et al, 2023). An independent authority is not accountable to the electorate and is therefore more likely to make decisions that are positive for social welfare but may negatively affect specific constituencies (Maskin and Tirole, 2004). These advantages help explain why international bodies such as the EU, the Basel Committee, and the OECD have promoted independence from political interference as an institutional regulatory model (Koop and Jordana, 2022).

The theoretical connection between enforcement quality and independence has been thoroughly examined in the literature, but empirical evidence remains limited. 

Empirical analyses struggle with imperfect cross-country comparisons due to differences between institutional solutions and regulatory contexts. Koop and Jordana (2022) surveyed the existing literature, finding that studies focusing on utility regulations (such as electricity or telecoms) or financial services tended to identify either a significant positive effect on regulatory outcome quality or no significant effect at all. Koop and Hanretty (2018) analysed competition authorities in 30 OECD countries, finding that formal independence has a positive and highly significant effect on agencies’ ranking in terms of the overall perceived quality of their enforcement and regulatory activity. They also found a positive and significant relationship between an agency’s size and the quality of its work, pointing to scale efficiency effects. Ma (2010) found that substantive independence has a positive effect on antitrust and merger enforcement. However, when independence is granted only on paper, and governments retain some political control over the agency, that effect vanishes – a finding confirmed by Guidi (2015).

Independence would be particularly welcome in the face of external pressure on trade policy, as in the enforcement of the EU’s digital rulebook (see Box 1 for a game-theory analysis). This is especially relevant when the preferences of the principal (in this case, the European Commission, potentially appeasing the US) cannot be aligned with those of the agent (the newly established regulatory authority, which is only concerned with proper enforcement)20

In conclusion, gaining substantial independence from political control reduces the risk of over- or under-enforcement; if the Commission were to delegate its enforcement powers to a digital authority, this would greatly improve the quality of enforcement of EU digital rules.

Box 1: A game-theory analysis of how regulatory independence can help with foreign interference

To help visualise how an independent EU digital authority would respond to US pressure, consider the following game in reduced form. There are two players: the EU and the US. The EU can either enforce the rules or refrain from doing so. The US can either impose a trade restriction, such as applying an import tariff, or refrain from restricting trade when the EU’s digital rulebook is applied to a US company. Formally, the action space for the EU is SEU {E,NE} and for the US is SUS {T,NT}, where stands for ‘enforcement’, NE for ‘non-enforcement’, for ‘trade restriction’ and NT for ‘no trade restriction’. The EU benefits from enforcing its digital rules (it gains B), the US loses when rules are enforced against US companies, and B>L>0 (this assumption implies that regulatory action generates value for the economy, for example, by enhancing competition, and that value is higher than the loss experienced by US companies hit by the EU enforcement). If the US retaliates, it triggers a trade war, leaving both players worse off than under the status quo: the EU and the US lose DEU, and DUS, respectively. DEU>B ensures that, for the EU, the loss from a trade war is not offset by the benefit of enforcement. Table 2 summarises the payoffs across all four possible scenarios.

Table 2: Reduced-form enforcement game payoff matrix

US Tariff (T) US No Tariff (NT)
EU enforce (E)(𝐵−𝐷𝐸𝑈,−𝐿−𝐷𝑈𝑆)(𝐵,−𝐿)
EU not enforce (NE) (−𝐷𝐸𝑈,−𝐷𝑈𝑆)(0,0)

Source: Bruegel.

Under normal circumstances, with all players behaving rationally, the (E, NT) equilibrium emerges, as enforcing and no-trade restrictions are dominant strategies for the EU and the US, respectively. No matter what the other player does, it is always better for them to implement those strategies; (E, NT) is, therefore, a Nash equilibrium: each player’s best response to the action chosen by the other player is to take the action that leads to that equilibrium, and no player has an incentive to deviate from it. 

Let us then assume that the US credibly commits to retaliate whenever EU enforcement occurs, regardless of whether the short-term consequences are self-defeating21. If that were the case, only two equilibria would be possible, (E, T) and (NE, NT) the two players would then converge to the latter, where no one experiences a loss. Conversely, let us consider a third scenario in which the EU is committed to enforcing its digital regulations because it would no longer be within the Commission’s power to yield to US pressure (in Table 2, this effectively eliminates the bottom row). The US may still threaten to impose a tariff; however, since the EU is no longer able to refrain from enforcement, the US threats would be irrelevant. It would become increasingly untenable for the US to maintain retaliation, as the game would stabilise at an equilibrium where the US has the worst payoff: −L −DUS

This does not mean that the US would refrain from making threats. In January 2026, the US State Department threatened the United Kingdom with retaliation in response to the UK’s independent telecom and digital regulator (OFCOM) launching an investigation into X’s alleged lack of compliance with the UK Online Safety Act22. However, the UK government has no control over OFCOM’s investigative and fining powers. Therefore, the US threats are empty: enacting them in the long term would be counterproductive and dynamically unsustainable. Hence, the original Nash equilibrium (E, NT) would likely emerge.

The outsourcing of enforcement power by governments involves trade-offs. Ideally, the designated agency should be simultaneously free from political pressure when exercising its authority to investigate infringements and prosecute violations, and accountable for the exercise of its powers and expenditure of public resources. Institutional design choices influence the extent to which these two often-conflicting goals are met (Kovacic and Mariniello, 2016).

Establishing an independent EU digital agency would require defining its relationship with the European Parliament, the European Council and the European Commission. Various measures can strengthen the agency’s ability to resist pressure from political decisionmakers on which cases to pursue and how they are resolved. Granting agency leaders fixed-term appointments and prohibiting their removal from office – except for good cause – is a common approach. For example, ESMA’s chairperson and executive directors are appointed by the Board of Supervisors for five-year terms (renewable once), subject to European Parliament approval23. They may be removed only for serious misconduct or if they suddenly fail to meet the conditions necessary for their duties (eg becoming permanently incapacitated).

In AMLA, the primary decisionmaker sanctioning regulatory breaches and setting pecuniary sanctions is the Executive Board, which includes AMLA’s chairperson and five independent, full-time members. The chairperson is appointed through a procedure involving the Commission, European Parliament and the Council of the EU24; she can only be removed by the Council under the same conditions as those for removing ESMA’s leadership. 

Funding can also provide autonomy if the agency can secure resources without relying on institutional approval. Allowing the agency to collect and retain user fees protects the agency from political interference in the form of budgetary pressure. Both ESMA and AMLA, for example, collect fees from stakeholders that are proportional to the expected cost of regulatory supervision; AMLA’s fees are based on banks’ size and risk profile. 

Enforcement of the EU’s digital rulebook, however, is not uniformly funded. The DMA collects no fees from gatekeepers. Similarly, the AI Act does not require GPAIs to pay fees; it leaves it to member states whether fees can be imposed on larger AI providers, for example, to access regulatory sandboxes. Under the DSA, the Commission levies an annual supervisory fee on VLOPs and VLOSEs, which is proportional to the number of their users in the EU and capped at 0.05 percent of their global annual net income.  

However, an agency that is entirely autonomous can become detached from the policy decisions that influence the regulatory process. For instance, relevant enforcement units within the Commission are usually involved in the drafting of regulatory proposals. Bringing their expertise from concrete enforcement cases, they can significantly contribute to shape the legislative text. A separated autonomous agency may, instead, be sidelined and lose influence on legislative proposals that affect markets more significantly than any enforcement actions the agency might pursue (Kovacic and Mariniello, 2016). 

Accountability can be maintained by delegating to the court judicial review of agency decisions, ensuring that the agency operates within the bounds of its authority. ESMA and AMLA are both subject to the CJEU’s ultimate review, for example. The CJEU can annul decisions or adjust imposed fines upward or downward. Companies can similarly ultimately appeal to the CJEU against a Commission decision based on the DMA, DSA or AIA. 

4 Outsourcing digital enforcement as a feasible option

Is the establishment of a separate digital enforcement agency a realistic option? A new agency can be established through a ‘staff spinoff’ from the European Commission. For example, the European Chemicals Agency was set up this way, moving experts from the Commission’s Joint Research Centre to Finland, where the agency is based (JRC, 2007). AMLA is being set up by the AMLA Task Force, which is based within the Commission’s Directorate General for Financial Stability, Financial Services, and Capital Markets Union (DG FISMA). The Task Force is taking all the preparatory steps for setting up the agency, including selecting facilities and recruitment. AMLA is expected to be fully operational approximately four years from when it was established on 26 June 202425.

The transfer of enforcement powers from the Commission to a new EU digital agency would require legislative amendments. During the legislative process leading to the adoption of the DSA, the European Parliament adopted a resolution explicitly requesting that the Commission assess the feasibility of appointing an existing authority or establishing a new one to carry out enforcement tasks26. This did not happen for the DSA, but such a process was enshrined in the text of the AI Act, which mandates an assessment in 2029 “with regard to the structure of enforcement and the possible need for a Union agency to resolve any identified shortcomings” (AI Act, Article 112). Compared to the European Parliament, EU governments have been more reluctant to establish independent EU enforcement agencies. Yet member states have also been increasingly vocal about the need for stronger enforcement of the EU’s digital rulebook. For example, France has criticised the Commission for its alleged inaction27, and ministers from 13 EU countries, at a meeting in October 2025, discussed creating a single EU digital regulator28.

If sufficient political will were available for institutional restructuring, the final obstacle would be the EU Treaty. In 1958, the Court of Justice of the European Communities (the precursor to today’s CJEU) interpreted the Treaty establishing the European Coal and Steel Community as implying strict limits on the Commission’s ability to delegate regulatory enforcement powers to external agencies. This interpretation is referred to as the ‘Meroni doctrine’, from the case that originated it29. Accordingly, the Commission cannot delegate discretionary powers with a wide breadth of policy implications to external bodies unless the scope is strictly defined and closely supervised by the delegating authority. An independent agency must be a highly technical body that executes clearly prescribed tasks without independent political responsibility.

In principle, the Meroni doctrine seems to significantly undermine the prospects of establishing an independent EU digital agency. However, over the years, Meroni has been superseded. De facto, agencies such as the European Supervisory Authorities have exercised substantial powers in setting technical standards. De jure, in the 2014 ESMA short-selling case30, the ECJ confirmed Meroni but significantly relaxed its restrictions (Simoncini, 2025). The Court stated that delegated enforcement powers may include some discretion, provided there are objective criteria and specific conditions subject to judicial review. This is especially true when specific technical expertise is required to achieve a regulatory goal. The establishment of AMLA in 2024 confirmed that significant delegation of powers is considered compatible with EU law: AMLA selects high-risk institutions that require monitoring, supervises them and, when necessary, issues binding decisions and pecuniary fines.

Regarding the EU’s digital rulebook, the Commission exercises its discretionary powers most prominently by defining who is subject to its enforcement actions: it designates DMA gatekeepers, DSA VLOPs and VLSEs and the AI Act’s GPAI models31. Moreover, the Commission must assess whether companies are effectively implementing adequate measures to mitigate systemic risks, such as the spread of harmful online content or large-scale incidents arising from the use of large language models. Such an assessment is complex and may involve some subjectivity. Conversely, powers that are considered more technical include running investigations, requesting information, auditing, testing and inspecting, identifying non-compliance or infringements and monitoring ex-post compliance. The imposition of fines and remedies could also be considered technical, if discretion in sanctioning power is limited by well-defined criteria to ensure objectivity when an infringement is proved. 

Ultimately, the extent to which the enforcement of the EU digital rulebook can be transferred from the Commission to an external agency will depend on whether the new agency’s decisions can be taken within a framework of bounded discretion (permissible under Meroni), or could become precedent-setting to the extent that they resemble rulemaking (in breach of Meroni).

5 Comparing the DMA, DSA and AI Act

The case for outsourcing enforcement powers from the Commission is not equally compelling across the three regulations examined: currently, there is no strong case for an independent digital agency enforcing the DMA or the AI Act. Conversely, we argue that the enforcement of the DSA would improve significantly if delegated to such an agency.

For the DMA, establishing an independent agency is likely to lower the risk of over- or under-enforcement. However, this would come at a high price. The DMA overlaps strongly with the EU’s competition policy framework: the regulation itself was shaped based on the Commission’s past antitrust experience (Caffarra and Scott Morton, 2021). Some obligations for gatekeepers, such as informing the Commission about the intention to acquire another company, are intended to leverage complementarities with the competition framework – the EU merger regulation, for example (Mariniello, 2025).

Thus, DMA enforcement benefits greatly from synergies between Commission directorates-general (mainly DG-CONNECT and DG-COMP, in this case). Those synergies would be lost if the Commission were to stop enforcing the DMA. A structural separation between competition enforcement and DMA enforcement could, moreover, lessen the ability to coordinate respective investigations under different laws and increase the likelihood that the same company would be charged twice for the same infringement32. Finally, a DMA-enforcing agency would be most exposed to legal challenges under the Meroni doctrine, precisely because the DMA-enforcing powers are reminiscent of those conferred to the Commission by Article 105 of the TFEU. 

For the AI Act, the risk of running afoul of the Meroni doctrine is lower, because the AI Act is primarily conceived as a product regulation and the enforcer’s role is largely to ensure that AI systems are properly developed and conform to regulatory compliance standards (a function that has little discretion). This also applies to GPAIs that are supervised by the Commission, with providers of GPAIs having to keep information up to date on how the model is trained33. However, the regulatory framework for AI is not yet sufficiently stable: AI is evolving rapidly in ways that legislators struggle to anticipate. For example, the Commission’s AI Act proposal, released in April 2021, did not include the provisions on GPAIs; those were rushed into the text by the Council and Parliament after the release of ChatGPT in November 202234. The legislators had not foreseen the need for specific provisions targeting GPAIs. 

It is equally telling that the Commission proposed amending the AI Act with its Digital Omnibus proposal in November 2025, before the AI Act had even entered into full force (European Commission, 2025). In such a dynamic context, in which the legislative process is still in flux, the benefit of outsourcing enforcement to an independent agency seems outweighed by the drawback of removing critical expertise from within the Commission35. This, of course, does not preclude the possibility that delegating enforcement powers under the AI Act may become desirable in the future, as the regulatory framework is expected to become more stable. 

For the DSA, the case for an EU-independent authority is strongest. Between the DMA, AI Act and DSA, the DSA is subject to the greatest internal and external political pressure, particularly because of its links to freedom of speech. In July 2025, for example, the US House Judiciary Committee released a staff report focused on the DSA, titled ‘The Foreign Censorship Threat: How the European Union’s Digital Services Act Compels Global Censorship and Infringes on American Free Speech’36. Opposition to the DSA’s alleged potential to censor free speech also comes from within Europe. In January 2026, Polish President Karol Nawrocki vetoed the national law implementing the DSA, warning that its enforcement framework risked enabling “administrative censorship37. This suggests that the risk of enforcement distortion is highest with the DSA and, therefore, that creating an independent enforcement authority would yield significant benefits by minimising that risk.

The drawbacks of outsourcing enforcement, such as the risk of jeopardising the expertise needed to draft legislation, are limited: DSA obligations are not expected to change anytime soon. The DSA’s text is sufficiently broad to provide the necessary legal basis to tackle harmful online content, and significant discretion is given when assessing obligations, such as the measures VLOSEs and VLSEs should put in place to mitigate systemic risks. These obligations, however, are progressively specified through Commission guidance. For example, the Commission releases specific guidelines, such as those on election integrity38 or on protecting minors from addictive design39 that help qualify the risk that platforms must mitigate. It promotes non-binding codes of conduct, such as the one on disinformation40, that show how the risk of spreading disinformation can be alleviated (for example, by avoiding advertising next to disinformation). And, it uses delegated acts, such as the Delegated Regulation on Auditing (C(2023) 6807), which directs auditors on how to test claims by VLOSEs and VLSEs. In other words, the Commission is circumscribing the framework within which enforcement operates, limiting its discretion.

6 Policy recommendations

Based on the above discussion, we conclude that the establishment of an independent EU agency to enforce the DSA should be prioritised. The DSA agency should be established in line with the AMLA model. In practice, this means:

Similarly to AMLA’s direct supervision of selected high-risk financial institutions, the DSA agency should directly supervise DSA-designated VLOSEs and VSOEs. The Commission could retain the power of designation. However, the DSA agency would assume day-to-day supervision and inspection powers over these entities, ensuring a neutral, technical application of the law, free from political interference.

The Commission and the DSA agency would share the authority to define the criteria for assessing whether risk-mitigation measures have been implemented. The Commission could have the final word on broader principles, while the DSA agency would define the specific technical metrics and audit standards required by industry to show compliance. However, the DSA agency would have full, autonomous power to request information, inspect and audit platforms. It should be empowered to conduct investigations independently and to ultimately make decisions on DSA infringements. The DSA agency should be endowed with the authority to impose administrative fines, establish remedies and monitor ex-post compliance.

Mimicking AMLA, the DSA agency’s governance structure could comprise a General Board, an Executive Board, a chair, an executive director and an Administrative Board of Review with functions, composition and dismissal protections as described in section 3. The General Board would include the heads of national authorities responsible for enforcing the DSA at national level. The DSA agency’s decisions could ultimately be subject to appeal to the CJEU.

To supervise 40 European financial groups, AMLA is expected to employ slightly more than 400 staff members, half of whom will perform direct supervision tasks, organised into joint supervisory teams that include staff from national authorities. Taking that as a rough reference, a back-of-the-envelope calculation suggests that the DSA agency could need 200-250 staff members to supervise the currently designated 21 VLOPs and VLOSEs41. The DSA agency could initially be established through a spin-off from the Commission’s dedicated DSA unit within DG-CONNECT and the European Centre for Algorithmic Transparency (ECAT, currently within the Commission’s Joint Research Centre)42. Similarly to AMLA, the DSA agency’s staff could be supported by seconded national staff joining joint supervisory teams.

The need to shore up DSA enforcement against political interference is compelling and urgent. Transferring enforcement powers away from the Commission would require the DSA regulation to be amended. The agency’s practical setup would then require a multi-year transition. Using AMLA as a realistic benchmark, establishing the DSA agency would likely follow a three- to four-year trajectory. A quicker approach could also be considered: the DSA envisages an advisory body, the European Board for Digital Services (EBDS), composed of national DSA enforcers and chaired by the Commission. The new agency could evolve from the EBDS, provided that its institutional structure is transformed radically along the lines of what we propose for the DSA agency, to ensure its full independence from the Commission.

Source : Bruegel

GLOBAL BUSINESS AND FINANCE MAGAZINE

Recent Posts

Capitalising on Europe’s strengths

In recent years, the European economy has shown remarkable resilience, whilst continuing to transform. This…

6 hours ago

Central bank digital currency, the future of money, and politics

A number of concerns have been raised regarding retail central bank digital currency. These range…

6 hours ago

Ray of hope? The rise of solar energy in China

China's solar industry is a poster child for the country’s economic rise over the last…

6 hours ago

Reforming European defence procurement to boost military innovation and startups

European defence procurement practices must evolve to embrace innovative startups and small firms, in order…

6 hours ago

The non-fungible token bubble: What investors actually earned

The non-fungible token market exploded in 2021, but by late 2022 prices had collapsed. This…

2 days ago

A wartime labour market: The case of Ukraine

The full-scale Russian invasion of Ukraine in 2022 has generated one of the largest labour…

2 days ago