• Loading stock data...

The European Union needs more than the digital omnibus to make digital services competitive

Screenshot 2025-12-09 160532

European Commission plans to streamline digital rules introduce new inconsistences and fall far short of transformative reform.

The European Commission’s digital and artificial intelligence omnibus proposals – two draft laws issued on 19 November that would amend several European Union laws – respond to exhortations to accelerate EU productivity growth, in particular in digital services and AI, by reducing regulatory compliance costs. The goal of the proposed omnibus laws is regulatory simplification and streamlining.

Regulatory consolidation around the Data Act

The digital omnibus (European Commission, 2025a) would prune the complex jungle of EU data regulations by repealing the Platform-to-Business and Free Flow of Data regulations (Regulations (EU) 2019/1150 and (EU) 2018/1807), the Open Data Directive (Directive (EU) 2019/1024) and the Data Governance Act (Regulation (EU) 2022/868). Parts of these regulations will be consolidated in another law, the 2023 Data Act (Regulation (EU) 2023/2854), which sets out rules on data access and use by companies, and which has applied since 12 September 2025.

These repealed regulations have a rather marginal impact on data markets and consolidation looks like a good idea to overcome incoherence in EU data regulations (Martens, 2023). However, the Data Act has many shortcomings not addressed by the digital omnibus. It applies to ‘product’ data generated by connected physical devices and was meant to facilitate access to and reuse of data and boost EU data markets. But the fuzzily defined product data category has fragmented data markets and created legal uncertainty. All data resides on hardware. How can data be distinguished according to the type of hardware on which it resides? Applying the Data Act to all non-personal data, not just product data, would have greatly simplified the regulatory landscape. 

Moreover, the Data Act gives industry data holders control over access to data, at the expense of users and other data service providers. Industry can determine the price and conditions for data sharing with third parties. Users who paid for a device, including the data that it generates, must pay again to the device manufacturer when they want to share this data with third parties (though smartphones are exempt from this provision). Other anti-competitive provisions in the Data Act, such as the prohibition on companies using data obtained from other companies to develop competing products or services, remain in place. These provisions defeat the purpose of the Act.

The Commission claims that the reluctance of firms to share their data for AI training constitutes a data market failure that hinders AI development in the EU. It wants to overcome that by setting up data labs and common European data spaces. But the market failure is unlikely to be overcome if participation in data labs and spaces remains voluntary, ie market-based, as in the omnibus proposals. The Commission argues that privacy-enhancing computing infrastructure in data labs will facilitate data pooling. That may be true. But there are already many private providers of this type of service. Why should taxpayer money be spent on this? Firms want to know who can access their data, for what purpose and how costs and benefits are shared. That requires a complex data-governance mechanism.  

The Commission wants to use the experience from data spaces initiatives in health, manufacturing and agriculture in other sectors. The European Health Data Space Regulation (EHDS, Regulation (EU) 2025/327) makes data sharing by medical service providers mandatory and free of charge for users and data recipients. There is no incentive problem because health data is a by-product of medical services. That maximises the social value of health data for society as a whole.

However, plans for a Common European Agricultural Data Space are taking a different track (AgriDataSpace, 2024) and would give farmers exclusive rights to decide on the price and conditions for sharing their data – contradicting the Data Act, which gives these rights to farm-machine producers and fails to exploit the social value of farm data.

This reveals a fundamental problem in EU data regulation: lack of clarity around the allocation of data access rights in digital settings where multiple parties – service providers and users, all of whom want some sort of access rights – contribute to co-generating data on platforms.

Some regulations recognise this, such as the EU general data protection regulation (GDPR; Regulation (EU) 2016/679) and the EHDS. They allocate partial rights to all parties. But others, such as the Data Act and the Agricultural Data Space, ignore the issue and grant exclusive rights to industry data holders. This is bound to create obstacles in data markets. The EU must leverage data as an efficient production factor (Martens, 2025): more competitive data markets, wider (re)use and pooling of data. The digital omnibus has thus missed an opportunity to straighten out the EU’s zigzagging approach to this.

Data protection changes

The digital omnibus addresses a longstanding problem with the GDPR: consumer fatigue with GDPR privacy consent notices that pop up everywhere while surfing the web, and which are irritating rather than useful. Consumers click whatever option is available to get rid of pop-up banners as quickly as possible (Farronato et al, 2025). Data collectors exploit this to nudge consumers towards accepting data sharing as the easiest way out.

To its credit, the digital omnibus proposes browser extensions that can automate this task and reduce consumer costs, for example by opting-out of data sharing as the default setting for all webpages. Moreover, the omnibus introduces the possibility to make such extensions mandatory. That would overcome consumers’ reluctance to spend time on managing their data rights. But the omnibus excludes media services from this provision, presumably to preserve their ability to collect consumer data and use this to generate more advertising revenues. The omnibus argues that media are important in democratic societies4 and therefore their financial sustainability should be ensured. This will create a loophole: media websites will collect advertising data, thereby distorting the online advertising market. 

Meanwhile, more fundamental issues around consent for data collection and processing remain unanswered. Under the GDPR, consumers can accept or reject use of their data for specific purposes but have no way to know if acceptance or rejection benefits them. Empirical evidence on the negative economic impact of the GDPR on the EU (Johnson, 2023) shows that it reduces digital services investment in the EU, has contributed to the concentration of user data in big-tech firms at the expense of smaller players, and has increased prices for consumers. Less sharing of user data makes markets less transparent and less competitive. Individual privacy protection seems to stand at odds with economic welfare for society at large (Chen et al, 2025). It is an urgent task for economists to understand this better.

Other digital omnibus proposed changes to the GDPR include the revision of the definition of ‘personal data’ to exclude information when the data processor is unlikely to have the means to identify the individual. Such proposals have stirred up considerable controversy.

Another controversial but also overdue proposal in the digital omnibus is to confirm that publicly available personal data on social media websites can be used to train AI models. This would be treated as ‘legitimate use’ under the GDPR, although subject to strict conditions including opt-out rights for data subjects.

On this, the digital omnibus draws a line under diverging interpretations among 27 national data protection authorities, which have delayed the deployment in the EU of several advanced AI models for nearly a year. This is not a case of regulatory compliance costs holding back productivity growth, but regulatory uncertainty because of fragmented authority in the EU. Access to publicly available social media data for AI model training would greatly expand the supply of training data, which is already falling short of the volumes required by the most advanced models (Sevilla et al, 2024). This is especially important for small language communities in the EU that do not have sufficiently large corpuses of text data to train AI models.

The AI Act (Regulation (EU) 2024/1689)

Alongside the digital omnibus, the Commission issued an AI omnibus that would change to the EU AI Act (European Commission, 2025c).

The main change is the postponement of AI Act obligations for high-risk AI systems in areas including critical infrastructure, education and law enforcement. This would buy time to design harmonised technical standards for the assessment of these obligations. Without these standards, AI developers face legal uncertainty about what to do about high-risk applications. This provision responds to growing pressures to ‘stop the clock’ on the implementation of the AI Act. It does not affect the deployment of general-purpose AI models, unless they are applied in high-risk areas.

However, rather than simplifying implementation and reducing compliance costs, the AI omnibus adds new sources of uncertainty for AI investors. It promises that the Commission will issue more guidance to facilitate the interpretation of, and compliance with, the AI Act – on top of the already two dozen implementing acts and guidelines triggered by the original AI Act. There are several problems with such guidelines. First, their development and approval will take several years, resulting in protracted regulatory uncertainty for AI investors, similar to GDPR-related uncertainties that held back the introduction of advanced AI models in the EU market. Second, they are developed by industry working parties that are not always representative. Third, they become de-facto legal standards, beyond the control of EU lawmakers. They contribute to the picture of the AI Act as rolling regulatory plan, piling up layers of additional regulation and widening, rather than reducing, regulatory uncertainty that inhibits investment.

More efficient ways need to be found of dealing with these problems, which are inherent to a fast-evolving technology like AI. Precautionary measures for ChatGPT-like generative-AI models were added by the European Parliament to the AI Act . The Act is already running behind the AI technology curve, for example with respect to agentic AI systems, which are not covered. It makes little sense for the Commission and the European AI Office to develop guidelines that will be outdated by the time they get approved, for a technology that evolves so fast.

Excessive precaution should be abandoned and AI developers and deployers given more room to experiment with AI technologies and applications. Firms that bring lousy AI models to the market will be punished by the market, long before regulators can punish them – as has been seen regularly since the launch of the first ChatGPT model.

In this respect, the AI omnibus contains an intriguing provision for an “EU-level AI regulatory sandbox which the AI Office will set up”. This provision should be broadened to a general provision for sandboxing AI model development and deployment in the EU. This would enable model developers to experiment with many features of AI models without immediately hitting regulatory constraints. It would mark a major shift in EU AI regulation from a precautionary to a pro-innovation stance, and could become the kernel of a single EU AI regime. That would be a truly productivity-enhancing EU AI regime.

Despite the good intention to fast-track regulatory changes, the AI omnibus risks slowing-down AI deployment in the EU by prolonging regulatory uncertainty. It will take at least a year before the omnibus progresses through the EU decision-making process, and another year or more for the additional guidelines to be completed. Nobody knows today where AI technology, models and services will stand several years down the road. But what is clear is that the situation will be very different from today, and today’s regulatory regimes might be irrelevant by then.

Source : Bruegel

Leave a Reply

Your email address will not be published. Required fields are marked *