UK – The DUA Act: Highlights of a modest reform to the UK’s data protection laws

Hosting a seminar, Maximising opportunities from the Data (Use and Access) Act 2025, in conjunction with Privacy Laws & Business on Wednesday 1 October 2025. Click here for details.

The Data (Use and Access) Act 2025 (DUA Act) marks the end of a long haul reforming the UK’s data protection framework. The previous Government’s 2021 consultation, Data: A new direction, proposed ambitious changes. However, the more radical amendments have been dropped leaving a set of modest and largely technical reforms.

The retreat is the result of three key factors. First, the Government’s aims have shifted over time. These reforms were originally intended to deliver the Brexit dividend and usher in “a golden age of growth and innovation” but appeared to include many changes for change’s sake. These presentational changes were dropped as Brexit receded into the distance. Second, there was significant pressure to avoid unnecessary divergence from the EU because of the risk to EU-UK dataflows.

Most importantly, there was little appetite for broader reforms. The more radical ideas in the initial consultation received a lukewarm response from business. Most wanted to avoid unnecessary change and to maintain a consistent compliance approach with the EU.

Implications for UK businesses

While overall effect is modest, the 265-page DUA Act still makes a large number of detailed changes to three key data protection instruments: the UK GDPR, the Data Protection Act 2018 and the Privacy and Electronic Communications Regulations 2003 (PECR). Most changes involve minor technical recalibrations to the legislative framework and there is not space to list them all. However, the following should have most impact for UK businesses.

Complaints mechanism: There will be a new duty on controllers to implement a complaints mechanism for data protection issues. The intention is to reduce the workload on the Information Commissioner by making businesses deal with problems at source.

Businesses will need to offer a complaint form to data subjects. They must acknowledge receipt of complaints, make suitable enquiries and then inform the data subject of the outcome. In other words, this is a process-driven obligation. There is little to prevent businesses from simply rejecting the complaint once it is complete.

Subject access requests: There are minor changes to the rules governing an individual’s right to access their personal data, known as a DSAR. These are a fundamental feature of the UK’s data protection framework, but can be extremely burdensome. Providing data subjects with access to personal data contained in emails (and other unstructured electronic data) inevitably requires a manual review that will often be expensive and time-consuming.

The Act makes it clear that controllers need only conduct a “reasonable and proportionate” search in response to a DSAR. Controllers can also ask the data subject to identify exactly what data they want to access, and stop the clock while they wait for a response. These changes codify existing case law and guidance from the Information Commissioner. However, including them on the face of the Act will encourage controllers to be more robust when dealing with excessive or unreasonable DSARs.

Finally, while most of the changes in the DUA Act will be brought into force over the next few months, the “reasonable and proportionate” limitation is unusual. It has retrospective effect and will be treated as having come into force on 1 January 2024.

New cookie rules: There are three key changes to the “cookie” rules in PECR. First, scope of these rules will be extended to any form of online tracking (such as device fingerprinting and email tracking pixels). Second, while these techniques can generally only be used if data subject consents, the DUA Act expands the situations in which consent is not needed to website analytics, website appearance and emergency assistance. Finally, there is a significant increase in sanctions with breach resulting in fines of up to £17.5m or 4% turnover.

These changes are unlikely to eliminate one of the worst manifestations of UK data protection laws – endless and annoying cookie pop-ups. However, the Information Commissioner has now issued a consultation on whether it should take a “risk-based” approach to some types of Adtech that might permit some advertising models without the need for consent or pop-ups.

Scientific research: The DUA Act will encourage scientific research. It includes a more permissive definition of what constitutes “scientific research” that encompasses anything that can “reasonably be described as scientific whether publicly or privately funded”. It also will be much easier to repurpose personal data for research purposes. This includes allowing data subjects to give a relatively open-ended consent to the use of their personal data in both current and unspecified future scientific research.

These technical changes have been broadly welcomed as preventing inappropriate encroachment by the UK GDPR on legitimate research activities.

Automated decisions – “Computer says no”: There is also more flexibility for automated decision-making. Currently there is a general restriction on significant decisions being made about individuals using automated means unless narrow exceptions apply – i.e. where the individual gives explicit consent, or the decision is necessary for a contract with the individual or is authorised by law.

Despite the fact the GDPR is now approaching its 10th anniversary, this is still a developing area of law. On the one hand, the use of automated decision-making is increasing, particularly in recruitment where robo-CV sifting and interviews are now the norm. On the other, these provisions are being weaponised by privacy activists in the EU to try and outlaw a variety of business models, including credit reference agencies.

The changes in the DUA Act are an attempt to allow greater space for innovation, whilst still protecting data subjects. The DUA Act narrows the general restriction on the use of automated decision-making so that it will only apply if the processing is based entirely or partly on “special category data” (information on health, political opinions, racial and ethical origin, etc.). However, safeguards must be applied in all cases (regardless of whether that uses special category data). The data subject must be informed of the decision and be able to contest the decision, including obtaining human intervention.

Permitted processing: The DUA Act identifies that certain limited processing activities should be automatically recognised as having a lawful basis – these are processing for national security, public security and defence, to respond to emergencies, for criminal investigations, to safeguard vulnerable individuals and to disclose personal data for public administration purposes.

The Act also recognises that controllers have a legitimate interest in three other activities, namely processing for direct marketing, intra-group transfers and network security. However, this just repeats the existing position as set out in the recitals to the UK GDPR. Moreover, these activities are not automatically permitted. It is still necessary to consider if the processing is proportionate and the impact on data subjects outweighs that legitimate interest. In practice, this makes little difference to the law.

 
Risks to EU-UK dataflows

The DUA Act is notable for what it doesn’t include, just as much as for what it does. The Data: A new direction consultation suggested more radical reforms including scrapping data protection officers and data protection impact assessments, giving the Government greater influence over the Information Commissioner and narrowing the definition of personal data. One of the reasons these reforms have been dropped is because of the potential impact on EU-UK data transfers.

The EU GDPR contains strict restrictions on the transfer of personal data to third countries that do not have adequate data protection laws. Those restrictions have become more problematic over time.

For example, most transfers can only take place if the data exporter and data importer enter into a model contract known as the Standard Contractual Clauses. However, following the CJEU’s seminal decision in Schrems II (C-311/18) those clauses must be supplemented by a review of the laws and practices in the destination country – known as a “transfer impact assessment”. This is an expensive and uncertain exercise. It is also subject to considerable risks as evidenced by the Irish regulator’s decision to fine Meta 1.2billion, and the Dutch regulator’s decision to fine Uber 290m, for transferring personal data to the US.

None of this currently matters for transfers to the UK as the EU Commission has decided the UK has adequate data protection laws. However, if the adequacy finding were removed, many EU businesses would likely pause or limit data transfers to the UK, or simply stop them altogether. Given the extensive trade links between the UK and the EU, this could create a significant economic dislocation.

The UK adequacy finding is due to be reviewed at the end of 2025. The UK Government has been liaising closely with the EU Commission and most commentators believe the steps taken to tone down the UK reforms mean EU-UK transfers are safe for the time being.

Enabling artificial intelligence?

The rise of artificial intelligence became a major issue during the passage of the bill, but the final text in the DUA Act says relatively little about it and leaves a number of questions unanswered.

During the final stages, the House of Lords proposed a series of amendments requiring AI providers to disclose information regarding the materials used to train their AI models, amid concerns about the lack of transparency and an imbalance between the creative and tech industries.

After multiple rounds of ping-pong between the Lords and Commons, a much more limited set of proposals were agreed. These oblige the Government to prepare an economic impact assessment for the proposals set out in its Copyright and Artificial Intelligence Consultation and to issue a report on the use of copyright works in AI systems. The lack of more specific provisions (or indeed any copyright provisions binding AI providers) is hardly surprising – it would have been unwise for the DUA Act to pre-empt the outcome of the Copyright and Artificial Intelligence Consultation.

Beyond the copyright issues, there are a number of difficult and unresolved questions about the interaction between data protection laws and AI technology. These include whether it is lawful to train AI models on vast amounts of data scraped from the internet, how to comply with requests from individuals to erase their data from AI models and whether the probabilistic and creative output from these systems complies with the accuracy principle. The DUA Act is silent on these points.

The Information Commission and the “law of everything”

The Information Commissioner will be replaced by the Information Commission. If there is little difference in the name, this major organisational change could significantly alter the way the framework is applied in practice.

In more detail, the broad scope of the concepts of “processing” and “personal data” means almost everything any business does is subject to the UK GDPR. Helen Dixon, the ex-Irish Data Protection Commissioner famously described the EU GDPR as the “law of everything”. The flexible and principle-based framework also rarely provides simple answers to questions of whether processing is permitted – rather you are often left with a series of questions to consider; “is the processing fair?”, “is it transparent?”, “are the security measures appropriate?” etc.

Put differently, this is a framework in which everyone is arguably in breach (to varying degrees) all of the time. This means the question of who decides what these broad principles mean, and where to target enforcement, is just as important as the black letter text in the legislation. This is a pointed issue given the Information Commissioner is currently a corporate sole. In other words, the role of Information Commission is vested directly in a single individual, currently John Edwards.

This unusual constitutional structure gives that individual in that role an outsized influence within the organisation. This might mean strong and decisive direction but, in practice, some previous incumbents have indulged in haphazard decision making. One might say UK data laws “varied with the length of the Commissioner’s foot”.

Recognising the growing importance of data regulation, the DUA Act introduces a new constitutional structure. The Information Commission will be a body corporate with a board made up of a chair, non-executive members and executive members, including a CEO. This should help professionalise the Information Commission’s operations by ensuring better consensus-based decision making and more structured governance.

It is hoped, however, this does not change things too much. Despite its faults, the Information Commissioner’s office is widely respected (both in the EU and globally) as a capable and pragmatic regulator with a keen focus on preventing harms to individuals.

Conclusions

The Government’s consultation in 2021 promised “a bold new data regime: one that unleashes data’s power across the economy and society”. In the end, these modest and sensible reforms marginally increase the regulatory burden, whilst still leaving key questions about the regulation of AI and Adtech unanswered.

More problematic is the fact it creates one of the most complex data protection frameworks in the world. It consists of a web of obligations in the very long and technical UK GDPR, Data Protection Act 2018 and PECR; all as amended on Brexit and now as reamended by the DUA Act.

The old Data Protection Act 1998 was described as “a thicket” and “a cumbersome and inelegant piece of legislation”.[1] In the intervening 20 years, UK data protection laws have grown to resemble a jungle.


[1]    Campbell v MGN [2003] QB 633 at [72].