UK – Even a dog’s name can be personal data

The question of what constitutes personal data is fundamental to the scope and effect of data protection law, but continues to raise difficult questions, particularly on issues such as indirect identifiers, group identification and proposed reforms to UK law.

We consider the English court’s recent decision that a dog’s name is personal data and other recent UK cases that trace the fine line between personal and non-personal data.

Indirect identifiers – Bad dog

The question of indirect identifiers arose after Mr Williams made a freedom of information request to the Avon and Somerset police for the name of a police dog that mauled him at a rave on the outskirts of Bristol (Williams v The Information Commissioner [2022] UKFTT 403).

Freedom of information cases have long been fertile ground for decisions in the UK on the meaning of personal data. This is because personal data is exempt from disclosure where disclosure would breach the data protection principles. Consequently, whether or not information is personal data is regular consideration for the First Tier Tribunal. Coupled with the streamlined appeals process offered by the Tribunal, this generates a large number of interesting judgments in the UK tracking the boundary between personal and non-personal data.

In this case, the first question for the Tribunal was whether the dog’s name is personal data? The Tribunal concluded, quite rightly, that the dog’s name was personal data. This was not because of some form of canine legal personality, but rather because a police dog is assigned to a single dog handler and an internet search of a police dog’s name readily reveals the handler’s identity.

However, that is not the end of the matter. To establish whether or not disclosure would breach the lawfulness principle (Article 5(1), UK GDPR), the Tribunal had to consider whether there was a “lawful basis” for the processing under Article 6, UK GDPR. 

In the understandable absence of consent from the dog handler, this meant that the Tribunal had to consider if the three-fold test for disclosure under the “legitimate interests” lawful basis (Art 6(1)(f), UK GDPR) had been satisfied, namely:

  • There must be a legitimate interest in disclosure.
  • The disclosure must be necessary for the purposes of the identified legitimate interest.  In this context, necessary means “reasonably necessary” not “absolutely necessary”. As part of the proportionality assessment inherent in the “necessity” test, the disclosure must cause the minimum of interference with the privacy rights of the data subject (i.e. the dog handler not the dog) in order to achieve that aim.
  • The balancing exercise between the legitimate interest and the interests of the data subjects must be satisfied. When carrying out this balancing exercise it is important to take account of the fact that disclosures under freedom of information legislation are disclosures to the world.

Applying these principles, the Tribunal found that there was clearly a legitimate interest – namely to ensure public scrutiny of police conduct, particularly in a situation where the dog had seriously injured a member of the public.  However, the Tribunal found that, as the attack on Mr Williams was to be investigated by the Independent Office for Police Conduct, the disclosure was not necessary for the purposes of the identified legitimate interest. 

In coming to this conclusion, the Tribunal was clear that the additional (indirect) disclosure of the dog handler’s identity was not the minimum interference with the privacy rights of the data subject and not therefore necessary to ensure public scrutiny of police conduct.  The Tribunal also had regard to the fact that further information regarding the incident would make its way into the public following the conclusion of the investigation into the incident.

As the disclosure did not pass the “necessity test” the Tribunal did not have to undertake the balancing test between the legitimate interest and the rights of the data subject.

The Avon and Somerset Police Dogs’ Twitter account (@ASPoliceDogs) reveals that they have a range of fierce looking dogs including Fonzo, Bali, Dax, Jabba and Athos. Whether any of them is the bad dog at the centre of this case remains a mystery… 

Group identification – The Operation Sheridan press release

While the existence of indirect identifiers as personal data is well established (e.g. the UK courts have also concluded that a number plate[1] and postcode[2] can each qualify as personal data), more difficult issues arise when information relates to a group of individuals or only allows the identity of an individual to be inferred.

These points were addressed in a case brought by Mr Driver, a well-known figure in Lancashire politics.[3] He became a suspect in a corruption investigation into a number of local politicians known as Operation Sheridan.

There were numerous press stories about the investigation between 2016 and 2018. As a result, “there was plenty of material in the public domain linking [Mr Driver] to Operation Sheridan and stating, more or less in terms, that the police file [had] been sent to the CPS in August 2018” so that the Crown Prosecution Service (CPS) could decide whether charges should be brought.

Around 10 months later, in response to a request from a member of the public, an official from the CPS responded by email saying:

 A charging file has been referred from the Operation Sheridan investigation team to the CPS for consideration”.

Mr Driver sued the CPS, claiming the email was a breach of data protection law, a misuse of private information and a breach of the Human Rights Act 1998. As the CPS is a law enforcement body, the data protection claim was made under Part 3 of the Data Protection Act 2018 (which transposed the EU Law Enforcement Directive) but the relevant provisions are much the same.

The key question for the data protection claim was whether the email from the CPS – despite not naming Mr Driver and relating to a group of individuals – had revealed his personal data. The judge concluded that there was “no doubt” that it had.  There were only eight suspects in Operation Sheridan so the email indirectly identified him as one of a small group of people in relation to whom a file had been sent to the CPS for a charging decision.

Moreover, even though the information related to a group of people, the fact that he was part of that group and so could be charged was a “significant life event”. Therefore, the information in the email related to him under either the “biographical approach” or the “obviously about” test under English law.[4]

Having found that the email had revealed personal data, the court concluded that disclosure was a breach of the Data Protection Act 2018. While the CPS had a legitimate reason to maintain public confidence in the investigation and prosecution of crime, it had failed to show that the email about Mr Driver had been necessary for these purposes.

As a result, Mr Driver was awarded £250 in compensation. The claims for misuse of private information and breach of the Human Rights Act failed.

Wider questions about associations of groups and individuals

The Driver case leads on to broader questions about whether information about a group of people constitutes personal data. This is likely to turn on:

  • first, whether the size and nature of the group means information about the group is too diffuse to “relate” to the members of the group; and
  • second, if any individuals are an identifiable member of the group.

There is little case law on the first question so it is easiest to consider an example – is the statement “All members of the Spurs Supporters’ Club are terrible and dishonest” personal data?[5] The statement clearly relates to an identified group of individuals and, on the face of it, appears to be “biographically” significant. However, given the large number of individuals and the nature of the slur, it is simply too diffuse and can hardly be said to seriously apply to each and every member or any specific member.

There is more authority on the second question. NHS Business Authority v ICO & Spivack [2021] UKUT 192 relates to a freedom of information request for information about dispensaries which had prescribed the cannabis-based medication Stiripentol. The NHS Business Authority refused to provide some of the data for dispensaries where fewer than five items had been prescribed. The NHS argued this was because of the risk the patients could be identified as a result of that information, combined with other information.

In particular, given the information would be released to the public, the NHS concluded that a patient could be identified by a motivated inquirer, e.g. someone wanting to market Stiripentol, journalists, family members or the like, based on the specific additional information they either already knew or could obtain.

The Upper Tribunal disagreed, finding there must be actual identification of the individuals for the information to be personal data. Identifying a pool that contains, or may contain, a person covered by the data is not sufficient. Saying that it is reasonably likely that someone is covered by the data is not sufficient. Accordingly, in this case, statistics indicating very small numbers of people in a particular category (or “low cell count information) withheld by the NHS were not personal data.

The judgment is controversial. It does not appear to be consistent with the leading cases in this area, Breyer (C-582/14) and R v Information Commissioner [2011] EWHC 1430, though both decisions were considered in the judgment. However, it is still a clear illustration of the difficulties in determining if “group information” is personal data and that there is no fixed amount for a particular statistic, below which very low count that statistic becomes personal data.

Legislative reforms – The new Bill

The UK is also proposing to confront some of these difficulties through its proposed Data Protection and Digital Information (No 2) Bill. That Bill will insert a new section 3A into the Data Protection Act 2018 that states an individual is “identifiable” where:

  • the relevant controller or processor can itself use reasonable means to identify them at the time of processing. The “reasonable means” must be assessed by reference to the time, effort and costs involved in identifying the individual and the technology and other resources available to the controller or processor, or
  • the controller or processor knows, or ought to know, another person will, or is likely to, obtain the information and that other person can by reasonable means identify the individual. The “likely to obtain” test include those likely to obtain the data through hacking.

This is a laudable attempt by the UK Government to bring clarify to this definition, particularly that you only need to consider the “reasonable means” available to those holding, or likely to hold, the data. However, given the infinite variety of types of information and processing situations, it seems unlikely this will prevent difficult questions arising in the future.


While logic in these judgments is – in general – understandable it arguably overextends the boundaries of data protection law. The GDPR’s very broad principles mean it already is the “law of everything[6], with significant implications for anything anyone does with personal data. Applying this regime to information that is only very loosely connected with an individual risks creating an unworkable regime.

There is no quick solution to this issue but these decisions illustrate two long-standing principles on the meaning of personal data:

  1. No bright line test: While some information obviously is (or is not) personal data the middle ground is riddled with uncertainty. The problem of applying a binary categorisation to an infinite variety of facts is not unique to data protection law – another example is the idea-expression dichotomy in copyright in relation to which Judge Learned Hand said: “…nobody has ever been able to fix that boundary and nobody ever can”.[7] However, the question of whether information is personal data is a day to day concern for many businesses and the answer often requires a messy fact-intensive assessment with no clear conclusion.
  2. The internet makes anonymisation hard: You must assess all means reasonably likely to be used to identify an individual to determine if information is personal data. That could include a wide range of information sources but will always include the internet. Trying to assess whether there is, or is not, sufficient information on the internet to identify someone is a fraught exercise and any conclusion that information has been anonymised should be approached with caution. After all, even dogs’ names can be personal data.


A longer version of this article addressing developments in both the UK and the EU will appear in the November edition of Global Privacy Law Review.

[1]    DVLA, 17 October 2017, Decision Notice FS50689632.

[2]    Roy Benford v ICO and DEFRA, EA2007/009.

[3]    Driver v Crown Prosecution Service [2022] EWHC 2500 (KB). 

[4]    For example, Durant v Financial Services Authority [2003] EWCA Civ 1746 [21-31].

[5]    One of the authors is an Arsenal supporter but would like to clarify that this is an entirely spurious assertion.

[6]    One of the problems with GDPR is that it has become the law of everything, and that it’s drawing [data protection authorities], who are not elected officials, into making an awful lot of decisions that impact societies and individuals, which go well beyond data processing.” Helen Dixon, Irish Data Protection Commissioner, May 2020.

[7]    Nichols v Universal Pictures and Peter Pan Fabrics, 45 F.2d 119 (2d Cir. 1930).