Home

Search

Looking for something else?

Data protection and assisted lifeguard technology

Jan 26, 2023

GUIDANCE

This article explores the data protection implications of assistive lifeguard technologies (ALT), sometimes called drowning detection systems. The systems under examination in this article use overhead and/or underwater cameras and computer algorithms to detect potential drowning targets based on behaviours the system has been trained to detect. At present, machine learning technology supports these computer vision systems. No deep learning technologies have yet been brought to market. 

 

What is personal data? 
Personal data means any information relating to an identified or identifiable natural person (‘data subject’); an identifiable natural person is one who can be identified, directly or indirectly, in particular by reference to an identifier such as a name, an identification number, location data, an online identifier or to one or more factors specific to the physical, physiological, genetic, mental, economic, cultural or social identity of that natural person. 

In this case, the data constituting personal data are images captured of pool users, which enable them to be identified. It is worth saying that not all images captured of pool users would constitute personal data. 

 

Do ALT have a lawful basis for capturing and storing data? 

Article 5(1) of the GDPR says that personal data shall be processed lawfully, fairly and transparently. The lawful basis is likely to meet the 'vital interests' category, although there may be some debate about whether ALT is 'necessary' to protect someone's life. However, it must be more than just helpful and more than just standard practice. It must be a targeted and proportionate way of achieving a specific purpose. It is not enough to argue that processing is necessary because you have chosen to operate your business in a particular way. The question is whether the processing is objectively necessary for the stated purpose, not whether it is a necessary part of your chosen methods. This does not mean that processing has to be essential.

Whether an operator has a lawful basis for capturing and storing data using ALT systems depends on whether they do it fairly and transparently for a legitimate purpose. 

 

Do ALT collect and store data in a 'fair' way? 

Is it fair to capture, store, and use personal and sensitive data using ALT? Fairness means that you should only handle personal data in ways that people would reasonably expect and not use it in ways that have unjustified adverse effects on them. I would expect most pool users to understand that ALT involves being filmed and that footage is stored for a reasonable period before being securely destroyed. I am less sure that pool users understand that their non-biometric data is being used to improve the algorithm of a third-party ALT manufacturer. 

 

Do ALT collect and store data in a 'transparent' way?  

Transparent processing is about being clear, open and honest with people from the start about who you are and how and why you use their personal data. Again, I think the trickier area of this is when the data is being transmitted to a third party to improve a machine-learning algorithm. 

 

Do ALT collect and store data for a specified, explicit and 'legitimate purpose'? 

Article 5(1)(b) says personal data must be collected for a specified, explicit and legitimate purpose. As identified above, the legitimate purpose you have for capturing data using ALT systems is likely to be premised on the need to protect pool user safety, but your justification must go beyond that to show that it is more than just helpful or standard practice. You must show that data capture is necessary to keep pool users safe. To demonstrate this, you are likely going to need to examine the alternatives available. 

The argument for the storage of the data is less clear. There is certainly an argument in machine-learning ALT systems that data storage is essential to improving the algorithm and increasing the protection of other pool users.  However, it is less clear how the consequences of that argument continue to meet the processing of data fairly and transparently unless the processing is disclosed in pool use terms and conditions. It would certainly be advisable to make that disclosure if the ALT system you select is storing data for considerable periods in order to improve the quality of the algorithm. 

 

Do ALT capture 'sensitive' data? 

The category of sensitive data that may be relevant to ALT is biometric data. CCTV cameras collect and store biometric and non-biometric images. A biometric image is one where facial recognition technology is used to identify a person in the image. A non-biometric image is where facial recognition technology is not used to identify a person in the image. Facial recognition technology is a set of algorithms that work together to identify people in a video or a static image.

A photo is not automatically biometric data. According to ISO 19784-5:2011/AMD:2015 standard, relevant questions to ask when evaluating whether a photo is 'biometric data' include: 

  • Is the person in the photo looking towards the camera?
  • Are their eyes looking into the camera?
  • Is it a neutral facial expression (no smiling)?
  • Does it have a monochrome background (light grey or grey are best)?
  • Is the photo free from shadows on the face or in the background?

Some of the images captured by ALT systems may be capable of fulfilling the criteria for biometric images as set out by ISO 19784-5. The consequences of images captured by ALT systems constituting sensitive data are defined below. 

 

Do ALT capture 'sensitive' data? 

Biometric data is sensitive data. To lawfully process special category data, you must identify a lawful basis under Article 6 of the UK GDPR and establish one of the ten conditions for processing under Article 9 and meet the rules in Schedule 1, Part 1.

Only two exemptions are applicable. The first is that explicit consent is provided. It is worth noting that in Europe, this exemption cannot be relied upon for biometric images of an employee as it is deemed that the employee cannot freely provide consent to their employer. The second is reasons of substantial public interest, namely paragraph 11 of the DPA 2018, protecting the public, which requires explicit consent (s.11(1), Schedule 1, Part 1). Thus, consent is required by both routes. 

Typically, ALT systems do not use facial recognition technology to identify a person within an image. Instead, the algorithms focus on body shape, size, and trajectory. It is correct to say that most ALT systems do not capture sensitive data in the form of biometric data. 

 

Do assisted lifeguard technologies store data? 

Most ALT systems store non-biometric data in the form of images. The timeframe data is stored varies by system and operator but is typically at least one week. ALT with a machine-learning component typically store non-biometric images for considerably longer periods which may bring an operator into conflict with their duty to use the data in a fair way. 

 

In conclusion

ALT systems are likely to fall within the 'vital interests' gateway on account of being more than helpful at protecting life. The specific purpose that justifies these systems' use is likely to be based on hazards that could pose a risk to pool users' health and safety, which cannot otherwise be proportionately overcome by alternative means. Operators are responsible for processing the data collected through the use of ALT systems in the way a reasonable person would expect. It is less clear whether the reasonable user would expect their data to be used to improve the machine learning algorithm of a third party. It is, therefore, advisable to be upfront about the relevance of this use in pool use terms and conditions. 

 

Citation. Jacklin, D. 2023. Data protection and assisted lifeguard technology. Water Incident Research Hub, 26 January.