Is Big Brother watching you? Developments over the last few years into Live Facial Recognistion (LFR) systems has lead many businesses to explore its use for their own commercial purposes. Last month the ICO published a Commissioner’s Opinion on its use examining the data protection issues surrounding use of this new technology. It has never been so important to protect the rights and freedoms of ourselves and our families as individual data subjects.
The report is based on 14 examples of use or planned uses of LFR in public places gathered from ICO investigations. These examples, from private companies and public bodies, have provided insight into how organisations are seeking to use LFR in public places and the key data protection compliance issues that have been thrown up.
We can all see the potential advantages of this type of software; it is already used for identification verification at passport control and on electronic devices, but as the use of the technology expands to real times scenarios, such as scanning faces at a mass events such as a football match or a concert, there are greater risks to our privacy, rights and freedoms.
The report identifies new purposes considered by private and public companies, such as marketing, targeted advertising and controlling access. Taking our children to a leisure complex, visiting a shopping centre or touring a city to see the sights could all result in our biometric data being collected and analysed with every step we take. The clothes we wear, how long we stare at an advert, the route we take along the street could all be recorded by an electronic billboard. It could be used to display adverts tailored specifically to us (or our children) or match our image against known shoplifters as we walk around the local shopping centre. And if the technology gets it wrong, because we humans can look alike (who doesn’t have a doppelganger?), well, then we all need to be able to exercise our subject rights.
Findings
The ICO found that companies often did not sufficiently consider ‘the necessity, proportionality and fairness of the use of LFR systems and failed to be sufficiently transparent’, a breach of the first principle of the UK GDPR which requires that data should be ‘processed lawfully, fairly and in a transparent manner in relation to individuals’.
They also found that organisations did not always do enough to demonstrate a fair balance between their own purposes and the interests, rights and freedoms of the public. UK data protection law requires the establishment of a lawful purpose for processing; where the purpose of the processing is ‘legitimate interests of the controller’ the controller must carry out an assessment of the impact of the processing on the interests or fundamental rights and freedoms of the data subject. Of the companies investigated little had been done to assess the proportionality of the processing.
Here is a summary of key data protection issues reported by the ICO in the cases they investigated:
- The automatic collection of biometric data at speed and scale without clear justification
Controllers failed justify that the automatic, indiscriminate processing of biometric data was necessary and proportionate. In most of the cases there was no evidence of a data protection by design and default approach being taken. The Data protection Impact Assessment showed little consideration of the effectiveness of the LFR in achieving the controller’s objective against the potential impacts for data subjects. - The lack of control for individuals and communities
LFR deployed in public place involved collecting the public’s biometric data without those individuals’ choice or control. Where biometric data is processing without the consent or engagement of the data subject, controllers need to justify the processing and ensure the processing is fair, necessary, proportionate and transparent. - A lack of transparency
Many of the transparency measures were insufficient: inadequate signage, poor communications and a lack of information available in privacy notices. A lack of transparency can also affect the ability of individuals to exercise their data protection rights, such as the right of access, erasure and the right to object. - The technical effectiveness and statistical accuracy of LFR systems
If LFR systems are not sufficiently statistically accurate they may result in “false positives” or “false negatives”. If the number of false results is high, this would call into question whether the LFR system is necessary or fair. Companies need to carry out due diligence on the products they purchase rather than relying on the information provided by their supplier. - The potential for bias and discrimination
The potential for bias in complex AI systems is another risk highlighted in the ICO’s guidance. Several have shown that LFR is less accurate for some demographic groups, including women, minority ethnic groups and potentially disabled people. This raises serious ethical concerns as well as breaching the first principle that data is processed ‘lawfully, fairly and in a transparent manner in relation to the data subject’. - The governance of watchlists
Watchlists are not always compiled and maintained in a lawful, fair and transparent way. Companies must be sure that data subjects are able to exercise their data protection rights in relation to watchlists. These include the right to be informed, to rectification, to erasure and to object. These rights also apply to any watchlist data shared with third parties. Companies should consider whether it is necessary to share watchlist data with other organisations and where they rely on exemptions from data protection legislation, this needs to be clearly justified. - The governance of LFR escalation processes – regarding escalation processes following an LFR match (ie what happens after someone is identified).
It is important to that processes related to the use of LFR are clearly defined, including verification of the individual’s identity. Without a clearly defined escalation process to fulfil the organisation’s purpose, LFR systems may be difficult to justify. - The processing of children’s and vulnerable adults’ data
The ICO reported that in many of their investigations, LFR was deployed towards locations likely to be accessed by children and vulnerable adults, such as retail or public transport settings. Data protection law provides additional protections for children and adults who may be less able to understand the processing and exercise their data protection rights. Companies using LFR need to pay additional attention to transparency, necessity and proportionality of the processing. This is particularly the case when children and vulnerable adults make a significant group covered by the system.
Data protection is often regarded as an onerous and box ticking exercise which stifles the creative flair of innovative companies. Unlike CCTV, LFR and its algorithms can automatically identify who you are and infer sensitive details about you. It can be used to instantly profile you to serve up personalised adverts or match your image against known shoplifters as you do your weekly grocery shop.
Considering the multitude of CCTV cameras throughout the UK, most of us would agree that good governance is essential at this time. The prospect of combining of LFR technology with social media platforms or other big data systems is terrifying and we should not sleepwalk into allowing it to rollout in the way that CCTV has proliferated over the past decade. This report is a reality check on the importance of maintaining a balance between the use of technology against our rights and freedoms as individuals.
If you use software that records personal data and need help understanding your obligations as a Data Controller, we can help you. Contact us by clicking here.
Leave A Comment