Article 22 of the GDPR provides that “The data subject shall have the right not to be subject to a decision based solely on automated processing, including profiling, which produces legal effects concerning him or her or similarly significantly affects him or her.”
The now notorious algorithm that was applied to A level results in England looks a lot like a decision based on automated processing and definitely produced a significant effect on A level students. The Information Commissioner’s Office has become involved and has engaged with Ofqual to understand the circumstances of the processing.
In its defence Ofqual maintains that the algorithm (or standardisation model to use their terminology) is not involved in automated decision making and that teachers and exam board officials are involved in decisions on calculated grades. The standardisation model is necessarily involved in processing the data and the input of teachers would be before that processing takes place. There is also a difference between being involved in terms of having input and being involved in terms of carrying out a human review of an automated decision.
This is what the right to object to decisions taken by automated means comes down to. The redress is to require a human intervention in the decision-making process, even if the ultimate decision remains the same, a human should review the decision. Whether or not there was such a review of the output of the algorithm is a matter of fact and it is doubtful whether anyone would want to step forwards at this stage and claim to have reviewed and approved the output of the algorithm. Arguably the right of individuals and schools to appeal against an award means that there is a route to object to the decision made by the algorithm.
There is also the Data Protection Principle of Lawfulness, Fairness and Transparency to consider. If, as has been claimed, some students have been disadvantaged due to their backgrounds and the school they attended, the processing would be inherently unfair. The fact that students from independent schools benefited most from the moderation process (more upwards moderation to A and A* results and less downwards moderation of grades overall compared to sixth form colleges) looks unfair and needs to be investigated further.
The government has now backtracked on the moderation process and A level grades will revert to teacher recommendations but there remains the damage and distress suffered by the 2020 cohort of A level students. Damage might be represented by missing out on a University place or needing to defer entry for 12 months. No one questions that this has been a distressing time. GDPR provides for compensation to be paid in recompense for such damage and distress and introduces class rights to data protection, meaning that groups of disadvantaged data subjects can bring a court action together.
This is an unprecedented time of financial support being pumped into the UK economy, nevertheless the A level fiasco is only the latest in a chain of poor data protection implementation and potential legal action. Again we have to ask if the government is taking data protection seriously, is it viewed as just so much “red tape”? And how will these data protection failures play out in the EU, when we are dependent on an Adequacy decision to continue to transfer personal data to and from EU Member States? The ICO says it will continue to monitor the situation. Will that be enough?
Mandy Webster, Data Protection Consulting