Last week, EFF joined 30 civil society groups and academics in warning UK Home Secretary Yvette Cooper and Department for Science, Innovation & Technology Secretary Peter Kyle about the law enforcement risks contained within the draft Data Use and Access Bill (DUA Bill).
Clause 80 of the DUA Bill weakens the safeguards for solely automated decisions in the law-enforcement context and dilutes crucial data protection safeguards.
Under sections 49 and 50 of the Data Protection Act 2018, solely automated decisions are prohibited from being made in the law enforcement context unless the decision is required or authorised by law. Clause 80 reverses this in all scenarios unless the data processing involves special category data.
In short, this would enable law enforcement to use automated decisions about people regarding their socioeconomic status, regional or postcode data, inferred emotions, or even regional accents. This increases the already broad possibilities for bias, discrimination, and lack of transparency at the hands of law enforcement.
In the government’s own Impact Assessment for the DUA Bill, the Government acknowledged that “those with protected characteristics such as race, gender, and age are more likely to face discrimination from ADM due to historical biases in datasets.” Yet, politicians in the UK have decided to push forward with this discriminatory and dangerous agenda regardless.
Further, given the already minimal transparency around automated decision making, individuals affected in the law enforcement context would have no or highly limited routes to redress.
The DUA Bill puts marginalised groups at risk of opaque, unfair and harmful automated decisions. Yvette Cooper and Peter Kyle must address the lack of safeguards governing law enforcement use of automated decision-making tools before time runs out.
The full letter can be found here.