The final stage (used and accessed) data invoices were more eventful than expected. Interventions from people like Sir Elton John and Dua Lipa have raised the rare attention to the issue and harnessed the growing public anxiety about artificial intelligence. Many people see the possibilities of AI, but are also concerned about the impact on what they value, from music to work.
As Congress is now looking at future AI bills to address questions about the creative industry, it is worth pausing considering what has already changed. The UK, behind the famous debate, quietly undermined its main data protection rights.
Big changes include how algorithms are used to make decisions about people’s lives. Previous laws allowed individuals to have the right not to be subject to decisions made solely by automated systems. That safeguard was removed in most cases. People can face serious consequences, including losing their profits or being denied, except for their clear rights to human reviews.
Argo state
If previous discussions focused on “database states,” we face the rise of “argo states.” The government has moved further into an automated, opaque system that makes decisions with little transparency. If these systems are wrong, the outcome lies with individuals who understand these issues and are responsible for challenging the bad uses of technology.
At the same time, the government has also reduced the rights of individuals to pursue relief and relief, turning this into a losing scenario for most of us. The Data (Usage and Access) bill also expands government powers to share and reuse personal data for law enforcement, national security and administrative purposes. Ministers are also able to define new legal basis for data processing using statutory measures without any significant parliamentary scrutiny. This means that data provided to one public institution can be accessed and reused by other public institutions.
Information from smart meter measurements to children’s educational needs may be shared between departments for these reasons. This is a new reason written in future statutory measures approved during a 30-minute session behind the door of the Congressional Committee.
Policing is another area that has received little attention. The requirement for police to record and record the reasons why they accessed our records will be removed by the bill. This is despite scandals such as when dozens of police officers accessed murder victim Sarah Ebbard’s files without legitimate policing purposes. Reducing police accountability further harms our trust in our institutions.
Data validity was threatened
The government may argue that these changes support innovation and growth, but the empowerment of bad government and big technology definitely puts pounds on the pockets of large tech companies in the US and China, not the UK masses. The DUA bill (which will soon be acted) also threatens the UK’s data validity agreement with the EU. This is essential for cross-border data flows of business and policing. European Civil Society Group has already raised concerns with the European Commission. The bill missed the opportunity to strengthen independent surveillance and provide the intelligence committee’s office with the tools needed to protect the public. Instead, it shows the shift towards more government control of regulatory functions, more politicization of market enforcement, and more chronism.
After years of political promise to “regain control,” the irony is clear. I oppose this law. It passes greater power to governments and major technologies, reducing individual rights. As people start connecting dots between data, AI and controls, this quiet change in law may not be spent quietly for a long time.
James Baker is Platform Power Program Manager for Open Rights Group