UK intelligence agencies seek to weaken data protection safeguards
UK intelligence agencies are campaigning for the government to weaken surveillance laws, arguing that the current safeguards limit their ability to train AI models due to the large amount of personal data required.
GCHQ, MI5, and MI6 have been increasingly using AI technologies to analyze data sets, including bulk personal data sets (BPDs), which can often contain sensitive information about people not of interest to the security services.
Currently, a judge has to approve the examination and retention of BPDs, a process that intelligence agencies have described as “disproportionately burdensome” when applied to “publicly available datasets, specifically those containing data in respect of which the subject has little or no reasonable expectation of privacy.”
The categories of BPDs retained by the UK Intelligence Community (UKIC) fall under six categories — law enforcement or intelligence, travel, communications, finance, population, and commercial — and are acquired through both overt and covert channels.
Following this year’s review of the Investigatory Powers Act by David Anderson, a senior barrister and a member of the House of Lords, intelligence agencies are now lobbying the government to replace these safeguards with a process of self-authorization.
“The bureaucratic processes around the use of BPDs impact on recruitment and retention of talent,” Anderson wrote in his findings, noting that his team had been told the UKIC was finding it difficult to retain data scientists as they were becoming “baffled and frustrated by what may strike them as pointless impediments… notably the need to spend months obtaining warranty for standard open-source training data.”
To tackle this issue, Anderson proposed the introduction of a new category of BPD “containing data in respect of which there is assessed to be a low or no expectation of privacy,” such as news articles, academic papers, public and official records, audiobooks and podcasts, and content derived from online video sharing platforms.
Removing the need for a warrant to analyze what Anderson described as “low/no datasets” would significantly reduce the time needed to authorize the use of such a BPD.
However, he recommended that it should not be up to the UKIC to determine what BPDs would fall under this new category, with ministers and judges instead required to authorize and approve the allocation of a dataset into this newly proposed class.
Anderson also said that “low/no datasets” that UKIC wished to retain and examine would still be subject to the data protection requirements under Data Protection Act and would furthermore be subject to an additional authorization requirement with associated safeguards. This would be unique to UKIC and not imposed on any other users of such datasets.
“It is essential that they adhere to strong ethical and oversight frameworks when they use AI techniques as part of their use of investigatory powers,” he wrote, adding that the ethics of AI is one of the most pressing of contemporary issues, extending far beyond the world of intelligence and policing.
When the Home Secretary announced the review in March 2023 Liberty and Privacy International, two of the UK’s largest civil liberties organizations, released statements strongly resisting any proposal that would weaken the existing safeguards around BPDs.
“Weakening safeguards would be an unjustifiable assault on already reduced rights, and the Home Secretary’s proposals would give even more power to the State to access sensitive data such as a person’s health records or confidential legal communications,” the statement from Liberty read. “We already know that the current so-called safeguards are totally ineffective in protecting our rights and holding those in power to account.”
http://www.computerworld.com/category/security/index.rss