dpas bulletin - march 27
Welcome back to our monthly DPAS bulletin, where we cover the latest data protection news from all around the world.
What facial recognition technology usage warranted enforcement action? What new artificial intelligence law cleared Utah legislature? And how did the EU Commission breach data protection rules through their use of Microsoft 365?
Read about all this and more in our latest DPAS Data Protection Bulletin.
Key Insights
ICO launches second chapter in generative AI consultation series
The Information Commissioner’s Office has released a second call for evidence as part of their consultation series on the use of generative AI. This second chapter sets out to shed light on how the purpose limitation principle should be applied at different stages in the generative AI lifecycle.
Since the generative AI model lifecycle comprises several stages, with these stages potentially having different purposes for processing personal data, it can be difficult to know how to apply the data protection principle of purpose limitation to this technology. Through input from experts and stakeholders, the ICO seeks to provide clarity and advice on this, so that people can be informed and responsible when utilising AI.
Read more about this here.
French start-up raises $73 million to enhance data privacy for blockchain and AI projects
Zama, a startup based in Paris, has raised $73 million to aid its development of ‘fully homomorphic encryption’, technology that allows computations to be performed through the use of encrypted data. Chief Executive Officer of Zama, Rand Hindi, says that this will improve data privacy and confidentiality for new industries such as artificial intelligence (AI) and blockchain.
Read more about this news here.
ICO launches call for views on “consent or pay” cookie model
In a move met with some controversy online, the ICO proposed a “consent or pay” model with regards to advertising cookies. This idea, in a nutshell, is that users have the choice to either accept all cookies, or pay a fee to not be tracked. This means that online users would be able to use websites for free, but only if they gave their consent to their personal data being used for personal advertising.
This proposal has received mixed responses, many of which questioning the validity of consent given when the alternative is paying a fee, suggesting that this is in fact, not free choice.
Within this update, the ICO also announced the success of their crackdown on websites not offering their users a free choice. They reported that of the 53 organisations they wrote to in 2023, they have seen a success rate of effecting change of almost 80%.
Read the ICO’s call for views here.
ICO investigates claims of Kate Middleton privacy breach
The Information Commissioner’s Office has received a report that staff at the hospital where Kate Middleton underwent surgery in January tried to access her private medical records. This claim followed conspiracy theories swirling around the Internet regarding Kate’s health, involving analysing a recently posted family photo for edits and speculating that a new video taken at Windsor Farm Shop actually features a body double in Kate’s place.
Any secrecy surrounding the royal’s health was later revealed by Kate in an announcement video to be due to an unfortunate cancer diagnosis, and understandably wanting to keep such matters private.
According to a claim from the Daily Mirror, at least one member of staff at the London Clinic was caught attempting to access the Princess of Wales’s notes, and Kensington Palace was assured by the hospital that an investigation into this security breach would be launched.
A spokesperson from the ICO has stated that they have received a breach report and will be assessing the information provided.
Read more about this story here.
Government Activity
Data Protection and Digital Information Bill enters committee stage in House of Lords
The Data Protection and Digital Information (DPDI) Bill officially entered committee stage in the House of Lords earlier this month, with committee hearings taking place on 20th, 25th, and 27th March.
A number of new amendments was put forward to be discussed at these hearings, including:
- ensuring children’s data is included in the definition of sensitive personal data
- assigning data rights to a third party
- the use of personal data for direct marketing
Read more about this development here.
EU AI Act approved with overwhelming majority vote
On 13th March, the EU AI Act was approved by lawmakers by an overwhelming majority – with 523 votes in favour of the act, 46 against, and 49 abstentions.
This marks a significant step forward for the regulation of artificial intelligence, with this act set to completely transform the entire world of AI as we know it.
Read more about the AI Act here.
Funding for new technology investments announced by UK government
On 6th March, the government announced the ‘Budget for Long Term Growth’, which included plans to provide the NHS with a day-to-day funding boost of £2.5 billion for 2024 and 2025 – in addition to £3.4 billion to invest in new technology (such as artificial intelligence). These plans aim to transform operations, replacing outdated systems and therefore improving productivity. By implementing new, innovative technology, the hope is that this will eliminate roadblocks such as admin workers spending unnecessary amounts of time on lengthy tasks.
Along with this investment, the government also plans to invest in new tech for services such as the police force – specifically, facial recognition and surveillance to aid criminal identification.
Read more about this here.
Cabinet Office announces Dr Nicola Byrne reappointed as National Data Guardian
The Cabinet Office announced on 4th March that Dr Nicola Byrne was to be reappointed as the National Data Guardian until March 2027. Dr Byrne will now serve this additional three-year term alongside her duties as a consultant psychiatrist for the NHS.
Dr Nicola Byrne stated the following:
“I am delighted to have the opportunity to continue my efforts in ensuring the highest ethical as well as legal standards for the use of health and social care data.
In this constantly evolving policy and regulatory landscape, my team, panel of advisors and I remain dedicated to promoting the safe and appropriate use of data to improve patient care. We are committed to protecting patient confidentiality and choice and ensuring that healthcare data is only used in ways that benefit the public.
Our ultimate goal is to build public trust in the use of their confidential data, so that it can be used to improve healthcare outcomes for everyone.”
Read more about this here.
Utah legislature passes artificial intelligence law
An artificial intelligence (AI) law was passed in the state of Utah on 28th February. The Artificial Intelligence Policy Act (Senate Bill 149) cleared this legislature, with overwhelming majority votes in the House and Senate.
The main focus of this bill is to ensure that businesses are held accountable if their usage of generative AI deceives the user. As well as this, the bill introduces new obligations regarding transparency, requiring certain professionals to disclose involvement of AI technology or usage of AI-generated materials. For example, telemarketers will need to disclose the use of chatbots, if asked to do so.
Read more about this here.
Enforcement Action
ICO issues enforcement notice for unlawful facial recognition technology usage
The Information Commissioner’s Office issued enforcement notices to Serco Leisure, Serco Jersey, and seven associated trusts for unlawfully utilising facial recognition technology and fingerprint scanning as a means to monitor employee attendance. These trusts were found to have been illegally processing the biometric data of over 2,000 employees at 38 leisure facilities for this purpose.
Read more about this here.
ICO takes action against five public authorities for failing to meet FOI obligations
Five public authorities have found themselves on the receiving end of regulatory action from the ICO, as a result of continuously failing to meet their obligations under the Freedom of Information (FOI) Act.
The authorities in question are:
- Sussex Police
- South Yorkshire Police
- The Department of Education (DfE)
- Foreign Commonwealth and Development Office (FCDO)
- Financial Ombudsman Service (FOS)
While the first two listed were issued with enforcement notices for their failings, the latter three were given practice recommendations to help them improve their operations and better comply with their obligations.
Read more about this here.
Home Office’s GPS pilot found to breach data protection law
The ICO issued an enforcement notice and a warning to the Home Office for not sufficiently assessing the privacy risks posed by their GPS pilot scheme. The purpose of this scheme was to test the effectiveness of electronic monitoring (via ankle tags) on asylum seekers, as an alternative to detention.
Tracking individuals’ locations is a highly intrusive activity for which organisations doing so must be able to provide sufficient justification. The Home Office failed to appropriately consider the impact that this could have on already vulnerable people, or the measures that should be put in place to mitigate these risks.
Read more about this here.
Klarna fined for violating EU GDPR
Klarna, the payments group based in Sweden, was fined 7.5 million crowns for not providing its clients with sufficient information about how their personal data was being used – thus violating the EU GDPR. After a lower court ruled last year that Klarna should pay 6 million crowns, Sweden’s Administrative Court of Appeal raised the amount back to the original penalty of 7.5 million crowns.
Read more about this here.
Chief Constable West Midlands Police reprimanded
West Midlands Police has received a reprimand from the ICO for mistakenly linking and merging the records of two individuals on repeat occasions. The two individuals share similar personal data (both with the same name and date of birth), and as a result of a mix-up, numerous errors were made, such as officers visiting the wrong home, and showing up at the school of the wrong person’s child.
Both of these individuals had been victims of crime, however, one was a suspect – therefore, failing to make a clear distinction between the data of victims and suspects meant that WMP was breaching the Data Protection Act 2018. Additionally, amounting to two more breaches of data protection law, WMP failed to take the appropriate steps to rectify the error fast enough, and did not stop the error from reoccurring.
Read more about this incident here.
EU Commission breaches data protection rules with Microsoft 365 use
According to independent supervisory authority European Data Protection Supervisor (EDPS), the European Commission breached data protection rules through its use of Microsoft 365. The Commission failed to ensure adequate safeguards for transferring data outside the EU or the European Economic Area (EEA), and also neglected to disclose the types of personal data being collected or the purpose of doing so.
Read more about this here.
GET IN TOUCH WITH US!
If you need any support in ensuring your organisation is complying with the relevant legislation, or require training in the areas of data protection and information security, get in contact with us.
Either call us on 0203 3013384, email us at info@dataprivacyadvisory.com, or visit our website at www.dataprivacyadvisory.com and fill out a contact form. Our dedicated team will get back to you as soon as possible.
The following courses are currently 25% off:
BCS Foundation Certificate in Information Security Management Principles (22nd April) – now £1,350
BCS Practitioner Certificate in Freedom of Information (25th April) – now £1,350
BCS Foundation Certificate in Information Security Management Principles (26th April) – now £1,537
Book yourself a place through our courses page today.