Data Protection Bulletin - August 18 2023
Welcome to the latest edition of our Data Protection Bulletin, where we keep you informed on the latest key insights, government regulatory activity, and enforcement actions in the world of data protection.
Here is our round-up of the most significant data protection developments in the UK and overseas in recent weeks. Visit our website for more news.
Key Insights
Several Major Organisations Suffer Data Breaches
Recent breaches involving significant UK organisations highlight the urgency of robust data protection measures.
Police Service of Northern Ireland (PSNI) Data Leak:
On 8 August 2023, an accidental breach occurred when data was mistakenly published online to the WhatDoTheyKnow website following a freedom of information request. This exposed details like surnames, initials, ranks, locations, and departments of all PSNI employees.
Electoral Commission Faces Cyber-Attack:
The UK’s elections watchdog disclosed it was a victim of a complex cyber-attack, potentially affecting millions of voters. The breach, carried out by unidentified hostile actors, targeted copies of electoral registers from August 2021. Additionally, hackers infiltrated the commission’s emails and control systems. The commission emphasised that personal data from the registers did not present a high risk by itself, but in combination with other public data, could be used to profile individuals. They have been implementing additional security measures, and the Information Commissioner’s Office (ICO) is urgently investigating.
Norfolk and Suffolk Constabularies Data Breach:
A technical issue resulted in the inadvertent sharing of personal details of 1,230 abuse victims in Freedom of Information (FOI) responses. This data included descriptions of various offences, such as sexual and domestic assaults. Notably, this is the second time within nine months that Suffolk Police mistakenly exposed victims’ details. The ICO has announced that it is currently investigating.
These incidents underline the imperative for stringent data protection protocols across all sectors and the swift response needed when breaches occur. We have written a series of articles on what organisations can do to contain data breaches:
Government Regulatory Activity
ICO Commences Public Consultation on Biometric Data Processing
The Information Commissioner’s Office (ICO) has unveiled a public consultation on its draft guidance concerning biometric data and its associated technologies. This initiative kicks off with the biometric data draft, which sheds light on the interplay between data protection laws and the use of biometric data in recognition systems. A follow-up phase, centred on biometric classification and data protection, is slated for the upcoming year, which will also incorporate a call for evidence. The consultation period is slated to run from 18 August to 20 October 2023.
The guidance’s primary objective is to elucidate the application of data protection laws when biometric data is integrated into recognition frameworks, aiming to clarify the prevailing laws and propose best practices. It primarily targets organisations and vendors either considering or currently employing biometric recognition systems, encapsulating both data controllers and processors. Key aspects covered in the guidance include the UK GDPR-defined biometric data, the circumstances rendering it as special category data, its utilisation in biometric recognition systems, and the requisite data protection obligations.
You can review the draft guidance on the consultation page.
Switzerland Ushers in Modernised Data Protection Era with revFADP
Switzerland embarks on a transformative journey in data protection with the introduction of the revised Federal Data Protection Act (revFADP) effective from September 1, 2023. This substantial reform overhauls the Swiss data protection legislation from 1992, fortifying consumer rights and closely aligning with the EU’s General Data Protection Regulation (GDPR). However, the revFADP incorporates unique nuances. Accompanying the revised law’s enactment are two pivotal ordinances: the provisions for the Federal Data Protection Act and the updated Data Protection Certification Ordinance.
A cornerstone of the revFADP is its “risk-based” approach. Organisations are encouraged to identify potential hazards across the data management lifecycle and take proactive measures. Those already adhering to GDPR guidelines will find a smoother transition, albeit with some Swiss-specific considerations. The law’s scope now mirrors the GDPR, focusing on personal data processing for natural individuals, excluding corporate entities. Notably, the definition of sensitive data has expanded to encompass genetic and biometric data, mandating explicit consent for processing. Breaching the revFADP can result in penalties, with fines reaching up to CHF 250,000.
India Passes New Data Protection Law
India has recently passed a pivotal data protection law that outlines how tech companies will manage user data. The Digital Personal Data Protection Act, 2023 permits certain user data to be transferred abroad and grants the government the ability to obtain information from corporations. This recent law follows India’s decision to retract a 2019 privacy bill that had deeply concerned major tech entities due to its strict regulations on cross-border data flows. The new law carries hefty penalties for violations, with fines potentially reaching up to 2.5 billion rupees (about $30 million).
With its almost one billion internet users, India has undergone a massive digital transformation, and the passage of this legislation is of key importance for UK and EU businesses who have customers or service providers in the country. As they continue to engage with India, understanding and adapting to this changing regulatory landscape will be crucial.
ICO and CMA Warn Against Deceptive Website Designs
The Information Commissioner’s Office (ICO) in conjunction with the Competition and Markets Authority (CMA) is urging businesses to desist from employing deceptive website designs that inadvertently compel consumers to disclose more personal data than they intend to. Such deceptive practices encompass complex privacy controls, default configurations that offer limited control over personal details, and grouping privacy options in a manner that nudges users to divulge excess information. These tactics not only jeopardise consumer rights but can also stifle competition in the market, according to the ICO and CMA.
Upon visiting a website, these misleading designs prompt users to make swift decisions regarding their personal information – ranging from sharing contact details in hopes of securing discounts to relinquishing control over targeted advertising by accepting cookies. A notable concern is the diminished consumer control over cookies. In response, the ICO has initiated an assessment of cookie banners on the UK’s most frequented websites, prepared to act against designs that adversely impact users.
We have this handy guide available if you want to get an overview of how to stay cookie compliant: Cookie Compliance
Enforcement Actions
Norway To Fine Meta £78,000 a Day Over Non-compliance
Norway’s data protection agency recently announced its plan to impose a daily fine of 1 million kroner on the owner of two major social media platforms for defying a local ban on using user personal data for targeted ads. The regulator had temporarily prohibited behavioural advertising on these platforms, citing concerns over intrusive surveillance and the potential violation of users’ data protection rights and freedom of information. Particularly worrisome to the agency is the risk posed to vulnerable user groups, such as the young and elderly, and the potential misuse of sensitive personal information for advertising purposes.
The ban, made public in mid-July, granted the tech giant a brief window until early August to implement necessary corrective measures, which it failed to do. Following a separate European Court of Justice ruling, the company recently signalled its intention to seek user consent for targeted advertising across the European Union, European Economic Area, and Switzerland. However, despite this policy shift, the regulator underscores that unlawful data processing remains a concern.
ICO Takes Action Against Five Public Entities
The ICO has taken action against five public entities, including Liverpool City Council, London Borough of Tower Hamlets, the Medicines & Healthcare Products Regulatory Agency, the Ministry of Defence, and the Environment Agency, citing their failure to adhere to the expected standards in handling Freedom of Information Act (FOI) requests. This comes a year after the ICO unveiled its FOI regulatory manual, detailing its refined approach towards ensuring stricter FOI compliance in accordance with its legal prerogatives.
Specifically, the Ministry of Defence and the Environment Agency have received enforcement notices, mandating them to address several requests that have surpassed the lawful 20 working day deadline; some of these pending requests span years. The London Borough of Tower Hamlets, Liverpool City Council, and the Medicines & Healthcare Products Regulatory Agency have been handed practice recommendations. Tower Hamlets and Liverpool City Council are under scrutiny for their habitual delays in addressing FOI requests, with the latter facing criticism due to a surge in related complaints and subsequent intervention notices. The Medicines & Healthcare Products Regulatory Agency has been flagged for its non-compliance with the FOI Code of Practice.
ICO reprimands NHS Lanarkshire for sharing patient data via WhatsApp
The Information Commissioner’s Office (ICO) has admonished NHS Lanarkshire for the unauthorised sharing of patient data through WhatsApp by its staff over a two-year span. From April 2020 to April 2022, a WhatsApp group consisting of 26 NHS Lanarkshire staff members shared patient information, such as names, contact details, and addresses, on more than 500 occasions. Furthermore, images, videos, and screenshots containing clinical data were disseminated. Although initially used only for basic communication during the pandemic’s onset, WhatsApp was not sanctioned by NHS Lanarkshire for such purposes and was used without the institution’s awareness. A non-employee was mistakenly added to this group, leading to unintentional disclosure of private data.
Upon discovery, NHS Lanarkshire alerted the ICO. The subsequent investigation ascertained that the healthcare institution lacked concrete guidelines, policies, and evaluation mechanisms concerning WhatsApp usage. John Edwards, the Information Commissioner, highlighted the gravity of patient data security, emphasising the necessity for healthcare establishments to maintain stringent data protection measures. As a result, the ICO provided multiple recommendations to NHS Lanarkshire, which included the introduction of a secure system for transferring clinical images, an extensive evaluation of apps before their deployment focusing on personal data risks, and refining internal procedures associated with such incidents. NHS Lanarkshire has been given six months to report on the measures taken post-reprimand.
How can we help you?
At DPAS, we provide Data Protection and Information Security Consultancy and Training internationally. We support businesses to achieve their organisational objectives and goals, by transforming data protection compliance from an obstacle into a value-added asset.
Take a look at our website to see what we can do for you.