DPAS Data Protection Bulletin – November 28 2024

dpas bulletin - NOVEMBER 28

Welcome back to our monthly DPAS bulletin, where we cover the latest data protection news from all around the world.

How are cybercriminals targeting shoppers around Black Friday? What smart devices are capturing more of your data than you thought? And how are data sharing issues affecting the NHS?

Read about all this and more in our latest DPAS Data Protection Bulletin.

ICO publishes response to new DUA Bill

The Information Commissioner’s Office (ICO) has released their response to the newly announced Data (Use and Access) Bill.

Overall, the ICO approves of the changes included in the Bill. Following their analysis of the various amendments, they state that these are “pragmatic and proportionate” to the UK regulatory landscape, they “align well with the ICO’s enduring objectives”, and that they “strike a positive balance and should not present a risk to the UK’s adequacy status”. However, the ICO claims there are also some changes that could benefit from more clarity, such as the reference of both tests for adequacy and for appropriate safeguards as the “Data Protection Test”, and the specifics of human involvement opportunities in automated decision making.

Read more about this here.

Journalist falsely labelled criminal by AI hallucinations

Martin Bernklau, a German journalist, was surprised to find that Microsoft’s AI tool, Copilot, seemed to think that he was a violent criminal.

When he typed his name into the tool, the information that he received about himself had been confused with cases that he’d written about. Bernklau wasn’t thrilled to be faced with AI hallucinatory outputs labelling him a drug dealer, a con-man preying on widowers, an psychiatric institution escapee, and a child molester – a crime which, according to Copilot, he had confessed and felt remorse for. If that wasn’t enough, the tool also listed his real phone number and home address.

Read more about this here.

Which? finds that smart devices are collecting more personal data than necessary

According to a new study conducted by consumer group Which?, our kitchen appliances may be doing a little more than just heating our food.

Tests were carried out on a range of common household devices, and concluded that various manufacturers may be crossing the line when it comes to their users’ privacy. For example, air fryers made by Xioami, Aigostar and Cosori were found to record audio on their users’ phones for no clear purpose, with air fryers produced by the two formers having sent data to servers in China to, as it is believed, use for targeted advertising. Also found guilty of having less-than-ideal privacy practices were companies like Samsung, whose smart TVs apparently request eight “risky” phone permissions.

Read more about this here.

ICO shares data protection considerations for using AI in recruitment

After recently auditing a number of providers and developers of AI tools for recruitment (and having found multiple areas for improvement), the ICO has published advice comprising questions that recruitment organisations should consider when procuring these tools to aid their processes.

These are:

  1. Have you completed a DPIA?
  2. What is your lawful basis for processing personal information?
  3. Have you documented responsibilities and set clear processing instructions? 
  4. Have you checked the provider has mitigated bias?
  5. Is the AI tool being used transparently?
  6. How will you limit unnecessary processing?

The ICO highlights the importance of using these AI tools responsibly, as the consequences of unlawful use may be unfair exclusions from opportunities for candidates, or even the compromisation of their privacy. 

Read more about this here.

Coroners warn of NHS patients dying due to clinicians failing to access data

Coroners in England and Wales have issued warnings – 36 in total this year – that NHS patient information isn’t being adequately shared, with some patients dying due to clinicians’ inability to access important health data.

For example, a three-year-old boy tragically passed away from a streptococcal infection after contracting chicken pox. This could have been avoided if the NHS 111 advisor had been aware that the boy had Down’s syndrome, as in this case, the boy’s mother would have been advised to take him to hospital immediately. In another case, an A&E patient’s digital information was unavailable, leaving mental health staff unsure as to why she had been taken there. As a result, they discharged her, and she sadly committed suicide the following day. Health secretary Wes Streeting has said that we “desperately need to modernise our health service” and make it “more efficient”. However, Labour’s plans to store each NHS patient’s health data all in one place has raised concerns among privacy campaigners that this will pose a risk to patient confidentiality.

Read more about this here.

UK cybersecurity chief warns of AI being used to trick shoppers

According to Richard Horne, chief executive of GCHQ’s National Cyber Security Centre (NCSC), the festive period has become “prime time for cybercriminals”, who have been taking the opportunity to target unsuspecting buyers with increasingly advanced scams, “sometimes crafted using AI, making them harder to detect”.

During the last holiday period (between November 2023 and January 2024), over 16,000 reports of online shopping fraud were recorded, the average loss per victim being £695. A total of more than £11.5 million was lost to fraudsters during this time, and interestingly, almost half (43%) of these reports made to Action Fraud in this period included mentions of a certain social media platform, with online marketplaces mentioned in almost 20%.

Read more about this here.

Bedford Borough Council denies housing interviews held in open is in breach of law

Recently coming under a bit of scrutiny is Bedford Borough Council, who have been criticised for conducting “sensitive” housing interviews in an open plan office rather than privately, behind closed doors. This issue, raised during a Housing Committee meeting in mid-November by a member of the public, surrounded the council’s duty to ensure its services’ compliance with data protection law by design and default, something that these interview practices were considered to be in breach of.

Head of Housing, Homelessness, and Customer Services, Anna Robbani, denies that the conductions of these interviews are breaching the law, stating that “interview rooms are available upon request” and “the numbers of people in the waiting area are very controlled”. Robbani also mentions the existence of privacy screens and options for home visits. To make it clearer that the option of using interview rooms is available to everyone, Robbani says that she has “put forward with the team an action to put signage up in the Hub”.

Read more about this here.

Irish data protection authority to potentially become the national watchdog for AI

Due to an abundance of questions regarding the use of AI, Ireland’s data protection authority has made a request to the European Data Protection Board (EDPB) for details on how they can handle these queries. The Irish DPC now awaits the EDPB’s decision.

“That’ll then give us some further guidance on how to address these issues,” says data protection commissioner Dale Sunderland. “For example, does personal data continue to exist within the training model?”

Regarding the AI Act going forward, it’s now up to Ireland’s incoming government to make a decision on which authority will oversee national compliance.

Read more about this here.

ICO promotes responsible data sharing for International Fraud Awareness Week

For International Fraud Awareness Week, the ICO has shared advice on appropriate data sharing for organisations when it comes to preventing data-enabled scams.

The advice they provide includes:

  • Carry out a Data Protection Impact Assessment (DPIA)
  • Be clear about responsibilities
  • Set up data sharing agreements
  • Identify a lawful basis
  • Understand the type of information being shared
  • Comply with the data protection principles
  • Respect people’s rights

With this advice – aimed particularly at organisations such as banks and telecommunications providers – the ICO hopes to encourage “appropriate and confident data-sharing” in the fight against fraud.

Read more about this here.

Peter Kyle announces priorities for Ofcom in preparation for OSA enforcement

Technology Secretary Peter Kyle has set out his strategic priorities for online safety regulator Ofcom, in preparation for next year’s enforcement of the Online Safety Act (OSA).

The five key areas outlined as priorities are:

  1. Safety by design
  2. Transparency and accountability
  3. Agile regulation
  4. Inclusivity and resilience
  5. Technology and innovation

Kyle states that among his specific plans are to keep up with the evolution of technology to effectively protect children, commence research that will help build the evidence base needed to prioritise children’s online safety, and “baking safety into social media sites from the outset”.

Read more about this here.

UK Parliament debates police use of facial recognition

The use of facial recognition by the police was a topic of debate in Parliament recently, with MPs gathering to discuss the need to use the technology to aid law enforcement, but without compromising the confidence and privacy of the public.

Dame Diana Johnson MP, Policing Minister, stated that we must “think about how we protect the public from potential misuse of those technologies” and “consider how the application of the rules and regulations is scrutinised.” The lack of a legal framework for live facial recognition (LFR) was a cause for concern among the MPs, with Sian Berry (Green Party MP for Brighton Pavilion) calling the issue of this “unlawful area” an urgent matter. “The legal framework needs to be strengthened to ensure that the use of technology is transparent, accountable and subject to rigorous oversight,” claimed Iqbal Mohamed, Independent MP for Dewsbury and Batley.

According to Johnson, before the year ends, a series of roundtables will be held with a number of different groups, such as regulators and civil society groups, with funding continuing to be provided to police forces for LFR in the meantime.

Read more about this here.

Data (Use and Access) Bill undergoes second reading in House of Lords

The DUA Bill continues its journey through the House of Lords, with the second reading having occurred on 19th November.

Among the discussed issues during this reading were:

  • The robust monitoring and enforcement of codes of conduct, with penalties for breaches
  • The accountability framework in GDPR and how it should be strengthened
  • The softening of accountability measures, for organisations relying on automated decision making in particular
  • Simplified cookie consent rules to reduce unnecessary user friction
  • The weakening of privacy protections for vulnerable groups, like children

The Bill is now to enter the committee stage in early December.

Read more about this here.

ENGAGE, EDUCATE, EMPOWER 2025 – FREE CONFERENCE

In case you missed it, in February, we’re bringing Engage, Educate, Empower back for 2025! This free data protection and information security conference is the perfect place for you to connect with new people, join the buzzing discussions about today’s challenges, and listen to a range of varying perspectives on the pressing topics and issues surrounding the modern privacy world.

Our 2025 conference will follow the same theme as previous years’ Engage, Educate and Empower events, aiming to educate colleagues across the industry on topics in data protection, information security and AI. We have a host of industry experts ready to deliver engaging sessions aimed at educating DPOs from a range of private, public and third sector organisations.

Read more about this conference and book your free ticket here.

GET IN TOUCH WITH US!

If you need any support in ensuring your organisation is complying with the relevant legislation, or require training in the areas of data protection and information security, get in contact with us.

Either call us on 0203 3013384, email us at info@dataprivacyadvisory.com, or fill out a contact form. Our dedicated team will get back to you as soon as possible.

related posts

Get a Free Consultation