DPAS Data Protection Bulletin – March 27 2025

dpas bulletin - march 27

Welcome back to our monthly DPAS bulletin, where we cover the latest data protection news from all around the world.

How have deepfake celebrity ads been used in recent scams? What recent ChatGPT hallucination resulted in Noyb’s second complaint against them? And what confidential group chat did the Trump administration accidentally add a journalist to?

Read about all this and more in our latest DPAS Data Protection Bulletin.

Georgia-based group scams thousands using deepfake celebrity ads

A group of scammers based in Tbilisi, Georgia have managed to scam millions out of unsuspecting victims through the use of celebrity deepfakes.

A total of $35 million (£27 million) has been taken from people across the UK, Europe and Canada by tricking them with fake celebrity ads promoting fraudulent cryptocurrency and misleading investment schemes. Among those whose faces and voices were used in these fake ads displayed throughout Google and Facebook were Ben Fogle, Zoe Ball, and money-saving expert Martin Lewis. About a third of the money stolen was taken from UK victims.

Read more about this here.

ICO posts about their dedication to protecting children’s privacy

The Information Commissioner’s Office has recently shared an article outlining their dedication to protecting children’s privacy, particularly in digital online spaces.

Research conducted by the ICO has apparently shown that 42% of parents in the UK feel that they have “little to no control” over the data that social media and video platforms are collecting about their children. Furthermore, 23% of the public have claimed that either they or their children have abandoned certain platforms due to concerns and uncertainties about what information of theirs is collected. To combat these issues, the ICO has made several efforts, such as reaching out to numerous organisations about improving their protections to improve the safeguarding of children’s privacy.

Read more about this here.

Information Commissioner gives keynote speech at IAPP event

John Edwards, Information Commissioner, gave a keynote speech at IAPP’s recent Data Protection Intensive UK 2025 event, in which he reiterated the importance of understanding how poor data protection practices can have a “ripple effect” and greatly impact people’s lives.

“Imagine a person fleeing a violent domestic relationship, only to have their new address accidentally shared with their abuser.

Or someone who has their HIV status and confidential medical information disclosed without their consent.

These are real cases. They happen. And we wanted to change the conversation around these mistakes being seen as “admin errors”, to help organisations see that when they mishandle personal information, it can have a ripple effect of damage and distress far beyond their understanding.”

In his speech, Edwards also outlined some of the ICO’s current plans and activities, such as:

  • Raising awareness of data protection
  • Protecting children on social media
  • AI and biometrics

The Commissioner concluded his speech with the key message: “Remember the person behind the data. Don’t try and push onto them the responsibility for your compliance. It’s not their job.”

Read more about this here.

ICO shares how their activities are aiding economic growth

The ICO has written about how they feel their approach to regulation has been supporting economic growth in the UK.

Their article states that through helping “tens of thousands” of businesses to improve their practices, they have been able to “invest and innovate with confidence”. By helping organisations to feel more confident in the lawfulness and effectiveness of their practices, this has aided the flow of data, and according to the ICO, these efforts have enabled up to an estimated £140 million of value for UK businesses.

Read more about this here.

Hearing between Apple and UK Government occurs behind closed doors

Following an order from the UK Government for Apple to grant them backdoor access to Advanced Data Protection (ADP) service, the US firm launched an appeal with the investigatory powers tribunal. Now, a recent hearing in this legal battle has taken place behind closed doors, with many UK media organisations (such as The Guardian and the BBC) being denied entry.

A group of US lawmakers have since called on the tribunal to make this hearing (as well as any further proceedings) public, and “remove the cloak of secrecy” around the initial order for access that Apple received from the UK Government.

Read more about this here.

Noyb files complaint against OpenAI for “child murderer” hallucination

Austrian privacy rights group Noyb has filed their second complaint against OpenAI following a dangerous hallucination by ChatGPT that painted one particular user out to be the murderer of his own children.

Norwegian user Arve Hjalmar Holmen, when searching to see what information ChatGPT would share about him, was shocked to see the AI claim that he had been convicted of murdering his own two sons. ChatGPT stated that Holmen was sentenced to 21 years in prison after he was found guilty of murdering his sons, who it claimed were found dead in a pond. In response to this, Holsen has said: “Some think that ‘there is no smoke without fire’. The fact that someone could read this output and believe it is true, is what scares me the most.”

Read more about this here.

Online Safety Act officially comes into force

17th March was a significant date for online platforms in the UK, as the Online Safety Act (OSA) has now officially come into force.

This Act requires that additional measures to improve the protection of online users against illegal content, with an emphasis on the priority of criminal activities like child sexual abuse material (CSAM). Assessing compliance with this new Act is Ofcom, who warns that enforcement action awaits any organisation who fails to comply with their legal duties. These can take the form of fines of up to 10% of global turnover or £18m – whichever is greater.

Read more about this here.

Amazon to discontinue Echo’s “do not send voice recordings to cloud” feature

Amazon has recently announced the impending discontinuation of a feature available on some of their Echo speakers that disables the sending of users’ voice recordings to the cloud. Users of the three particular devices that previously had this feature will no longer be able to have their requests processed locally, and can instead opt to have their recordings deleted immediately after being processed in the cloud.

Amazon has decided to end their support for this feature due to their increasing expansion into Alexa’s generative AI capabilities, for which they need data to train these features.

Read more about this here.

Plans announced for Meta AI to roll out in the EU

Meta has revealed plans to launch its chatbot-like Meta AI features in the EU, even off the back of concerns and challenges in recent months about how the tech giant processes its users’ public data for AI training.

For the time being, this will be limited to a feature baked into Facebook, Instagram, and WhatsApp, described as “intelligent chat”, where users can call upon Meta AI to answer their questions, functioning like an advanced search engine within a messaging app. Meta claimed in its announcement that this AI has “an advanced understanding of what you’re looking for”, but assures that this does not imply any personalised suggestions based on the user’s data.

Read more about this here.

Trump administration accidentally shares war plans with journalist via group chat

Editor-in-Chief of the Atlantic, Jeffrey Goldberg, was mistakenly added to a group chat in which members of the Trump administration were discussing their plans to launch airstrikes on Yemen.

This group chat, which included officials like Vice President JD Vance, Defence Secretary Pete Hegseth, National Security Adviser Mike Waltz, and Director of National Intelligence Tulsi Gabbard, was situated on encrypted messaging service “Signal”. Goldberg had received a connection request from Waltz, and was subsequently invited to the chat days later. When asked about this group chat, President Trump claimed to know “nothing about it”.

Read more about this here.

Have you ever felt at all stuck on which of the six lawful bases set out in the GDPR best applies to your processing activity?

Businesses process data all the time for a variety of purposes, from marketing to simply carrying out business operations as usual. But while your reason for using this data may be clear, what might make you stop in your tracks for a moment is just which of the lawful bases is most appropriate for what you’re doing.

It can be difficult to know which one best applies to your situation. Is there legitimate interest? Did the subject give valid consent? What’s the answer?

Our panel will be discussing:

  • How to select the appropriate lawful basis
  • What each lawful basis means in simple terms
  • How different ones can apply

…and more!

 

Panel:

Rowenna Fielding (Miss IG Geek Ltd)

Tash Whitaker (Whitaker Solutions Ltd)

 

Register for free here.

GET IN TOUCH WITH US!

If you need any support in ensuring your organisation is complying with the relevant legislation, or require training in the areas of data protection and information security, get in contact with us.

Either call us on 0203 3013384, email us at info@dataprivacyadvisory.com, or fill out a contact form. Our dedicated team will get back to you as soon as possible.

related posts

Get a Free Consultation