“What do you actually do in data protection?”

If I had a pound for every time someone asked me ‘What do you actually do?’ – I’d probably be retired. But here I am, writing this instead. I used to brush the comment off with a sweeping statement that minimised what my role actually meant – usually focused on law or legal compliance, and move on to talking about someone’s ‘more interesting’ job. But, I was early in my career and still getting a handle on the fundamentals – my focus at the time was learning the legislation, the environment, and the practicality.

Fast forward a ‘few’ years, and now I find myself in a place where actually, I would never describe my career as being about legal compliance. At some point a switch flicked and I realised my role was about advocating for rights, and trying to educate others about the real-life implications of that legal compliance ‘stuff’.

So, I am going to make a point of writing something that answers the ‘What do you actually do?’ in a way that I hope resonates with more people than my old ‘legal compliance’ answer. In a way that hopefully makes you see why I think it is so important, and why it matters more than so many people realise.

ARTIFICAL INTELLIGENCE

I know what you’re thinking – another AI blog. But stick with me. 

It is impossible to escape the AI conversation these days. Even my nan was telling me a story not long ago about ‘AI Granny’ – created by O2 to effectively waste scammers’ time; she had seen a feature on it on the ITV show ‘This Morning’. My friends all have the ChatGPT app to hand, and every other client I speak to is asking about how they can implement AI technology in their organisation.

Now, I am all for innovation, I am all for advancements, I am all for utilising technology to streamline work flows, to access more knowledge than ever, and frankly, to have a bit of fun. However, like all things, not everybody is singing from the same hymn sheet.

I would say 80% of the people I speak to, in any capacity, about AI, have no idea about the darker side of the technology being created, or the risks that they present. Specifically the risk to us as individuals. I spend a lot of time talking about risks within businesses, and often don’t have the opportunity to talk about the more day-to-day risks that some of this technology introduces.

There is a plethora of technology available at the click of a button – websites to visit, apps to download. Whilst most of us can access them, they are not all designed to be utilised by ‘everyone’.

Those most at risk of this technology harming them? 

Women.

Let’s look at deepfakes for example… 

What is a deepfake?

A deepfake is a type of artificial intelligence generated media that manipulates or replaces someone’s likeness in videos, images, or audio recordings. While deepfakes can be used for entertainment, satire, or filmmaking, they have also been linked to misinformation, identity fraud, and other unethical uses. 
Effectively they are videos (most commonly), that look ‘real’ but they aren’t. You have probably seen the Deepfakes of Joe Biden, and Trump, even if you didn’t realise you have.

The BBC also recently produced a drama called ‘The Capture’ that centres around the use of Deepfakes, it’s worth a watch, but a TV programme that so many would view as a James Bond-esque, futuristic, look at what technology ‘might’ do, is actually a very current look at what technology CAN do.

The TV show has a political theme, but one of the most prevalent unethical uses of this technology is more wide reaching: deepfake porn.

Yep, you read that right.

What is deepfake porn?

Well, exactly what it says on the tin. Pornographic content that is created using deepfake technology. So now that fake video could feature you engaging in sexually explicit acts – nice right? Just another thing to add to the list of ‘crap things on the internet’.

In 2023 an analysis of deepfakes online revealed that 98% of deepfakes are pornographic, and 99% of the victims are women.

You might be thinking that it’s a depraved joke reserved for celebrities, influencers, and those in the public eye, although the likes of Taylor Swift, Jenna Ortega, Alexandra Ocasio-Cortez and Georgia Meloni have all been victims of this hideous ‘trend’, there are thousands of ‘normal’ women that have also had the unfortunate experience, many of which are likely unaware that there are falsified images of them circulating the dark corners of the internet.

This isn’t technology that is exclusive to the illusive ‘Dark Web’ or secret circles, it is only a Google away from absolutely anyone being able to access it, you do not need any technical expertise, and knowledge of artificial intelligence, image manipulation or editing, you simply need to know how to use a web browser.

There are websites, and downloadable apps, that solely exist to ‘nudify’ women. You simply upload a photograph of someone, and the technology does the rest, producing you a photograph, or video if you choose, of the individual, undressed. You may be thinking ‘well, you can upload an image of a man’. You’re right, you could, but the result would be a man’s face superimposed on a woman’s body – what a display of equality… sigh.

After creation, these images/videos find their way to dedicated forums and groups where users share their ‘art’, tips on improving outputs, and also share their own lewd content demonstrating how they enjoy the hideous imagery they have created with no consent from their victims.

There are a multitude of reports from young girls that have been victims of this sort of abuse, some are just teenagers. Many are unaware of the content until contacted by friends and acquaintances – just imagine getting that phone call, or opening that text. I can remember being around 18 and a local girl’s photographs had been posted on Facebook following a break-up, there was more chatter of ‘have you seen this?’ than ‘is she okay?’ – and that is the problem.

Data from UK based ‘Revenge Porn Helpline’ shows image-based abuse using deepfakes has increased more than 400% since 2017.

I find it hard to believe that we are living in a world where technology is being used to improve cancer survival rates, wildlife conservation, food waste reduction, and humanitarian aid, that there are huge numbers of individuals that would rather contribute their time and energy to creating life-damaging images of women.

The reality is that this is a data protection issue. An issue that my job role quite literally exists to prevent.

The good news

The good news is that there is a genuine appetite to stop it. 

The UK’s Online Safety Act 2023 amended the Sexual Offences Act 2003 to criminalise the sharing of intimate images that appear to depict another person without their consent, encompassing deepfake content. However, the production of the material remained a legal grey area, prompting further legislative proposals to close this gap.

In January 2025, the UK government announced plans to criminalise the creation and distribution of sexually explicit deepfakes without consent. This move aims to directly address the growing misuse of AI technology to produce realistic but fake intimate content, which has disproportionately targeted women and girls.

Following advocacy efforts, the government reconsidered an amendment to the Data Use and Access Bill (DUAB) that would have previously required victims to prove the perpetrator’s harmful intent in deepfake cases. This shift underscores the importance of consent-based laws to protect victims effectively.

There are some tremendous women working tirelessly in this space, for anyone interested in further reading I encourage you to look at Clare McGlynn’s work on image-based abuse, specifically deepfake porn – her website provides an array of information and links to key resources and articles that dive deeper into the epidemic than my role will ever allow. 

Although progress is being made, and recognition from key individuals is growing, we cannot afford complacency. The reality is that technology is advancing faster than legislation.

Conversations like this, acknowledging the risks, challenging the misuse of AI, and advocating for stronger protections, are crucial. The fight against AI fuelled image-based abuse isn’t just about law; it’s about changing attitudes, demanding accountability, and ensuring that innovation serves us all in a safe and ethical way.

So, for those of you that want to know why I chose to stick with a career in data protection after years (and thousands of pounds) of university level education in a different field, or want to know ‘what do you actually do’, this is why, and this is what – I get to be a part of an industry that is contributing towards change in areas you would never imagine, that genuinely impact all of us on a level that none of us could have predicted years ago.

related posts

Get a Free Consultation