A parent's guide to deepfakes (2024)

Children and young people might struggle to recognise fake videos and audio, especially as technology becomes more sophisticated. Additionally, the increased accessibility to AI tools might mean that more people can create deepfakes. This can therefore increase the reach of potential harms.

False information and propaganda

Online users can use deepfakes to:

  • spread false information;
  • ruin trust in public figures; or
  • manipulate conversations around politics and other important issues.

Children are still developing critical thinking skills, so they are particularly vulnerable to believing this kind of information.

Reputation damage

Some reports link deepfakes to revenge p*rn.In these cases, the perpetrator adds the victim into compromising content such as p*rnographic videos or imagery.

The perpetrator might then use these deepfakes to coerce victims. This could include demanding payment or real images to stop them from sharing the deepfakes more widely. Learn more about this practise with our guide to sextortion.

Cyberbullying, harassment and abuse

Perpetrators might also use deepfakes to bully others by creating videos meant to mock, intimidate or embarrass them.

The nature of deepfakes might make the bullying more severe for the victim. It might even border on abusive behaviour. Learn what child-on-child abuse might look like.

A 2023 Internet Watch Foundation (IWF) report warned of increasing AI-generated child sexual abuse material (CSAM). They identified over 20,000 of these images posted to one dark web CSAM forum over a one-month period. They judged more than half of these as “most likely to be criminal.”

While this number doesn’t include deepfakes, the IWF says “realistic full-motion video content will become commonplace.” They also note that short AI-generated CSAM videos already exist. “These are only going to get more realistic and more widespread.”

Financial loss and scams

Some audio deepfakes or voice cloning scams cause victims to lose money. Public figures have also had their likeness used to promote scam investments.

One example is of YouTuber Mr Beast, who seemed offer his followers new iPhones. However, it wasn’t actually him. YouTube is popular among children and teens. So, deepfakes that mimic their favourite creators can leave them open to these scams.

Examples of deepfake scam videos

Display video transcript

Well, let's stay with technology because artificial intelligence is fueling a boom in cyber crime. The cost is expected to hit 8 trillion dollars this year, more than the economy of Japan according to one estimate by cyber security experts. The world's biggest YouTuber is among those who've had their video image manipulated by AI to promote a scam, and even BBC presenters are not immune.

Have a look at this: British residents no longer need to work, that's the announcement made by our guest today, Elon Musk. Who will unveil a new investment project while the connection is going on. I will tell you more about this project that opens new opportunities for British people to receive a return on investment. More than three billion dollars were invested in the new project, and it is already up and running at the moment.

Strange, isn't it? Looks like me, sounds like me, you may say. It's kind of hard, isn't it, to get your head around this? And so I spoke to Stephanie Hair, she's an independent technology expert, I should say.

Stephanie Hair, an independent technology expert, is interviewed about the difficulty of spotting deepfakes.

``Many people find it so difficult to spot these things,`` Hair says. ``And you're not the first and I don't think you're going to be the last, unfortunately, because there's nothing really to stop this from happening. There's no regulation really to either hold anybody to account. I'm not sure who you would get any joy from if you wanted to sue for example. Now, we got onto Facebook and said you need to take this down, it is fake and it has done... they have done that, Meta has done that since. However, uh, there are plenty more deep fake videos out there pointing viewers to scams and the worry is people are actually parting with their own money because they believe they're genuine and this is the worry, how do people tell between what is real and what is not?``

``Honestly, I'm not sure that trying to tell from the sort of technical limitations because we were able to see even with your video that there were certain things that were not quite right. I made it very clear that it wasn't legit. What you really want to be doing is asking yourself if it sounds too good to be true, it probably is. If it seems a bit weird, it probably is. Uh, there is no such thing as a free lunch.``

``There certainly isn't and I've uh tweeted or shared an article about this written by the BBC that gives you top tips on how to spot deep fake videos.``

A parent's guide to deepfakes (2024)

References

Top Articles
Latest Posts
Article information

Author: Terrell Hackett

Last Updated:

Views: 5936

Rating: 4.1 / 5 (52 voted)

Reviews: 91% of readers found this page helpful

Author information

Name: Terrell Hackett

Birthday: 1992-03-17

Address: Suite 453 459 Gibson Squares, East Adriane, AK 71925-5692

Phone: +21811810803470

Job: Chief Representative

Hobby: Board games, Rock climbing, Ghost hunting, Origami, Kabaddi, Mushroom hunting, Gaming

Introduction: My name is Terrell Hackett, I am a gleaming, brainy, courageous, helpful, healthy, cooperative, graceful person who loves writing and wants to share my knowledge and understanding with you.