Top image: Marc Clarence Beraquit / RICE File Photo
There, I said it. It’s deplorable, it’s concerning, but the Sports School deepfake incident wasn’t a big shocker.
Again, it was disappointing and worrying that teenage boys were editing fake nudes of their schoolmates. But given the obscene Telegram channels that have popped up over the years, this seems almost predictable.
As an aunt to two young ladies aged 16 and 18, though, I can’t help feeling helpless. Sure, the school informed the victims’ parents, questioned the students involved, confiscated all their mobile phones and devices, and meted out “disciplinary actions”.
But how do we know we’ve truly stopped them from repeating their actions? Or that they really grasp the harm and damage they’ve done?
There’s little to stop them. A quick keyword search can lead them straight back to deepfake sites. It’s scarily simple to churn out a deepfake at an age when you can ask ChatGPT anything. It’s free, no programming skills needed, and in less than 30 seconds, another woman could become a victim.
Realistically speaking, anyone with an online presence is a potential deepfake victim. So how do I even begin to warn my young nieces about dangers like deepfakes? You’d think talking about topics like sex and online harms would get easier with experience.
But as the thought continues to weigh on my shoulders, it’s valid to ask ourselves: Why are these topics so hard to bring up when you’re the adult, the “protector”?
Is It Punishment Enough?
The shock of the sports school incident might have made deepfakes the centre of local conversation for a while, but online deepfakes have existed since as early as 2017.
It first started with fudged explicit clips of celebrities—the one of ‘Gal Gadot’ having sex with her stepbrother comes to mind. As the technology spread, influencers found themselves falling victim. And now, private individuals (and minors!) have to reckon with the possibility of innocuous images of them being spliced onto nude bodies.
This incident is much, much more than just kids mucking around online. It’s insidious that minors are objectifying young girls. And it’s a troubling indicator of how they view women.
Like a good number of online commenters I saw, the Sports School deepfake scandal sparked anger in me. Some even called for the perpetrators to be expelled.
I’d argue, though, that beyond the kneejerk anger, it’s also crucial to ask: How did this happen in the first place? And how are we going to keep it from happening? Are these kids outliers? Or are they a symptom of a deeper rot festering unnoticed?
Yes, the school curriculum covers online dangers. It teaches kids to “stay alert”, look out for one another, and seek help from trusted adults.
Here’s a hard pill to swallow. Who do the kids turn to when the very lessons they’re taught in the classroom contradict their reality—that they’ve fallen victim to the people they spend time with and trust?
And is enough emphasis being given to deepfake technology’s role in the spread of non-consensual explicit images? To combat the rise of deepfakes, the Singapore government has set out to develop new detection tools, but these are largely aimed at preventing misinformation and scams.
I can’t help wondering: What about deepfake nudes? They’re understandably harder to detect as they can be circulated anonymously or in private chat groups. But perhaps more resources can be afforded to studying the phenomenon and ways to curb it. After all, it’s a new-age crisis that’s dangerous enough for South Korea to have acted on it.
In other words, there has to be more we can do aside from telling our young kids to be wary of online predators.
The Gender Divide
When I was younger, I resented being told what to wear. Why blame the victim? Why not tell the perverts not to stare?
Now that I’m older, I find myself wanting to tell my nieces to be wary of what they upload online.
The boys involved in the deepfake scandal were caught, but how many more wolves in sheeps’ clothing are lurking out there that haven’t been? The fact is that in this case, and in a majority of nonconsensual intimate imagery (NCII) cases overseas, women are the victims.
Between 90 to 95 percent of deepfakes are non-consensual porn, Sensity AI, a research company that has tracked online deepfake videos since December 2018 estimates. And—surprise, surprise—90 percent of this non-consensual porn is of women.
My nieces spend their time outside school simping over K-pop idols with their friends. They’re still embarrassed to talk about the cute boy in their class. You know, normal teen things. It boggles my mind that boys their age could be spending a ridiculous amount of time splicing nudes of their classmates instead of, maybe, Roblox or Minecraft.
I have to admit, I still see my nieces as girls. Telling them about the realities of womanhood almost feels like bursting their bubble and destroying their innocence.
But it’s a hard truth. Some still see women as objects of reproduction, caregiving, and sexual pleasure. Look no further than the people dehumanising women and manipulating their images for their own pleasure.
Adam Dodge, the founder of advocacy group EndTAB, says it’s a power play: “What a perfect tool for somebody seeking to exert power and control over a victim.”
In a perfect world, boys would know better than to create deepfakes. But the fact is that we don’t live in such a world.
My nieces’ lives are documented in detail on socials—new internship, new boyfriend, and, of course, an endless supply of selfies and #ootds. Nothing wrong with that. But I worry that their active social presence could leave them vulnerable to deepfake manipulation. And I don’t want to wait till something happens to them to have that conversation.
Still, topics like sex and harassment are things that weigh heavily in the air. When is the right time to ask, “What do you know about consent?”
Irony Of Overprotection
Part of it is denial that my nieces are old enough, and I’m not ready to accept them as responsible young adults.
The other part of me instinctively needs to protect them from the horrors that the outside world can do to them. In my mind, they’re still ‘pure’ and ‘innocent’, and I want them to remain that way.
Somehow, I feel that opening the can of worms concerning themes like sexuality would end up tainting their mind—or worse—encouraging behaviours or ideas that should be avoided until they’re “age-appropriate”.
It’s this hesitation that prevents many adults—myself included—from having open, honest discussions. Then there’s a fear that a conversation about deepfakes might open the floodgates to other complicated questions or uncomfortable situations.
The irony of overprotection usually ends up in a paradox where kids are simultaneously overexposed and underprepared.
No matter how we lean into our misplaced desire to delay exposure to conversations like deepfakes, we have to learn to accept that they will encounter it sooner or later.
Leaving them to their own devices means we’ll have no choice but to accept the uncomfortable truth: that kids are learning it from somewhere. The world is a scary place, no matter how old we are. Pretending that it will all be okay and sweeping realities under a proverbial rug isn’t doing them any favours.
This sports school incident has left a bad taste in my mouth. It’s also forced me to face the truth that there’s never a right time for hard conversations. I just wish it didn’t have to come so soon.