Singapore Man Rejects $8,000 Hacker Demand After Deepfake Porn Uses His Images

Singapore Man Rejects ,000 Hacker Demand After Deepfake Porn Uses His Images

The Unwanted Deepfake: Owen’s Accidental Invasion of Privacy

Imagine you’re just scrolling on your phone, and before you know it, your face is in a porn video that never existed. That’s exactly what happened to 20‑year‑old Singaporean Owen, who found himself in the center of a deepfake crisis after a mysterious overseas call.

The Unexpected Call and the First Red Flag

  • Owen answered a random UK number while snoring—apparently his phone was being policed by a hacker.
  • Within seconds, the intruder scrambled his device, siphoning contact lists and media files.
  • That stolen data was used to graft Owen’s likeness onto a pornographic clip that was never shot with his consent.

His unfamiliar contact Ednes Lee received the video on April 21 and immediately posted her disbelief on Facebook. The post read:

“BRB (Be right back), time to go wash my eyes. The person literally sent me the whole damn video to see if I recognise him. Oh dear god.”

Enter the “Hacker” – The Deepfake Mastermind

Days later, Owen’s police report revealed his side of the story. According to him:

“I was sleeping. I never opened my eyes then I answered already… I thought it’s my manager calling me.” He claimed that the brief interaction was enough to give the hacker a foothold.

This user, who identified himself in the message thread as Lori, threatened to unleash the footage unless Owen paid a hefty sum. In Chinese, Lori warned:

“It will only infuriate me and I will make your reputation suffer more. As long as you listen to us and pay a small fee, we will delete the footage and not bother you.”

The Chain Reaction

Owen’s statement—alongside a mention that friends had been entrapped in this same scheme—prompted a police investigation after he filed a formal report. The incident is a stark reminder of how deepfake technology can be weaponised.

Beyond the Gamer’s Meme: Deepfake’s Dark Side

While some folks delight in AI‑crafted memes, history shows more sinister uses:

  • In March 2019, fraudsters nudged a UK energy firm’s CEO into making a fraudulent transfer of $243,000 (≈ S$336,000).
  • Similarly, the deepfake that targeted Owen is a clear example of AI masquerading as reality.

Takeaway

Deepfake tech isn’t just a joke for gamers, it can be a bullet in someone’s mouth. If you ever think your phone might be targeting you, keep your software updated, beware of strange calls, and remember that your privacy is a right.

For more unsettling stories on how AI is blurring the line between truth and fiction, stay tuned.\

Email for further details: [email protected]

</assistantanalysisCheck formatting: no code blocks, no markdown. They used

tags etc. Good. No images or embed tags. Should not have high-level list after note. We ended with

for more unsettling stories… I wrote a narrative but might not be allowed to include “stay tuned” but it’s fine. It’s all HTML. We preserve email. Ensure no extraneous.
This seems okay.assistantfinal

The Unwanted Deepfake: Owen’s Accidental Invasion of Privacy

Imagine you’re just scrolling on your phone, and before you know it, your face is in a porn video that never existed. That’s exactly what happened to 20‑year‑old Singaporean Owen, who found himself in the center of a deepfake crisis after a mysterious overseas call.

The Unexpected Call and the First Red Flag

  • Owen answered a random UK number while snoring—apparently his phone was being policed by a hacker.
  • Within seconds, the intruder scrambled his device, siphoning contact lists and media files.
  • That stolen data was used to graft Owen’s likeness onto a pornographic clip that was never shot with his consent.

His unfamiliar contact Ednes Lee received the video on April 21 and immediately posted her disbelief on Facebook. The post read:

“BRB (Be right back), time to go wash my eyes. The person literally sent me the whole damn video to see if I recognise him. Oh dear god.”

Enter the “Hacker” – The Deepfake Mastermind

Days later, Owen’s police report revealed his side of the story. According to him:

“I was sleeping. I never opened my eyes then I answered already… I thought it’s my manager calling me.” He claimed that the brief interaction was enough to give the hacker a foothold.

This user, who identified himself in the message thread as Lori, threatened to unleash the footage unless Owen paid a hefty sum. In Chinese, Lori warned:

“It will only infuriate me and I will make your reputation suffer more. As long as you listen to us and pay a small fee, we will delete the footage and not bother you.”

The Chain Reaction

Owen’s statement—alongside a mention that friends had been entrapped in this same scheme—prompted a police investigation after he filed a formal report. The incident is a stark reminder of how deepfake technology can be weaponised.

Beyond the Gamer’s Meme: Deepfake’s Dark Side

While some folks delight in AI‑crafted memes, history shows more sinister uses:

  • In March 2019, fraudsters nudged a UK energy firm’s CEO into making a fraudulent transfer of $243,000 (≈ S$336,000).
  • Similarly, the deepfake that targeted Owen is a clear example of AI masquerading as reality.

Takeaway

Deepfake tech isn’t just a joke for gamers, it can be a bullet in someone’s mouth. If you ever think your phone might be targeting you, keep your software updated, beware of strange calls, and remember that your privacy is a right.

For more unsettling stories on how AI is blurring the line between truth and fiction, stay tuned.

Email for further details: [email protected]