But I Didn’t Know: That Video Might Be Fake. The Harm Isn’t.

Deepfakes, grooming and consent, what Canberra families and schools can do now.

Last week, a Canberra teen told her mum a video of her was “going around.” It looked real. It wasn’t. That is the new reality: AI can fabricate faces, voices and chats that feel true enough to hurt.

But I didn’t know… how fast this moved, how common it’s becoming, and how much our kids need us to talk about it early, without shame.

“Being tricked by AI is not a failure. It’s a feature of the tech.”

What is a deepfake, really?

A deepfake is media that has been synthetically created or altered to look and sound real. Using off-the-shelf tools, someone can map a person’s face onto another body, clone a voice from a short audio sample, or generate a fake chat or image that appears authentic. When these tools are used to create sexualised content without consent, it is a form of image-based abuse, even if no original “real” photo was shared. The harm is real because reputations, relationships and wellbeing are on the line.

The eSafety Commissioner’s resources for schools and families call this out clearly: online safety education must be evidence-based, proactive and victim-centred, with explicit attention to emerging harms like deepfakes and AI-assisted grooming.

Harm looks like this

  • Sextortion: An offender threatens to share a sexualised image (real or fabricated) unless money or more images are sent.

  • AI-assisted grooming: Fake profiles and AI-generated chats or voice notes make an adult sound like a peer.

  • Deepfake pornography: A teen’s school photo or social post is turned into a sexual image and circulated.

  • Social sabotage and bullying: “Nudify” apps used as a prank escalate into serious harassment and isolation.

The eSafety Commissioner has urged schools to take these incidents seriously and report criminal deepfakes to police. Reports note a sharp rise in cases where sexualised, AI-altered images are used for bullying and extortion, often sourced from ordinary school photos.

“Consent covers images, including AI-generated ones.”

Why this matters now

Deepfake sexual images are no longer fringe or “someday” issues. They are accessible, cheap and targeted, and they can be created from the most ordinary photos. National and international reporting points to a rapid rise in cases intersecting with bullying, extortion and grooming. That is why evidence-based education, not fear, is our best defence.

In Canberra, the scaffolding is here: curriculum integration, eSafety frameworks, national coordination and strong external partners. But the system is only as strong as our everyday habits, the way a student learns to pause, a teacher routinely rehearses reporting steps, and a parent keeps conversations shame-free.

What ACT schools are doing now

Online safety is embedded in learning. In the Australian Curriculum (Version 9), “Online safety” is a formal Curriculum Connection woven across year levels, learning areas and general capabilities. That means it’s not a one-off lesson; it is threaded through Health and Physical Education, Media Arts and more, including consent, protective behaviours and help-seeking.

Best practice guides teachers. The eSafety Commissioner’s Best Practice Framework for Online Safety Education provides a research-backed model for how schools design programs, develop staff capability and respond to incidents, with explicit coverage of emerging risks like AI-generated abuse. Learn More

National coordination is in place. Through the National Online Safety Education Council (NOSEC), education authorities and peak bodies coordinate approaches so schools aren’t tackling this alone. Learn More

External partners support classrooms. Schools can (and do) draw on trusted partners such as the AFP-led ThinkUKnow program and eSafety’s virtual classrooms and school hub. These bring specialist presenters, family resources and incident response guidance directly to school communities. Learn More

Child Safe Standards apply. In the ACT, every organisation working with children, including schools, must meet the ACT Child Safe Standards, which now have legislative force. This frames leadership, culture, policies and day-to-day practice that keep students safer — online included. Learn More

Local guidance for families. The ACT Education Directorate’s online safety pages gather practical tips for privacy, bullying and where to get help, signalling a whole-of-system approach alongside school-level action. Learn More

Bottom line: Canberra schools are not starting from zero. The building blocks exist, but they work best when teachers, students and families use them together.

Why brains fall for fakes (and why it isn’t your fault)

Deepfakes exploit how our brains process faces, voices and context. If an image carries social proof (“a friend sent it”), fits a believable setting (a school uniform, a familiar background) and is seen in a high-emotion moment (late at night, after a conflict), we are more likely to accept it as true. That’s why skills beat lectures: students need routine practice in pausing, checking and reporting, just like they practise road safety.

The 10-minute home plan

You don’t need to be a tech expert to create a digitally safe home. Try this short, shame-free routine any time: dinner, the drive to sport, or before bed.

1) Set the no-blame rule (1 minute).
Say: “If something online goes wrong, even if you clicked a link or sent a photo, you will not be in trouble for telling me. We solve it together.”

2) Spot-the-fake drill (4 minutes).
Pick a short clip or image (even a harmless filter example). Ask:

  • “What tells you this could be fake?”

  • “If this were about a friend, what would we do first?”
    The goal is confidence, not perfection.

3) Lock in the report steps (3 minutes).
Practise the order: Pause → Screenshot → Block/Report → Tell a safe adult. Agree who the “safe adults” are (family, coach, school wellbeing staff).

4) Save your lifelines (2 minutes).
Add the school’s incident contact, the platform’s report page and the eSafety reporting portal to your phone bookmarks so you are not searching under stress. Schools and families can find these pathways through eSafety’s schools hub and reporting guidance.

Repeat this weekly for a month, then monthly. Confidence is a habit.

Scripts that make hard talks easier

  • What is AI? “It is a tool that can make fake images, videos or chats that feel real. Being fooled is common. It is not your fault.”

  • Consent language: “Consent applies to images, including ones made by AI. If it wasn’t your ‘yes’, it’s not ok.”

  • If a fake targets you: “We will not panic or pay. We will collect evidence and report. Your safety is our priority.”

  • If a friend is harmed: “We support them, we do not share the content, and we help them tell a safe adult.”

  • If you made a mistake: “Thanks for telling me. Let’s fix it together.”

What teachers can do this term (low-prep wins)

Warm-up: Start a lesson with a two-minute “True/Trick” prompt: show a non-sensitive altered image and ask students for three reasons it might be fake. Build vocabulary (“context clues,” “artifacts,” “reverse image search”).

Consent refresh: Integrate consent scripts into Health and Media Arts, explicitly including AI-generated content. This aligns with the Curriculum and general capabilities.

Reporting role-play: In pairs, practise reporting a post to a platform and drafting a message to a safe adult. Debrief feelings as well as steps.

Parent loop: Send home a one-page “run the drill” handout with the no-blame rule and the four steps. Point families to eSafety’s school hub and family resources.

Partner in: Book a ThinkUKnow session or use eSafety virtual classrooms to reinforce messages from a trusted external voice. thinkuknow.org.au

Measure confidence: Run a quick anonymous poll (“I know how to report; I know who to tell”) before and after the unit.

When an incident happens

1) Stabilise the person. Prioritise wellbeing and safety first. Remind them it is not their fault.

2) Preserve evidence. Capture URLs, usernames and timestamps. Do not forward harmful content.

3) Report promptly. Use platform tools and the eSafety reporting pathway. In suspected criminal matters (e.g. child exploitation material or extortion), involve police. Guidance from eSafety emphasises timely reporting and a victim-centred response.

4) Communicate wisely. Limit internal circulation of the content. Use need-to-know principles.

5) Follow up. Put a safety plan in place, monitor for retaliation or repeat harm, and connect to counselling. Where relevant, inform and support families.

Where to get help (Canberra + national)

  • School wellbeing team: Your first local support when a student is impacted.

  • eSafety schools hub and reporting portal: Guidance for educators, parents and young people, with clear reporting pathways.

  • ThinkUKnow (AFP-led): Presentations and resources for schools, families and carers on child online exploitation trends and how to respond.

  • ACT Child Safe Standards: Expectations for organisations working with children, including policies and cultures that support online safety.

  • ACT Education online safety page: Up-to-date local guidance and links for families.

Pull-quotes for social and carousels

“If it looks real but wasn’t consensual, it is abuse, full stop.”

“Pause, screenshot, report, tell a safe adult.”

“Small routines build big safety.”

A note to parents and carers

You don’t have to know every app. When kids see adults learning too, they feel less pressure to get it right on their own. In this space, we don’t need perfect answers, we just need to keep the conversation going.


Purpose Media CBR will keep tracking this space because protective education works best when it reflects the world young people actually live in.

  • Concerned about image-based abuse or AI-enabled grooming? Use the eSafety Commissioner’s reporting pathways. eSafety Commissioner

  • Got a school program or family practice that works? We’d love to feature it. Email hello@purposemediacbr.au.

Because while AI may be complex, the need to keep our kids safe isn’t.

Next
Next

People with Purpose: Aliesha Embleton