Thursday, 23 Oct 2025
PurposeMediaCBR
  • Home
  • News
  • Community
  • People
  • Health
  • Services
  • Policy
  • Events
  • Contact
Font ResizerAa
PurposeMediaCBRPurposeMediaCBR
  • News
  • People
  • Services
  • Community
  • Policy
  • Stories
  • Health
  • Events
  • Voices
Search
  • Home
  • Categories
    • News
    • People
    • Health
    • Services
    • Community
    • Policy
    • Stories
    • Events
  • Personalized
    • My Feed
    • My Saves
    • My Interests
    • History
    • Members
    • Contact
Have an existing account? Sign In
Follow US
© Pupose Media CBR. Powered by techBean. All Rights Reserved.
NewsStories

Sex Ed and AI: Are Canberra’s Classrooms Ready for Deepfakes and Digital Consent?

July 15, 2025
Share
SHARE

Artificial intelligence is shaping the world young people are growing up in, but is the classroom keeping pace?

National child protection organisation Act for Kids called for Australia’s sex education programs to be urgently updated. Their message was clear: the digital landscape has changed, and AI is creating new and dangerous avenues for sexual exploitation.

“Artificial intelligence is changing the way sexual exploitation and harm occurs,” said Dr Katrina Lines, CEO of Act for Kids.

“AI technology is increasingly being used to create deepfake pornography, which is fake sexual imagery that looks real, without the person’s consent. This is a form of image-based abuse and can cause significant harm to victims.”

These concerns aren’t hypothetical. They’re happening now. And the organisation believes education is our first line of defence.

“We must ensure that young people understand the risks associated with AI technology, how to protect themselves, and where to go for help if something goes wrong.”

What does that look like in practice?

To find out what’s already being done here in Canberra, Purpose Media CBR contacted Minister for Education Yvette Berry, seeking clarity on the ACT Education Directorate’s current position and actions.

In response, an ACT Government spokesperson confirmed that the Directorate is actively working to address the challenges and risks that come with artificial intelligence (AI), especially concerning deepfakes and the misuse of images, within our schools.

Education that meets the moment

According to the spokesperson, online safety is “explicitly integrated into the Australian Curriculum Version 9 as a curriculum connection across all year levels, Learning Areas, and General Capabilities.” This broad integration is designed to support schools in delivering relevant content throughout a student’s learning journey.

To ensure educators are not left behind, the Directorate confirmed that professional learning opportunities and aligned resources are made available for teachers. These materials, they noted, are consistent with the eSafety Commissioner’s Best Practice Framework for Online Safety Education, which specifically includes emerging issues like AI-generated grooming and deepfake pornography.

“Resources and professional learning are available to equip our educators with the knowledge and skills to understand and address the intersection of AI, consent, and image-based abuse, with curriculum support through the Directorate also available.”

This reflects a proactive position, but also a recognition that AI-related abuse is now part of the real-world safety toolkit young people need.

National collaboration and local delivery

The ACT Education Directorate also highlighted its commitment to regular review processes and collaboration through national partnerships.

“Through regular review processes and active participation in national collaborations, such as the National Online Safety Education Council (NOSEC), the ACT Education Directorate consistently updates its protective education materials.”

These updates are guided by research from the eSafety Commissioner and aligned to the ACT Child Safe Standards, ensuring that materials stay relevant to the fast-changing nature of digital harms.

In terms of delivery, schools aren’t expected to do it alone. The Directorate supports schools to access a range of evidence-based, trusted partners, including:

  • AFP’s Think U Know program

  • eSafety Commissioner’s virtual classrooms

  • Approved external online safety education providers who work directly with students

“These partnerships are crucial in strengthening our programs and providing comprehensive support for our students.”

Why it matters

The concerns raised by Act for Kids go beyond curriculum, they touch on the lived digital experiences of young people.

They said, “Offenders can use AI to create fake profiles, conversations, and even videos to groom children and young people online. This can make it even harder for them to identify when they are being targeted.”

That difficulty in detecting harm is what makes these conversations urgent. The tools that enable manipulation, previously reserved for experts, are now widely available and easy to use.

Deepfake pornography and AI-enabled grooming are serious forms of sexual exploitation and abuse. They can have devastating impacts on victims, including mental health issues, social isolation, and even suicide.

For parents, educators, and policymakers, this raises a shared question: Are we equipping young people with the awareness and agency to stay safe, and to seek help when they need it?

The ACT Education Directorate believes it is.

“Through the combined and proactive efforts, the ACT education system is well-prepared to handle both the challenges and opportunities that AI presents, fostering a safe and informed learning environment for all students.”

It’s a strong response, and one that places Canberra among the jurisdictions taking early steps to respond to AI-related harms. But for the community, there is still room for conversation.

What does this look like in practice across schools? Are teachers feeling confident and resourced to lead these discussions? How can families complement school-based education at home?

What Parents Can Do at Home: Learning Together!

While schools are stepping up to respond to AI-related risks, parents and carers remain one of the most important sources of safety and support. And with AI moving faster than most of us can keep up, it’s critical that families learn together, not just rely on children to absorb these lessons alone.

The truth is, many parents didn’t grow up navigating online spaces, let alone deepfakes or AI-generated scams. But you don’t need to be a tech expert to create a digitally safe home, you just need to stay curious, connected, and open.

Here are some simple, practical ways parents can complement classroom learning and support their children at home:

Talk about what AI is, without fear

Explain that AI can be used to create fake images, videos, or conversations that seem real. Let your child know that this technology isn’t inherently bad, but it can be misused and that being tricked by it is never their fault.

Create a shame-free zone for digital mistakes

Kids are far more likely to open up about something unsafe online if they know they won’t get in trouble. Let them know you’re a safe adult to talk to, even if something seems embarrassing, strange, or scary.

Stay updated with them

Use tools like the eSafety Parent Portal to learn about current risks, including deepfakes and grooming tactics. Watch an explainer video together and ask each other, “Could we spot a fake?”

Ask open questions

Try: “What would you do if someone sent you a weird video that looked real but felt off?” or “Do you know what sextortion is? Can we talk about what to do if something like that happens to a friend?”

Model critical thinking

If you see a clearly fake video or manipulated image online, talk about it out loud. Say things like, “Wow, this looks real, how can we tell it’s not?” Show them that even adults have to pause and question.

Because when kids see adults learning too, they feel less pressure to get it right on their own. In this rapidly evolving space, we don’t need perfect answers, we just need to keep the conversation going.

Purpose Media CBR will continue to follow this space, not just with curiosity, but with the belief that protective education is most effective when it reflects the world young people actually live in.

Because while AI may be complex, the need to keep our kids safe isn’t.

If you’re concerned about image-based abuse or AI-enabled grooming, support is available at www.esafety.gov.au/report.

If you’d like to share your school’s or your approach to this issue, we’d love to hear from you at hello@purposemediacbr.au.

TAGGED:Education
Share This Article
Facebook LinkedIn Email Copy Link Print
What do you think?
Love0
Sad0
Happy0
Sleepy0
Angry0
Surprise0
Previous Article People with Purpose: Daniel Bartholomaeus
Next Article People with Purpose: Ben Drysdale

Got a Story to Share?

Have a good news story, local hero, or service you want more people to know about. Get in touch, we would love to heard from you!
FacebookLike
InstagramFollow
LinkedInFollow
- Advertisement -
Ad image

You Might Also Like

NewsPeople

The Kings Birthday Honours for the ACT – OAM

June 8, 2025
News

How Robots Know Where They Are, Even Without GPS

May 27, 2025
CommunityNews

Building a Cyber Safe Canberra: Why Small Actions Matter This October

October 15, 2025
EventsNews

A Shield for the Vulnerable: Why the Red Shield Appeal Still Matters

May 29, 2025
PurposeMediaCBR
Facebook Linkedin Instagram Tiktok

Purpose Media CBR: We are Canberra’s good news platform, sharing the people, places and stories that create social impact and helping our community navigate the services that matter.

Top Categories
  • Events
  • People
  • Members
  • News
Usefull Links
  • Contact Us
  • Share your story
  • Complaint

© Purpose Media CBR. Powered by techBean. All Rights Reserved.

Welcome Back!

Sign in to your account

Username or Email Address
Password

Lost your password?