In a hot new press release, the Australian Association of Voice Actors (AAVA) is calling for mandatory labelling of AI-generated content, and it’s for good reasons

AI labelling simply means letting people know when what they’re hearing or seeing has been created or altered by artificial intelligence. It’s about transparency and trust.
The urgency was stated by Teresa Lim, Vice President of AAVA, who stressed the importance of protecting audiences from deception.
“We all deserve to choose if we want to engage with content that isn’t human-created,” she said, pointing to the recent use of an AI-generated radio announcer that misled listeners into believing a woman of Asian descent was on-air.
At the heart of AAVA’s campaign is the push for stronger protections. The goal is to ensure that AI content is clearly labelled, outlaw the creation of AI voice clones and deepfakes without explicit consent, and reinforce copyright laws to safeguard the work of creatives.
For the voiceover community, this transparency preserves the integrity of human performances.
Follow @voiceverseng on Instagram and across other socials for more updates from the voiceover world.