A video of a dragging man with dark curls, ocher spots and an unmistakable Australian gait takes viewers through the land of red clay, talking excitedly about snakes, crocodiles and long-tailed eagles. But “Jalen” — the so-called Bush legend — is not real.
He is an AI-generated character created in New Zealand by South African content creator Keegan John Mason, and his growing popularity has raised serious ethical questions about “AI blackface” and digital cultural appropriation.
A non-existent viral “wildlife educator”
The bush legend has amassed a following of around 200,000 users across TikTok, Instagram and Facebook with videos filled with “bros” and “bros” that seem to celebrate Australia’s wildlife and ecosystems. His clip is soundtracked by a didgeridoo beat and depicts him wandering through the bushes, apparently searching for snakes and rare parrots.
But all pixels are manufactured. The video is completely AI-generated, including Jalen himself and the wildlife he “interacts with.”
Mason, who runs the Bush Legend: Wildlife Stories and Facts account, claims the page uses “AI-generated visuals to share wildlife stories for educational purposes.” He posted a message asking his viewers to subscribe to support his content creation journey.
Experts warn of ‘AI blackface’ and false representations
Kamilaroi scholar Dr. Tamika Worrell writes in The Conversation that the Bush saga is part of an alarming global trend she describes as “AI blackface.”
She wrote in her paper that this trend is “forming a new type of cultural appropriation,” allowing non-Indigenous creators to “generate Indigenous personas through AI based on stereotypical representations that fuse and appropriate cultures.”
In comments to PEDESTRIAN.TV, Worrell added:
“Indigenous portraits have been stolen, and many in the audience know that this is not a real person,” she continued.
On paper, the platforms say they are trying to make AI content more discoverable. TikTok’s community guidelines currently require creators to clearly label synthetic media, including videos that are “completely generated or heavily edited by AI,” and the app can automatically tag or demote content that is not properly published.
Meta, which owns Instagram and Facebook, has also introduced an “AI information” system and broader labeling rules to mark AI-generated images and videos with invisible watermarks and metadata, encouraging users to self-disclose when uploading synthetic content.
But because these systems rely so heavily on creators doing the right thing and users spotting tiny labels, many people casually scrolling through their feeds may still think characters like the Bush legend are real.

Very worrying misuse of technology
“The ethical and cultural concerns of AI Blakface are not new. Indigenous peoples around the world have noted the many sinister ways in which AI technology is being used to provide a false cultural front and ultimately distance themselves from Indigenous peoples,” Worrell told P.TV.
Dr. Terry Janke, a lawyer with Utashi, Yadaygana and Merriam and an expert on Indigenous culture and intellectual property, told the Guardian that while the video appears to be educational, it is “offensive and risks flattening the culture”.
“It’s a very insidious theft in that it also comes with cultural harm,” Janke said. “Because of discrimination… there is an impact of stereotypes and negative thinking, and the impact is even more severe.”
Corey Tutt OAM, Kamilaroi and founder of Deadly Science, told SBS NITV that the replication of cultural identities through AI is of great concern, particularly when it comes to consent.
“Even more troubling is the use of AI-generated images that resemble a deceased person, where the technology searches for and recreates a similar one,” he told the magazine.
Creator’s response
As criticism mounted, Mason’s AI avatar, Jalen, said in a video: “I’m not here to represent any culture or group, and this channel is just about animal stories. If you don’t like this, don’t worry, just scroll down and move on.”

Worrell told P.TV that Bush Legend’s attempt to address the backlash and encourage people to “just scroll and move on” was “a dismissal without accountability for the harm this (content) is creating.”
Those commenting on Mason’s video were equally unimpressed. “If it’s just about animals, don’t use Aboriginal portraits. Don’t use yidaki/didgeridoo music. It’s clear the kind of cultural image you’re trying to push is unethical because it’s not authentic,” musician Keian wrote.
“This is an insult to indigenous people!! Instead of appropriating other peoples and cultures, use your own face, Keegan!!!” another person wrote.

warning of what’s to come
Toby Walsh, a professor of artificial intelligence in New South Wales, told the Guardian that AI-generated media could easily absorb and reproduce racial or cultural bias. “They’re going to inherit the biases of that training data,” he said. “We’re going to push the boundaries between truth and falsehood.”
For Indigenous scholars, the Bush legend is more than just a quirky technology tale or an algorithmic party trick. It’s a glimpse into a future where culture is stolen, remixed, and sold back to audiences without any connection to the people from whom it was stolen.
The problem isn’t just how to spot a fake. It’s a matter of whether platforms, policymakers, and audiences are willing to draw the line when AI turns living culture into content.

