UNSW Business School has partnered with accounting giant PwC to develop a research-based solution to combat fraud, misinformation and AI-generated deepfakes.
As the world of social media, AI and content creators grows bigger than ever, a new framework has been created by PwC and UNSW Business School to leverage financial collateral to protect accountability and stop misinformation and deepfakes.
The organizations say the solution was created to hold content creators accountable for the accuracy of their claims.
This has been achieved through the introduction of “truthfulness bonds,” which transform the publication of content from a “risk-free activity to a financially responsible one” and require creators to stake funds guaranteeing the legitimacy of their content.
It was noted that this also includes the ability for readers to dispute claims by setting counter-bonds if they suspect inaccuracies, and is aimed at creating a robust environment for the pursuit of truth.
The research team behind the framework consisted of former PwC Senior Associate and UNSW Distinguished Fellow Lucas Barbosa, UNSW Business School Associate Professors Sam Kershner and Eric Lim, PwC AI Partner Rob Kopel, and former PwC AI Lead Tom Pagram.
Barbosa said the research team published their findings in a paper demonstrating how financial incentives can align market forces with truth-seeking behavior.
“The impetus for the research came from conversations about advances in AI and the futility of trying to spot deepfakes in the future,” he said.
“We realized that detecting deepfakes is a fool’s errand, and as these GenAI models become more intelligent, solutions focused on verifying authenticity become more robust.”
The motivation for creating this framework comes from research that identified a “growing crisis” in which deepfake fraud is expected to quadruple globally in 2024 and cybercrime, including identity fraud using deepfake technology, is now the most reported type of fraud worldwide.
Regarding the authenticity bond system, it was set up to utilize a closed-loop mechanism where content creators are required to stake collateral to reflect their trust in the material.
Mr Lim said this financial commitment would deter unnecessary conflict and ensure that all challengers face equal risks.
“If inaccuracies are proven, fines collected from inaccurate content will be used to fund rewards for accurate assessments,” he said.
“This mechanism is a way for independent content creators to enhance their reputation as the new gold standard in news dissemination, and a way for traditional news outlets to rebuild trust and build larger audiences on new social media platforms.”
Truthfulness bonds are expected to make the biggest difference within the insurance industry and deter false claims, as they allow claimants to vouch for the accuracy of their submissions.
“The economic model underlying truth guarantees incorporates multiple layers of incentives. Creators of higher-stakes content gain visibility and are incentivized to provide verified information,” Barbosa said.
“Challengers stand to benefit from stronger bonds through successful conflict and fostering a culture of accountability. This framework is designed to transform how we interact with information in the digital age and restore trust and authenticity in an increasingly complex information ecosystem.”

