A potentially revolutionary productivity booster, generative artificial intelligence is a tool that can backfire, leaving under-prepared employees confused and anxious. Executives need to carefully consider AI policy and training to reassure nervous staff-members and rein in those who will leap ahead and experiment with AI in a potentially damaging way, says Shaun Davies, founder of the Sydney-based AI Training Company start-up, and a former AI content specialist at Microsoft and Google. Based on large language models and now freely available, gen AI is upending workplace norms.
“If you don’t have a policy, and you’re not talking about how employees can and can’t use AI, you’re going to get some people who are just too scared to use it,” he says. “That also happens when you have a policy that’s completely dictated by the risk team, and it’s just ‘please do not use any of these tools’.”
At the other end of the spectrum are the employees who act independently and make their own way, so-called “shadow AI”, he adds, saying: “By not getting your finger on it early, you can find that this escapes and becomes quite a big problem down the track”.
Davies thinks employers need to specify how generative AI can be used in the workplace, to build trust with employees by setting firm boundaries and encouraging them to learn about its risks and its potential “safely and sensibly”.
“If you’re in an industry that’s more exposed to AI, that it becomes more pressing to make sure that you are not just grounding people in the basics, but thinking of how you’re going to adapt to the disruption that’s going to come from the use of these tools,” he says.
Overall, the Australian level of training in AI remains low, he adds. A report by Jobs and Skills Australia released in August 2025 found that among ASX200 companies, 19 per cent mentioned AI training in 2023 a proportion increasing to 25 per cent in 2024.
“It is not always clear what training is being provided,” the report said, “but several companies flagged training focused on leadership development or executive training focused on AI strategy, ethics or management (18 per cent), with a smaller share allocating resources to training programs focused on how employees can use, operate or work with AI tools, platforms or applications in their daily work (12 per cent).
Davies suggests first surveying staff to ask how they were using AI. The survey should be anonymous, he adds, so employees can admit they have been adding confidential company data to models, or taking sensitive information home to work on it with AI.
Once executives understand the extent of AI usage and where staff think AI could usefully augment their work, then informed decisions can be made about which AI products should be introduced and how a company AI policy should be structured. “Too many people, start without taking that basic step and instead just race headlong into AI, which may well result in a failed rollout,” Davies says.
Professor Mary-Anne Williams, founder of the Business AI lab at the University of NSW says organisations need to govern AI properly because employees’ unmanaged AI use can lead to problems with intellectual property law and privacy issues.
Williams believes AI tools need to be designed for purpose within organisations and managed for that purpose.
“This is not something people can learn on the side,” she says. “People can play and experiment on their own, but once they start using organisational data or using it to generate IP, they should be given training in building AI capability so they can use it safely and responsibly.”
Companies should do a deep drive and really consider how AI affects them now and into the future, which will differ according to the industry and the types of customers involved, Williams says. Employees would likely struggle to fully understand the potential of AI on their own and the introduction of AI should be “contextualised”, she adds, so its use aligns with an organisation’s strategy and values.
Williams notes that MIT research released in August 2025 found 95 per cent of generative AI pilots delivered zero value and were failing. Larger organisations will likely have more difficulty with charting a clear AI course because they often struggle to implement change, and research had found it was important for executives to avoid rapidly replacing employees with AI because that sort of strategy often backfired.
AI has been perhaps the most-quickly adopted technology ever, Williams says, and executives need to have a solid understanding of its potential. AI courses are emerging across Australia, and Williams has embedded AI in numerous business innovation courses at the University of NSW.
“It’s so useful, it’s easy to use and it creates value,” she says, adding that understanding and skills are needed to capitalise on the tool. “Everybody needs to know, essentially, how the large language models work and how we can get the best out of them.”