AI · · 3 min read

AI-Powered Cyber Threats Are Targeting Australian Businesses β€” Here's What You Need to Know

51% of Australian organisations have already encountered AI-powered cyber threats. From deepfake scams to prompt injection attacks, the threat landscape is evolving fast.

The cybersecurity threat landscape has shifted fundamentally. Attackers are now using artificial intelligence not as a novelty, but as a core part of their toolkit β€” and the impact on Australian businesses is already measurable.

According to recent industry research, 51% of Australian organisations have encountered AI-powered cyber threats in the past year. Three-quarters of organisations report that threat volume has at least doubled, and 16% have seen it triple. Malicious email activity alone jumped 131% during 2025.

These aren't theoretical risks. They're operational realities that boards and leadership teams need to factor into their risk planning.

How attackers are using AI

The most immediate impact is in social engineering. AI-generated phishing emails are now sophisticated enough that traditional detection methods β€” looking for poor grammar, unusual sender addresses, or generic greetings β€” are far less reliable. Attackers can craft highly personalised messages at scale, using publicly available information to make their lures convincing.

Beyond email, deepfake technology has matured rapidly. Voice cloning and video deepfakes are being used for impersonation attacks, particularly targeting executives and finance teams. Industry reporting indicates that known deepfake scams have exceeded $25 million in losses over the past 12 months, and the trend is accelerating.

Automated malware campaigns are also evolving. WatchGuard's Q4 2025 threat landscape report for Australia recorded over 96,000 network attacks blocked against just 8,500 malware attacks β€” a shift that reflects attackers moving toward network-level exploitation. Unique endpoint malware rose by over 1,500% in the second half of 2025, and evasive malware detected over encrypted connections (TLS) increased by nearly 2,000%.

The emerging risk: prompt injection

For organisations integrating large language models into internal systems β€” whether for customer service, document processing, or data analysis β€” prompt injection represents a new class of vulnerability.

This attack vector exploits the way AI models interpret instructions. By embedding malicious instructions in documents, emails, or data that an AI system processes, attackers can potentially cause the model to leak sensitive information, execute unintended actions, or bypass security controls.

Data spillage is a related concern. Incidents are emerging where staff upload sensitive commercial or personal information to public-facing AI tools, creating exposure that traditional data loss prevention controls weren't designed to catch.

The cost to Australian businesses

The financial impact is escalating. Large Australian organisations experienced a 219% increase in average cyber crime costs during the most recent reporting period, reaching approximately $202,700 per incident. For businesses in healthcare, finance, and logistics β€” sectors with high digital dependency β€” the exposure is particularly acute.

Practical steps for your organisation

Review your AI acceptable use policy β€” or create one if you don't have it yet. Ensure your incident response plan accounts for AI-powered attacks including deepfake impersonation. Train your finance and executive teams on the specific tactics being used. And if you're deploying AI tools internally, assess them for prompt injection vulnerabilities before they go live.

Looking ahead

AI is changing both sides of the cybersecurity equation. Defenders are using it to detect anomalies and respond faster, but attackers have a head start in many areas. The organisations that fare best will be those that understand these threats specifically β€” not just as a category, but as operational risks that require concrete controls.

The days of treating AI security as a future concern are over. For Australian businesses, this is a present-tense problem that demands present-tense action.

Kaurna Acknowledgement

We acknowledge and pay our respects to the Kaurna people, the traditional custodians of the ancestral lands on which we work. We acknowledge the deep feelings of attachment and relationship of the Kaurna people to country and we respect and value their past, present and ongoing connection to the land and cultural beliefs.