By Tom Rogers AO, Distinguished Advisor | National Security College | ANU.
Reprinted with permission from ANU / Tom Rogers.
Why Australia needs whole-of-nation digital resilience
Australia invests heavily in traditional defence systems, rightly recognising that the first duty of government is to defend against physical threats. Yet some threats strike deeper, exploiting minds and moods rather than land. Democracy’s enemies thrive when citizens are divided and distracted, not resilient and informed. Steps are under way to build digital resilience, but efforts must be accelerated so that trust is guarded as carefully as territory.
For the first time, entire populations now form their political opinions inside a digital environment. They live in a world of instant and often distorted information, which collides with the ancient process by which democracy renews itself: elections. Elections are the keystone of democracy, the element that locks freedom’s arch into place. They rest on trust, a public asset sustained only when citizens believe the system works, and without which losers do not concede.
To uphold that bargain we must share some common beliefs, not about politics, but about democracy: that it is worth preserving, that conflict can be resolved without violence, and that elections are not corrupt. Disinformation strikes at those foundations. Artificial intelligence fuels disinformation, but it is only the newest spark in a long-burning fire. Deepfakes matter, but so do half-truths and false claims about election processes that already populate social media feeds. Citizens without tools to handle that terrain are exposed to every form of digital manipulation.
Research and lived experience show that censorship laws, fact checking, or the false comfort of labelling.
are not enough. Heavy-handed bans erode trust because they deny adults the right to judge information independently. Instead, we need a layered defence that treats digital literacy as part of national security, coordinated across government, education, and civil society, and connects Australia to the global network of democracies.
The urgency is clear, yet investment in digital literacy, civic awareness, and social trust remains limited and fragmented, though these are as vital to democracy’s survival as missiles are to its borders.
Democratic vulnerability is not abstract; it is already being probed. Foreign interference exploits diaspora links and dual-national identities, amplifying offshore narratives to fracture trust within Australia. The Australian Security Intelligence Organisation (ASIO) has warned that foreign regimes are targeting diaspora communities with influence campaigns.[i] Strengthening resilience means including diaspora communities within national communication and education programs, not treating them as peripheral audiences.
The evidence
The V-Dem Institute’s 2025 report counts 91 autocracies and 88 democracies worldwide. Nearly three-quarters of the world’s population now lives in autocracies, the highest share since 1978[ii]. International IDEA’s Global State of Democracy 2025 reinforces this picture, showing sustained democratic decline across multiple regions and institutions. In countries such as Venezuela and Tunisia, the ballot box has become a tool of authoritarian legitimacy. The post-Arab Spring trajectory in the Middle East shows how democratic openings can collapse into authoritarian consolidation, often under external influence from larger powers[iii].
Trust has fallen, even in established democracies, when disinformation collides with political tension. The Lowy Institute’s interactive analysis of democratic erosion illustrates how reinforcing loops like polarisation gradually hollow out democratic norms[iv]. The digital environment magnifies the danger, with algorithms that reward outrage and undermine social cohesion.
Australia is a rare outlier. While trust in institutions has eroded elsewhere, surveys consistently show the Australian Electoral Commission (AEC) as one of the most trusted federal agencies. Clear communication such as the AEC’s Stop and Consider campaign, targeted strategies, professional delivery, and impartiality have kept public confidence in elections high[v][vi]. That trust is strong but not permanent; it depends on sophisticated and transparent communication, and vigilance against digital threats.
Work is also under way across jurisdictions. At the Commonwealth level, the Electoral Integrity Assurance Taskforce has coordinated agencies during election periods since 2018, identifying and addressing issues quickly. States, territories, and local governments have expanded education campaigns, curriculum reforms, and community initiatives to build civic awareness and resilience. The Disinformation in the City: Response Playbook shows that local institutions can fight disinformation directly[vii]. Civil society adds energy, with programs such as Squiz Kids’ News Hounds reaching tens of thousands of students, teaching them to spot misinformation. The Australian Media Literacy Alliance connects researchers, educators, and journalists, strengthening civic education. Classroom civics and democracy education at primary and secondary levels also provide early resilience, shaping habits of trust and participation in future voters. These efforts form a base, and with stronger national coordination they could add up to much more.
Collectively, these efforts show how democratic resilience depends on social as well as institutional trust.
Global situation
International IDEA’s 2025 study on AI, elections, and the European Union (EU) sets out concrete governance challenges and responses. According to International IDEA, the EU has advanced new approaches to transparency and oversight in political communication, signalling a wider effort to address digital campaign risks. Implementation across member states is not uniform, but the direction of reform is consistent. In the United States (US), regulators have begun addressing AI-generated robocalls under existing law.
Some states are also tightening rules for political ads, though federal coverage remains patchy. This fragmented approach illustrates the limits of state-based regulation in a global information system.
The United Kingdom (UK) has expanded its imprint requirements to cover online campaign material, and while compliance is uneven, the direction is clear. Other democracies go further. Finland integrates media education in schools and coordinates national programs. Taiwan applies prebunking through civic partnerships and rapid fact-checking networks. Democracies differ in emphasis, but a layered defence is clear: bans on process harms, transparency through provenance and ad disclosure, and resilience through prebunking and literacy.
What works
Not all tools against digital manipulation deliver what they promise. Some help, others disappoint, and a few risk making the problem worse.
Labels help, but only at the margins. Research shows selective labelling creates an implied truth effect[viii]. When some items are tagged, people assume unlabelled content is accurate. Legislatures often reach for labels because they create the appearance of action, yet they do little to build public resilience. “Made with AI” tags also confuse audiences, since they can cover everything from minor edits to content created entirely by AI. Labels are most useful when paired with tamper-proof provenance metadata that shows how and where digital content was made.
Provenance is stronger. The C2PA standard embeds this metadata at creation, allowing users to verify content’s origin and integrity. Users can confirm what is real, not just flag what is fake. Adoption now spans cameras, editing tools, and online platforms, with early use in media and election contexts[ix].
Broader digital literacy provides the foundation for all these tools. It equips citizens to question sources and understand the mechanics of online influence. Literacy programs that combine critical thinking, media analysis, and civic education strengthen the audience side of the information equation. Without that baseline, technical measures such as provenance or labelling achieve only limited effect.
Among literacy tools, prebunking has proven especially effective. Trials show that short videos explaining manipulation tactics, such as scapegoating or false dilemmas, improve people’s ability to spot and resist falsehoods[x]. Gains per person are modest but cheap, and durable. Importantly, without embedding in schools, community programs, and civic institutions, prebunking will not become a lasting defence.
Although Australia’s existing Digital Literacy Skills Framework (DLSF) provides a national benchmark for foundational digital skills in education, its use appears to be uneven, and it has not been widely applied in civic-resilience contexts. Existing tools should connect directly to democratic resilience, not sit siloed within education policy. Several democracies now classify election systems as part of their critical infrastructure, linking information integrity directly to national resilience. National coordination should occur across multiple domains through four linked policy priorities.
Protect, prove, prepare, partner
These four priorities show where effort and resources should be directed if resilience is to be more than rhetoric. This framework aligns with the policy recommendations presented at the start of the paper and seeks to apply them as an actionable sequence.
Protect the process
Democracies should target their strongest interventions at the specific harms that strike directly at electoral integrity. False claims about polling dates or locations, impersonation of election officials, and AI-generated suppression calls must be prohibited and enforced. This is not about restricting opinion or debate, but about protecting the machinery of democracy from deception and sabotage. Many of these practices are already unlawful in Australia, yet electoral and communications laws should be reviewed regularly to ensure penalties, detection, and referral processes remain effective in the digital era.
Legislative and regulatory reform will remain part of the broader response, from online-safety laws to debates on algorithmic transparency and platform accountability. These are necessary but contested tools, shaped by politics, enforcement, and global coordination. This paper takes a different focus: laws do not create resilient citizens, knowledge, habits, and civic practice do. And those can be strengthened now, without waiting for regulation to catch up.
Prove the source
Citizens need reliable ways to verify what is real. That means embedding machine-readable provenance signals, such as those defined by the C2PA standard, in digital content[ix]. Public ad libraries, like those emerging under EU transparency reforms, ensure every political sponsor can be identified. Provenance also helps smaller actors, including independents and civil society groups, guard against fakes. Clear provenance and transparent ad rules reduce the scope for covert influence and restore accountability across platforms.
Prepare the public
Digital literacy, including prebunking, remains a low-cost, scalable, and durable defence[x]. MIT findings show falsehoods are 70 per cent more likely to be retweeted and spread six times faster than truths[xi]. This underscores the need for embedded education that builds public immunity to falsehoods, rather than campaign- based efforts. Embedding digital literacy within schools, community programs, and civic institutions through long-term coordination would sustain these initiatives without requiring large new spending. Governments have devoted vast sums to advertising campaigns over the past two decades. Reallocating a fraction of that money toward digital literacy would build what advertising rarely delivers: lasting public trust.
Partner internationally
Australia should deepen structured dialogues with leading democracies. Exchanges on digital literacy would let Australia adapt proven models while contributing its own strengths globally. Partnerships could extend to election security and protection against foreign interference. This may also give Australia a stronger voice in shaping emerging international norms on digital integrity, AI transparency, and election security. International IDEA’s 2025 study on the EU experience[iii]. and ongoing work by the Australian National University (ANU) National Security College (NSC) both demonstrate how democratic resilience now sits within a shared global security agenda, connecting governance, trust, and information integrity.
Building these global links should be matched by coherence at home. A national framework for digital resilience would complement Australia’s Cyber Security Strategy 2030 and align with the Office of National Intelligence’s coordination role across the information and influence domains, both central elements of Australia’s existing security architecture.
To link these strands at home, government could establish a ‘National Digital Resilience Council’. The council would bring together agencies responsible for security, education, communications, and elections, along with academic and civil society partners. Its core function would be to map threats, align policy, and measure impact across portfolios, using existing staff and networks\ rather than building a new bureaucracy. Some may baulk, but the model need not be complex or costly. It could start as a small, time-bound inter-agency forum using existing staff and networks. What matters most is coherence and agility, ensuring Australia’s digital defences evolve as fast as the risks they address.
National security begins at the ballot box
Australia rightly invests in physical security, but defending democracy now requires citizens with the capacity to discern manipulation as much as it needs ships and soldiers. That battle is won by those citizens, and by institutions trusted to guard the process.
Elections are the cornerstone of democracy, and belief in their fairness is what keeps the structure standing. As such, the defence of that belief is a core task of national security policy.
This means national security begins at the ballot box, and protecting trust is the strongest, cheapest defence we can build. In a world where threats evolve faster than regulations, civic resilience through digital literacy is a cost-effective capability that scales.
About the author
Tom Rogers AO is a Distinguished Advisor at the NSC and the former Australian Electoral Commissioner, serving from 2014 to 2024. He is a member of the Advisory Board for the International Institute for Democracy and Electoral Assistance, and the Electoral Psychological Observatory at the London School of Economics.
About the series
Policy Options Papers offer concise evidence-based recommendations for policymakers on essential national security issues. Papers in this series are peer-reviewed by a combination of expert practitioners and scholars. Justin Burke is the series editor and Senior Policy Advisor at NSC.
About the College
NSC is a joint initiative of The Australian National University and Commonwealth Government. NSC offers specialist graduate studies, professional and executive education, futures analysis, and a national platform for trusted and independent policy dialogue.
E national.security.college@anu.edu.au
W nsc.anu.edu.au
@NSC_ANU
@ANU National Security College
@nscanu.bsky.social.
References
[i] Australian Security Intelligence Organisation (ASIO). (2025). Annual Threat Assessment 2025. Canberra: ASIO. Available at: https://www.oni.gov.au/news/asio-annual-threat-assessment-2025 (Accessed 3 November 2025).
[ii] V-Dem Institute. (2025). Democracy Report 2025: Autocratization Surges, Democracy at Risk. Gothenburg: University of Gothenburg. Available at: https://v-dem.net/documents/54/v-dem_dr_2025_lowres_v1.pdf and https://v-dem.net/publications/democracy-reports/ (Accessed 3 November 2025).
[iii] International IDEA. (2025). Safeguarding Democracy: EU Development at the Nexus of Elections, Information Integrity and AI and The Global State of Democracy 2025 (including Democracy Tracker country data for Venezuela and Tunisia). Stockholm: International IDEA. Available at: https://doi.org/10.31752/idea.2025.46 and https://www.idea.int/gsod (Accessed 3 November 2025).
[iv] Khalil, L., Woodrow, P., Paterson, J., & Kaufman, R. (2025). Understanding Democratic Erosion. Sydney: Lowy Institute. Available at: https://interactives.lowyinstitute.org/features/democratic-erosion/ (Accessed 3 November 2025).
[v] Australian Electoral Commission (AEC). (2022). Stop and Consider campaign. Canberra: AEC. Available at: https://www.aec.gov.au/elections/electoral-advertising/stopandconsider… (Accessed 3 November 2025).
[vi] Australian Public Service Commission (APSC). (2024). State of the Service Report 2023–24: Trust and Satisfaction in Australian Public Services. Canberra: APSC. Available at: https://www.apsc.gov.au/initiatives-and-programs/workforce-information/research-analysis-and-publications/state-service/state-service-report-2023-24/operating-context/australian-public-services-trust-and-satisfaction (Accessed 3 November 2025).
[vii] Trijsburg, I., Sullivan, H., Park, E., Bonotti, M., Costello, P., Nwokora, Z., Pejic, D., Peucker, M., & Ridge, W. (2024). Disinformation in the City: Response Playbook. University of Melbourne & German Marshall Fund. Available at: https://doi.org/10.26188/26866972 (Accessed 3 November 2025).
[viii] Pennycook, G., Bear, A., Collins, E., & Rand, D. G. (2020). The implied truth effect: Attaching warnings to a subset of fake news headlines increases perceived accuracy of headlines without warnings. Management Science, 66(11), 4944–4957. Available at: https://doi.org/10.1287/mnsc.2019.3478 (Accessed 3 November 2025).
[ix] Coalition for Content Provenance and Authenticity (C2PA). (2024). C2PA Technical Specification v2.2. Available at: https://c2pa.org/specifications (Accessed 3 November 2025).
[x] Roozenbeek, J., van der Linden, S., Goldberg, B., Rathje, S., & Lewandowsky, S. (2022). Psychological inoculation improves resilience against misinformation on social media. Science Advances, 8(34), eabo6254. Available at: https://doi.org/10.1126/sciadv.abo6254 (Accessed 3 November 2025).
[xi] Vosoughi, S., Roy, D., & Aral, S. (2018). The spread of true and false news online. Science, 359(6380), 1146–1151. Available at: https://doi.org/10.1126/science.aap9559 (Accessed 3 November 2025).