Across Australia’s VET and higher education sectors, there is a growing consensus that cyber security matters – but a stubborn gap between what leaders say in public and what they actually do in governance forums. Globally, research with 1,000 CEOs shows that 96 per cent view cyber security as critical for growth and stability, yet only a small minority regularly dedicate board time to it, and many still treat it as a technical issue that needs attention only when something goes wrong.
For RTOs, dual-sector providers and universities, this mindset is profoundly dangerous. Education providers sit on some of the most sensitive data in the economy, depend on fragile webs of third-party platforms, and operate on seasonal peaks where even short outages cause disproportionate damage. Recent Australian breaches – including repeated incidents at Western Sydney University involving student management systems and cloud-based platforms – have illustrated that you can outsource software, but you cannot outsource accountability.
This article argues that cyber resilience in education is no longer a matter of firewalls and passwords. It is a leadership and governance imperative, tightly linked to the Standards for RTOs 2025, TEQSA’s expectations under the Higher Education Standards Framework, the Notifiable Data Breaches scheme and the Australian Government’s Essential Eight maturity model.
You will see how cyber risk intersects with compliance, why “episodic” attention is insufficient, and how boards can treat cyber in the same disciplined way they manage financial risk. The article offers practical examples for VET and higher education, a phased uplift roadmap, and a governance scorecard that boards can adopt now – before the next incident forces their hand.
1. The dangerous illusion: “We care about cyber, but IT has it covered”
Walk into almost any Australian VET or university boardroom, and you will hear senior leaders endorse cyber security in principle. They talk about reputational risk, student trust and the importance of protecting data. Yet when you study their governance rhythms, cyber often appears only as an occasional slide in a risk report or a crisis briefing after an incident.
Global research into “cyber-resilient CEOs” reveals just how common this disconnect is. In a survey of 1,000 CEOs from large organisations across multiple countries, 96 per cent agreed that cyber security is critical for organisational growth and stability, and three-quarters expressed concern about their own organisation’s ability to minimise damage from an attack. However, the same research shows that around 60 per cent do not build cyber security into strategies or products from the outset, more than 40 per cent see cyber as an issue that demands attention only from time to time, and only about 15 per cent hold dedicated board meetings on cyber security.
This pattern is not limited to the corporate sector. It shows up in education when boards delegate cyber entirely to IT committees, when budgets for core teaching and learning are rigorously debated, but cyber uplift is treated as discretionary, and when executive scorecards track enrolments and student satisfaction meticulously while cyber metrics are vague or absent. On paper, leaders understand the stakes. In practice, they behave as if cyber is a specialist technical project rather than a core element of institutional resilience.
This gap matters because attackers do not care whether your organisational chart says “IT owns cyber”. When ransomware encrypts assessment evidence, when a credentials breach exposes USIs and identity documents, or when a student management system hosted by a third party is compromised, the regulator will not ask whether your firewall rules were delegated properly. They will ask whether the governing body exercised effective oversight and whether risks were understood, planned for and managed across the enterprise.
2. Why Australian education is such an attractive target
The Australian cyber threat environment has become more hostile, not less. The Australian Signals Directorate’s 2023–24 Annual Cyber Threat Report notes that the Australian Cyber Security Centre responded to more than 1,100 cybersecurity incidents in that financial year and received over 87,000 cybercrime reports, an average of one every six minutes. Calls to the national cyber hotline rose by 12 per cent year-on-year, and around 11 per cent of incidents involved critical infrastructure.
Education may not always be listed formally as “critical infrastructure” in every jurisdiction, but in practice, it functions like it. Providers hold the identity documents, contact details, tax file numbers and demographic information of hundreds of thousands of learners, including international students. VET providers transmit detailed AVETMISS data to NCVER for funding and reporting, while universities hold research datasets that can be commercially and geopolitically valuable.
The sector is also heavily dependent on complex digital ecosystems. Student management systems, AVETMISS reporting tools, LMS platforms, cloud-based storage, online assessment tools, proctoring services, work-based learning platforms, virtual labs and CRM systems are often supplied, hosted and maintained by different vendors. A vulnerability in any one of these can compromise the whole chain.
Attackers understand the rhythms of education. They know when enrolments spike, when census dates fall and when exams are scheduled. They time extortion demands for moments when downtime feels intolerable. The ASD report highlights the continued prevalence of ransomware, business email compromise and exploitation of public-facing applications – all of which can be catastrophic in a context where continuity of teaching and access to records are essential.
Recent incidents at Western Sydney University illustrate the magnitude of the risk. Over a series of events, attackers were able to gain unauthorised access to cloud-hosted student management and storage platforms, leading to the exposure of data belonging to around 10,000 students and, in earlier incidents, the theft of large volumes of sensitive information via Microsoft 365 and other systems. The lesson is sobering: when a third-party system fails, it is the institution that faces media scrutiny, regulator questions and student anger. Vendors can help manage the technical fix, but they cannot absorb the loss of trust on your behalf.
In this environment, treating cyber as an occasional technical problem is not just naïve; it is negligent.
3. The regulatory web: from keyboard to boardroom
In Australia, cybersecurity for education providers is not only a matter of prudent risk management. It is deeply entangled with the legal and regulatory frameworks that determine whether an RTO or higher education provider is allowed to operate.
Under the Privacy Act 1988 and the Australian Privacy Principles, organisations that experience eligible data breaches must comply with the Notifiable Data Breaches scheme. OAIC data shows that notifications have been climbing, with the second half of 2024 recording the highest number of reported breaches in any six-month period since the scheme began, and education consistently appearing among the top sectors impacted. Malicious or criminal attacks remain the dominant source of breaches, but human error – misdirected emails, incorrect access controls, lost or stolen devices – continues to play a significant role.
For VET providers, the data picture is even more complex. The management of the Unique Student Identifier is governed by specific privacy and security obligations, and AVETMISS reporting creates regular flows of sensitive data from RTOs to government agencies. If these datasets are altered, intercepted or rendered unavailable, providers may find themselves unable to demonstrate accurate training activity or claim funding correctly.
The Standards for RTOs 2025, now in force for ASQA-regulated providers, embed governance, risk management and information integrity more clearly into the regulatory framework than ever before. Quality Area 4 and the associated Practice Guides emphasise that governing bodies must manage organisational risk in a systematic way, ensure accurate and transparent information and protect the integrity of nationally recognised training. A ransomware incident that destroys or locks assessment evidence, a breach that exposes confidential student support records or a system compromise that undermines the reliability of enrolment and completion data are not merely operational problems; they go directly to whether the provider meets key Standards.
In higher education, TEQSA’s “Compliance in focus: cyber security” and its consolidated cyber security resources make the link between cyber resilience and the Higher Education Standards Framework explicit. TEQSA has reported an unprecedented rise in notifications of cyber incidents in recent years and emphasises that controlling information is a core part of provider governance and information management obligations. A university that loses control over student records, research outputs or online assessment systems risks being found non-compliant with threshold standards on corporate governance, information management and academic integrity.
Sitting above both sectors are national cybersecurity expectations. The ACSC’s Essential Eight maturity model remains the benchmark for baseline mitigation, encouraging organisations to plan to a target maturity level across eight control areas, such as multi-factor authentication, application control, patching and backup resilience. For boards, asking “what is our Essential Eight maturity, and how is it changing over time?” is as basic as asking “what is our cash position?”
The message from regulators and the government is increasingly aligned. They do not expect providers to be unhackable. They do, however, expect providers to prepare, to detect quickly, to respond transparently and to design reasonable controls that reflect the sensitivity of the data they hold and the criticality of the services they provide. Pleading ignorance is no longer acceptable.
4. From “cyber project” to enterprise discipline
To move beyond slogans, education leaders need to abandon the old mental model of cyber as a project that IT “does” in the background. Cyber resilience should instead be treated as a continuous enterprise discipline, much like financial management or quality assurance.
Research on the small cohort of “cyber-resilient CEOs” suggests that these leaders behave differently from their peers. They embed security considerations into strategy from the outset, treat cyber risk as a shared responsibility across the executive, seek clear metrics on resilience and invest ahead of incidents rather than after them. They also reject the myth that implementing strong controls is more expensive than suffering an attack, recognising that the financial and reputational costs of major breaches can be far higher than the cost of prevention.
Translating this into VET and higher education means making at least four shifts.
First, cyber must become a standing item at the board and senior governance committee levels, not an annual update. Boards should expect regular, plain-language reporting on threat trends, recent incidents, Essential Eight maturity, third-party risk and preparedness for legal reporting obligations.
Second, security must be built into new initiatives from the start. In practice, this means that every new online course platform, digital micro-credential, work-integrated learning app or international partnership involving data exchange should pass a structured privacy and security impact assessment before contracts are signed. Retrofitting controls after a system is in production is invariably more expensive and less effective.
Third, accountability must be shared. The CIO or head of IT will continue to own the technical implementation, but the CFO, COO, academic leaders, registrar, HR and marketing functions all have distinct roles to play in funding resilience, planning for continuity, protecting assessment integrity, managing identity data, reducing insider risk and guarding the organisation’s reputation.
Fourth, cyber needs to be managed like financial risk. Boards set risk appetite and limits for operating deficits, liquidity and capital investments. They should also establish clear thresholds for acceptable cyber exposure, such as target Essential Eight maturity levels by specific dates, maximum tolerable downtime for key systems, and expectations for how quickly the provider must be able to identify and notify affected individuals in the event of a breach.
When these shifts occur, cyber stops being treated as a specialist “problem” and becomes part of how the institution thinks about its licence to operate.
5. A practical uplift roadmap: from fragile to resilient
Education providers rarely have the luxury of starting from scratch. Most have legacy systems, complex provider–vendor relationships and varying levels of internal capability. For this reason, it is helpful to think about cyber uplift as a staged program over several years rather than a single technology purchase.
In the first stage, which might cover the initial six to twelve months, the focus is on establishing a baseline and stabilising obvious weaknesses. Boards should approve an enterprise cyber risk appetite statement that sets top-level expectations. The organisation should conduct an Essential Eight self-assessment and identify gaps, then prioritise foundational controls such as multi-factor authentication for staff, students and contractors, restrictions on administrative privileges, basic patching discipline and resilient, tested backups that cannot be altered by ransomware. At the same time, the institution should harden identity and email systems – such as Microsoft 365 or Google Workspace – because these are the front doors that attackers most often use, particularly for business email compromise. An incident response plan, even if simple at first, should clearly define who makes decisions in a crisis, who speaks to regulators and who communicates with students and staff.
In the second stage, over the following year, attention shifts towards integration and governance. Change and project management processes should be updated so that no digital initiative can progress without a documented security and privacy assessment. The provider should develop a register of critical third-party suppliers, identifying which underpin core services like student records, learning management, proctoring and AVETMISS reporting. Contracts should be reviewed for breach notification obligations, audit rights and independent assurance such as ISO 27001 or SOC 2 reports where appropriate. Detection capabilities should be improved, for example, by deploying endpoint detection and response tools and establishing some form of continuous monitoring, whether in-house or via a managed service.
In the third stage, beyond the 18 to 36 month horizon, the goal is to embed continuous improvement and cultural change. Evidence for patching, backup tests and incident handling should be collected automatically where possible, both to support internal governance and to satisfy regulators. The provider should regularly test its security posture and crisis response through phishing simulations, tabletop exercises and, where feasible, external penetration testing or red-teaming. Staff training should evolve from generic awareness modules to tailored programs that address the specific risks different roles face, such as invoice fraud targeting finance, identity fraud targeting admissions and data exfiltration risks for research teams.
This roadmap is not prescriptive. Small RTOs may need to scale it down and rely more heavily on external partners; large universities may need more sophisticated architectures. The principle, however, is universal: stabilise, integrate, then embed.
6. Sector-specific realities: VET is not the same as a university
While there are common themes across education, VET, and higher education face distinct cyber challenges that should inform their governance and investment priorities.
In VET, the data pipeline to NCVER through AVETMISS submissions is mission-critical, as it underpins government funding and national statistics. Disruption or corruption of that pipeline can have immediate financial and compliance consequences. Many RTOs, particularly smaller providers, also rely on basic practices like receiving identity documents via email to create USIs or verify enrolment. This creates a high-risk environment where copies of passports, driver licences and Medicare cards are scattered through inboxes and shared folders. A more secure approach – such as using dedicated upload portals, strict access controls and clear retention and destruction procedures – is essential if providers are to live up to their privacy and security obligations.
VET providers also tend to have diverse delivery arrangements, including workplace-based training, third-party partnerships and online learning. This adds complexity to access management and device security, particularly where trainers and assessors use personal devices or share laptops across multiple sites. Clear policies, mobile device management and explicit expectations around data handling in workplace environments are therefore crucial.
In higher education, the risk profile is shaped heavily by research, scale and openness. Universities often hold large volumes of commercially sensitive and strategically valuable research data, sometimes in fields that attract foreign state interest. Guidance on foreign interference in universities, including from the Australian Government and allied jurisdictions, explicitly recognises cyber channels as a vector for influence and theft. Open campus networks, complex identity structures and extensive international collaboration all make robust identity governance and segmentation vital.
Universities are also deeply invested in online assessment and proctoring, which creates significant reliance on third-party providers. A failure in a proctoring platform during exam periods can rapidly become both a cyber incident and an academic integrity crisis. Governance arrangements should ensure that academic boards understand this dependency and that contingencies exist for critical assessment events.
Both sectors rely heavily on trust – from students, industry partners, governments and communities. A major breach does not just create operational headaches; it can erode enrolments, damage graduate employability narratives and invite harsher regulatory oversight.
7. Generative AI: sharper tools for both sides
Generative AI is reshaping the cyber landscape in ways that are directly relevant to education. From an attacker’s perspective, AI makes it easier to craft convincing phishing emails, impersonate officials, generate malware and create deepfake content that can be used to manipulate or extort. Reports from cyber agencies and industry commentators warn that AI-enabled social engineering is already making scams harder to detect, even by relatively tech-savvy staff.
From the defender’s perspective, AI is also being embedded into security tools to detect anomalous behaviour, flag unusual login patterns and correlate signals across large volumes of logs faster than human analysts could manage alone. Many modern email gateways, endpoint systems, and security information and event management platforms now rely on machine learning to distinguish normal activity from threats.
For VET and higher education leaders, the governance question is twofold. First, they must ensure that their institutions do not inadvertently leak sensitive information into public AI tools. This requires clear staff and student policies on acceptable AI use, explicit prohibitions against pasting confidential student data, research findings or proprietary content into external platforms, and training to reinforce these boundaries. Second, they should support their security teams in making thoughtful use of AI-powered defence tools, understanding that these are not silver bullets but can significantly improve detection and response when integrated into a broader strategy.
Crucially, AI in education is not only a cybersecurity issue; it is also an academic integrity challenge. TEQSA has highlighted the dual role of generative AI in enabling both legitimate learning support and potential cheating, which means that cyber, academic and teaching leaders must work together rather than treating these as separate issues.
8. The governance toolkit: questions that cut through
Many non-technical board members feel uncertain about how to interrogate cyber risk effectively. They worry about getting lost in jargon or second-guessing technical experts. However, the most useful questions are often quite simple and do not require deep technical knowledge.
A starting point is to ask whether cyber is being treated primarily as a cost centre or as an investment in continuity and trust. When cyber uplift proposals are presented, boards can ask how the proposed controls will protect enrolment revenue, minimise disruption to teaching, reduce regulatory exposure and preserve the provider’s reputation with students and partners.
Boards should also insist on clarity about the current posture. Instead of accepting vague reassurances that “we are working on the Essential Eight”, they can ask for the organisation’s current maturity level for each of the eight controls, the target levels, and the timeframe and funding required to close the gap.
Examining security-by-design is another powerful lever. Boards can request a short retrospective on the last several major digital initiatives or course-platform launches, asking whether privacy and security impact assessments were conducted before commitments were made and what issues were identified and addressed. Patterns in those answers will reveal whether security is genuinely integrated or simply patched on.
Third-party risk warrants direct attention. Leaders should be able to describe how many critical suppliers underpin core systems such as student management, AVETMISS reporting, LMS, payment processing and online assessment and whether those suppliers have contractual obligations to notify the provider promptly if a breach occurs. The Western Sydney University case underscores that institutions cannot assume that their own risk falls away when a third-party platform is involved.
Backup and recovery questions are vital. It is not enough to know that backups exist; boards should ask when systems were last restored from backup in a test scenario, how long recovery took and whether there are offline or immutable backups that ransomware cannot encrypt.
Finally, boards should probe legal and regulatory readiness. If a serious incident occurred today, could the provider identify affected individuals, notify OAIC within the required timeframes and demonstrate to ASQA or TEQSA how it is addressing the root causes?
Over time, these lines of questioning can be formalised into a simple cyber scorecard reported quarterly, covering posture (such as Essential Eight maturity and MFA coverage), exposure (status of critical datasets and vendors), threat activity (phishing simulation results and blocked attacks), resilience (time to detect and time to respond) and compliance indicators. Boards do not need to be security experts, but they must be able to see trends, ask follow-up questions and hold management accountable for improvement.
9. Leadership in a “when, not if” world
The ASD’s latest threat report is blunt: the question is not whether organisations will face cyber incidents, but when. For Australian VET and higher education providers, the stakes of that “when” are unusually high. A well-timed ransomware attack can stop enrolments, disrupt exams, compromise research collaborations and expose the personal stories of students who have trusted the institution with their most sensitive information.
Managing this risk is not about promising that nothing will ever go wrong. It is about leading in a way that acknowledges the reality of modern threats and builds resilience into the daily fabric of operations. That means allocating real budget to cyber uplift, even when other demands are pressing. It means modelling good practice at the top – using multi-factor authentication, following secure processes, engaging thoughtfully with AI and taking training seriously. It means being prepared to talk openly with students, staff and regulators when incidents occur, rather than hiding behind technical language or downplaying impact.
Most of all, it means recognising that cybersecurity is now inseparable from educational quality. If learners cannot trust that their data will be protected, if regulators cannot trust that assessment evidence is genuine and reliably stored, and if employers cannot trust that qualifications have not been compromised by system failures or manipulation, the social licence of the entire sector is at risk.
Boards and executives who embrace this reality and manage cyber with the same discipline they apply to finances, academic quality and student wellbeing will not only reduce the probability and impact of incidents; they will differentiate their institutions in a competitive market. When students and industry partners ask, “Can we trust you with our data and our future?”, confident, evidence-based answers will increasingly become a decisive factor.
The path forward is clear, even if it is challenging. Elevate cyber from a technical afterthought to a standing governance priority. Treat the Essential Eight and related frameworks as minimum baselines rather than aspirational goals. Build security into every program, partnership and platform from day one. Plan for incidents, practise your response and learn from each near miss. In a world where attacks are constant, the institutions that will thrive are not the ones that claim to be invulnerable, but the ones that are prepared, transparent and relentlessly improving.
