Generative AI in Australian Universities: Embracing the Future or Stalling Progress?

Generative AI in Australian Universities: Embracing the Future or Stalling Progress?

Since the release of ChatGPT at the end of 2022, there has been widespread speculation about how generative AI would transform universities. With the potential to revolutionise teaching, learning, research, and administrative tasks, AI is a topic of heated debate. While there has been considerable research focused on students' use of AI and its implications for teaching and assessment, there has been limited large-scale research on how university staff in Australia are using generative AI in their day-to-day work.

A recent study surveyed more than 3,000 academic and professional staff at universities across Australia to understand how they are integrating AI into their roles. The findings paint a picture of uneven adoption, with some staff embracing AI while others remain hesitant or resistant. This article explores the key insights from the study, examining why AI adoption has been slow for some and what universities should consider as they navigate this technological revolution.


The Reality of AI Use in Australian Universities

The study revealed that 71% of university staff have used generative AI in their work. However, adoption rates vary depending on the staff category:

  • Academic staff: 75% of academic staff reported using AI, with the highest usage rates seen in fields like information technology, engineering, and management.
  • Professional staff: 69% of professional staff have adopted AI, particularly those in business development and learning support roles.
  • Sessional staff: 62% of sessional academics who are employed on a temporary basis, reported using AI in their work.
  • Senior staff: Among senior executives, AI adoption is highest at 81%, reflecting a more forward-looking approach among university leadership.

Despite these numbers, nearly 30% of staff have not yet used AI, suggesting that while AI has made significant inroads, it is still in the early stages of adoption across Australian universities.


What AI Tools Are University Staff Using?

When asked about the AI tools they use, university staff reported a surprising diversity, with 216 different AI tools cited in the survey. The most commonly used tools were general-purpose AI applications, such as:

  • ChatGPT: The most popular AI tool used by 88% of staff who have adopted AI.
  • Microsoft Copilot: Used by 37% of AI users.

In addition to these general tools, staff are also leveraging AI for specific tasks like image creation, coding, and literature searching. The range of applications demonstrates the versatility of AI in higher education.

However, the survey also highlighted that many staff are still in the exploratory phase. Around one-third of AI users have only experimented with a single tool, while a smaller group of about 4% have adopted ten or more tools in their work.

Common Uses of AI

AI is being used across a range of academic and administrative tasks, with literature searching, writing, and summarising information topping the list. Other popular uses include:

  • Course development: AI is assisting academic staff in designing and updating course content.
  • Teaching methods: AI is helping educators tailor their teaching strategies.
  • Assessment: AI tools are being employed to streamline assessment processes and develop innovative ways to evaluate student learning.

Why Are Some University Staff Not Using AI?

With nearly 30% of university staff still not engaging with AI, the study explored the reasons behind this hesitation. Some common themes emerged:

1. AI Is Not Seen as Relevant

Many staff, particularly those in non-technical roles, reported that AI does not seem relevant to their work. For example, one staff member noted that while they had explored AI tools like ChatGPT and Microsoft Copilot, they had not found an application for them in their day-to-day tasks.

2. Lack of Familiarity and Confidence

Another key barrier is a lack of familiarity with AI technology. Some staff admitted they didn’t feel confident enough to use AI, highlighting the need for more training and support. One academic respondent explained: “I don’t feel confident enough yet,” while others echoed this sentiment, citing the complexity of AI as a reason for not engaging with it.

3. Ethical Concerns

A significant number of respondents expressed ethical concerns about using AI in academic settings. These concerns include fears about AI undermining human activities like critical thinking, writing, and creativity—skills that many academics view as central to their professional identity. One academic noted, “I want to think things through myself rather than having a computer think for me,” reflecting a broader scepticism about AI’s role in intellectual work.

Others raised concerns about AI as a tool of plagiarism, particularly in creative industries where machine learning has been accused of using the creative works of others without permission. These ethical objections have contributed to the slow adoption of AI by some staff.


The Divide in Attitudes Toward AI

The study highlights a significant divide in attitudes towards AI among university staff. On the one hand, there are those who see AI as a valuable tool for enhancing productivity, streamlining tasks, and improving the learning experience for students. On the other hand, there are those who remain sceptical, either due to concerns about the ethical implications or because they feel AI technology doesn’t align with their work.

This divide suggests that universities need to take a balanced approach in responding to the rise of AI. While AI offers exciting possibilities for innovation in education, universities must also address the legitimate concerns of staff who are wary of adopting the technology.


How Should Universities Respond?

As universities stand at a crucial juncture with generative AI, they must carefully consider how to support staff in using the technology effectively while also addressing ethical concerns. The findings from this study suggest several key actions universities can take to foster a positive relationship with AI among staff.

1. Develop Clear Policies and Guidelines

One of the most consistent messages from staff is the need for clear, consistent policies and guidelines on AI use. Many staff members are unsure about how AI should be integrated into their work or what the ethical boundaries are. Universities should prioritise the development of AI policies that outline best practices for its use in teaching, research, and administration.

2. Provide Comprehensive Training

A lack of confidence in using AI is a significant barrier for many staff, particularly those in non-technical roles. To address this, universities must invest in comprehensive training programs that help staff understand how AI can be applied in their specific roles. Training should not only focus on technical skills but also on ethical considerations and the potential risks associated with AI use.

3. Secure and Invest in Reliable AI Tools

Another concern raised by staff is the security and reliability of AI tools. Universities should ensure that the AI tools they endorse are secure, compliant with privacy regulations, and designed to meet the specific needs of the higher education sector. Investing in trusted, purpose-built AI tools will help to alleviate concerns about data privacy and ethical misuse.

4. Foster Open Dialogue and Collaboration

The divide in attitudes toward AI underscores the importance of open dialogue within universities. Institutions should create spaces for staff to discuss their concerns, share experiences, and collaborate on AI-related projects. This can help bridge the gap between those who are enthusiastic about AI and those who are more hesitant, fostering a more inclusive approach to AI adoption.

5. Encourage Ethical AI Use

Finally, universities must emphasise ethical AI use as a core component of their AI strategy. This includes addressing concerns about plagiarism, intellectual property, and the potential for AI to devalue human skills like creativity and critical thinking. Universities should take a proactive approach to ensuring that AI is used to complement human activities rather than replace them.


Navigating the AI Revolution in Higher Education

Generative AI presents both opportunities and challenges for Australian universities. While many staff are already using AI to enhance their work, others remain cautious, citing ethical concerns, lack of familiarity, and scepticism about the relevance of AI to their roles.

For universities to fully embrace the potential of AI, they must take a balanced approach—one that promotes innovation while addressing concerns around ethics and trust. With clear policies, comprehensive training, and secure AI tools, universities can help staff navigate this new era of AI, ensuring that it serves as a tool for progress rather than a source of division.

As the use of AI in higher education continues to evolve, the choices universities make today will shape the future of learning, teaching, and research for years to come.

Back to blog