John Price is one of Australia’s most respected consultants and trainers in the Vocational Education and Training (VET) sector, with more than 40 years of experience shaping skills development, educator capability, and quality assurance across the country. Known for his pragmatic approach to compliance, governance, and continuous improvement, John has become a trusted mentor, auditor, and keynote speaker across the Australian VET landscape.
Career Overview
John began his career as Head of the School of Printing and Graphic Arts before becoming a Professional Development Coordinator with TAFE Queensland, where he mentored emerging teachers through the Diploma of Technical and Further Education. In 1988, he co‑established Australia’s first high‑technology industry training centre for the printing and graphic arts sector in partnership with InPrint Limited—an initiative that set new standards for industry‑integrated training.
Since leaving TAFE, John has assisted hundreds of training organisations to achieve and maintain registration as Registered Training Organisations (RTOs) under successive regulatory frameworks, from the Australian Quality Training Framework (AQTF) to the Standards for RTOs. His consulting and auditing practice has supported more than 300 organisations in implementing continuous improvement and internal quality systems to strengthen compliance and performance.
Expertise and Contributions
John’s expertise spans:
- RTO registration, re‑registration, and audit preparation
- Internal auditing and compliance monitoring systems
- Validation and assessment design
- Risk management and governance for training executives
- Professional development for trainers and assessors
- Systematic implementation of ISO 9001, ASQA, and AS ISO 21001 education quality management frameworks.
His belief in simplifying regulatory compliance through practical systems has influenced quality frameworks used across the nation. John’s workshops and keynotes—particularly on assessment validation, trainer and assessor currency, and engagement strategies—are highly regarded by practitioners at national conferences such as the Velg Training National VET Conference and other sector forums.
Philosophy and Impact
At the heart of John’s philosophy is a simple but enduring principle: compliance and quality are inseparable from good teaching and learning. His work promotes an evidence‑based, student‑centred approach to regulation—one that aligns organisational systems with the values of effective education.
As a lifelong educator, mentor, and quality innovator, John Price continues to shape the standards, systems, and people driving excellence in Australian VET.
What inspired your long-term commitment to vocational education and training in Australia over the past four decades?
This commitment actually started in Liverpool, UK, when I was 16 years old (1964). I had just commenced my apprenticeship in Graphic Arts. I was good at science at school; however, my Science teacher, Bill Bailey, at Liverpool College of Art, had this tremendous ability to engage with students by placing science into the context of our trade, and suddenly Physics and Chemistry came alive! I went home that afternoon after my first lesson and told my Mum, “I want to be a teacher someday!”
That opportunity came about on the 19th of January 1976 when I joined TAFE Queensland to teach in my trade area. The talent Bill demonstrated in 1964 wasn’t a result of any defined standards, such as we have today in Outcome Standard 1.1, but as a result of the passion and commitment he had to support students on their learning journey.
Looking back to your days at TAFE Queensland, what lessons from classroom training still inform your professional practice today?
Acknowledging that every student has the capacity to succeed, and how important it is to create a learning environment for each individual student to make it happen. I wasn’t aware of William Glasser’s motivational teachings back in 1976, particularly associated with his findings that all students want to be successful. Glasser described how activities that create fun, freedom, belonging and recognition enable this motivation to occur.
Importantly, however, he also made it clear that as teachers we need to provide an environment that allows these motivators to be demonstrated in a Valid way, otherwise they are so powerful that they can be demonstrated by students in Invalid ways.
Every day I try to follow the teachings of Glasser, and in all of my interactions both in and out of the classroom, I try to use techniques associated with Fun, Freedom, Belonging and Recognition in a Valid way that rewards people's success and encourages further success.
Over the years, how have you seen the expectations of trainers and assessors shift in response to regulatory and educational changes?
Many trainers and assessors’ expectations haven’t changed. When I consult with TAFE and other RTOs that have been around for a long time, I still see that commitment to getting it right for students, despite an increased administrative workload for educators that has been introduced.
What defining moment or initiative most shaped your approach to compliance and education governance?
I have been involved in the quality movement in Australia since 1987 and have installed many quality management systems based on the International Standards ISO9001 in industry, commerce and RTOs.
It was the installation of ISO9001 using the ISO 9001 Education and Training handbook HB90.7-2000 into an RTO that was registered against the Australian Recognition Framework (ARF) that realised a need to have an approach that wasn’t just compliant (based on the ARF compliance standards) but also quality-focused (based on what the customer wants).
Integrating both approaches enabled compliance and good governance to co-exist, as it also reinforced leadership, accountability, communication, developed a training quality culture, and risk management. Here we are, some 25 years on, and now seeing these requirements in the 2025 standards.
What a chance someone missed 25 years ago that not only would have enhanced our total understanding of compliance and quality, but would also have placed Australia in the forefront by demonstrating Australia’s Education and Training System was as good as anywhere else in the world!
So, in summary, there was an opportunity to integrate quality standards with legislative compliance 25 years ago – we missed it. Let’s not make that mistake again!
Governance and Leadership in RTOs
What distinguishes excellent RTO governance from simply compliant governance under the national standards?
The difference is always in implementation. It’s not the policies and procedures – ‘the Talk!’ - that distinguishes ‘excellence’ from ‘ordinary’, it’s ‘the ‘Walk’; i.e. the actual implementation of governance practices that can be sensed in an RTO that is there for the right reasons.
Those excellent RTOs have governance structures in place that enable them to clearly:
- establish and communicate their objectives and performance requirements
- communicate effectively to all staff their roles and accountabilities
- use metrics for both compliance and quality activities
- analyse data using tools that produce results that distinguish between causes and symptoms
- determine trends
- implement comprehensive action planning for continuous improvement
- demonstrate the effectiveness of improvements made; and
- recognise and reward achievements that support a quality training culture
In your experience auditing hundreds of RTOs, what are the most recurring weaknesses that leaders overlook?
The three main weaknesses that I observe are failing to:
- Communicate to all staff that Quality and Compliance is everyone’s business and not just that of the RTO Manager.
- Understand that their commitment not only needs to be demonstrated at audit (performance assessment) but also continuously evident through effective delegation and implementation of timely reporting systems. They must lead by example – every person’s eyes are on their behaviour – they need to make sure it’s positive, genuine and repeatable.
-
There is a tendency to focus on rectifying issues from the narrow sample of activities used at audit rather than responding systemically across all the RTO's scope of operations.
How can RTO boards and CEOs create a culture of compliance that goes beyond risk management to genuine quality improvement?
Compliance standards and requirements have, and will, keep changing. Understanding that a genuine culture of compliance will motivate and engage people as these requirements change is something that RTO boards and CEOS must understand.
Change and Conflict go hand-in-hand! Consequently, implementing change through effective planning, communication, and senior management visibility provides support to develop the foundation of culture, i.e. trust.
Providing feedback needs to be seen as an effective and positive opportunity to encourage change, rather than as complaining.
Investing in these control systems and training staff to use them builds confidence that encourages all stakeholders to agree and move in the same direction ‘safely’.
What advice would you give new RTO CEOs about building compliant, sustainable institutions within five years of operation?
Realise that obtaining registration is the start of the journey and is comparatively easy compared to implementing and maintaining it. And consider at least the following points:
- Invest in your skills as well as those of your staff
- Keep up-to-date on regulatory issues and changes
- Develop an effective system of regular review and communicate the results in a timely manner
- Adopt the concept that findings from any review are the “Keys to the Treasure Chest” in order to avoid blame and build trust
- Engage an experienced external critical evaluator to examine your approaches and assist you in overcoming groupthink situations
- Celebrate success in a fun and valid manner with all staff.
How important is board-level understanding of the Standards for RTOs when setting policy and direction?
Their understanding should be at a critical level that enables them to make effective decisions on policy and direction. Their decisions will come from accurate trends resulting from analysis of operational, financial and human resource information, and they need to be comfortable in accepting that it is clearly associated with the Standards for RTOs.
The timing should be such that it avoids ‘knee-jerk’ reactions associated with symptoms rather than causes. They should understand that the results are the result of the use of a range of continuous improvement tools.
Quality Assurance and Continuous Improvement
What are the most effective practices for implementing continuous improvement systems that actually result in better learning outcomes?
Implementation practices should include ensuring:
- Practices have been communicated to all staff
- staff have been trained in using the systems
- Data collected and recorded can result in reports being compiled against specific activities such as Complaints and Appeals, Stakeholder feedback, Validation outcomes, etc.
- data is analysed to identify trends
- Continuous improvement tools are then used to separate the symptoms from the causes
- cause and effect analysis results in actions being planned and implemented
- outcomes of actions are evaluated and compared to the initial presenting cause
- improvements are communicated to all relevant stakeholders; and
- Systems are changed to incorporate the changes that have been deemed to be effective.
How can validation be reimagined to serve as both a compliance tool and a mechanism for pedagogical reflection?
My experiences resulting from considerable involvement in pre-and post-assessment validation activities reinforce that trainers and assessors, and auditors look at assessment through different lenses due to their innate characteristics.
Experienced Trainers and Assessors are generally ‘Expressives’ who know instinctively what they are assessing meets workplace expectations; whilst Auditors are ‘Analytical’ and require evidence that the ‘atomistic’ components in a unit of competency can be assessed and evidenced.
As a ‘validator’, I must take off my natural ‘Expressive’ hat and don an ‘Analytical’ auditor’s perspective. If a Performance criterion requires the learner to demonstrate how they “develop a solution, implement it, monitor its use and effectiveness, and make changes for improvement”, then an auditor is looking for evidence of each of the active verbs highlighted.
If I don’t find them all, then my approach is to ask where they are assessed, and invariably, they are assessed in a different unit or units. Of course, I’ll ask to see that unit, its assessment tools, and particularly the assessment mapping for the units to confirm validity.
In all honesty, the other difference between Trainers and Assessors and Auditors is that Trainers and Assessors understand that very rarely does a unit of competency exist as a single task in industry. Competent industry people don’t just ‘Use hand tools’. Competent people use hand tools when conducting general engineering tasks.
Consequently, consider this situation. The assessment developed by the Trainer and Assessor may not have been designed to capture all the highlighted active verbs because another unit of competency actually assesses them! If an accurate consideration is made for developing an assessment for a unit, then the ‘Application’ statement in the unit is an essential component to review prior to developing the assessment tools.
Consequently, a critical component of reimagining validation is considering clustering of assessments to represent what the activity actually looks like in the workplace and not just in the unit. This should be undertaken as a component of risk assessment when creating a validation schedule.
And remember, Pedagogy is about creating an environment that encourages critical thinking through active engagement, making learning relevant, and consequently providing a foundation for future learning. The processes of validation are important in achieving this result.
What do you believe has gone wrong with assessment validation in many RTOs, and how can it be fixed?
The answer to this question is generally covered in the response to the previous question regarding the atomistic approach by auditors compared to the practical approach by trainers and assessors. However, strong consideration needs to be given to the choice of validators, and I would always recommend the inclusion of independent validators, particularly from industry who will ask, and should be encouraged to ask, the awkward questions that inevitably result in opportunities to improve the assessment process and practices.
How can RTOs demonstrate a quality culture under the new self-assurance framework being advanced by ASQA?
The first step is ensuring all staff have a sound understanding and show commitment to the quality principles that are based on meeting the requirements of the customer consistently, compared to the principles of compliance that focus on meeting the requirements of the standards.
Consequently, a concept of quality assurance and self-assurance that should be promoted is one of satisfying the requirements of both internal and external customers. ISO 9001-based quality systems recommend that the first step in achieving this is to produce a process flow diagram of the organisation’s activities and support mechanisms.
This approach, when clearly documented, identifies the relationship between inputs and outputs in each operational task and, more importantly, the relationship between internal customers and external customers. It clearly identifies that if an RTO wants to satisfy the requirements of its external customers, then it needs to satisfy the requirements of its internal customers first.
This approach supports good communication, the use of teamwork, and the trust that internal customers have in each other as they depend on timely and accurate information to be transmitted. The result is confidence in the systems and everyone’s role in supporting them. Confidence is what self-assurance is about.
The diagram below represents the internal and external customer approach that defines self-assurance as meeting the requirements of the customer.

What metrics or indicators would you recommend that RTOs use to measure the health of their compliance systems?
There is a saying that quality professionals use when describing metrics. It’s based on the concept that “if you can measure it, then you can control it!” Read through the comments below, and you’ll see how that statement needs to change.
RTOs would have established a series of KPIs; at least they should have done so to influence their business planning and marketing strategies. Consequently, measures established against these KPI’s should be the basis for determining the health of compliance and quality systems.
These measures should be ones that are:
- student - focussed such as progression, completion, and satisfaction
- industry satisfaction
- third party performance
- compliance and quality-focused, involving results of internal reviews, validation, customer complaints and appeals; and
- financial performance related to ASQA’s Financial Viability Risk Assessment, Profitability, employee costs and cash flow forecasts.
The main issue that I observe is that data is gathered and generally represented graphically, but not necessarily adequately analysed to determine trends, separate symptoms from causes, develop action plans, implement solutions and evaluate their effectiveness.
Consider changing the phrase that quality professionals use to “if you can measure it, then you can control it, if you have sufficient data to avoid knee-jerking into solutions, and the data has been analysed and appropriately acted upon”.
Audit and Compliance Reform
From your perspective, what are the biggest strengths and weaknesses of ASQA’s new progressive audit model?
The strengths are that ASQA has clearly defined the features and steps in the audit process. Its features are defined by ASQA as:
-
Risk-based: The model uses data and intelligence to understand risks, and the scope and scale of audits are informed by this analysis, along with information from students, industry, and a provider's profile.
-
Outcomes-focused: It prioritises the actual delivery of high-quality training and the results for students, rather than just an RTO's systems and processes.
-
Student journey-focused: Audits are designed to follow the "journey" of a student to get a holistic view of the provider's practices.
-
Proactive and behavioural: The model focuses on the actual practices and behaviours of providers, with ASQA engaging with providers using communication and education to drive positive behaviours.
The five-step process involves:
-
Planning and initial contact: ASQA contacts the RTO, requests preliminary information, plans the scope of the audit, and holds an opening meeting.
-
Compliance testing: ASQA collects and analyses evidence through various means, including information requests, virtual or onsite observations, and interviews with staff and students.
-
Closing meeting: ASQA holds a meeting to explain its findings to the RTO.
-
Reporting: A performance assessment report is written to identify areas of compliance and non-compliance.
-
Next steps: ASQA shares the findings with the RTO and outlines the next steps.
In my past sales training role, I always taught budding salespeople: “You don’t sell features, you sell benefits!” Benefits are what customers use to make their decision to buy or, in this case, ‘buy in’. A benefits statement that accompanies the features would be well received by RTOs and assist in developing their belief and, more importantly, their confidence (self-assurance) in the regulator. In my considered opinion, confidence in the regulator has not been effectively promoted as being an important component of an RTO's self-assurance, and it should be.
How can RTOs prepare for ASQA’s performance assessments that now focus on continuous self-assurance rather than reactive audits?
There have been several questions in this article that relate to developing a quality culture and provide some answers to these questions. The first step should be for the RTO to clearly communicate, through leadership commitment and involvement, an opportunity to manage quality and compliance progressively rather than seeing it as a ‘punishing and unnecessary event’ that may only happen once a year in order to submit the Annual Declaration of Compliance.
Progressive monitoring with an effective improvement regime takes the sting out of an annual realisation that we’ve had it wrong for so long.
Understanding that merely changing the word from ‘Audit’ to ‘Performance Assessment’ means little if the negative experiences associated with being exposed to it continue to affect morale and self-esteem. I can remember an edict released years ago to remove the word ‘catastrophe’ from its association with a disaster and replace it with ‘event’ when the outcome was actually the same. The words ‘absolute event’ don’t carry anywhere near as much emotion as ‘absolute catastrophe’. The fear, loss of self-esteem, and emotional responses were the same. Words mean nothing it's actions that are important. Reward positive achievement, make changes to less than positive outcomes through an appreciative enquiry on what you do well rather than what you do wrong.
What legal and operational mistakes do RTOs make most frequently when appealing ASQA decisions?
I’m not qualified to provide a legal viewpoint; however, when RTOs look back on what they provided from an operational viewpoint, they generally realise – whoops, I didn’t respond correctly or systemically to what was required.
My recommendation is to organise for external experts, operational and legal to review your response
In light of the 2025 SRTOs changes, how will capability-based assurance reshape how we measure training quality?
Capability-based assurance is about an RTO through their governance, staff and systems demonstrating evidence that they deliver high-quality and consistent assessment outcomes.
RTOs will need to determine for their RTO what this evidence looks like. For all RTOs, it will no longer be only a focus on administrative procedural documents that merely describe the process, but rather providing evidence of the outcomes of those processes that meet customer requirements.
I’ll often say to an RTO, “You don’t need a belt and braces to hold your pants up”. And for some of them, there also appears to be the use of ‘sky hooks’ and ‘gaffe tape’. Megabytes of words and ‘Minabytes’ of action.
In summary, if an RTO can clearly demonstrate, and provide evidence of, leadership communication, accountability, risk management, training of all staff, effective implementation of controls and a commitment to continuous improvement, it will demonstrate how it measures training quality.
Do you believe ASQA’s engagement-first approach is achieving its goal of improving regulatory trust across the sector?
I’m not aware of a lot of evidence related to this current engagement-first approach to provide an informed judgment; however, I can provide first-hand knowledge of how ASQA used a similar approach in its audit of RTOs that had a Delegation to manage their own scope of registration.
I was appointed by ASQA to their panel of Delegation Auditors. There were about a dozen auditors, selected from across the country several years ago, to conduct delegation audits. Approximately 20 RTOs with Delegation status were selected by ASQA for audit and were given approximately six (6) months to prepare and be subject to an external audit.
One of the main features of this approach was that RTOs could select from the panel of auditors which auditor they wanted to use. The benefits are that over a period of months, a relationship could be established between the auditor and RTO. This preparation on both sides helped to develop trust, ensure no surprises at the audit and result in an audit that progressed in a professional manner, enabling communication of issues and their resolution that had been discussed months earlier.
Five of the six requirements for Delegation were based on the quality of the requirements being met and recorded as Met or Not Met; the other requirement was a compliance requirement recorded as Compliant or Not Compliant. However, an overall Rating of Findings was also recorded, and outcomes were rated as Poor, Adequate, Good or Excellent.
In my considered opinion, and through feedback from the eleven (11) RTOs that I audited out of the 20 on the list, they believed the opportunity to:
- select their auditor
- develop a professional working relationship through the journey to audit
- communicate progressively during the lead-up to the audit, and
- be involved in a comfortable experience with no surprises was a welcome difference compared to previous audit experiences.
In summary, the answer to this question is the potential for Yes. I wonder what the appetite is for reproducing the Delegation audit system?
Teaching, Assessment, and Professional Development
What are your top recommendations for maintaining trainer and assessor currency effectively without overwhelming staff?
Firstly, my experience is that most trainers and assessors maintain their vocational currency through a range of activities and events they attend throughout the year. However, many fail to record these opportunities that align with the delivery and assessment they conduct.
Secondly, consider the concept of ‘shelf-life’, or ‘use-by-date’ when planning vocational currency activities. This concept is based on recognising that many vocational tasks undertaken in the past are the same as those undertaken today. In effect, they are not ‘out-of-date’ and returning to industry to perform them would do very little to maintain a trainer and assessor’s currency.
The 2025 standards reinforce the need to determine, through ongoing industry engagement, what trainers and assessors need to do to maintain vocational currency. I see lots of examples of ‘the how’ to maintain currency, e.g. return to industry, attend industry events, subscribe to industry publications, but generally very little on ‘the what’ should be focused on during this time, related to the results from industry engagement.
Thirdly, consider developing a list of topics, based on industry engagement, that identify the latest technology, techniques, materials, legislation, etc. and compile them into a schedule that indicates how often these should be undertaken, the location, the provider, etc. This may result in some topics being undertaken at least annually and others once or twice over a five-year period. The process that is developed to support this approach reinforces the need to maintain evidence of what was done and align it to an outcome of engagement with industry.
Fourthly, revisit the schedule at least annually based on the outcomes of industry engagement strategies, make adjustments and implement the revised activities.
How do you conduct professional development that produces measurable improvements in assessment design and delivery?
Reinforce the need to firstly consider what the unit or units of competency ‘look like’ in the workplace. This enables a holistic consideration of the relationship of the activities in a unit to activities in other units in the context of the workplace.
This initial approach enables consideration of clustering of assessment activities based on workplace tasks and/or common activities within units, such as planning, recording of actions taken, PPE, clean-up, etc. Ensure the approach to clustering common activities takes into account the context in which the units are delivered, particularly with respect to PPE.
From this point, the process requires the selection of the appropriate mode of assessment and consideration of the opportunity to assess at least some knowledge evidence through application of that knowledge in practical activities.
Conduct pre-use validation on the assessments produced and check that the assessment mapping, benchmarks, learner and assessor instructions clearly reinforce the Principles of Assessment and have the capacity to produce evidence that will satisfy the Rules of Evidence.
Trial the assessment, have a peer review the process, moderate the outcomes and make improvements.
What critical gaps exist in current trainer preparation programs, and how might the VET sector bridge them?
The Certificate IV in Training and Assessment has always been suggested to be an entry-level qualification, which I agree with. However, I’m not convinced that proceeding to acquire a complete TAE Diploma, etc., is a necessary step for most trainers and assessors.
I acknowledge there are ‘Skill Sets’ that may be relevant, but I’m not convinced that the practical need in the learning environment for daily use of educational psychology, sociology and philosophy has been adequately considered. These topics I consider to be a major gap, and I vividly remember during my initial ‘teacher training’ in 1976 struggling with the relevance of educational psychology, sociology and philosophy; and I was never more surprised than when I entered the learning environment to see their value and importance.
How can recognition of prior learning (RPL) processes be simplified while maintaining integrity and compliance?
We’ve been continually reminded that RPL is just like any other assessment process. And yes, the Principles of Assessment (PoA) and the Rules of Evidence (RoE) do need to be met. However, I believe through my experiences of delivering PD on RPL around the country, as well as at an ASQA Assessor Moderation session, and conducting RPL, that the most overlooked are the application of the Principles of Assessment.
How often do I see RPL candidates being asked to interpret units of competency (invalid, unfair and not reliable) and provide examples of evidence for every component of the units (unfair and inflexible).
The RPL process should be very simple and involve:
- An application that requires minimum evidence of past experience at that point in time
- A good self-assessment that acts as an opportunity for the candidate to say, either this isn’t for me, or build their confidence for the next step in the process
- A Competency or Professional Conversation that asked questions that are phrased in a manner that elicits evidence of practical examples performed in the workplace, e.g. “How did you …?” rather than “How would you …?”
- An opportunity to flexibly switch between the conversation and the use of practical activities that reflect the environment that the candidate worked in, which may be different to the context that the RTO delivers in.
- Practical consideration of whether supplementary evidence needs to be gathered from third parties. Consideration may include privacy, confidentiality, intellectual property, and the availability of third parties
- Clear and accurate records of how the assessment evidence for each particular candidate aligned with the unit/s requirements for the assessor to make their judgement. This considers that every candidate may provide a range of evidence that is different to other candidates; and
- The opportunity to fill gaps in evidence through several options rather than just completing the training and assessment.
Could you share a few examples of innovative RPL or validation frameworks that you’ve seen successfully implemented?
The response above provides some insight into what the RPL process should involve and may provide readers with something to consider how they can be modified in the context of their RPL cohort.
Some assessors have told me that the opportunity to observe the candidate in the workplace is invaluable in making the candidate feel comfortable. The assessor then uses their knowledge of the units and their currency in the workplace to ask competency conversation questions or observe smaller tasks to fill any gaps.
They are also ‘surrounded’ by evidence of tasks previously completed in records of jobs and conversations with colleagues and supervisors. Assessors make statements like “I was only there for 5 or 6 hours, but I was able to see evidence of 10 years of the candidate's work!”
Other assessors report on evidence they receive in video or pictorial format that ‘tells the story’. They then authenticate this evidence through the use of a competency conversation that asks questions such as “what was involved in the planning for this task?” Again, for a competent RPL Assessor, this process saves time, relates directly to the work the candidate has done, and identifies gaps that can be filled through more questioning, etc.
The Future of the VET Sector
With the sector moving toward digital transformation, how can technology enhance audit readiness and compliance reporting?
Digital transformation can certainly assist in providing information on quality and compliance in a timely manner. However, the saying “Garbage in – Garbage out” still applies.
RTOs need to know:
- WHAT data will need to be gathered?
- WHO will gather it?
- WHERE will it be gathered?
- WHEN will it be gathered?
- HOW will it be analysed?
- WHAT tools will we use to separate the causes from the symptoms
- HOW will we plan to change the systems to produce better outcomes? and
- WHEN and HOW will we report it to Senior Management?
Only when the steps above are designed and implemented will digital transformation aid in preparing an RTO for audit.
How do you foresee AI and automation influencing VET compliance systems in the next decade?
The comment above goes some way to responding to this question regarding compliance systems, and most of the discussion on AI is regarding determining the authenticity of learner responses. I’m not sure if we have a complete answer for that at this time.
However, this question should also look at how A1 and automation can positively influence the quality requirements embedded throughout the 2025 standards.
RTOs should consider the positive effects that AI can have on student engagement, well-being support, and maintaining or accelerating progression on the way to achieving successful outcomes. VETNexus, through Kerri Buttery, has been involved in a considerable amount of work on the positive use of these systems by trainers and assessors, and any PD in that area should be seriously considered.
What is your view on the balance between regulatory oversight and innovation? Are RTOs still given enough room to innovate?
It is my considered opinion that aiming for a compliant outcome has created a risk aversion to innovation, particularly associated with assessment. A few years ago, I had the opportunity to share a table with an ASQA Chief Commissioner. One of his questions to the people present was along the lines of “Why aren’t RTOs more innovative in their assessment practices?” You can imagine the response from those present! And yes, you are right, the response focused on the perceived inflexibility of ASQA auditors to see ‘outside the box’.
RTOs would love to be more creative in their approach to assessment, but can only do this if they feel safe and confident (self-assured with respect to the regulator) that they can discuss why they chose this approach at performance assessment with auditors who thoroughly understand the VET environment, as well as the cohort being delivered to.
This comment supports a previous comment on ASQA’s engagement-first approach, which, when fully implemented, should assist in this opportunity being realised.
What accountability mechanisms should regulators and governments build into ensure fairer treatment of compliant, high-performing RTOs?
They should consider that some RTOs may also be subject to other systems of compliance and quality, such as the National Code 2018 (CRICOS), Higher Education, and international systems based on ISO9001 and its interpretation against AS/ISO21001:2019 Management Systems for Educational Organisations.
Achieving the above requirements is no mean feat, and consideration of that achievement should be made when ASQA determines the regulatory risk of a provider.
Finally, if you could rewrite one part of the national VET regulatory structure, what would it be and why?
I believe the 2025 Standards in the VET regulatory structure are now more closely aligned to International AS/ISO21001:2019 Standards Management Systems for Educational Organisations. However, more consideration should be given to the performance assessment of the quality of performance of any supplier or Third Party that delivers services or training and assessment for RTOs.
A ’guesstimate’, probably conservative, is that more than 20,000 third parties are recorded by ASQA. Consequently, greater consideration of their performance should be emphasised in the RTO Standards. This could possibly be considered by taking a lead from ISO standards that require effective controls and approval of any supplier that has a direct effect on the quality of performance of the product.
