Depending on who’s doing the reading, the 2017 Employer Satisfaction Survey shows the overwhelming majority of employers think graduates are well-prepared for jobs. Or it might show universities need to do more to address the needs of graduates and employers. Or even that universities are not delivering to businesses.
In fact, you can go back in time even further. In the 1950s, the Report of the Committee on Australian Universities called upon both universities and government to do more to work with industry to identify future labour demand and offer courses accordingly.
The release of the latest Graduate Ouctomes Survey had a similar effect in focussing attention on higher education performance.
A focus on graduate employability is not surprising. What is surprising is we are using mostly the same ways of measuring university performance we have for decades, when more accurate means exist.
Why the way we use the surveys is flawed
Like all surveys, the Employer Satisfaction Survey has to account for and overcome a number of elements that can affect the validity of the results. For example, 4,348 survey responses sounds like a lot, but this represents only 9.3% of all possible employer contacts.
Also, the way employers are contacted is a problem. It’s the graduate who’s contacted and invited to provide their supervisor’s details to the survey team. So the surveyors start out with almost 100,000 graduate contacts, of whom less than 10% provide their supervisor’s details and of those supervisors, less than half participate in the survey.
Another issue is the survey relies largely on subjective measurements of perception. For example, data shows the supervisors of graduates are more likely than the graduates themselves to think the graduate’s qualification is important. Two perceptions of the same qualification in the same context – which one, if either, is right?
The Graduate Outcomes Survey also relies on graduates being willing to complete the survey. The latest survey had a response rate of 45%, which is very good for surveys. But the survey is sent out only four months after graduation. It does not, then, necessarily reflect the short – let alone medium or long-term – employment prospects for the individual.
This is not to say the methodologies underpinning these surveys are not robust, or the Social Research Centre, who deliver the surveys, are not experts in their field. It is and they are. When a survey is the best option for gathering data, then these types of survey should be run. But we shouldn’t be using findings such as these to measure university performance, when there are better options available.
The missing link
For decades now, there has been an administrative link between a graduate’s education and taxation records. If domestic students have ever wondered why they are issued with a Commonwealth Higher Education Student Support Number (CHESSN), and why they need to provide their Tax File Number (TFN) to the university, this is the reason.
The CHESSN tracks their educational history, even when they change courses or institutions. Consequently it keeps track of their
HECS-HELP debt. By linking the CHESSN to the TFN, a record of the debt can be provided to the Australian Taxation Office, for future collection.
This administrative link could also be used to provide accurate and detailed longitudinal analyses of which jobs all graduates end up in, not just those motivated to respond to a survey. As time passes – or by going back further into the records – detailed pictures can be provided about how graduates perform over time, which organisations recruit and retain the most graduates, which courses show evidence of greater graduate mobility, actual lifetime earnings (as opposed to predicted), and so on.
Graduate outcomes would also be better contextualised against non-graduate outcomes, as well as national and international labour market trends. One-off, or purpose-specific analyses could be more easily provided to address specific government or community concerns as and when they arise.
If the government were to make key findings of these analyses publicly-available on a regular basis, students, politicians and policymakers would be able to make much more informed decisions regarding future study requirements. The current surveys would still be important, as they can provide additional information government records cannot. But when it comes to measuring university performance, hard data is the key.
Linking government records in this way is a sensitive issue. It would almost certainly require specific legislation, to ensure privacy of data. The current arrangement between the Department of Education and Training and the ATO is restricted, essentially, to the Department providing only the basic information required to report the student debt. The legislation at the time was not designed to allow wider sharing of student information between the two organisations.
Also, there would need to be a significant investment in the right infrastructure and systems to ensure the data was protected and analysed appropriately. Given the centrality of graduate employability to government higher education policy, now might be the time for this investment.
Tim Pitman does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.