Three steps to an AI-empowered university

Three steps to an AI-empowered university

In my last article I argued that universities must own their own AI. Not rent it from Silicon Valley. Not outsource it to consultants. Not rely on vendors who sell the same intellectual property to multiple clients while charging universities handsomely to access their own data. As The Saturday Paper recently described in its coverage of the Australian National University consultancy saga, this practice amounts to “mind-boggling stupidity” — millions of dollars paid to firms like Nous Group for work based on data that should never have left campus in the first place. Universities must own their AI because whoever controls the institutional brain controls the institution itself.

But ownership is only the beginning. The harder question is: what do you do with it? How do you turn ownership into empowerment? The answer is not a five-year digital strategy destined to sit on a shelf. It is three simple, practical steps. If universities followed them, they could shift from being laggards in the AI revolution to leaders — setting the standard for transparency, accountability and effectiveness in higher education.

Step one: the AI-empowered website. Think of a Q&A function on steroids. Instead of static FAQs or labyrinthine menus, every stakeholder — student, staff, applicant, alumnus — can ask questions in natural language and receive intuitive, accurate and transparent answers. “What scholarships am I eligible for?” “When does my assignment need to be submitted?” “How many international students are enrolled in our MBA?”

These queries no longer depend on the luck of reaching out to the right administrator or the patience required to navigate fourteen sub-pages of a university website. They are answered instantly, consistently, and, crucially, logged. Every query becomes a data point. And those data points reveal what people are really asking. Universities could quickly see where information is missing, where students are confused, where staffs are overloaded — and then use that insight to improve and personalise a better user experience.

Transparency here is not an abstract principle, but trust in action. When information is easy to find, people stop assuming it is being hidden. When every question is logged, managers can see patterns rather than firefight individual complaints. Imagine how many hours of academic and professional services staff time could be saved if even 30% of routine queries were answered automatically. And imagine the reputational value if students felt that their university was as responsive as the tech platforms they use every day.

Nick Hillman of HEPI has often spoken about the importance of “clarity of communication” in universities, particularly around admissions and fees. Yet clarity remains patchy, and frustration is common. An AI-enabled website could provide that clarity in a way no prospectus or PDF ever could.

Step two: guardrails, security and defining users. If step one is radical openness, step two is controlled access. Not all queries are created equal, and not everyone should have the same access to data. A Vice-Chancellor, for example, needs the ability to ask, “What is our total tuition income this year by source country?” or “How many staff grievances have been lodged in the Faculty of Arts?” But that is not information a junior administrator should be able to access. A department head should be able to interrogate workload models for their faculty, but not see the individual pensions data of staff elsewhere.

Guardrails don’t restrict progress; they enable it. They create accountability and clarity. If everyone knows what they can query, what they can’t, and how the system will respond, the AI becomes an institutional asset rather than a reputational risk.

The importance of this is underscored by the ANU example, where institutional data was handed wholesale to external consultants. As The Saturday Paper reported, senior executives “misled ministers” about invoices and signed off on millions with minimal oversight. That episode, now a case study in failure, was not just about money wasted but about control lost. Guardrails are the antidote to that loss of control. They ensure that power resides where it should — within the university, with clear rules and clear accountability.

Step three: widgets by function. This is the real game-changer. Once the foundation is in place, universities can develop functional widgets for every part of the institution: finance, HR, estates, student information, admissions, catering, research. Each widget is an AI-driven dashboard that can be queried in plain English. “How much tuition revenue does our MBA generate?” “What is the average cost of acquiring an undergraduate student from China — broken down by marketing spend, admissions, and agent commission?” “List our catering contracts and their expiry dates.”

Right now, universities spend millions on management information systems that lock data in black boxes, accessible only to a handful of analysts and published as stale PDF reports. NashTech (2023) puts it bluntly: universities are “data rich but insight poor.” An AI-empowered university flips this model. It opens the black boxes, invite questions, and delivers everyday answers. From participation and pensions to postgraduates and parking — the principle is the same: transparency for all.

The benefits are enormous. For leaders, it means decisions are made on evidence, not hunches. For staff, it means autonomy to access the data they need without endless requests to “central.” For students, it means clarity about what their institution is doing and why. It is the difference between a university that hides behind bureaucracy and one that empowers its community to know itself.

Sceptics will say this is too ambitious. They will point to the complexity of systems, the diversity of data, the risks of misinterpretation. But every other sector is already moving in this direction. Finance, healthcare, retail — each is deploying AI systems that integrate multiple datasets and make them queryable by non-specialists. If universities do not follow suit, they will once again be disrupted rather than doing the disrupting.

Of course, there are risks. Misuse of data. Over-reliance on AI outputs. Bias creeping into algorithms. But these are not arguments for inaction; they are arguments for deliberate, responsible design. As Jason Clare, Australia’s education minister, has repeatedly argued, “transparency builds trust.” Universities have the intellectual capital to get this right. What they lack, at present, is the will to invest in their own capacity rather than endlessly funding external providers.

The opportunity cost of doing nothing is stark. Already, universities are haemorrhaging money on duplicated systems, losing credibility with students who expect better digital experiences, and ceding control of their data to third parties. In a world where international students are questioning the value of their degrees and governments are questioning the migration impact of international education, the last thing universities can afford is to look technologically obsolete. An AI-empowered university is not a luxury. It is survival.

This is why step one, step two and step three matter so much. They are not theoretical. They are practical, achievable and incremental. Any institution could begin today. Start with a Q&A front door that logs queries. Build guardrails that clarify access. Layer functional widgets to democratise data. The result would be transformative. Not just for efficiency, but for culture. Transparency would cease to be a buzzword and become a lived reality. Staff would stop feeling in the dark. Students would stop assuming information is hidden. Leaders would stop managing blind.

The final piece in this trilogy will explore what this looks like in practice: a day in the life of a Vice-Chancellor, armed with a university-owned AI. How do the questions they ask change? How does the decision-making process evolve? How does it transform their role from reactive to proactive, from defensive to strategic? Because ultimately, the measure of AI in universities will not be the elegance of the system but the quality of the decisions it enables. And that is a test no institution can afford to fail.

Back to blog

Leave a comment

Please note, comments need to be approved before they are published.