Skip to content Skip to navigation

Vice-Chancellor's Communications Blog

Other Blogs

AI-Powered Universities

13 April 2023

7 mins

This is an extended version of a blog which was originally published on the Times Higher Education website on 11 April 2023.

Artificial Intelligence (AI) will soon power our universities, transforming the way we undertake research, educate students, and run our institutions. Early glimpses of this are evident today, but we have only just started to scratch the surface of what AI can do for the sector.  

I have spent my professional life researching AI – and now I’m a Vice-Chancellor, I’m frequently asked about AI’s impact on universities. While I certainly don’t claim to have all the answers, I have a few thoughts from my perspective of leading Loughborough University.

AI is transforming society: we already see significant impacts on the way we live, work and play. This will become ever more pronounced as the technology improves and we think of more creative ways to deploy it. When I give talks on AI I often enliven proceedings by asking the audience to shout out an area or topic, and then I go on to describe a related application of AI. I have yet to be stumped, despite some fairly wild suggestions! However, I don’t believe that AI systems will come to dominate the world, nor take over all our jobs. AI systems will be most effective when they work in partnership with humans, making the most of the complementarity between the tasks that smart machines are good at and those at which humans excel[1].

With this perspective in mind, let me turn to universities in particular.

In terms of research, there is exciting activity in all areas of academic endeavour. Be that exploring the rights of sentient machines or using novel computational techniques to discover new drugs and materials. These are, respectively, profound new research questions and fundamentally new approaches to solving complex problems. It is important to emphasise that such activities are absolutely not confined to STEM subjects, nor to those with a STEM background. All research disciplines are already, or will shortly be, influenced by AI. Many will be powered by it. There are examples of Loughborough research already embracing AI technologies and approaches.

In terms of education, there has long been the tantalising promise that AI will personalize the learning experience for each and every student. AI-powered systems could follow an individual’s progress and present content and assignments tailored to their particular learning style and ability. This truly bespoke offering would operate at a scale far beyond that available at any university today and would help with the inclusion of the increasingly diverse needs of today’s student cohorts. Now, we are beginning to see elements of this vision become a reality.  But we still have a long way to go to get anywhere near this degree of sophistication. Moreover, I also believe (and fervently hope!) there will always be a need for human educators to support and curate this content and to provide the inspiring and insightful connection that lies at the heart of a high-quality Loughborough education.

Much of the very recent explosion of interest in AI has been driven by the development of powerful chatbots such as ChatGPT. In this context, there is much debate about their role in completing assessments. Such bots can produce highly credible essays, answers to assignments, computer programs and blogs… although not this one, obviously. We cannot, and we should not, simply ban such systems. This is both unworkable and undesirable. Rather, we must think carefully about how these powerful tools can enhance our work. We have a duty as educators to prepare our students for a global workforce in which they will have to routinely use AI tools in a responsible manner (see the development of DIGILabs as a great local example).

Chatbots can usefully facilitate idea generation, summarise significant bodies of work and critique initial drafts. However, we need to ensure that assignments, and their chosen method of assessment, require critical thinking, independent research and understanding that cannot be outsourced to a chatbot alone. This requires a shift to authentic assessments in which students are expected to deploy knowledge for an in-depth analysis or to synthesise data from specific situations. After all, if a chatbot can get good marks for a question, it probably isn’t a very good question in the first place!

From the educator’s perspective, partnership with AI systems offers an ability to aggregate, at the course-level, feedback and analysis in real-time. We will have a reliable and objective way to identify the topics and concepts that the students find most challenging, without having to wait until after exams or being overly influenced by those who are most vocal. Locally, we are currently piloting the use of AI chatbots within specific modules and early student feedback is positive.

Possibly the least developed area is how AI can improve the way that universities operate. There are many ways in which universities are just like many other large institutions. So, AI advances in areas like HR, finance, and marketing should naturally flow into universities via standard products and services. We have already used ChatGPT to generate content for recruitment exercises and critique documents prior to publication. Going further, imagine if Loughborough was a university in which data is only input once. Data is not lost. Data is joined up to provide useful, valuable services. AI could make Loughborough, and all universities, the modern, digitally-powered institutions they should be.

What’s even more interesting, however, is where AI will have an impact on the things that are particularly prominent or distinctive to being a university. Turning first to students. I see significant opportunities for AI assistants to support their journey in a joined-up manner. Such assistants could amplify (human) personal tutors by identifying relevant course options based on the millions of data points generated by their individualised learning journey, highlighting and scheduling interesting extra-curricular activities and opportunities, and keeping an eye on mental health and well-being.

For staff, AI systems could automate the routine administrative tasks that we all spend too much time on (think meeting scheduling, expense claims, and form filling), summarise free-form feedback from surveys and questionnaires, and identify new connections with relevant researchers who are working in adjacent or complementary fields.

While it is clear there are many opportunities for AI-powered universities, significant challenges need to be overcome. Most AI systems need data. Lots of it. This brings in issues of privacy and ethics, data ownership, copyright, GDPR and bias. These are all genuine showstoppers if handled incorrectly. On top of this, there are issues with the way AI systems make decisions. Most of them cannot readily explain why they made a particular decision – so while the computer might say no, it cannot explain why.  AI systems, and chatbots in particular, are also prone to hallucination. That means giving convincing, but entirely fictitious, answers. Finally, AI systems are poor at social interactions. They just don’t know how to be effective collaborators, good team players or robust challengers of human decisions. The human still has to adapt to accommodate the foibles of the machine and this is not a great basis for an effective partnership.

I firmly believe AI will revolutionise all aspects of university life and that we should be in the vanguard of this in Loughborough. ChatGPT and I agree that “if used wisely it will make universities more effective, rewarding and inclusive for everybody.” There are challenging issues to be resolved, both with the technology and its application, but by embracing the opportunities and working in partnership, AI will help staff and students, research and education to flourish.

[1] S. D. Ramchurn, S. Stein and N. R. Jennings (2021)“Trustworthy human-AI partnerships”iScience 24 (8) 102891

Vice-Chancellor's Communications

Opinions and comment from the Vice-Chancellor, Professor Nick Jennings

Scroll to Top