Artificial intelligence is here ... and it is already rewriting the rules of education

‘As educators, our job is not to shield students from AI, but to prepare them for the reality of the working world’

Handled carefully, artificial intelligence can become a powerful learning tool. Photograph: Getty Images
Handled carefully, artificial intelligence can become a powerful learning tool. Photograph: Getty Images

Artificial intelligence is everywhere. When I open Microsoft Word to draft this piece, a Copilot icon pops up, asking if I want help. One click, one prompt, and a passable first draft would be on the screen in seconds.

Every Google search begins with an AI overview, serving up neatly packaged answers-no need to scroll, no need to think. AI is now integrated into almost every digital tool we use and the message is loud and clear: AI has arrived, and it’s not going anywhere.

For students, AI isn’t just a novelty; it’s a tool and they’re using it more and more. A recent UK study found that 64 per cent of students now use the application to generate text, up from 30 per cent in 2024. Many rely on it as a learning aid: using it to clarify concepts, summarise articles and brainstorm ideas. Faced with a looming essay deadline, a student can also feed the question into ChatGPT, generate a convincingly written response, then run it through again to add academic references and a bibliography that looks – and often reads – authentic.

Given the pressures of long commutes and part-time jobs, it’s no wonder that some students are turning to these shortcuts to get work done.

Since the launch of the first version of ChatGPT in 2022, AI has exploded into our lives. Tech giants are locked in a billion-dollar arms race, churning out faster, smarter, more powerful versions of these tools with weekly regularity. While Silicon Valley accelerates, educators everywhere are left grappling with one pressing question: What does this mean for the future of learning?

The term AI usually refers to generative artificial intelligence software that can generate text, images and other content using vast datasets and predictive algorithms. When students use these tools to generate assignments, the impact on learning is significant. At first glance, they might seem to meet the brief, but they miss the opportunity to develop the very skills that higher education is designed to foster – critical thinking, analysis, originality, and effective communication – skills highly valued by employers.

Yet, AI now makes it easy to bypass many of these learning processes. Tools such as Canva and Gamma, for instance, can generate professional presentations in minutes. For some, this feels like innovation; for others, it’s academic short-circuiting.

It also appears that an overreliance on AI could weaken cognitive function. According to brain health expert Dr Daniel Amen, if you’re not actively engaging your brain, it becomes weaker. His advice? Use AI to amplify thinking, not replace it. The challenge for educators – and their students – is to find that balance.

How are universities responding to this crisis?

At first, many institutions went into defensive mode – concerned about the impact of the technology on student learning. Some universities banned it outright; others brought back timed, in-person exams in a bid to sidestep AI completely.

Susan Galavan.
Susan Galavan.

As educators, however, our job is not to shield students from AI, but to prepare them for the reality of the working world. In my profession of architecture, for instance, an estimated 59 per cent of practising architects in the UK now use AI for at least the occasional project, up from 41 per cent in 2024. These tools are already reshaping the profession – from visualising designs to managing projects and rethinking how practices operate and grow. Architectural firms are experimenting with tools that can generate faster design concepts, optimise building layouts for energy efficiency, or even flag project delays during construction.

While some architects fear a loss of creativity, others see AI as a collaborator that handles some of the heavy lifting, freeing them to focus on creative vision and human experience. There are, of course, difficult legal and ethical questions which we must address. But by the time my first-year students graduate in 2030, artificial intelligence is going to be a fundamental part of their working lives. Our role as educators, therefore, must evolve from policing technology to teaching how to use it critically and ethically.

Three years on from the launch of ChatGPT, universities are now moving from prohibition to engagement with AI. Academics across disciplines are joining forces to rethink what teaching, learning and assessment should look like in an age of intelligent machines.

What they are finding is that a radical redesign of assessment is required. The first step is to adapt – or abandon – assignments that AI can easily do. The second step is to shift the focus to the process of learning itself – and testing that learning through more in-class engagement, learning journals and oral exams. Some lecturers are guiding students how to use AI as a “thinking partner”, helping them explore ideas, analyse data, or refine arguments ethically and transparently.

There’s also growing recognition that generative AI can enrich learning. It can offer personalised learning through AI-powered tutors, for example, or provide new ways to engage students through interactive platforms. The challenge now is balance – harnessing innovation without losing the authenticity and depth that define true education. Not everyone is on board, but the universities that strike that balance will be the ones that thrive in the AI era.

Without guidance, students may treat what artificial intelligence produces as gospel – blindly trusting its outputs without questioning the source, the logic, or the accuracy. But AI isn’t always reliable. It can fabricate facts (“hallucinate”), replicate bias, and rely on outdated data. It produces polished texts within seconds, which sound convincing but are wrong. There are also many ethical and legal concerns, and the rapid expansion of data centre development is likely to have significant environmental implications. That’s why AI literacy, understanding how these tools work and where they fail, is now as essential as digital literacy once was.

Let’s be clear – the technology is useful. In writing this article, I used it to reframe my writing, sharpen language, and improve the flow. It saved time. It streamlined the process. It even opened up other ways of looking at the problem.

But it got things wrong sometimes, and it didn’t do the thinking. I’m still the author and originator of these ideas and thoughts, based on 30 years’ experience in my profession. I also decided how and when to use the tool: what is useful, what is not. This is a key point when it comes to adapting to AI: understanding when to lean on the machine and when to rely on our own minds.

This isn’t the first time technology has disrupted the status quo.

In the 19th century, the arrival of steam power sparked fears of mass unemployment. Jobs were lost, but new ones were also created in the factories, offices and institutions which sprang up to support industrialisation. Fast forward to the 1970s, calculators caused panic in classrooms. Would students ever learn to add again? Of course they did, because we adapted. We taught them when and how to use the calculator wisely, not blindly.

Artificial intelligence demands a similar response.

The future isn’t AI versus educators. It’s AI with educators. We can’t wait around for governments to draft policy – we must lead. Our job as educators is to guide students through the noise, to help them question, analyse, and create with intelligence, integrity, and insight.

Change is never easy. But neither was teaching during the pandemic. The difference here is that AI won’t go away; it will only become smarter. Our challenge is not to stop the technology, but to shape it – ensuring students remain active thinkers, not passive consumers.

If we adapt wisely, artificial intelligence can become one of the most powerful learning tools ever created. But if we ignore it, we risk losing the very essence of higher-level education: the ability to think for ourselves.

  • Dr Susan Galavan is an architect and lecturer in the Department of Architecture, Atlantic Technological University, Sligo