The rise of artificial intelligence powered tools is influencing research work across many disciplines. But with these new tools come new responsibilities. For PhD students navigating the shifting landscape of AI in academia, AI literacy is becoming an essential skill.
So what does AI literacy actually mean for PhD students, and how can you develop it?
1. Understand what AI tools can (and can’t) do
The first step in AI literacy is to understand what current AI models, especially large language models (LLMs) like ChatGPT, are capable of, how they work and what limitations they have.
This blog post from the London School of Economics is a good starting point to read about how LLMs generate text (or speech) without getting ensconced in technical details. The blog makes the crucial point that text produced by generative AI needs to be checked, edited and updated for accuracy by a human. Being literate in AI means knowing when an AI tool is useful for PhD work, and when it isn’t.
2. Stay informed about your university policies
Policies about the use of Generative AI tools in research is a fast-moving area and likely to develop over time. Always check your university policies including those on:
- plagiarism
- research integrity
- academic integrity
- the PGR code of conduct
Policies in these areas are expected to evolve, and so reading and engaging with policy updates will be essential. Take some time to discuss use of generative AI tools with your supervisor too and find out their standpoint.
3. How are postgraduate students using AI tools?
A recent study surveyed 75 postgraduate students across 19 different academic institutions in the UK. The findings showed that the most prevalent uses of AI tools by students were to:
- Save time , particularly in the early stages of research where the amount of information available may seem overwhelming
- Edit text, either for spelling and grammar or to improve the writing style
- Act as a colleague, to help generate ideas and as a source of support or reassurance
Surveyed students showed an awareness of limitations and ethical issues connected to the use of AI tools, However, the authors concluded that there is a need for greater clarity about the appropriate limits of the use of AI tools.
4. Using generative AI tools for more than just words
While AI tools are often used for working with text, it’s also important to consider how they can be used to generate code, particularly in computer science education research studies. The Alan Turing Institute has a useful blog which includes suggestions of AI tools to use, some suggested tasks and helpful guidance of when to be cautious about AI-generated code.
Another consideration is the use of AI tools to generate images. Context is important here: an AI-generated image that serves as an illustration in a presentation is a very different scenario from using an AI tool to create images and present these as results in a publication. This shows that when to use and AI tool requires careful thought.
5. Consensus is not yet established
There are currently many “grey areas” where the use of AI tools in research is not yet agreed. An example of this is a recent blog post sharing the results of an online poll with responses from 5,000 researchers. The results showed a variety of perspectives about when to use AI in research, as well as some disagreement about when to disclose the use of AI tools. The poll did not include different user perspectives such as whether the researcher spoke English as additional language, or required additional support for neurological differences such as dyslexia.
Here’s a link to the poll questions used in the survey.
Summary
Being an AI-literate postgraduate student means taking time to understand how AI tools work in order to engage with them critically, understand the limits of their use and staying curious about how they can support postgraduate research activities.
Further reading
English, R., Nash, R., & Mackenzie, H. (2025). ‘A rather stupid but always available brainstorming partner’: Use and understanding of Generative AI by UK postgraduate researchers. Innovations in Education and Teaching International, 1–15. https://doi.org/10.1080/14703297.2024.2446236
Iatrellis, O., Bania, A., Samaras, N., Kosmopoulou, I., & Panagiotakopoulos, T. (2025). ChatGPT in PhD Mentoring: Exploring the Potential of Generative AI for Academic Guidance and Sustainable Educational Practices. [Pre-print].
