Guide last updated by Katie Radford, February 2025
This guide was created by Katie Radford, Access Librarian at the IALS Library.
Email katie.radford@sas.ac.uk
Reference Desk hours:
Monday-Friday 9:30-17:00
Tel (0)20 7862 5790
Email ials@sas.ac.uk
Generative AI is a type of artificial intelligence that uses machine learning models to generate new content, including text or images. One of these types of model is called a large language model (LLM), which generates new content in the form of natural language based on predicting what words are most likely to follow in the given context. These predictions are based on the patterns it has learned from the datasets on which the model has been trained.
Generative AI tools like ChatGPT and Gemini can be valuable as part of academic study and writing in various ways. This guide will discuss ways in which you can use these tools to support your academic work whilst ensuring that you respect the principle of academic integrity.
The first thing you should do is make sure that you check guidance from your institution regarding use of AI in your academic work. There may be guidance available on universities’ websites about using AI, often contained in information on academic integrity more broadly.
However, given these technologies have only recently become widely available, there may not be one cohesive approach to the use of AI in your institution. As a result, individual course tutors might have different guidance regarding the use of generative AI tools. Check with your course tutor or supervisor if the use of generative AI tools is permitted in any capacity when an assignment is set.
Any use of generative AI in assessed work must be in line with existing policies surrounding academic integrity and plagiarism. Academic integrity policies aim to ensure that no student has an unfair advantage over another, by requiring honesty and openness in academic practice.
The key principle behind academic integrity is that any work you produce must be your own, and you must acknowledge and reference any ideas in your work that are not your own. In this way, using content generated by AI as if it was your own work without appropriate acknowledgement is likely to represent academic misconduct.
The University of Cambridge has provided the following definition of academic misconduct with AI:
“A student using any unacknowledged content generated by artificial intelligence within a summative assessment as though it is their own work constitutes academic misconduct, unless explicitly stated otherwise in the assessment brief.” University of Cambridge, 'Artificial Intelligence' (2025)
Remember that academic misconduct is taken seriously by institutions and can lead to disciplinary procedures and penalties, including failure of modules and even entire degree programmes.
The key point to remember is that any work submitted must be your own. It may seem clear that asking another person to write your assignment for you and then presenting it as your own work would be a breach of academic integrity. In the same way, it would be a breach of academic integrity to ask a generative AI tool like ChatGPT to write part of your assignment for you and present it as your own.
However, AI can be helpful when preparing for assignments, as long as it is not being used to produce work for you. The next section outlines some of the ways in which AI can be used in your academic work.
The following are some examples of the sort of ways in which you can use generative AI to help you prepare an assignment. Good uses of generative AI tools are those that enable you to enhance your own study or writing process, without producing content or ideas for you.
If you are not sure whether you should be using AI in a particular way, imagine AI as another person, and ask yourself: would this be acceptable if it was another person? If the answer is no, you shouldn’t be using generative AI for this purpose in your work either.
It is also important to bear in mind the weaknesses of generative AI when using it for any purpose. Some of the limitations of AI tools when used for academic study and writing are:
It’s often not possible to identify the source of the information produced by an AI tool. This is a problem in academic settings as acknowledging and citing sources are integral parts of academic writing. Any ideas in your written work that are not your own must be cited, and so you may be breaching academic integrity if you use information generated by AI tools for which you cannot identify and acknowledge the source.
The information produced by generative AI tools can be inaccurate, out-of-date, and even completely fictitious in some cases. These models generate new content by predicting what is likely to follow in the given context based on the datasets on which they have been trained. By generating information using predictions, AI tools may produce information which is plausible in the context but factually incorrect. For example, it has been known to generate citations to articles which do not exist.
The responses of generative AI tools may replicate bias inherent in the datasets on which they have been trained. This can be a problem as many AI tools are trained on publicly available information from the internet which contains views that may be biased or prejudiced. These views can then be reflected in the generated content.
Copyright and intellectual property
Generative AI tools raise a host of questions about copyright and intellectual property, including concerns surrounding the data included in the datasets used to train the models, as well as the ownership of any output produced.
Remember the limitations when using AI and do not assume that everything it produces will be correct. Ensure that you critically evaluate the output of AI tools, as you would when using other sources in academic work.
Newcastle University has produced a useful document entitled Critical Evaluation: Evaluating information generated by AI which provides guidance on how to evaluate the output of AI. This document outlines 5 key areas to consider when critically evaluating any information generated by AI tools:
If you are permitted to use generative AI tools, you must make sure you acknowledge and reference them appropriately. Bearing in mind academic integrity, you should explain honestly if and how they have been used.
It is worth checking with your university and tutor how they require you to cite AI in your assignments, as guidance varies between institutions. It may be sufficient to reference AI in the same way that you would with any other source, in accordance with the required citation style.
There is currently no formal guidance in the 4th edition of OSCOLA on how to reference generative AI. In absence of formal guidance, it has been suggested that you should cite in the same way as personal correspondence.
Newcastle University has produced a useful guide on referencing AI using different citation styles, including OSCOLA.
Until official guidance is released, you will need to reference generative AI as personal communications for OSCOLA. These are only cited in footnotes.
Footnote format: Footnote number. Form of communication from Author (Date).
Footnote example: 2. ChatGPT 3 response to prompt to outline 3 reasons why children can't form circles from OpenAI (7 February 2023).
Newcastle University, 'Citing ChatGPT and other generative AI' (2023).
However, some institutions require that any use of AI should be acknowledged separately from your references list in an acknowledgements section.
For example, UCL guidance on Academic Integrity suggests that it is not appropriate to simply add the generative AI tool to your references list or bibliography. The point of the references list is so that the reader can identify the original source of the idea. It is not possible to do identify the original source of information generated by an AI tool, so it shouldn’t be added to your list of references. As such, the UCL guidance suggests that you should include an acknowledgement of your use of AI in an Appendix or Methods section, with details such as the following:
Name and version of the generative AI system used; e.g. ChatGPT-3.5
Publisher (company that made the AI system); e.g. OpenAI
URL of the AI system.
Brief description (single sentence) of context in which the tool was used.
For example: I acknowledge the use of ChatGPT 3.5 (Open AI, https://chat.openai.com) to summarise my initial notes and to proofread my final draft.University College London, 'Acknowledging the use of AI and referencing AI' (2023).
Bear in mind that generative AI content is a “nonrecoverable source” so individual content cannot be linked to or retrieved directly (Newcastle University, 2023). There is a Google Chrome extension called ShareGPT which allows you to generate a unique URL that you can use in your acknowledgements if this is required.
You may also be required to include details on the prompts used and exactly the content that was generated.
Generative AI tools can be helpful in various ways for your academic study and writing. If you do use AI for any assessed work, remember to bear in mind the following key points: