AI (Artificial Intelligence) in Teaching and Assessment

Updated on 13 November 2023

Guidance for staff on the use of AI in teaching with notes on practice and aspects of generative AI (GenAI) in higher education to consider.

On this page


The artificial intelligence (AI) landscape continues to evolve at pace. This second briefing note for staff on the use of AI in education and assessment provides an update on developments together with notes on practice and outlines aspects which need careful consideration.

The first briefing note is available on CTIL's blog: Briefing note on AI in education and assessment.

Whilst AI has been working in the background of many of the digital spaces and platforms that we interact with daily the focus is increasingly on generative AI (GenAI) technologies. These tools can generate writing in a broad range of styles on pretty much any topic, generate code and presentations, artwork, graphics, video and audio. There are hundreds of GenAI tools such as ChatGPT, Google Bard, Bing Chat, Claude,, DALL-E, Gamma, the list is endless. GenAI is also coming to My Dundee. Blackboard can now use AI to generate images, quizzes and module outlines and there are more developments in the pipeline.

GenAI technologies are developing and improving rapidly, they are disruptive and have the potential to be transformative. It is not realistic for universities to ban these tools or to think they can be ignored. With the professional and business worlds adopting GenAI and AI more generally it is essential that our staff and students have the skills and knowledge to navigate the world of GenAI and understand how it can be applied. 

If you have not yet used any of the GenAI tools, we would encourage you to try them out, to play and experiment with them. Get a feel for how well these tools can perform in your own discipline and specialist fields, understand their limitations

This briefing note and guidance touches on the following areas and aspects of GenAI in higher education:

  1. Academic integrity
  2. Privacy, ethics and risks of using GenAI tools 
  3. Gen AI in teaching, learning and assessment
  4. Using Gen AI to develop teaching content development


Academic Integrity

University Policy

Academic integrity is central to the University’s core values of valuing people; working together; integrity; making a difference; and excellence. The expectation is that the work students submit for their assessments is solely their own work, or in the case of group work, solely the work of the group. The University Academic Misconduct by Students Code of Practice (section 2) defines the types of activities that would constitute academic misconduct. The unauthorised use of AI systems is included in this list of activities and whilst Wolfram Alpha and Sudowrite are mentioned as examples, this equally applies to ChatGPT and other AI tools. 

Where a lecturer authorises the use of AI in an assessment the University’s guidance for students on the use of generative AI, clearly details how students should acknowledge and reference its use as they would other sources that have been referred to and have helped to shape and inform students’ work. Where the use of AI is authorised students may be permitted to use GenAI for specific purposes to assist with their assessment or encouraged to use it more extensively as a core part of the assessment.

AI writing detection

Following the launch of ChatGPT3 on 30 November 2022 there was immediate concern about the integrity of assessment and how we might be able to detect whether any part of an assessment had been generated by AI. Turnitin announced their intention to switch on AI detection for all customers in April 2023. The University, along with most UK universities, decided to opt out of this launch. A key factor in this decision was that we were unable to test and evaluate the reliability of the detection tool and were not provided with independently verified data to give us sufficient confidence to implement at this time. 

Almost six months on it is still not clear how Turnitin detects AI writing. There has been no information forthcoming detailing how its detection tool identifies AI generated writing. Meanwhile, research internationally is highlighting a range of issues with AI detection. We previously highlighted the following from Jisc:

Jisc notes: “AI detectors cannot prove conclusively that text was written by AI.” 

Michael Webb (17/3/2023), AI writing detectors – concepts and considerations, Jisc National Centre for AI

In response to calls from the UK HEsector for an update on their recommendations Jisc have reiterated their position and guidance: 

First, the information in our previous blog post ‘AI writing detectors – concepts and considerations’ still holds, so we won’t repeat much from there, just to reiterate the four points:

  • No AI detection software can conclusively prove text was written by AI
  • It is easy to defeat AI detection software
  • All AI detection software will give false positives
  • We need to consider what we are actually trying to detect as AI-assisted writing is becoming the norm.

Michael Webb (18/09/2023), AI Detection – Latest Recommendations, Jisc National Centre for AI

Our earlier guidance also highlighted that lecturers should *NOT* use unauthorised AI detection tools. This remains the case. We have not approved any of these tools and we do not have student consent to upload their work to third party sites. It is also worth noting that OpenAI the company behind ChatGPT withdrew their AI classifier for indicating AI-written text:

“As of July 20, 2023, the AI classifier is no longer available due to its low rate of accuracy.”

Open AI Blog, New AI classifier for indicating AI-written text.

A recent white paper published by Anthology (the parent company of Blackboard) also considers some of the emerging research on AI detection tools that highlight unreliability and bias against students for whom English is not their first language. This paper states:

“At Anthology, we have conducted a thorough beta test in collaboration with market-leading AI detection tools, from which Anthology and participating clients concluded that AI detection is not currently fit for purpose in education.”

Anthology, AI, Academic Integrity, and Authentic Assessment: An Ethical Path Forward for Education

If you would like to gain more insights into why AI detection tools are unlikely to be reliable indicators of AI generated writing you may want to watch our webinar with Robin Crockett, Academic Integrity Lead at the University of Northampton. Robin outlines how these tools work and how their tuning leads to higher or lower levels of false positives and how this then impacts on the likely percentages of false negative scores.

Suspected cases of AI generated assessment work

As we are now able to create a test environment for the Turnitin AI detection tool we are evaluating it with both human and AI generated work. If you have an assessment that you think is not the student’s own work, you should refer this your School’s Associate Dean for Quality and Academic Standards (AD QAS) for review and where appropriate this can be referred to our academic integrity panel for further review. 

It is important to remember that lecturers may also be able to identify some cases where student work may include AI generated content without the need for additional tools. GenAI is fallible and may generate inaccurate statements and make up or hallucinate references. It can also generate text that includes UK and American versions of spelling in the same paragraphs or sentences. You may also spot a change in the tone and style of writing from previous assessments. If you spot any of these signs in work that you are marking, please do refer to your AD QAS.


Privacy, ethics and risks of using AI tools

Institutionally licenced AI tools

We appreciate that lecturers may be designing teaching and learning activities and assessments that incorporate the use of GenAI technologies. Just as with all the other digital tools that we use in our educational practice it is important to consider issues such as GDPR, data privacy and security when using GenAI. ChatGPT for instance requires users to provide personal details including their mobile phone number and students may be uncomfortable in sharing these details. It is therefore important to remember that we cannot compel students to sign up to use these tools as part of their learning even if they are free to use. 

Therefore, if you do choose to use GenAI tools with students, we advise using Bing Chat which is built into the Microsoft Edge sidebar as part our of Microsoft 365 licence and can generate both text and images. Bing Chat does not retain personal data, nor does it use your data to train the underlying large language models. Staff also have access to Adobe Firefly as part of our institutional Adobe licence which can be used to generate images and text effects. Padlet also includes a generative AI feature which can create images and is available to both staff and students. Students can of course make the choice to use other GenAI tools such as ChatGPT, Bard, Claude, Midjourney etc. The proviso with all of these AI writing, image and content generation tools is that they cannot be used in assessments unless expressly authorised.

This guidance will be updated with details of any other AI tools the University might licence for staff and student use. It is also hoped that as we progress our work around cyber security that we will be able to develop a list of trusted AI and other digital technologies that can be used in teaching and learning. Similarly, a list tools of that should not be used or indeed may banned on campus, (as is already the case with will also be created.

Ethical and privacy implications

As we have suggested above, we do encourage you to explore, understand and become familiar with the range of AI tools that are available, particularly Bing Chat. There are literally tens of thousands of AI tools and it is not just the generative tools that might have an application in your teaching and research practices. Tools like ResearchRabbit and Elicit support literature review, summarising of papers and the creation of collaborative collections and can play a useful role in helping to develop student’s research skills

Before you sign up for these tools or any other AI or GenAI tools it is important to look at the terms and conditions, read them carefully. Many of us to skip reading these and just blindly accept them. Take time to read them and understand how these tools might use your personal data and the data you might share with GenAI tools to prompt answers. Think carefully about any potential risks, ethical and privacy issues, as well as benefits before you sign up. This is particularly important for free GenAI tools, these tools are rarely genuinely free, they often make money by using or selling your data. You might find it helpful to refer to Jisc’s article on Navigating the terms and conditions of generative AI to better understand how some of the commonly used GenAI tools such as ChatGPT and Google Bard, Claude etc may use your data. Similarly you may find it helpful to share this article with students.

As you explore and evaluate GenAI tools we encourage you to adopt a thoughtful and critical approach. Be aware of some of the key ethical concerns associated with GenAI to help ensure you use these tools with care and creativity.  

  • Privacy - Sharing personal or sensitive data in an AI platform, just as with social media, has the potential for it become publicly available. This also applies to any unpublished research data or intellectual property that you may share and we advise against inputting this type of data into any GenAI tool. 
  • Copyright - One of the criticisms of GenAI is that it does not respect copyright and credit the sources that it has been trained on. These criticisms are particularly acute in relation to creative works in art and literature and computer code and there are several pending lawsuits against GenAI companies for breach of copyright. The News Media Alliance in the US have also published a white paper highlighting that the “pervasive copying of expressive works to train and fuel generative AI systems is copyright infringement and not fair use.”. There are developments such as the UK Government’s work to develop a code of practice on copyright and AI that may in time help start to address some of these concerns.
  • GenAI companies train their models on content that we submit, this in turn helps them generate income.
  • Misinformation - The accuracy and integrity of information generated by AI is a widespread concern. It has been well known for several years that AI has been playing a significant role in generating misinformation and deep fakes. The most recent advancements in GenAI tools have only served to make it even easier to generate and distribute misinformation. Academics are not immune to falling for false information generated by AI as a recent case in Australia highlights where academics used GenAI to make false accusations in a submission to a parliamentary inquiry
  • Cognitive bias - Content generated by AI can replicate human bias. The outputs of GenAI tools will only be as inclusive and equitable as the data they have been trained on and informed by and it may therefore be discriminatory. There are also concerns around the inclusivity of training datasets, much of the training data originates in the global north. As such is replays some of the challenges faced with Wikipedia content with the voice of the global south and women under represented.
  • Accessibility - Tools such as ChatGPT are not accessible in all countries due to privacy concerns, government regulations, censorship or other Internet restrictions. This is the case in China where the use of GenAI is illegal and therefore it must not be used in teaching. If you are lecturing on a distance learning programme and want to incorporate GenAI into your teaching be mindful that not all of your students may have access.
  • Exploitation - There are concerns that some of these platforms have adopted exploitative and unethical practices to cleanse their data of violent and hateful content by employing individuals in developing countries who have been traumatised by the content they have had to vet.
  • Sustainability - Concerns also prevail about the environmental impact of GenAI. These technologies require enormous computational power to process large amounts of data leading to high energy and water consumption. Also, as GenAI requires ever more powerful hardware to run it is likely that it will also contribute to the growing problem of digital waste.

These are issues that we also need to make our students aware of as we support them to develop their AI fluency and AI literacy skills.

AI and Teaching, Learning and Assessment

Integrating GenAI into your teaching to support student learning

AI technologies are already being used in many professions and areas of business that our graduates will be working in. These tools are also being used in schools and we need to ensure that we are ready to teach students who may already have developed a level of competency in using GenAI. We are also conscious that there may be pockets of hesitancy amongst staff and students and a wariness of using and engaging with GenAI. The reality is that with Microsoft’s Copilot already bringing AI capabilities to Windows 11 and with upcoming integrations into MS365, AI is going to become an integral part of the productivity tools used in workplaces globally. It is essential therefore that our teaching programmes can prepare our students to succeed and thrive in an AI pervasive society, helping them to apply and use AI ethically, critically and creatively.

Opportunities to do this will become easier as GenAI becomes increasingly embedded into the tools we already commonly use. This is already possible with Bing Chat in Microsoft Edge and it allows for social interaction between humans and AI that can support learning conversations. Mike Sharples, (Professor Emeritus of Educational Technology, Open University, UK) has outlined some roles for GenAI in cooperative and social learning that can be incorporated into your teaching. These are detailed in the table below and whilst the table refers to ChatGPT this can be interchanged for Bing Chat or any other GenAI. 

Some roles for generative AI in cooperative and social learning

Role Description Example of implementation
Possibility engine  AI generates alternative ways of expressing an idea  Students write queries in ChatGPT and use the Regenerate response function to examine alternative responses. 
Socratic opponent  AI acts as a respondent to develop an argument  Students enter prompts into ChatGPT following the structure of a conversation or debate. Teachers can ask students to use ChatGPT to prepare for discussions. 
Collaboration coach  AI helps groups to research and solve problems together  Working in groups, students use ChatGPT to find out information to complete tasks and assignments. 
Co-designer AI assists throughout the design process Students ask ChatGPT for ideas about designing or updating a website, or focus on specific goals (e.g., how to make the website more accessible).
Exploratorium AI provides tools to play with, explore and interpret data Students use ChatGPT to explore different ways to visualise and explain a large database, such as census data, 
Storyteller AI creates stories that include diverse views, abilities and experiences Students take it in turn to ask ChatGPT to continue a story, prompting it to include a diversity of characters.
Mike Sharples (2023) Towards social generative AI for education: theory, practices and ethics, Learning: Research and Practice, 9:2, 159-167, DOI: 10.1080/23735082.2023.2261131

Try out and test some of these activities yourself or with colleagues to gauge the level and quality of content that GenAI tools are capable of generating in your discipline. There are lots of educators writing and vlogging about how they’re incorporating GenAI into their teaching practice and sharing their experiences on LinkedIn, Twitter (now X), BlueSky, YouTube etc. 

Additional resources:

101 Creative ideas to use AI in education: A crowdsourced collection, Edited by Chrissi Nerantzi, Sandra Abegglen, Marianna Karatsiori & Antonio Martínez-Arbole 

Ethan Mollick Substack – Ethan is a professor at the Wharton School of the University of Pennsylvania and has emerged as a key writer and thought leader on working with AI. Wharton have also published an interactive crash course on AI for instructors and students with Ethan and Lilach Mollick which provides a helpful overview of AI.

AI in Education, Dr Tarsem Singh Cooner is a senior lecturer in social work at the University of Birmingham. His website includes a series of videos outlining how he uses ChatGPT to support his teaching practice and to help his students prepare for a future in which they will have to easy access to GenAI. 

CTIL curated Padlet – AI and Higher Education with links to research on AI, tools, guides, workshops etc. 

Student adoption to support learning

Just as staff have mixed views on using GenAI, so do our students. Some are worried that relying too heavily on or overusing GenAI will limit their own learning and the development of core skills which many employers are looking for in graduates. Skills such as problem solving, critical thinking, analytical skills, critical reflection. Such concerns have already been raised by UK-based students in Jisc’s report on ‘Student perceptions of generative AI’:

Students have also expressed fears about the potential negative consequences of relying too heavily on GenAI tools. For example, some think it could impede their intellectual growth. They worry that an excessive reliance on such tools may lead to a decline in knowledge acquisition and hinder the development of critical thinking skills. (Jisc National centre for AI in tertiary education – Student perceptions of generative AI – 15 August 2023)

There are, however, students with well-developed self-directed, self-regulated learning and information literacy skills that are using GenAI and other AI tools to develop new approaches to enhancing their learning. Our guidance to students highlights some of these and reflects what students shared with us in an AI workshop. It is essential that we help all of our students to develop their information and AI literacy alongside the graduate attributes that are central to our Curriculum Design Principles.

GenAI and assessment 

With the Covid pandemic and associated lockdowns there has already been increasing reflection, review and debate about our assessment practices. The advent of AI has heightened this debate and the importance of assessment design and the need for more authentic assessment. As the University progresses its review of the current assessment policy there is an opportunity to reflect on our existing assessment practices and develop new guidance that can inform new approaches to assessment that align with our Curriculum Design Principles (CDPs). Considering more authentic approaches to assessment also requires us to think about programme and learning design and here our CDPs are key. 

In the immediate we would encourage you to think about whether your assessments are fit for purpose. Think about the learning outcomes and how students can demonstrate these. Then choose the assessment type which facilitates this. Try running your assessments through GenAI and seeing what type of output it generates and consider how you can tweak the assessment to encourage students to demonstrate skills and application (rather than repetition of knowledge) which are harder for AI tools to address as they require more personal understanding or critical thinking.

Think about whether you can ask students to undertake tasks which apply their knowledge and skills to real world tasks. For example:

  • Focus on a course topic of particular interest to them and explain their interest in it as part of the assessment.
  • Relate their learning to their personal and / or professional circumstances.
  • Work with a given data set, case study etc.
  • Interpret or evaluate a real-world example or artefact.
  • Address a real-world design brief or engage with a real-world community. 
  • Undertake a task based on one which might be carried out by a professional in that discipline. 

Your School will also be developing recommendations around assessment design that take account of the various disciplines that we teach. If you are unsure about these then speak to your Associate Dean for Teaching and Learning. As these recommendations develop, we will share them more broadly so that we can learn from other disciplinary approaches.

We would also recommend taking a look at Jisc’s National Centre for AI’s helpful interactive menu of ideas for assessment for an AI enabled world. It covers a range of assessment categories including controlled exams, take home papers, quizzes and in class tests, dissertations and coursework. Each idea also offers a rating for the authenticity of the assessment, the level of challenge, the output, the demand on staff time. You can download this resource and use it with your programme team to help you review your current assessments. 


Using GenAI to support learning design and content development

As mentioned previously GenAI capabilities are going to be increasingly integrated into our Microsoft productivity tools. You may already have been using the AI-based Designer tool in PowerPoint to help with the design of your teaching slides. If you are developing a new lecture, seminar, workshop, handouts, case studies, simulation scenarios etc. you can also use GenAI tools to help you with initial planning or brainstorming. Tools like Bing Chat or ChatGPT can generate a class outline for you and through further prompting can help you develop this further, expanding the various sections. You might want to try this out with a module you teach on or a lecture that you give and compare with what you deliver.

The results can be variable and also vary across the different GenAI tools. For some topics and disciplines the results may not be particularly impressive but these tools can help with initial ideation and move you on from a blank piece of paper to an outline learning design for your teaching. Trying an exercise like this highlights the importance of having underpinning knowledge of a topic and using that knowledge to write more considered and explicit prompts for the AI. For example, AI does have a propensity to present content based on a US setting so you can prompt it to give examples or set the information in a UK or other national context. Again, looking at how other lecturers are using GenAI to help with the process of learning design and content development can be helpful. This video from Tarsen Singh Cooner demonstrates the pros and cons of using ChatGPT to create learning designs based on his own published research.

If you are looking for images to include in PowerPoints and other teaching resources you can use Adobe Firefly which is available to all staff or BingChat. They will generate various styles of images in response to your prompts and you can download the images that it creates. Tools such as Midjourney and DALL-E can also be used.

GenAI in Blackboard

GenAI tools are also coming to Blackboard. The latest release supports the AI generation of images, module outlines, quizzes based on content in modules. We have not yet enabled these features but do have a group of lecturers trying them out. Once we’ve reviewed them, we will present our recommendations to Senate Learning and Teaching Committee for review and decisions on whether we should turn these on as standard features.

Acknowledging use of GenAI

If you are making use of GenAI tools to support the development of teaching content this should be acknowledged, just as we currently expect students to reference that AI has been used where it is permitted. This can be done by including a statement on which tools have been used to help write some of the content or generate an image.

Caution: being aware of the pitfalls 

Finally, a reminder of one of the key pitfalls of using GenAI. It can make things up and hallucinate so reviewing any outputs from GenAI is critical. Academics and professionals are not immune from being caught out here. There have been high profile cases in the US of lawyers using AI to help draft legal submissions which have included fake case law. Meanwhile, academics were caught out making false accusations in a submission to a parliamentary inquiry in Australia.

If you do use GenAI to help you generate any type of content remember to critically review it and ensure the information you present is correct. Don’t assume all the content it generates is factually accurate.

Advice and support

To help staff apply this guidance and develop their practice in the use of AI in teaching we will be developing a number of staff development opportunities. Keep an eye out for notifications on these in the EduZone 2.0 Teams Channel, our AI Teams Channel, Staff Newsletter and other communications. We will also continue to review and update this guidance.

In the meantime, if you require specific advice or support on AI in teaching or assessment please be in touch with Natalie Lafferty ( and Emma Duke-Williams ( in CTIL.

Guide category Staff support