Following a meeting of Senate on the 8th November 2023, the University has adopted a permissive approach to GenAI in Education in order to minimise the risks and maximise the future benefits. It agreed to align the University to a set of high-level principles adopted by the Russell Group of universities, and to work collectively as a community to:
Support students and staff to become AI-literate.
Equip staff to support students to use generative AI tools effectively and appropriately in their studies.
Adapt teaching and assessment to incorporate the ethical use of generative AI and support equal access.
Ensure academic rigour and integrity is upheld.
Work collaboratively across the sector to share best practice as the technologies and their application in education.
Work is already underway – led by two groups with representation from across the institution (one academic and one professional services). More information can be found on the Teaching hub, but a summary is presented below, and more specific communications are planned in the next few weeks.
Key Developments
We have updated our academic integrity statement, related Quality Assurance documents, and generic guidance for referencing and acknowledging the use of AI tools in assessments.
Course teams retain full academic freedom to decide if, and to what degree, students are permitted (or not) to use GenAI in assessments.
Where students are allowed to use GenAI tools, they are NOT permitted to pass off any content created by GenAI tools as if it were their own work.
The Skills Centre has produced a self-paced module on AI Literacy for students, and a similar resource will be developed for staff.
In response to potential uncertainty around students’ permitted use of GenAI in assessments, and in recognition that this may be different depending on the requirements of specific subject disciplines’ learning outcomes, EQSC has recently approved a simple tool and light-touch approach to enable staff to review their coursework.
The tool can be used to help colleagues to classify and communicate their current course assessment portfolio according to 3 types of GenAI usage. This will help to ensure greater clarity and transparency for students. The tool can also be used to help us better future-proof our assessments by identifying any quick wins to minimize risk and/or longer-term opportunities to develop, or change, assessments in light of GenAI. This approach aligns with existing work on CT (notably Assessment for Learning) and is intended to complement existing assessment plans.
We are currently working with the Associate Deans (Education) and Education Managers to ensure a joined-up approach to the review, to plan development sessions for staff and further communications. This will be announced in the new few weeks; in the interim, the CLT have organised a series of talks from external experts.
We have discussed our approach with the Students’ Union via the Student Representatives, and they have helped to inform our plans and guidance.
Finally, we are working closely with DDAT to review a range of GenAI tools, potential licensing requirements and costs, data privacy concerns and to keep a watching brief on Microsoft’s Co-Pilot tools (if/when announced for Higher Education).
Please note: at present we have ruled out using so-called AI Detector Tools. The evidence so far indicates these are fundamentally flawed in concept, do not work effectively, and are prone to bias against certain groups or individual characteristics. Staff must not upload student work into such tools as this may compromise students’ intellectual property and personal data.