skip to content
 

Guidance for University of Cambridge Staff on the Administrative Use of Generative AI

This page provides guidance on the use of Generative Artificial Intelligence (GenAI) for administrative tasks in support of University activities, and links to sources of further information.

 

A summary of key points is also available here.

 

Scope

Please be aware this guidance only applies in limited circumstances, as described below.

Scope of this guidance: technology

This guidance exclusively concerns the use of GenAI. GenAI typically describes any type of AI which can be instructed to produce content.

Large Language Models (LLMs), such as Copilot, OpenAI ChatGPT, Google Bard and Gemini, are part of this category of AI. They produce text-based outputs based on written or verbal instructions from human users. These are sometimes referred to as natural language text prompts.

There are also GenAI tools such as DALL.E, Midjourney and Stable Diffusion that produce digital imagery based on written or verbal instructions.

In addition to the standalone GenAI tools described above, GenAI is increasingly incorporated into software and systems to enhance their capabilities.

This guidance applies to both use of standalone GenAI tools and GenAI features in software and systems. It does not apply to use of systems or software that may include GenAI features where those features are not being utilised.

This guidance does not apply to other forms of AI that may be conceived, developed or utilised across the University.

Nevertheless, some of the broad principles for good practice outlined below may be relevant in other contexts.

Scope of this guidance: audience and context

This guidance is aimed at members of staff (or equivalents such as casual workers, interns, volunteers or visiting staff) considering the use of GenAI for administrative tasks in support of University activities.

This guidance applies to routine tasks such as the production of content for an email to a student or colleague. It also applies to more complex tasks such as the production of content for formal policies or strategic reports.

Ultimately, staff are responsible for ensuring any use of GenAI is conducted reasonably, lawfully, and in conjunction with relevant University policies and procedures.

Scope of this guidance: where this guidance does not apply

This guidance does not apply to use of GenAI in other contexts. In particular, this guidance does not apply to:

  • Education

    • This guidance does not apply to any task or content creation undertaken in the direct provision of teaching, student learning and/or academic assessment.

    • It does not apply to the appropriate use of GenAI by students to support their learning.

    • It does not apply to inappropriate use of GenAI by students in regard to plagiarism or any other perceived form of academic misconduct.

    • It does not apply to programmes of study, courses or lectures where GenAI is the subject matter.

  • Research

    • This guidance does not apply to GenAI as a topic of research – it does not apply to research regarding any dimension of GenAI as a technology, its application, nor the impact of its application in any given scenario.

    • It does not apply to any research undertaken with the assistance of any form of GenAI. The ethics and appropriateness of its use in those circumstances will be assessed by other established means (e.g. research ethics processes).

  • Colleges and other non-University legal entities

    • This guidance does not apply to use of GenAI tools by College employees for any College business, or to the Students’ Union, student societies, or Cambridge University Press & Assessment.

Please see the ‘Further reading on best practice with use of GenAI’ section for other University resources on use of GenAI that may be relevant to the matters identified above.

 

Structure and overview of this guidance

Given the diverse range of work the University undertakes, it is not possible for the University to provide comprehensive instruction on circumstances in which it would or would not be appropriate to use GenAI tools.

It is up to departments, faculties and individual members of staff to balance the benefits and risks of the use of GenAI, to make an informed decision about whether its use is appropriate or desirable – before it is initiated. In the limited circumstances where a formal risk assessment is required, the University Data Protection Impact Assessment (DPIA) and Information Security Risk Assessment (ISRA) processes remain the relevant risk assessments processes to follow.

To help guide staff in their decision making we have provided the following:

  • a brief outline of some of the benefits of the use of GenAI

  • guidance on the risks associated with use of GenAI tools

  • practical examples of good and bad practice

  • useful facts about Copilot

Regardless of the work being undertaken, it is recommended that staff avoid inputting confidential, sensitive or personal information into GenAI tools unless warranted and only in accordance with this guidance. Please see the UIS guidance on different data types: https://help.uis.cam.ac.uk/service/security/data-sec-classes.

Where the use of GenAI will not encompass the use of confidential, sensitive, or personal information, the risk to the University is reduced. However, it is still important to consider whether its use constitutes the best available means of completing a task.  Any use of AI must be appropriately transparent and accountable.

If it is necessary and beneficial to use GenAI tools, it is recommended that wherever practicable the University’s licensed AI tools, Microsoft Copilot and Copilot for Microsoft 365 are utilised, regardless of the type of data that may be input. Use of other licenced GenAI tools is not prohibited, but such tools must be procured in accordance with any applicable procurement policy or process, including but not limited to the completion of any requisite risk assessments such as DPIAs and/or ISRAs. Any risk associated with their deployment rests with the relevant department. Any use of such GenAI tools should also be accordance with this guidance.

Copilot is already available to staff. Microsoft Copilot is the basic licensed version and is available via a staff login and Copilot for Microsoft 365 is the more comprehensive version, which is available via an additional purchased licence. The public free version, available via a web browser, must not be used for University activities and purposes as it does not guarantee a sufficient level of information security or confidentiality. More information on Microsoft Copilot and Copilot for Microsoft 365 is available on the UIS website: https://help.uis.cam.ac.uk/service/collaboration/365/copilot.

Please remember the Acceptable Use Policy (AUP) continues to apply, including its compliance monitoring and enforcement provisions, when using GenAI tools and staff are reminded of their obligation to abide by its terms.

In all circumstances, staff are expected to exercise caution and use their judgement in their use of GenAI.

 

Benefits

Many potential benefits of using GenAI tools are already apparent, and the University encourages all its staff to make the best use of the information services made available to them.

Here is a brief list of potential benefits in supporting University activities.

Efficiency and productivity

GenAI tools can quickly process a lot of information to produce a large amount of output, saving time that can be spent on other tasks, and boosting efficiency and productivity.

Flexibility

GenAI can create content in various formats, styles and languages, making it useful in a wide range of circumstances. For example, it could be used to generate drafts of meeting minutes, letters, press releases, social media posts, and other forms of documentation produced in the course of University business.

That flexibility also allows the University to engage with different audiences across various platforms. In some circumstances, it may be appropriate and beneficial to use it to produce personalised content.

Creativity

GenAI can produce novel or unusual content from existing source material, making it a useful tool for the initial part of some creative tasks. For example, it could be used to draft materials that could form part of a marketing campaign.

Understanding

GenAI doesn’t just have to be used for content production and creation. It could be used to discover and summarise vast amounts of information in more digestible formats, and so it can be a valuable tool in the early stages of research undertaken to support administrative tasks.

 

Risks

There are, however, significant legal and reputational risks associated with use of GenAI tools, which can be broadly categorised as follows:

  • Bias, misinformation and inaccuracy

  • Data protection law non-compliance

  • Intellectual property and copyright law non-compliance

  • AI legislation non-compliance

  • Reputational damage and other risks

This list is not exhaustive, given the overlapping nature of risk categories and the pace of AI development.

Staff must consider relevant risks, and whether they are applicable to the circumstances in which they are considering use of GenAI, before any use of GenAI is initiated.

Risk: Bias, misinformation and inaccuracy

Much GenAI is trained to produce outputs based on information and patterns extracted from data in the public domain. Consequently, outputs will sometimes reflect the biases and prejudices of individuals and groups in society.

GenAI can also be manipulated to deliberately produce biased or inaccurate outputs, presented as fact. Unlike human intelligence, AI does not have the capacity to apply judgement or understand context. It cannot apply moral or critical perspectives.

Additionally, there is an inherent risk that the ongoing use of GenAI tools will create and sustain feedback loops, whereby GenAI tools will be trained on text outputs generated by other GenAI tools, further reinforcing existing biases and inaccuracies.

In practical terms, that means some outputs will be inaccurate or unbalanced regardless of the phrasing of the text prompt, because the source material is inaccurate or unbalanced. Some output may also be inaccurate because the GenAI tool is ‘hallucinating’ – creating nonsensical or inaccurate outputs that are not directly attributable to the source material.

For those reasons, it is important that any GenAI output is thoroughly evaluated by a human being before it is utilised for any purpose. This is especially important where there is the risk of an adverse impact on a group or individual, or the potential for infringement of a person’s rights (e.g. as defined under the Equality Act 2010, the Human Rights Act 1998 or any other relevant legislation).

Risk mitigation(s): Ensure that all GenAI outputs are thoroughly evaluated by a human being before they are used. Ensure use of GenAI is acknowledged if it is used to make a significant and unrevised contribution to a substantive or impactful piece of work such as the production of content for formal policies or strategic reports.

Risk: data protection law non-compliance

GenAI tools and their use are subject to existing laws and regulations and any University policy relevant to their use. This includes data protection laws such as the UK GDPR, the Data Protection Act 2018 and the University Data Protection Policy.

As with the use of other information tools and services, it is important to use licensed versions of tools paid for from a University budget and procured through any applicable procurement process. (Free, unlicensed information tools and services often rely upon intrusive data processing and/or selling as part of their business model.)

The University’s standard licensed AI tools are Microsoft Copilot and Copilot for Microsoft 365, and this is the tool that should be used to process personal data, where necessary, for which the University is responsible. This ensures that technical, operational and legal safeguards are in place, which protect any personal data that the University controls or otherwise processes.

By contrast, inputting data into a free or unlicensed GenAI tool could be considered equivalent to putting it into the public domain – signifying a potential personal data breach.

Information input into GenAI tools is also often used to train those tools, which may not be a lawful use of personal data – especially if that data cannot be retrieved or deleted, for example from an AI neural network. Data input into the University’s licensed version of Microsoft Copilot and Copilot for Microsoft 365 is not used to train those tools.

It is particularly important to be cautious about inputting sensitive personal information (special category data) such as medical information into GenAI tools. Processing of such information is subject to additional safeguards, and misuse could represent a serious breach of data protection law and/or a breach of confidentiality.

Regardless of the type of information you wish to input into a GenAI, you must consider the task you are using it to support. Data must be collected, held and used for only that purpose – for example, as part of a governance process – and any use of AI must be compatible with that reason. With some exceptions (e.g. for academic research), it is not appropriate to use personal data that was collected for a different purpose to do something else with a GenAI tool.

It is also important to be mindful of ‘automated decision making’. Automated decision making involves:

  • making a decision that will affect a person solely by automated means without any human involvement; or

  • profiling someone – automated processing of personal data to evaluate certain things about an individual that could be used as part of a decision-making process.

A common example of automated decision making is a credit check conducted exclusively using AI to decide whether to provide someone with a personal loan.

While the use of GenAI to produce text or digital imagery would not necessarily result in a decision being made about an individual, it is not inconceivable that the use of such technology could be used as part of an evaluation process. For example, to review an application for a parking permit to ensure the person applying meets the eligibility criteria. In such circumstances, it is imperative that a human being is ultimately involved in the decision-making process.

If after careful consideration you decide to use a GenAI tool, you must tell the affected people that you’re going to do that. That doesn’t mean you must seek consent on a case-by-case basis, but it will mean that relevant privacy notices should be provided or updated.

Risk mitigation(s): Be cautious about inputting personal data into GenAI tools, especially sensitive personal data, and only do so where warranted. Do not use free GenAI tools. Use the University’s licensed version of Microsoft Copilot or Copilot for Microsoft 365. Deploy relevant privacy notices.

Risk: intellectual property and copyright non-compliance

It has been argued by some affected parties that the entire basis for training GenAI tools through publicly available data is a violation of intellectual property rights and copyright law. There are a series of ongoing and well-publicised copyright cases where high profile individuals and organisations have sued companies that own and operate GenAI.

Bearing that in mind, it is important to consider the nature of the content being input into any GenAI tool and to consider whether the University has the right to input that information into that GenAI tool.

It is also important that GenAI outputs are thoroughly checked by a human being and cited where appropriate.

Risk mitigation(s): Consider whether it is appropriate to input material into GenAI tools. Cite outputs where appropriate. Use the University’s licensed version of Microsoft Copilot or Copilot for Microsoft 365.

Risk: AI legislation non-compliance

In March 2024, the European Union (EU) introduced new laws that will require organisations using AI in the EU to employ a risk-based approach to their use of AI, especially with regard to the use of personal information. It will incrementally start to take effect from February 2025, and non-compliance with the legislation could result in penalties from European regulators. It will affect the University where it is processing EU citizens’ data using AI.

Risk mitigation(s): Exercise caution and judgement regarding deployment of GenAI on EU citizens’ data. Document any use of GenAI tools in these circumstances.

Risk: Reputational damage and other risks

As outlined above, there are various legal risks posed by the use of GenAI, all of which could result in reputational damage to the University.

There is also an environmental cost associated to use of GenAI: training GenAI tools requires vast amounts of power and indirectly generates enormous amounts of carbon, meaning there is a sustainability consideration to its use.

In the development of some GenAI tools, outputs are checked and refined in a process known as Reinforcement Learning from Human Feedback (RLHF). That process includes human reviewers, often working in countries in the global south for very low wages, routinely viewing objectionable and distasteful materials, and has been reported to have a profoundly negative impact on the health and well-being of many of those workers.

University staff will also be expected to provide human oversight of GenAI outputs. Whilst they would not ordinarily be exposed to objectionable nor distasteful materials in the course of administrative work, it is important to consider the risk to University staff wellbeing that could be posed by increased workloads and unrealistic expectations of what can be reasonably achieved using GenAI tools.

Risk mitigation(s): Consider the wider impact on the University’s reputation before using GenAI. Consider any other relevant risks.

 

Best practice: practical examples

Here are a series of practical examples intended to show both best practice and scenarios in which the use of AI should be considered carefully. It remains important to bear in mind the general risks and benefits associated with use of GenAI in these different contexts.

Example 1: Using GenAI for background research

Capability: GenAI can be used as a research tool to help gather background information on a topic relating to something you are unfamiliar with.

Example scenario: Using GenAI to provide an overview of a new piece of legislation that may affect your work at the University.

Good practice: Using the University’s licensed version of Microsoft Copilot or Copilot for Microsoft 365. Not inputting any personal or confidential information into Microsoft Copilot or Copilot for Microsoft 365, because it is not required to understand the new legislation.

Bad practice: Using an unlicensed GenAI tool. Inputting confidential data into that unlicensed AI tool with the aim of further understanding the legislation, thus committing a personal data breach and/or breach of confidentiality.

Example 2: Using GenAI to summarise information

Capability: GenAI can be used to summarise information.

Example scenario: Using GenAI to prepare a briefing on a University project for a meeting with a commercial partner.

Good practice: Using the University’s licensed version of Microsoft Copilot or Copilot for Microsoft 365. Treating the output as a draft and amending it as appropriate. Thoroughly checking the output to ensure it is factually accurate and unbiased, does not reveal anything confidential to the commercial partner and is reflective of the source material. Acknowledging GenAI has been used to help create the briefing if it has not been materially altered as part of the fact-checking or drafting process.

Bad practice: Using an unlicensed GenAI tool. Treating the output as a final draft. Accepting the output as factually accurate and suitable for sharing with a third party and not bothering to check it before presenting it to the commercial partner. Failing to mention it has been created using a GenAI tool.

Example 3: Using GenAI to draft a document

Capability: GenAI has the ability to produce written outputs in various styles and formats.

Example scenario: Using GenAI to create meeting minutes.

Good practice: Considering whether use of GenAI is appropriate in these circumstances – this will depend on what the meeting is about. Considering whether the document is likely to contain sensitive information. If the document is likely to contain sensitive information, considering the type of sensitive information, and whether use of GenAI is warranted. If it is appropriate to use a GenAI tool, using the University’s licensed version of Microsoft Copilot or Copilot for Microsoft 365. Using the output as a draft and amending it as appropriate.

Bad practice: Using an unlicensed GenAI tool. Inputting confidential or sensitive data into that unlicensed GenAI tool thus committing a personal data breach and/or breach of confidentiality.

Example 4: Using GenAI for numerical analysis

Capability: GenAI can perform numerical analysis.

Example scenario: Using GenAI to analyse the statistical results of a staff survey.

Good practice: Considering whether the raw survey data contains personal or sensitive information. If it is appropriate to deploy a GenAI tool, using the University’s licensed version of Microsoft Copilot or Copilot for Microsoft 365. Telling staff that AI will be used to analyse responses when they are choosing whether they wish to participate. Treating the output as a draft and verifying and amending it as appropriate.

Bad practice: Using an unlicensed GenAI tool. Not telling staff that GenAI will be used to analyse their responses. Accepting the output as factually accurate and making an unquestioning decision based on the analysis.

 

Copilot: Useful Facts

This section contains a few useful facts that may assist staff in using the University’s licensed version of Microsoft Copilot or Copilot for Microsoft 365.

The version of Copilot accessible to most University MS365 users is called “Microsoft Copilot”. A more comprehensive version of Copilot is available called “Copilot for Microsoft 365”, and a licence for its use can be purchased by contacting UIS.

This table compares the functionality available in Microsoft Copilot against the functionality available in Copilot for Microsoft 365. (Please be aware that GenAI tools are being constantly updated, so the information tabulated below may be out of date).

Functionality Questions Microsoft Copilot  Copilot for Microsoft 365
Is it available to University MS365 users? Yes, through their standard login. No, an additional licence must be purchased from UIS.
Does it integrate with Microsoft Apps? No. Yes.
Can it draft emails? Yes, but text output must be copied into Outlook. Yes, it will draft emails in Outlook.
Can it draft documents? Yes, but text must  be copied into Word. Yes, it will draft documents in Word.
Can it be used with spreadsheets?   Yes, spreadsheets can be attached to chats. Yes, but it will sometimes fail to produce outputs from Excel files with complex formatting
Can it be used to create PowerPoint presentations?  No, but it can be used to create the content for a presentation via the chat function.  Yes.
Can it summarise documents?   Yes, documents can be attached to chats. Yes, documents can be attached to chats and it will summarise Microsoft documents in those apps.
Will it summarise PDFs and other documents in standard formats?  Yes, via a chat. Yes, via a chat.
Is there a word limit for documents attached to a chat?   There is no known limit. There is no known limit.
What is the character limit for chat instructions? 8,000 16,000
Is there a limit on the number of questions I can ask?  There is conversation limit of 30 questions, but you can start as many new chats as necessary. There is conversation limit of 30 questions, but you can start as many new chats as necessary.
What is the word limit on chat text outputs? There is no known limit. There is no known limit.

Here are some potentially useful tips for using Copilot:

  • When accessing Microsoft Copilot, you can check whether you are logged into your University MS365 account by viewing the top right-hand corner of your window – you will be able to see your initials (and your name if you click on the initials) if you are logged in.

  • When attaching documents, these will be automatically saved in a folder called ‘Microsoft Copilot Chat Files’ in your University MS365 account OneDrive. 

  • Delete chat files that are no longer required.

  • Delete files uploaded into OneDrive as part of the chat process when no longer required.

  • There is more information regarding OneDrive on the UIS website: https://help.uis.cam.ac.uk/service/collaboration/365/onedrive.

 

Further reading on best practice with use of GenAI

University resources:

External resources:

Authorship

This guidance was composed without the use of AI.