AI (Artificial Intelligence) Guidelines
About the guidelines
These guidelines have been developed in support of a strategic, collaborative approach to the use of AI tools and systems in the Directorate of Communications, Marketing and Student Recruitment (DCMSR). The guidelines aim to ensure that all our work continues to be trusted by our audiences and that University content retains a distinct tone of voice.
The aim is also to provide examples for colleagues to follow of when to use AI and when not to. These principles can be adopted in local areas to support work related to communications, marketing, and student recruitment. Practical examples will appear throughout this document to provide a steer on how the guidance can be applied in your work.
These guidelines have been produced by the DCSMR AI working group, which formed in September 2023. We will revisit and update this guidance frequently as the use of AI evolves rapidly.
- For more information on these guidelines, please contact socialmedia@manchester.ac.uk.
AI at The University of Manchester
AI at The University of Manchester
AI describes computer systems which can perform tasks usually requiring human intelligence (source: NSCS.GOV.UK). Our University is undertaking a wide-ranging review of AI, including its use in teaching and learning, and research, as well as operations. University guidelines have been published for teaching and learning, and we have many research strengths in this field across all Faculties. These guidelines form part of this activity.
Key principles
When used appropriately, AI tools have the potential to enhance our work, and can support inclusivity and accessibility. Output from AI systems must be treated in the same manner by staff and student contributors (including our content creators) as work created by another person or persons - used critically and with permitted license and cited and acknowledged appropriately.
Our key principles:
- AI tools should enhance our work and be used as a supportive tool to help colleagues save time and work more effectively.
- While we may use AI tools in our day-to-day work, there will always be human input involved to ensure they are being used responsibly. A human will always approve the final version of anything where AI is used.
- AI tools should support – not replace – our people. We will support colleagues and provide training and development opportunities to help them use AI effectively.
- The pace of AI development means that we will regularly review our principles and approach.
- AI should not be used to create media content related to sensitive situations or topics.
- It is recommended that colleagues use the companion tool Microsoft Copilot when working on AI-related projects if this meets your needs. Use of this tool protects both personal and University data and is GDPR (General Data Protection Regulation) compliant.
Generative AI
Generative AI
Generative AI tools like Copilot and ChatGPT are capable of processing vast amounts of information to quickly produce an easy-to-understand summary of a complex topic. This can help colleagues work faster and understand new ideas.
- We may use text generator tools to help us research a topic or to spark inspiration.
- We will not publish any content that has been written 100% by text generators.
The default written style and tone of content produced by generative AI may not be appropriate for our target audiences in its unedited form.
The risk of plagiarism is high when lifting something completely from an AI content generator, where original sources often are not cited. It is essential that our outputs are original and do not plagiarise.
AI generated multimedia content: general
AI generated visual media, such as graphic design, illustrations, photo-realistic imagery, or videos, enable a highly efficient and accessible solution to content creation.
The breadth of the visual medium poses a challenge in defining what communications and marketing teams should and should not use these tools for. Finding a balance is essential if we are to make the most of these new tools, while avoiding any harm to the authenticity and identity of the University brand.
AI generated visuals: content creation
In the same way we fact-check our written content; we must be careful to represent the University accurately and authentically through our visual assets.
To ensure we accurately represent the University we (DCMSR) will not use AI visual content (imagery and video) to create imagery that represent staff, students, our campus, and facilities, in our communication or marketing material.
In most cases, AI tools should help support the creative process and its final output – not replace it. In practice, this looks like:
- Using AI to help storyboard a video, ideate visuals for a marketing campaign, iterate on existing visuals to explore different options.
- Avoiding the use of AI to generate the final visual output for marketing campaign content.
Exceptions
- Conceptual visuals: We may experiment in using AI generated multimedia content to help communicate ideas and topics that are difficult to represent using photography, in the same way we use stock imagery. For example, a magazine feature, a blog post, a research case study. We will credit or acknowledge that this multimedia content has been generated from AI applications.
- Supporting visual imagery: We may use or create AI generated multimedia content to provide supporting visual imagery. This is especially useful for platforms that produce a high volume of content such as news and event content, where AI could potentially help provide better consistency across a group of assets. Examples: objects like study materials, laptop, shopping bags, calculator, sports equipment.
- Backgrounds: We may use AI tools to extend or modify backgrounds, such a portrait-oriented image to a landscape format. Designers should exercise their discretion, in collaboration with stakeholders, to ensure that any alterations maintain a realistic interpretation of the environment and do not deviate into unrealistic representations.
Any assets developed using these tools must adhere to the University’s visual identity guidelines and the same level of quality and consistency expected from the Directorate. It is important that colleagues continue to work in close collaboration with their content and design teams.
As part of the sign-off process, senior marketing colleagues should be made aware of how AI tools have been used in the creation of any content, especially if the final image is generated using AI.
AI generated content from stock image libraries
Content creators should follow the guidelines above when accessing imagery from stock image libraries, ensuring imagery has not been created by AI image generators.
Advice to third party content creators (such as agencies, photographers, student content creators): All design work completed by suppliers must adhere to these guidelines in the same way they should adhere to our University brand guidelines.
AI generated visuals: editing
In-built AI software in tools such as Adobe Photoshop, Premiere Pro and Canva can help colleagues to enhance and optimise photographs. We will ensure that any modifications made with AI are practical but ethical and maintain the individuality of the subjects in the photograph.
- We will use image editing tools ethically and not change the essence of any original image.
- We will not use AI image editing tools to change the core features of people or groups captured in our photography.
- We may use image editing tools to make minor edits only if they are consolidated from other images and capture the essence of the individual.
- We may use image tools to correct and make minor edits.
Practical example / In practice:
A content coordinator needs to resize an image for a university website or blog and uses an inbuilt tool within Photoshop to fill in a background.
AI generated audio: content creation
- We will not use voice clone generators.
- We may use voice generators to add narration to video content, being careful to select neutral accents.
AI generated audio: editing
- We may use audio and video tools to remove distractions for audiences.
Practical example / In practice:
A digital content editor may use audio clean-up tools to remove background noise or using video clean-up tools/apps to remove distractions in background shots. This will save time.
Privacy and Ethics
Privacy and ethics
- We will not input any sensitive, restricted, private, or embargoed information into AI generators.
Text generators allow users to paste in articles or data and build a prompt around them, meaning there are privacy and intellectual property risks associated with the information we enter. As mentioned above, use of Microsoft Copilot protects both personal and University data and is GDPR (General Data Protection Regulation) compliant.
There is no guarantee that information inputted into external AI tools (such as Chat GPT, Claude) will remain confidential. Sensitive or restricted information should not be shared, in the same way that we would not share on other external channels or discuss in public.
Colleagues will be expected to use their professional judgement no matter how advanced AI technology may become.
Practical example / In practice:
A media relations officer may receive a research paper from an academic which is embargoed before release in an academic journal. The research paper should not be uploaded to a text generator like ChatGPT before the embargo.
After an embargo has lifted, we would still have to fact check and rewrite anything that is AI generated to ensure it is 100% accurate and meets the needs of our intended audiences.
While text generators can be used as a tool in drafting written press materials, a clear sign-off process remains in place with the lead academic and any relevant partners required to sign off the final copy.
- We will not assume that all outputs are accurate
Many instances of AI have generated results that seem realistic but are in fact false” facts” generated by the system. Always double-check information for accuracy, bias, errors or ‘AI hallucinations’. The University should only publish unbiased and factual written content.
- We will ensure that outputs are consistent with our vision and values
Colleagues should review AI outputs to ensure accuracy and alignment with the University’s vision and values. Outputs should also be aligned with our brand narrative, visual identity, and written word guidelines.
- We will ensure any AI generated outputs are accessible to all audiences and follow best practice guidance related to content accessibility.
- We may use AI to analyse public data collected by media or social media monitoring tools
Training and development
Training and development
Our teams are made up of skilled creatives and strategists who want to use AI tools to support – not replace – their creativity.
During the annual Performance and Development Review process, DCMSR colleagues are encouraged to reflect on how generative AI tools could help them save time and effort.
Soon, we will create a list of recommended tools that are supported by the University in terms of our security protections and ones that are open source. We will provide training and development opportunities to help DCMSR colleagues use AI tools effectively and provide opportunities for colleagues to learn from each other including providing hands-on learning experiences, allowing us to interact with AI technologies and see their practical applications.
As communications and marketing professionals, we should constantly review our understanding of ethics, and use test cases to prepare for potential pitfalls when using AI - as well as maximising the opportunities.
It is recommended that colleagues use the companion tool Microsoft Copilot when working on AI-related projects if this meets your needs. Use of this tool protects both personal and University data and is GDPR (General Data Protection Regulation) compliant.
Training and using Custom GPTs, such as Open AI's GPT model carry risks. Please think carefully and test extensively.