What you need to know #
AI tools can be a valuable support in your learning journey — helping you to brainstorm ideas, summarise academic texts, generate revision questions, and consolidate your understanding of complex topics. These tools range from everyday AI like grammar checkers and search engines, to more advanced generative AI (GenAI) tools that can create entirely new content, such as essays, code, or images, based on your prompts. While these technologies can enhance your productivity and creativity, they also come with important limitations and responsibilities. GenAI tools, in particular, can produce inaccurate or misleading information (known as “hallucinations”), including fabricated citations, data, or visuals. There are also concerns around data privacy, copyright, and academic integrity.
As such, it’s important that you understand how to use these tools responsibly and in line with university guidance.
Which GenAI tools can I use? #
University-supported AI tools
When using university data (such as lecture content, learning resources, research data, special category or personal data), only use Microsoft Copilot 365, logged in with your official university IT account. This should be verified by a visible green shield, which lets you know that your data will be kept safe. These tools are designed to comply with our data protection policies and offer better security measures.
When using other AI tools (not licensed or supported by the University)
If you want to use other AI tools, that are not centrally supported by the university, please ensure that you follow these rules:
Do not sign up using your university account or email.
Do not input any university data (such as lecture content, learning resources, research data, special category or personal data) and don’t upload content from academic books and journals – this is against copyright.
Do not make audio or video recordings of lectures or meetings using unsupported AI tools.
AI notetakers/meeting assistants (such as Read.ai) are prohibited.
University Guidance #
Using GenAI: Key Rules
- Fact-check all AI-generated content:
- Verify the accuracy of any AI-produced text, references, or visuals by consulting credible and authoritative sources.
- Do not copy and paste content from an AI tool into your work:
- This is plagiarism, unless you use quotation marks and reference the source. Submitting AI-generated content as your own gives an unfair advantage and violates academic integrity.
- Assess potential bias:
- AI tools generate content based on patterns in the data they were trained on — and that data often reflects real-world inequalities, stereotypes, or gaps in representation. This means AI-generated content can sometimes reinforce biased views, especially around topics like gender, ethnicity, culture, disability, or other marginalised identities. For example, an AI tool might consistently associate certain professions with one gender, or underrepresent perspectives from non-Western cultures.
- As a student, it’s important to critically evaluate AI outputs and consider whose voices or viewpoints might be missing or misrepresented. Always cross-check information with reliable academic sources and apply your own judgement when using AI-generated content in your studies.
- Check intellectual property compliance:
- Ensure that AI-generated content respects intellectual property rights. For example, don’t copy and paste AI-generated text that closely mimics a published article without proper citation, or use AI-generated images that replicate a known artist’s style without permission. Avoid uncredited use of open-source data (e.g. using datasets from GitHub or Kaggle in your assignments without acknowledging the original source), and always cite any AI tools and prompts you use according to your department’s preferred referencing style. See Citing AI use within assignments section below.
- Evaluate AI-generated visuals and videos (e.g., DALL·E, MidJourney, Invideo)
- Ensure visuals are relevant, accurate, and ethically appropriate, avoiding stereotypes and copyright infringements. For more advice on copyright related to images, see Referencing AI generated images section below.
University's principals for ethical AI use
At the University of Chichester, we support the responsible, transparent, and fair use of AI technologies in teaching and learning, guided by the following ethical principles:
1. Transparency
Users should openly acknowledge and cite any AI tools that have contributed to their work. Transparency ensures academic honesty and allows others to understand how AI has influenced the process or outcome.
2. Accountability
Students and staff remain responsible for the content they submit, even when AI tools assist in its creation. This includes verifying accuracy, originality, and appropriateness of AI-generated and AI-supported material.
3. Fairness and equity
AI use must not disadvantage any individual or group. Awareness of AI biases—such as those related to gender, ethnicity, disability, or socioeconomic status—is essential. The University is committed to promoting equitable access to AI resources and mitigating digital divides.
4. Privacy and data protection
Respect for personal and institutional data privacy is paramount. Sensitive information must not be shared with AI platforms unless they are officially supported and comply with data protection regulations.
5. Academic integrity
AI should support learning and creativity without undermining the development of original thought and critical skills. Misuse of AI to gain unfair advantage is considered academic misconduct.
6. Sustainability
Given the environmental impact of training and running AI models, users should consider sustainability in their AI practices, striving to balance innovation with responsible resource use.
7. Continuous reflection and adaptation
Ethical AI use is an evolving area. The University encourages ongoing dialogue, education, and review to adapt policies and practices as AI technologies develop.
By embedding these ethical principles into all AI-related activities, the University of Chichester fosters a culture of integrity, respect, and innovation that benefits our entire academic community.
Protecting your data, and the University's data
Using AI and Cloud Tools Responsibly: Guidance for Students
AI tools — especially those not provided or approved by the University — often work by sending your data through multiple systems across the world. These systems vary in how securely they handle your information. Some may deliberately collect and reuse your data, while others may be poorly designed and vulnerable to misuse.
What You Need to Know:
Be cautious with unsupported tools. If you use AI or cloud-based apps (like ChatGPT, Zoom, Discord, etc.) that aren’t provided by the University, assume that any data you enter could be accessed by others.
You have a legal responsibility to protect personal data. This includes any information about someone’s identity or characteristics — such as gender, age, ethnicity, or other protected attributes. Even if someone gives permission, you are still legally required to keep their data safe.
Your own work is valuable. Your research, writing, and ideas are your intellectual property — and they’re also valuable to the University. If you share them with AI tools or online platforms, there’s a risk they could be reused or published by someone else, even before you submit or publish your own work.
Academic integrity matters. If your work is copied or reused without your knowledge, it could lead to accusations of plagiarism — even if you were the original author. Protect your work by keeping it secure and using trusted tools.
Follow University guidance. Always use University-approved platforms for your studies and research. If you’re unsure whether a tool is safe or supported, ask your tutor or check the University’s IT and data protection policies.
AI and cloud-based tools can be helpful, but they also carry risks around privacy, copyright, and academic integrity. Use them wisely, and always protect your data — and the data of others — in line with University policies.
Use of Generative AI in Assessment #
For assessed work (e.g. essays, assignments) your department/tutor will state the extent to which GenAI tools can be used.
Check with your department to see which level of AI use is permitted for your assignment.
Unless your tutor states otherwise, the default is (2) “AI for Planning and Structuring”
If you are permitted to use AI, then you will need to be transparent about its use (see ‘Evidencing the use of AI tools’).
The University of Chichester AI in Assessment Scale
No. | Rule | Student Guidance |
|---|---|---|
1 | No AI – Independent Knowledge Demonstration | You must complete the task without any AI assistance. Focus on demonstrating your own understanding and skills. Prepare using traditional methods (notes, textbooks, discussions). |
2 | AI for Planning and Structuring | You may use AI to generate ideas, outlines, or research questions. Your final submission must show how you refined and built on these ideas. Keep records of your AI interactions and cite them if required. |
3 | AI for Editing and Refining | You may use AI to help edit or refine your work. You must critically assess and improve any AI-generated content. Clearly indicate where AI was used and how you modified it. |
4 | AI for Task Completion with Human Evaluation | You may use AI to complete parts of the task. You must demonstrate your understanding by evaluating or extending the AI’s work. Include commentary or annotations explaining your decisions. |
5 | AI Exploration and Co-Design | You are encouraged to use AI creatively to solve problems or explore new ideas. You may co-design the task with your instructor. Document your process and reflect on how AI shaped your work. |
Evidencing the use of generative AI tools #
Student declaration of AI use
Standard declaration
As a minimum requirement, you are required to include a statement on your assignment describing your use of AI.
If you have not used AI, please include the following statement:
“No content generated by AI technologies has been presented as my own work and I have not used AI at any stage of the assignment writing process.”
To acknowledge AI use, please include the following information:
The name and version of the GenAI system used (such as Microsoft Copilot (version GPT-4) or ChatGPT-3.5).
Specify the publisher of the GenAI system (for example, Microsoft or OpenAI).
Provide the URL of the GenAI system.
Include a brief description (single sentence) outlining the context in which the tool was utilised.
For example:
“I acknowledge the use of Microsoft Copilot (version GPT-5, Microsoft, https://copilot.microsoft.com/) to summarise my initial notes and to proofread my final draft”
“I acknowledge the use of Microsoft Copilot (version GPT-5, Microsoft, https://copilot.microsoft.com/) for initial research and to suggest a structure for my essay”
Enhanced declaration
Your department or tutor may require you to provide additional information about your AI usage.
You may be asked to provide further details in an ‘Appendix’ section, such as:
The prompt(s) used to generate a response in the GenAI system, if applicable.
The date when the output was generated.
The output obtained (for example, a ‘link to chat’ if ChatGPT was used, or a compilation of all generated outputs in an appendix).
An explanation of how the output was modified before being incorporated into the work (such as with a tracked-changes document or a descriptive paragraph).
As such keep a record of how you have used it. Retain this information until your work has been assessed.
Citing AI use within assignments
Some referencing guides offer suggestions for how GenAI systems should be cited within an essay and in the reference list. However, there are issues with citing GenAI systems:
A GenAI tool cannot be classed as an author: it cannot take responsibility for its work, nor does it generate original ideas but reproduces ideas found elsewhere.
A reference list is designed to enable the reader to refer to the original source, which is not always possible with AI generated content.
There may be cases where it is appropriate to cite AI generated content within the body of your essay, for example, where the assignment focus is on the topic of GenAI and its output. If in doubt, please check with your department.
If you do need to reference GenAI in your work, then use your department’s preferred reference style. Referencing guides can be found on the Study Skills help pages.
Here is an example of how to reference GenAI in the Harvard style:
In-text citation:
When prompted to define ‘authenticity’, ChatGPT (OpenAI, 2022) responded…
Reference list:
OpenAI (2022) ChatGPT: Optimizing Language Models for Dialogue. Available at https://openai.com/blog/chatgpt/. (Accessed: 30 June 2023)
Referencing AI generated images
Copyright safe image sources: https://help.chi.ac.uk/copyright-safe-image-sources
Example of how to reference AI generated images (examples using Harvard style, please consult your department’s referencing guide for advice on other referencing styles):
In-text citation
Hotpot AI (2024) was used to create a beach scene.
Reference list
Hotpot AI (2024) AI generated image by Hotpot AI art generator with prompt ‘Draw a beach with white sand, turquoise sea and blue sky’, 23 May. Available at: URL (Accessed: 31 May 2024).