Teaching and Learning Centre

Artificial Intelligence

Browse this page to explore UFV AI Principles, AI Guidelines developed by TLC and other AI resources for teaching and learning. As the field of generative AI is evolving rapidly, this page will continue to be updated with additional guidance and resources for instructors in an effort to support them in navigating the changing educational landscape.    

What is AI? 

hand shake between a human and artificial handArtificial Intelligence (AI) aims to create machines controlled by computers or software that mimic human cognitive functions (intelligence) such as learning, problem-solving, logical reasoning, perception or understanding natural language. 

There are three types of AI:

  • Reactive AI responds to specific input with always the same output, without learning from past experiences. Examples of reactive AI are autonomous robotic vacuum cleaners.  
  • Predictive AI analyzes data to predict future events or behaviors, looking for patterns in data collected from users. Examples of predictive AI are the personalized recommendation systems in platforms such as Amazon, Netflix or Spotify, where products are suggested based on user’s preferences. 
  • Generative AI (GenAI) is a type of AI that has the capability to generate new content. It refers to algorithms or models that have been trained on large sets of existing data so that they can create text, images, code, and other forms of content.  GenAI models learn to recognize patterns in the training data and build predictive models based on this learning. You can further refine the generated content by providing feedback to the AI tool, or by editing your original prompt to meet your specific needs. Examples of GenAI tools are ChatGPT, CoPilot or MidJourney. 

UFV AI Principles 

Artificial intelligence is transforming the way we teach, learn and work. We must carefully consider both the benefits and challenges that come with it. Balancing human-AI interactions requires careful planning and consideration. UFV created the Artificial Intelligence Task Force (AITF) to gather input from all university stakeholders and develop institution-wide AI principles to ensure that its integration into education is beneficial and aligned with UFV core values. These seven overarching principles were developed through a receptive, flexible, and proactive lens, keeping in mind the diverse needs of the various sectors of the UFV community. Academic, research, and administrative units can apply these principles for guiding them in the degree and nature of AI use in their respective areas.  

UFV AI Principles (PDF)

  1. Integrity and Innovation
  2. Flexibility, Adaptability, and Effectiveness
  3. Informed, Balanced, and Appropriate Use
  4. Data, Content, and Governance
  5. Ethics, Digital Literacy, Regulation
  6. Inclusion and Accessibility
  7. Positive Mindset, Forward Leaning Approaches 

TLC AI Guidelines

TLC has focused on creating guidelines that support instructors in applying each of the UFV AI Principles in their pedagogy. The guidelines urge instructors to be open, innovative, and flexible to technology. They also encourage instructors to critically analyze the ethical implications of using generative AI tools and take steps to mitigate them.

TLC AI Guidelines (WORD)

TLC AI Guidelines (PDF)

A large number of detection tools have emerged in the market to try and assuage educators’ concerns around academic integrity that have resurfaced with the prevalence of generative AI tools. These tools currently fail to reliably distinguish between content which is original from that which is generated. They may frequently come up with false positives, incorrectly detecting AI generated content in a student’s original work (Dalalah & Dalalah, 2023; Elkhatat, Elsaid, & Almeer, 2023).

Moreover, when content is flagged as generated, detection tools do not produce a similarity report showing the sources from which that content may have come from (Dalalah & Dalalah, 2023). Of great concern is also the potential of bias in detection tools due to the language models used to develop them. Evidence shows that detection tools are more likely to wrongly label non-native English writing as generated (Liang et al., 2023). The ability of detection tools to mitigate against such bias has not been substantiated yet. Due to the fallibility of AI detection tools, UFV has not approved the use of any detection tools as of yetThis stance is consistent with that of several other post-secondary educational institutions across British Columbia.

Since detection tools are likely to be more detrimental to teaching and learning than beneficial, the university has decided to err on the side of caution until the effectiveness and accuracy of these tools has been widely established.   

  • A Guide to Generative AI for Educators: Ryan Mann at SAIT has compiled this guide to support instructors in using AI constructively for enhancing teaching and learning. The guide shares ways in which AI can serve as a teaching aid and a learning tool, while also cautioning instructors about challenges around data privacy, transparency, and appropriate use  of AI. It is a useful resource for faculty and instructors to use in conjunction with the AI principles and guidelines referenced above. 
  • The Curious Educator's Guide to AI: This open educational resource authored by Kyle Mackie and Erin Aspenlieder aims to inform educators of the academic possibilities that generative AI offers, while also discussing  the challenges involved in its application in education. It not only shares practical ways to use generative AI for enhancing teaching and learning, but also communicates the importance of reflection and evaluation in an effort to promote ethical and responsible use.
  • The Artificial Intelligence Assessment Scale (AIAS): Perkins et al. (2024) have designed this simple yet comprehensive tool for educators to use for determining the extent to which generative AI may be used in assessments within their courses. The tool is based on the foundational assertion that the use of generative AI should be permissible to the extend that it does not contradict learning objectives of the course, and provides a framework for educators to apply to their pedagogical practice.
  • Five Principles for the Effective Ethical Use of Generative AI: This website outlines five student-centered principles that foster effective and ethical use of generative AI in education. It also presents practical strategies that educators can adopt to implement each of these principles.
  • Introduction to Generative AI for Educators, by McMaster University: This online module aims to provide educators with an understanding of generative AI to help you think through how these technologies intersect with your teaching practices. Whether you have reservations or enthusiasm about AI in education, “Introduction to Generative AI for Educators” offers a space for exploration and thoughtful consideration. Topics include: what is GenAI is, what can/can’t GenAI tools do, how to use GenAI tools, and how GenAI is changing the teaching and learning landscape.  
  • The Artificial Intelligence Disclosure (AID) Framework: An Introduction: Kari D. Weaver (2024) explains why traditional citation practices are sometime insufficient for documenting the many ways generative AI tools can be used. Kari introduces the AID Framework as a structured, transparent method for disclosing how AI tools are used in education, academic work, and research, addressing growing calls for clarity and accountability in AI supported tasks.

Teach with AI FAQ (for instructors)

Yes, instructors can use GenAI tools for support with development of teaching materials such as slide decks, assignment descriptions, rubrics, or other content. It is the instructor’s responsibility to ensure that any AI-generated content is accurate, up-to-date, and inclusive before using it in a course. Use of GenAI tools must align with UFV AI Principles.

Instructors should acknowledge their GenAI use in their syllabus and on AI-generated material, similar to how they would credit materials borrowed or adapted from a colleague. Consistent acknowledgement will model appropriate use and citation of GenAI tools for students.

Yes, instructors may use GenAI technology in their teaching face-to-face or online. Instructors can model the ethical and judicious use of a GenAI tool themselves or give students the option to use a GenAI tool to accomplish the learning outcomes of a course. In designing learning experiences that integrate AI use, instructors should consider ways to scaffold AI literacy skills such as:

  • understanding how AI works,
  • interacting effectively with AI technology through prompt engineering,
  • recognizing ethical implications such as bias, cultural appropriation, stereotypes, privacy and copyright issues, among other elements,
  • or critically evaluating AI outputs and use, such as AI hallucination, dated language, missing perspectives, voices and knowledge.

Instructors could ask their students to use the protected version of Microsoft Copilot. However, asking students to access GenAI tools that have not been vetted by UFV for privacy and security is not recommended or supported.  The University generally discourages the use of such digital tools for instruction until it is assured that that digital tool is protecting any personal data (e.g., the email address used to register on the system).

Instructors should provide alternative means to complete assignments for students who choose not to use GenAI technology. Instructors should indicate on their syllabus what AI tools may be used in the course.

It is up to individual instructors to decide if AI is allowed when completing assignments. 

Instructors are strongly encouraged to explicitly specify in their syllabus the extent to which the use of generative AI tools and technology is permitted in the course, which tools are permitted or prohibited, and conventions for referencing AI use. (See Examples of syllabi statements)

Students are responsible for submitting assignments that are their original work and represent their own learning.

Student should first review carefully their course and syllabus to understand expectations around AI use and academic integrity before starting assignments. Students should not use AI for generating or completing assigned tasks unless explicitly permitted by the instructor within their course. Instructors should give clear parameters, as to what the AI tool is to be used for, when completing the task. Instructors should clearly communicate expectations around AI use. If the use of AI is not addressed in the course syllabus or if it is unclear, students should discuss it with their instructors to clarify expectations around its potential use.

If students are permitted to use AI-generated information, they must cite the AI tool following the instructor’s guidelines.

Students should be aware of the limitations of GenAI tools, including the potential for generating inaccurate or biased information. They must also adhere to copyright compliance guidelines (see copyright question below) before inserting any information when using an AI tool.

Course materials developed by instructors (e.g., syllabus, lecture notes, slides) are their intellectual property. Similarly, original work and assignments submitted by students are their intellectual property, and course texts (e.g., textbooks, readings) are likely copyrighted.  Since many GenAI tools use prompts, copied text and uploaded documents to train their models, students and instructors may violate copyright laws, if they submit these materials to third-party GenAI tools without prior consent of the content owner.

Students are also not allowed to record or capture class lectures using AI tools without obtaining explicit instructor permission.

Students and instructors should not input personal or confidential information, such as name, contact information, or other identifying information into GenAI tools (e.g., imagine you want to get advice on writing a better CV, remove all personal information before uploading).

Copyright ownership of outputs produced by GenAI is still an emerging area of law and there is no clear legislation about GenAI and copyright yet. Instructors considering using these tools should:

  • Understand that they may not own or hold copyright over the work generated with these AI tools.
  • Review the terms of service of each AI tool used carefully, as it dictates the use and ownership of inputs and outputs. Be aware that these terms can change without notice.

The use of AI tools such as Microsoft CoPilot, ChatGPT, or other generative AI tools, does not automatically equate to academic misconduct at UFV. At this time, the use of AI tools in courses is determined by individual instructors, by clarifying expectations in their syllabus and explicitly informing students at the beginning of the course.

If using AI tools has been prohibited by the instructor, then using these tools to complete assignments would be considered academic misconduct.

If using AI tools has been permitted or permitted in specific circumstances by the instructor, then the extent to which the use of GenAI tools is permitted, which tools are permitted or prohibited, and conventions for referencing AI use should be transparent and clearly conveyed. If students use AI tools outside of the parameters set by the instructor, then it could be considered academic misconduct.

If the use of AI tools has not been discussed or specified by the instructor, then students should not assume that all available technologies are permitted. If students are not sure about whether AI tools are allowed in a course, they should ask their instructor for clarification and guidance.

 

Policy 70 has guidance as to what is considered academic misconduct at UFV and addresses the use of AI, referring to it as 'unauthorized technologies’.

It is important for instructors to promote academic integrity throughout their course term and provide opportunities for explicit conversations to reinforce student’s commitment to uphold UFV’s values and the importance of honesty and integrity, while encouraging responsible use of AI and clarifying expectations.

Refer to the Academic Integrity, Student rights & Responsibility office for more information on Academic Integrity, including the student academic misconduct Policy 70, regulations and procedures, appeal and more. 

Like most post-secondary institutions in British Columbia, UFV does not currently support  AI-detection software on student work for two important reasons. First, inputting student work into a detector without their knowledge or consent is a violation of privacy law. Instructors are currently not authorized to share student work outside of the course without the students’ permission and sharing it with an AI-detector – a third-party for-profit company – is no exception.  Second, research has shown that these tools currently fail to reliably distinguish between content which is original from that which is AI generated. They are known to frequently come up with false positives, incorrectly detecting AI generated content in a student’s original work and often producing results that are biased against speakers of English as an additional language (Dalalah & Dalalah, 2023Elkhatat, Elsaid, & Almeer, 2023; Liang et al., 2023). Until the effectiveness and accuracy of these tools has been widely established, the university has decided to err on the side of caution and does not support AI detection tools.

With these reasons in mind, AI detection tools results cannot be used as the sole factor for an allegation of academic misconduct. If an instructor suspects an unauthorized use of AI to complete an assignment, they should proceed as they would for any other potential allegation of academic misconduct. The procedure of investigating academic misconduct is highlighted in Policy 70, Student Academic Misconduct Procedures

Provide opportunities for open discussions

Instructors could provide opportunities to their students for open discussion on GenAI tools , how to effectively interact with them and their limitations. These discussions will help set clear guidelines for students on what tools may or may not be used to complete the assignment and why.

 

Promote academic integrity:

Instructors should provide opportunities for explicit conversations about academic integrity. Co-create an honour pledge with students to reinforce their commitment to uphold UFV’s values and the importance of honesty and integrity, while encouraging responsible use of AI and clarifying expectations (Eaton et al., 2022; Stoesz & Eaton, 2022).

 

Design AI-resistant assignments:

Instructors can design assignments and activities that support authentic evaluation of students’ knowledge and skills and that cannot easily be completed by AI. The following strategies are often effective in ensuring that students limit their use of AI in assignments:

  • Diversify formats of assignments: Have students create products such as videos, podcasts, or infographics that promote authentic demonstration of learning.
  • Add context: Include a contextual component in assignments that address course-specific experiences, such as classroom discussions, readings, course content, and other considerations.
  • Capture process: Include opportunities for students to reflect on the process of completing an assignment through personal meaning-making and individual learning.
  • In-class assignments: Create individual or group assignments that are completed in ‘real time’ in class that center problem-solving, generating questions and summaries from group input, and promote collaborative discussion-based content generation.
  • Indigenize assignments: AI often lacks the context and nuance necessary to represent First Nation’s voice, stories and history, and does not understand Halq’eméylem. Adding an indigenous component to the assignment will make it more difficult to use AI.

If an instructor suspects an unauthorized use of AI to complete an assignment, they should proceed as they would for any other potential allegation of academic misconduct. The procedure of investigating an academic misconduct is highlighted in Policy 70, Student Academic Misconduct Procedures

Please note that instructors should not rely on AI detection tools results as the sole basis to form an academic misconduct claim or allegation. Such tools are unreliable and UFV does not support the use of AI-detection software programs on student work.

If a student has concerns about an allegation of academic misconduct against them, they can contact the Student Rights and Responsibilities Office for guidance.

When allowing the use of generative AI tools in coursework, it is important that instructors explicitly educate students about the appropriate disclosure or citation of such content. Instructors can provide a disclosure framework or citation guidelines and conventions, and model how these are applied by disclosing their use of AI for the development of course content and materials. The AI disclosure framework or citation guidelines could be added as an additional form to assignments to ensure students disclose their use of AI.

Citation practices help provide some transparency on the use of AI, but they often do not capture the varied ways AI tools function. AI Disclosure statements are often longer but are clear, consistent and adaptable to many AI use scenarios.

Style rules on how to disclose or cite AI generated content are new and evolving. Please consider the following options: 

 

  1. Weaver (2024) AI Disclosure framework:

Artificial Intelligence Tool: [description of tools used]; [Heading]: [description of AI use in that stage of the work].

Multiple Heading: statement can be added to the disclosure. Please refer to Weaver, 2024 to see all the possible headings.

 

e.g. Artificial Intelligence Tool: Microsoft Copilot (University of Fraser Valley institutional instance); Conceptualization: Microsoft Copilot was used to brainstorm possible approaches to adapting a traditional lemon pound cake recipe into a gluten‑free version; Information Collection: I used Microsoft Copilot to help locate relevant culinary resources, including common ratios for gluten‑free flour substitutions, best practices for preventing graininess, and typical baking times for dense gluten‑free cakes; Visualization: Microsoft Copilot assisted in creating a draft ingredients table that compared different gluten‑free flour options, and a step‑by‑step process outline that I used to build a final recipe; Writing—Review & Editing: I used Microsoft Copilot to refine the written recipe instructions by using shorter sentence, improving clarity, and standardizing measurement language.

 

  1. AI Citation:

[Generative AI tool]. (YYYY/MM/DD of prompt). “Text of prompt”. Generated using [Name of Tool.] Website of tool  

e.g. “ChatGPT4. (2023/03/13). “Suggest a recipe for gluten-free lemon pound cake”. Generated using OpenAI’s ChatGPT. https://chat.opeani.com  

  1. The American Psychological Association(APA), Modern Language Association(MLA) and the Chicago Manual of Style have also provided recommendations on how to cite AI generated content. The UFV Library developed a website for AI citation & Ethics with examples for these options.

Currently, Microsoft Copilot is the only recommended GenAI tool to use at UFV, when users sign in using University credentials. This protected version of Microsoft Copilot conforms to UFV’s privacy and security standards. It is free to use, does not share any data with Microsoft or any other company and conversations are not saved.

There is no university-wide policy. It is however recommended to check with your department or faculty for discipline or program specific AI guidelines. Ultimately, it is up to individual instructors to decide if AI is allowed in their course and to what level. 

UFV created the Artificial Intelligence Task Force (AITF) to gather input from all university stakeholders and develop UFV institution-wide AI principles to ensure a responsible adoption and integration into education aligned with UFV core values and mission. These seven AI principles cover integrity, flexibility, appropriate use, data governance, ethics, inclusion, and forward-leaning approaches. Academic, research, and administrative units can apply these principles to guide them in the degree and nature of AI use in their respective areas.  

The Teaching and Learning Center used the UFV AI Principles to develop more specific TLC AI guidelines that support instructors in applying each of the UFV AI Principles in their pedagogy.

In the syllabus, instructors should define what constitutes acceptable and unacceptable use of AI in their courses, specify which tools can be used and in what context, and mention how AI generated content can be cited or disclosed. Presenting an AI use scale, such as the AI Assessment Scale (Perkins et al., 2024), could support students in understanding what level of AI use is authorized for each assignment.

Sample syllabus with AI use statement and Citation guidelines can be found in the TLC AI Guidelines

It is the instructor’s responsibility to be proactive in communicating expectations around the use of AI in their course, or in each assignment, while promoting academic integrity. Fostering an open dialogue about AI can help students navigate the complexities of using AI tools.

Learn with AI FAQ (for Students)

Students may use GenAI as a learning partner, as long as their use complies with legal requirements, university principles, program guidelines and course syllabus. For example, GenAI can support students with comprehension of a topic, translation, idea generation, comparative analysis, writing skills, accessibility, or research.  

This use of AI is generally acceptable. It requires solid AI literacy skills, and students should be aware of the limitations of GenAI tools, including the potential for generating inaccurate or biased information. They must adhere to copyright compliance guidelines (see copyright question below). This use of AI generally does not need to be cited or disclosed as it is not submitted in an assessment.

Students are responsible for submitting assignments that are their original work and represent their own learning.

Student should first review carefully their course and syllabus to understand expectations around AI use and academic integrity before starting assignments. Students should not use AI for generating or completing assigned tasks unless explicitly permitted by the instructor within their course. Instructors should give clear parameters, as to what the AI tool is to be used for, when completing the task. Instructors should clearly communicate expectations around AI use. If the use of AI is not addressed in the course syllabus or if it is unclear, students should discuss it with their instructors to clarify expectations around its potential use.

If students are permitted to use AI-generated information, they must cite the AI tool following the instructor’s guidelines.

Students should be aware of the limitations of GenAI tools, including the potential for generating inaccurate or biased information. They must also adhere to copyright compliance guidelines (see copyright question below) before inserting any information when using an AI tool.

Instructors are strongly encouraged to explicitly specify in their syllabus the extent to which the use of generative AI tools and technology is permitted in the course, which tools are permitted or prohibited, and conventions for referencing AI use. (See Examples of syllabi statements)

Course materials developed by instructors (e.g., syllabus, lecture notes, slides) are their intellectual property. Similarly, original work and assignments submitted by students are their intellectual property, and course texts (e.g., textbooks, readings) are likely copyrighted.  Since many GenAI tools use prompts, copied text and uploaded documents to train their models, students and instructors may violate copyright laws, if they submit these materials to third-party GenAI tools without prior consent of the content owner.

Students are also not allowed to record or capture class lectures using AI tools without obtaining explicit instructor permission.

Students and instructors should not input personal or confidential information, such as name, contact information, or other identifying information into GenAI tools (e.g., imagine you want to get advice on writing a better CV, remove all personal information before uploading).

Copyright ownership of outputs produced by GenAI is still an emerging area of law and there is no clear legislation about GenAI and copyright yet. Instructors considering using these tools should:

  • Understand that they may not own or hold copyright over the work generated with these AI tools.
  • Review the terms of service of each AI tool used carefully, as it dictates the use and ownership of inputs and outputs. Be aware that these terms can change without notice.

The use of AI tools such as Microsoft CoPilot, ChatGPT, or other generative AI tools, does not automatically equate to academic misconduct at UFV. At this time, the use of AI tools in courses is determined by individual instructors, by clarifying expectations in their syllabus and explicitly informing students at the beginning of the course.

If using AI tools has been prohibited by the instructor, then using these tools to complete assignments would be considered academic misconduct.

If using AI tools has been permitted or permitted in specific circumstances by the instructor, then the extent to which the use of GenAI tools is permitted, which tools are permitted or prohibited, and conventions for referencing AI use should be transparent and clearly conveyed. If students use AI tools outside of the parameters set by the instructor, then it could be considered academic misconduct.

If the use of AI tools has not been discussed or specified by the instructor, then students should not assume that all available technologies are permitted. If students are not sure about whether AI tools are allowed in a course, they should ask their instructor for clarification and guidance.

 

Policy 70 has guidance as to what is considered academic misconduct at UFV and addresses the use of AI, referring to it as 'unauthorized technologies’.

It is important for instructors to promote academic integrity throughout their course term and provide opportunities for explicit conversations to reinforce student’s commitment to uphold UFV’s values and the importance of honesty and integrity, while encouraging responsible use of AI and clarifying expectations.

Refer to the Academic Integrity, Student rights & Responsibility office for more information on Academic Integrity, including the student academic misconduct Policy 70, regulations and procedures, appeal and more.

When using generative AI tools in coursework, if allowed, it is important to explicitly and appropriate disclose it. Instructors can provide a disclosure framework or citation guidelines and conventions, and model how these are applied by disclosing their use of AI for the development of course content and materials. These AI disclosure framework or citation guidelines could be added as an additional form to assignments to ensure students are transparent about their use of AI. If students are unclear about how to site their use of AI, they should ask their instructor.

 

Contact Us