AI at UFV

Frequently Asked Questions

Learn with AI FAQ (for Students)

Students may use GenAI as a learning partner, as long as their use complies with legal requirements, university principles, program guidelines and course syllabus. For example, GenAI can support students with comprehension of a topic, translation, idea generation, comparative analysis, writing skills, accessibility, or research.  

This use of AI is generally acceptable. It requires solid AI literacy skills, and students should be aware of the limitations of GenAI tools, including the potential for generating inaccurate or biased information. They must adhere to copyright compliance guidelines (see copyright question below). This use of AI generally does not need to be cited or disclosed as it is not submitted in an assessment.

Students are responsible for submitting assignments that are their original work and represent their own learning.

Student should first review carefully their course and syllabus to understand expectations around AI use and academic integrity before starting assignments. Students should not use AI for generating or completing assigned tasks unless explicitly permitted by the instructor within their course. Instructors should give clear parameters, as to what the AI tool is to be used for, when completing the task. Instructors should clearly communicate expectations around AI use. If the use of AI is not addressed in the course syllabus or if it is unclear, students should discuss it with their instructors to clarify expectations around its potential use.

If students are permitted to use AI-generated information, they must cite the AI tool following the instructor’s guidelines.

Students should be aware of the limitations of GenAI tools, including the potential for generating inaccurate or biased information. They must also adhere to copyright compliance guidelines (see copyright question below) before inserting any information when using an AI tool.

Instructors are strongly encouraged to explicitly specify in their syllabus the extent to which the use of generative AI tools and technology is permitted in the course, which tools are permitted or prohibited, and conventions for referencing AI use. (See Examples of syllabi statements)

Course materials developed by instructors (e.g., syllabus, lecture notes, slides) are their intellectual property. Similarly, original work and assignments submitted by students are their intellectual property, and course texts (e.g., textbooks, readings) are likely copyrighted.  Since many GenAI tools use prompts, copied text and uploaded documents to train their models, students and instructors may violate copyright laws, if they submit these materials to third-party GenAI tools without prior consent of the content owner.

Students are also not allowed to record or capture class lectures using AI tools without obtaining explicit instructor permission.

Students and instructors should not input personal or confidential information, such as name, contact information, or other identifying information into GenAI tools (e.g., imagine you want to get advice on writing a better CV, remove all personal information before uploading).

Copyright ownership of outputs produced by GenAI is still an emerging area of law and there is no clear legislation about GenAI and copyright yet. Instructors considering using these tools should:

  • Understand that they may not own or hold copyright over the work generated with these AI tools.
  • Review the terms of service of each AI tool used carefully, as it dictates the use and ownership of inputs and outputs. Be aware that these terms can change without notice.

The use of AI tools such as Microsoft CoPilot, ChatGPT, or other generative AI tools, does not automatically equate to academic misconduct at UFV. At this time, the use of AI tools in courses is determined by individual instructors, by clarifying expectations in their syllabus and explicitly informing students at the beginning of the course.

If using AI tools has been prohibited by the instructor, then using these tools to complete assignments would be considered academic misconduct.

If using AI tools has been permitted or permitted in specific circumstances by the instructor, then the extent to which the use of GenAI tools is permitted, which tools are permitted or prohibited, and conventions for referencing AI use should be transparent and clearly conveyed. If students use AI tools outside of the parameters set by the instructor, then it could be considered academic misconduct.

If the use of AI tools has not been discussed or specified by the instructor, then students should not assume that all available technologies are permitted. If students are not sure about whether AI tools are allowed in a course, they should ask their instructor for clarification and guidance.

 

Policy 70 has guidance as to what is considered academic misconduct at UFV and addresses the use of AI, referring to it as 'unauthorized technologies’.

It is important for instructors to promote academic integrity throughout their course term and provide opportunities for explicit conversations to reinforce student’s commitment to uphold UFV’s values and the importance of honesty and integrity, while encouraging responsible use of AI and clarifying expectations.

Refer to the Academic Integrity, Student rights & Responsibility office for more information on Academic Integrity, including the student academic misconduct Policy 70, regulations and procedures, appeal and more.

When using generative AI tools in coursework, if allowed, it is important to explicitly and appropriate disclose it. Instructors can provide a disclosure framework or citation guidelines and conventions, and model how these are applied by disclosing their use of AI for the development of course content and materials. These AI disclosure framework or citation guidelines could be added as an additional form to assignments to ensure students are transparent about their use of AI. If students are unclear about how to site their use of AI, they should ask their instructor.

 

Teach with AI FAQ (for instructors)

Yes, instructors can use GenAI tools for support with development of teaching materials such as slide decks, assignment descriptions, rubrics, or other content. It is the instructor’s responsibility to ensure that any AI-generated content is accurate, up-to-date, and inclusive before using it in a course. Use of GenAI tools must align with UFV AI Principles.

Instructors should acknowledge their GenAI use in their syllabus and on AI-generated material, similar to how they would credit materials borrowed or adapted from a colleague. Consistent acknowledgement will model appropriate use and citation of GenAI tools for students.

Yes, instructors may use GenAI technology in their teaching face-to-face or online. Instructors can model the ethical and judicious use of a GenAI tool themselves or give students the option to use a GenAI tool to accomplish the learning outcomes of a course. In designing learning experiences that integrate AI use, instructors should consider ways to scaffold AI literacy skills such as:

  • understanding how AI works,
  • interacting effectively with AI technology through prompt engineering,
  • recognizing ethical implications such as bias, cultural appropriation, stereotypes, privacy and copyright issues, among other elements,
  • or critically evaluating AI outputs and use, such as AI hallucination, dated language, missing perspectives, voices and knowledge.

Instructors could ask their students to use the protected version of Microsoft Copilot. However, asking students to access GenAI tools that have not been vetted by UFV for privacy and security is not recommended or supported.  The University generally discourages the use of such digital tools for instruction until it is assured that that digital tool is protecting any personal data (e.g., the email address used to register on the system).

Instructors should provide alternative means to complete assignments for students who choose not to use GenAI technology. Instructors should indicate on their syllabus what AI tools may be used in the course.

It is up to individual instructors to decide if AI is allowed when completing assignments. 

Instructors are strongly encouraged to explicitly specify in their syllabus the extent to which the use of generative AI tools and technology is permitted in the course, which tools are permitted or prohibited, and conventions for referencing AI use. (See Examples of syllabi statements)

Students are responsible for submitting assignments that are their original work and represent their own learning.

Student should first review carefully their course and syllabus to understand expectations around AI use and academic integrity before starting assignments. Students should not use AI for generating or completing assigned tasks unless explicitly permitted by the instructor within their course. Instructors should give clear parameters, as to what the AI tool is to be used for, when completing the task. Instructors should clearly communicate expectations around AI use. If the use of AI is not addressed in the course syllabus or if it is unclear, students should discuss it with their instructors to clarify expectations around its potential use.

If students are permitted to use AI-generated information, they must cite the AI tool following the instructor’s guidelines.

Students should be aware of the limitations of GenAI tools, including the potential for generating inaccurate or biased information. They must also adhere to copyright compliance guidelines (see copyright question below) before inserting any information when using an AI tool.

Course materials developed by instructors (e.g., syllabus, lecture notes, slides) are their intellectual property. Similarly, original work and assignments submitted by students are their intellectual property, and course texts (e.g., textbooks, readings) are likely copyrighted.  Since many GenAI tools use prompts, copied text and uploaded documents to train their models, students and instructors may violate copyright laws, if they submit these materials to third-party GenAI tools without prior consent of the content owner.

Students are also not allowed to record or capture class lectures using AI tools without obtaining explicit instructor permission.

Students and instructors should not input personal or confidential information, such as name, contact information, or other identifying information into GenAI tools (e.g., imagine you want to get advice on writing a better CV, remove all personal information before uploading).

Copyright ownership of outputs produced by GenAI is still an emerging area of law and there is no clear legislation about GenAI and copyright yet. Instructors considering using these tools should:

  • Understand that they may not own or hold copyright over the work generated with these AI tools.
  • Review the terms of service of each AI tool used carefully, as it dictates the use and ownership of inputs and outputs. Be aware that these terms can change without notice.

The use of AI tools such as Microsoft CoPilot, ChatGPT, or other generative AI tools, does not automatically equate to academic misconduct at UFV. At this time, the use of AI tools in courses is determined by individual instructors, by clarifying expectations in their syllabus and explicitly informing students at the beginning of the course.

If using AI tools has been prohibited by the instructor, then using these tools to complete assignments would be considered academic misconduct.

If using AI tools has been permitted or permitted in specific circumstances by the instructor, then the extent to which the use of GenAI tools is permitted, which tools are permitted or prohibited, and conventions for referencing AI use should be transparent and clearly conveyed. If students use AI tools outside of the parameters set by the instructor, then it could be considered academic misconduct.

If the use of AI tools has not been discussed or specified by the instructor, then students should not assume that all available technologies are permitted. If students are not sure about whether AI tools are allowed in a course, they should ask their instructor for clarification and guidance.

 

Policy 70 has guidance as to what is considered academic misconduct at UFV and addresses the use of AI, referring to it as 'unauthorized technologies’.

It is important for instructors to promote academic integrity throughout their course term and provide opportunities for explicit conversations to reinforce student’s commitment to uphold UFV’s values and the importance of honesty and integrity, while encouraging responsible use of AI and clarifying expectations.

Refer to the Academic Integrity, Student rights & Responsibility office for more information on Academic Integrity, including the student academic misconduct Policy 70, regulations and procedures, appeal and more. 

Like most post-secondary institutions in British Columbia, UFV does not currently support  AI-detection software on student work for two important reasons. First, inputting student work into a detector without their knowledge or consent is a violation of privacy law. Instructors are currently not authorized to share student work outside of the course without the students’ permission and sharing it with an AI-detector – a third-party for-profit company – is no exception.  Second, research has shown that these tools currently fail to reliably distinguish between content which is original from that which is AI generated. They are known to frequently come up with false positives, incorrectly detecting AI generated content in a student’s original work and often producing results that are biased against speakers of English as an additional language (Dalalah & Dalalah, 2023Elkhatat, Elsaid, & Almeer, 2023; Liang et al., 2023). Until the effectiveness and accuracy of these tools has been widely established, the university has decided to err on the side of caution and does not support AI detection tools.

With these reasons in mind, AI detection tools results cannot be used as the sole factor for an allegation of academic misconduct. If an instructor suspects an unauthorized use of AI to complete an assignment, they should proceed as they would for any other potential allegation of academic misconduct. The procedure of investigating academic misconduct is highlighted in Policy 70, Student Academic Misconduct Procedures

Provide opportunities for open discussions

Instructors could provide opportunities to their students for open discussion on GenAI tools , how to effectively interact with them and their limitations. These discussions will help set clear guidelines for students on what tools may or may not be used to complete the assignment and why.

 

Promote academic integrity:

Instructors should provide opportunities for explicit conversations about academic integrity. Co-create an honour pledge with students to reinforce their commitment to uphold UFV’s values and the importance of honesty and integrity, while encouraging responsible use of AI and clarifying expectations (Eaton et al., 2022; Stoesz & Eaton, 2022).

 

Design AI-resistant assignments:

Instructors can design assignments and activities that support authentic evaluation of students’ knowledge and skills and that cannot easily be completed by AI. The following strategies are often effective in ensuring that students limit their use of AI in assignments:

  • Diversify formats of assignments: Have students create products such as videos, podcasts, or infographics that promote authentic demonstration of learning.
  • Add context: Include a contextual component in assignments that address course-specific experiences, such as classroom discussions, readings, course content, and other considerations.
  • Capture process: Include opportunities for students to reflect on the process of completing an assignment through personal meaning-making and individual learning.
  • In-class assignments: Create individual or group assignments that are completed in ‘real time’ in class that center problem-solving, generating questions and summaries from group input, and promote collaborative discussion-based content generation.
  • Indigenize assignments: AI often lacks the context and nuance necessary to represent First Nation’s voice, stories and history, and does not understand Halq’eméylem. Adding an indigenous component to the assignment will make it more difficult to use AI.

If an instructor suspects an unauthorized use of AI to complete an assignment, they should proceed as they would for any other potential allegation of academic misconduct. The procedure of investigating an academic misconduct is highlighted in Policy 70, Student Academic Misconduct Procedures

Please note that instructors should not rely on AI detection tools results as the sole basis to form an academic misconduct claim or allegation. Such tools are unreliable and UFV does not support the use of AI-detection software programs on student work.

If a student has concerns about an allegation of academic misconduct against them, they can contact the Student Rights and Responsibilities Office for guidance.

When allowing the use of generative AI tools in coursework, it is important that instructors explicitly educate students about the appropriate disclosure or citation of such content. Instructors can provide a disclosure framework or citation guidelines and conventions, and model how these are applied by disclosing their use of AI for the development of course content and materials. The AI disclosure framework or citation guidelines could be added as an additional form to assignments to ensure students disclose their use of AI.

Citation practices help provide some transparency on the use of AI, but they often do not capture the varied ways AI tools function. AI Disclosure statements are often longer but are clear, consistent and adaptable to many AI use scenarios.

Style rules on how to disclose or cite AI generated content are new and evolving. Please consider the following options: 

 

  1. Weaver (2024) AI Disclosure framework:

Artificial Intelligence Tool: [description of tools used]; [Heading]: [description of AI use in that stage of the work].

Multiple Heading: statement can be added to the disclosure. Please refer to Weaver, 2024 to see all the possible headings.

 

e.g. Artificial Intelligence Tool: Microsoft Copilot (University of Fraser Valley institutional instance); Conceptualization: Microsoft Copilot was used to brainstorm possible approaches to adapting a traditional lemon pound cake recipe into a gluten‑free version; Information Collection: I used Microsoft Copilot to help locate relevant culinary resources, including common ratios for gluten‑free flour substitutions, best practices for preventing graininess, and typical baking times for dense gluten‑free cakes; Visualization: Microsoft Copilot assisted in creating a draft ingredients table that compared different gluten‑free flour options, and a step‑by‑step process outline that I used to build a final recipe; Writing—Review & Editing: I used Microsoft Copilot to refine the written recipe instructions by using shorter sentence, improving clarity, and standardizing measurement language.

 

  1. AI Citation:

[Generative AI tool]. (YYYY/MM/DD of prompt). “Text of prompt”. Generated using [Name of Tool.] Website of tool  

e.g. “ChatGPT4. (2023/03/13). “Suggest a recipe for gluten-free lemon pound cake”. Generated using OpenAI’s ChatGPT. https://chat.opeani.com  

  1. The American Psychological Association(APA), Modern Language Association(MLA) and the Chicago Manual of Style have also provided recommendations on how to cite AI generated content. The UFV Library developed a website for AI citation & Ethics with examples for these options.

Currently, Microsoft Copilot is the only recommended GenAI tool to use at UFV, when users sign in using University credentials. This protected version of Microsoft Copilot conforms to UFV’s privacy and security standards. It is free to use, does not share any data with Microsoft or any other company and conversations are not saved.

There is no university-wide policy. It is however recommended to check with your department or faculty for discipline or program specific AI guidelines. Ultimately, it is up to individual instructors to decide if AI is allowed in their course and to what level. 

UFV created the Artificial Intelligence Task Force (AITF) to gather input from all university stakeholders and develop UFV institution-wide AI principles to ensure a responsible adoption and integration into education aligned with UFV core values and mission. These seven AI principles cover integrity, flexibility, appropriate use, data governance, ethics, inclusion, and forward-leaning approaches. Academic, research, and administrative units can apply these principles to guide them in the degree and nature of AI use in their respective areas.  

The Teaching and Learning Center used the UFV AI Principles to develop more specific TLC AI guidelines that support instructors in applying each of the UFV AI Principles in their pedagogy.

In the syllabus, instructors should define what constitutes acceptable and unacceptable use of AI in their courses, specify which tools can be used and in what context, and mention how AI generated content can be cited or disclosed. Presenting an AI use scale, such as the AI Assessment Scale (Perkins et al., 2024), could support students in understanding what level of AI use is authorized for each assignment.

Sample syllabus with AI use statement and Citation guidelines can be found in the TLC AI Guidelines

It is the instructor’s responsibility to be proactive in communicating expectations around the use of AI in their course, or in each assignment, while promoting academic integrity. Fostering an open dialogue about AI can help students navigate the complexities of using AI tools.

Contact Us