Welcome to our resource hub for faculty exploring the evolving role of generative AI in education. Here, you’ll find tools, strategies, and guidelines to help you integrate AI thoughtfully and responsibly into your teaching. From understanding AI's potential and limitations to addressing concerns around academic integrity, we aim to provide insights that support both faculty and students. Be sure to also check out the companion page for students- opens in a new tab, designed to foster responsible and informed AI usage in their academic work.
Since the release of ChatGPT in the fall of 2021, there has been a flurry of articles written on the pros and cons of using generative artificial intelligence (GenAI) in the classroom. While many embrace the changes that GenAI has the potential to bring to our understanding of authorship, plagiarism, and assessment (e.g., Compton, n.d.- opens in a new window), the International Centre for Academic Integrity’s Statement on Academic Integrity and Artificial Intelligence-PDF opens in a new window highlights many of the concerns raised in the literature. The use of text-generating tools in the classroom generally, or for writing activities more specifically, is an on-going debate worth considering.
When used with caution and discretion, however, introducing students to appropriate uses of GenAI tools may be helpful for mitigating academic integrity breaches and teaching core critical thinking and literacy skills. Consider how you might use GenAI to help students: Brainstorm topics, Identify keywords, Create or critique an outline, Identify sources (e.g., organizations, individuals, etc.), Summarize, Revise, Fact-check, Give or receive feedback, Develop metacognition & self-regulation (e.g., Brett D. Jones and David Hicks example- opens in a new window )
Generative AI tools can indeed make life and work a lot easier! Like any tool, however, there are many things to consider before using generative AI. You are responsible for educating yourself on the ethics of any AI you use and understanding the risks when you are permitted to use it.
While AI offers exciting opportunities to enhance accessibility and inclusivity, it’s crucial to recognize that it can also inadvertently exclude many individuals, particularly those who already face accessibility challenges. As we explore GenAI tools, it's essential to prioritize equitable access for all users. For example, although many generative AI tools are currently free, a growing number are starting to charge fees for access, which can create barriers for those who may not be able to afford it.
At the same time, GenAI tools can also function as accessibility aids for learners.
One may think that technology is objective and neutral. Generative AI, however, is trained on real-world data and information, such as images and text scraped from the internet. This information is rife with human biases. Human biases may be embedded in the AI model during its creation, and biases in the datasets used for training can influence how it generates content. Additionally, AI can develop its own biases based on how it interprets the data, and user input may inadvertently guide it toward biased responses.
AI Bias: "also referred to as machine learning bias or algorithm bias, refers to AI systems that produce biased results that reflect and perpetuate human biases within a society" (IBM Data and AI Team, 2023). Some common biases include gender stereotypes and racial discrimination.
Recognizing these factors is essential for critically evaluating AI-generated content. By understanding potential biases in the data, model design, and user input, you can better assess the credibility and accuracy of the AI's output.
Development, training and use of GenAI models require systems that use a significant amount of energy and contribute to carbon emissions. They also consume vast amounts of water for cooling, and produce electronic waste generated from the hardware used in AI systems. While developers continue to seek more sustainable ways of operating, it is important to consider whether your use of AI is worth the environmental impact and to consider ways GenAI tools can be used more efficiently when possible.
As the reliance on AI tools grows, it's crucial to prioritize sustainable practices and consider the ecological footprint of these technologies to mitigate their impact on the environment.
Since GenAI is trained on real-world data, text, and media from the internet, the content it provides to users may be misleading, factually inaccurate, or outright misinformation (like deep fakes, for example). Because it’s unknown exactly where the data used to train AI originates and AI cannot specify its sources, its output may not be credible or reliable for academic use. The information provided may be implicitly or explicitly biased, outdated, or a “hallucination.”
AI hallucination: Gen AI fabricating sources of information even though it is meant to be trained on real-world data. IBM (n.d.) examines the various causes of AI hallucinations, indicating that common factors include “overfitting, training data bias/inaccuracy and high model complexity.”
To avoid using or spreading misinformation, verify the accuracy of AI-generated content using reliable sources before including it in your work.
Like other digital tools, generative AI tools collect and store data about users. Signing up to use generative AI tools allows companies to collect data about you. This data can be used to make changes to tools to keep you engaged.
User data may also be sold or given to third parties for marketing or surveillance purposes. When interacting with AI tools, you should be cautious about supplying sensitive information, including personal, confidential, or proprietary information or data.
Note: Always be sure to review the terms and conditions of any application they use and be aware that these agreements often include a clause allowing the company to modify the terms at any time.
As you guide students in navigating the world of generative AI, it’s essential to emphasize the importance of upholding academic integrity. Begin by reviewing your institution’s policies regarding AI tool usage in assignments, as these guidelines can vary. Familiarize yourself with your college’s academic integrity and generative AI standards to ensure your teaching aligns with institutional expectations. Encourage students to experiment with AI tools to enhance their learning while stressing the importance of critically evaluating AI-generated content for credibility and potential biases. Finally, remind students to properly disclose or cite AI-generated outputs in their work, promoting transparency and respect for intellectual property.
Generative AI tools rely on information from the internet, which includes both accurate and reliable sources as well as outdated, biased, or incorrect information. These tools can sometimes provide inaccurate explanations or misrepresent sources, so it’s essential to critically evaluate their outputs—just as you would with any information source. Learn more about evaluating generative AI outputs on the AI Literacy & Evaluation Criteria page- opens in a new window.
Students should be aware that using generative AI tools typically requires creating an account. When you create accounts or interact with AI platforms, your personal information, inputs, and even behavioural data can be collected, which may be stored or shared with third parties.
Additionally, there’s a risk that sensitive information shared with AI could be used to further train the model, potentially leading to unintended data exposure. It’s important for students to be mindful of the information shared during account creation and to review the privacy policies and terms of use for any generative AI tool to understand how your data will be handled and to be cautious about sharing any personal or sensitive information.
When using generative AI tools, it’s essential to consider the purpose and intent behind your use to maintain academic integrity and ethical standards. Ask yourself: What am I using this software to create? For instance, if you’re using an image generator, are you trying to replicate the style of a specific living artist? This raises ethical concerns, as it may infringe upon that artist’s intellectual property. Similarly, consider whether you’re using AI to complete assignments or produce content without informing your instructor. Using generative AI tools in a class setting without your teacher’s knowledge can lead to misunderstandings and potential breaches of academic integrity policies. Always be transparent with your instructor about how you’re incorporating AI into your work, and ensure that your use aligns with your institution’s policies.
Generative AI can enhance various tasks, but it cannot replace essential skills like research, critical thinking, and evaluating information. While AI tools can quickly provide information, the ability to conduct thorough research—formulating effective questions, identifying credible sources, and synthesizing information—is vital. Critical thinking enables you to analyze AI-generated content for validity and discern bias or inaccuracies. Instead of replacing these skills, generative AI should be viewed as a supportive tool that enhances your research capabilities, allowing you to focus on deeper analysis and critical engagement with the material for more informed outcomes.
When you use generative AI tools to create content it’s essential to disclose that it’s AI-generated. In academic work, be sure to check your college's policies and let your instructor know how you’re using AI, as transparency is part of academic integrity.
When using generative AI tools, it's important to include clear disclosure statements to inform your audience about the AI’s role in content creation. Unlike citations, which specifically credit the source of information, disclosure statements provide transparency about how AI has been used—whether it's for generating ideas, drafting content, or providing assistance. This distinction helps maintain ethical standards and fosters trust, ensuring that your audience understands when and how AI has contributed to the material.
There are many different ways to present disclosure statements, depending on how AI was used and the requirements of your institution. Whether brief or detailed, the important thing is that they clearly explain how AI contributed to your work, ensuring transparency and academic integrity. When in doubt, always consult your instructor about their expectations.
Example:
Generative AI tools often use material sourced from the internet, typically without explicit permission from original authors. Some argue that this practice can resemble "copyright laundering- opens in a new window" by creating derivative works based on existing material without directly infringing on copyright laws.
The nature of generative AI (GenAI) content, and the increasing popularity of GenAI tools have led to important copyright questions including defining copyright ownership of generative AI works and the rights of creators whose works have been used to train AI models. The following provides an overview of copyright considerations to keep in mind when using GenAI tools. Due to the rapidly changing GenAI copyright landscape, the information presented below will change over time.
The two main areas of copyright concern revolve around generative AI input and output.
Check your college's copyright guides- opens in a new window for more information about Generative AI & copyright.
Explore these faculty resources tailored to support your college community in effectively incorporating generative AI into teaching and learning.
Note: If your college isn't listed above, there are no faculty specific resources available at this time. For more student-centered information about GenAI, try our Generative AI module- opens in a new window.
Tell-Tale Signs of AI Generated Text | ||
---|---|---|
Generic Style Repetitive writing style that lacks a unique voice or perspective, and lacks specific details |
Missing Context Lacks detailed information or a nuanced understanding of your specific topic |
Missing or Fake Sources Citations are missing or the AI Tool may fabricate (e.g., “hallucinate”) citations. |
Overuse of Jargon Overly reliant on certain works that are not commonly used in everyday language |
Inconsistencies Statements may contradict one another or may be completely unrelated to the topic |
Outdated Information Information is not always up-to-date and might contain inaccuracies |
Generative AI offers a wide range of possibilities for enhancing teaching and streamlining instructional tasks. From drafting course materials to designing assessments, AI tools can save time and inspire creativity in the classroom. To help you explore these options, we've compiled a list of AI tools tailored for faculty needs.
For even more options, check out Futurepedia- opens in a new window, a comprehensive directory of AI tools that is frequently updated to include the latest innovations in AI technology. With these resources, you can discover new ways to support your teaching and engage your students.
Explore a variety of AI tools beyond Copilot and ChatGPT to enhance classroom learning and teaching practices.
Information on this page was adapted, with permission from "What Students Should Know About AI & ChatGPT- opens in a new tab" by Centennial College Libraries, "The Generative AI Toolkit- opens in a new window" by Conestoga College, the "Generative AI @ Fleming- opens in a new tab" guide by Fleming College, the "Generative AI Guide - opens in a new tab" and "AI & Copyright - opens in a new tab" by Seneca Polytechnic Libraries, and Designing Research Assignments that Enhance Student Research Skills- opens in a new window by Kathleen Oakey, a faculty librarian at Sheridan College.