Because of my work exploring the intersection of authentic assessment and academic integrity, I am interested in the recent chatter on ChatGPT. What does it change, exactly? How can/should educators respond? What learning new learning opportunities does it present? This is a short selection of pieces from the explosion of articles that have come out about how the development of artificial intelligence will transform postsecondary assessment practice.

Mollick, E. R., & Mollick, L. (2022). New modes of learning enabled by AI chatbots. Three methods and assignments. SSRN. https://papers.ssrn.com/sol3/papers.cfm?abstract_id=4300783

AI is a cheap way to provide students with many examples, some of which may be inaccurate, or need further explanation, or may simply be made up. For students with foundational knowledge of a topic, you can use AI to help them test their understanding, and explicitly push them to name and explain inaccuracies, gaps, and missing aspects of a topic. The AI can provide an unending series of examples of concepts and applications of those concepts and you can push students to: compare examples across different contexts, explain the core of a concept, and point out inconsistencies and missing information in the way the AI applies concepts to new situations.

Eaton, S. E., & Anselmo, L. (2023, January). Teaching and learning with artificial intelligence apps. Taylor Institute for Teaching and Learning. https://taylorinstitute.ucalgary.ca/teaching-with-AI-apps

Six thoughts on artificial intelligence and academic integrity (Eaton, 2022): 

  1. Using artificial intelligence for school work does not automatically equate to misconduct. 

  2. Artificial intelligence can be used ethically for teaching, learning, and assessment. 

  3. Trying to ban the use of artificial intelligence in school is not only futile, it is irresponsible. 

  4. Human imagination and creativity are not threatened by artificial intelligence. 

  5. Assessments must fit for purpose and should align with learning outcomes. 

  6. Artificial intelligence is not going anywhere. We must learn to work with new technology, not against it. 

Schmidt, H. (2023, January 22). ChatGPT, popular AI programs under watch at Waterloo region universities. Global News. https://kitchener.ctvnews.ca/chatgpt-popular-ai-programs-under-watch-at-waterloo-region-universities-1.6241471

"When I have essays or something, if I need to clarify the instructions, I’ll put the instructions into ChatGPT and then it’ll just give it to me in an easier way,"

Parsons, J. (January 30, 2023). Post-secondary sector must embrace AI technology in education. University of Waterloo. https://uwaterloo.ca/news/post-secondary-sector-must-embrace-emerging-ai-technology

“The first thing you have to say is it’s super disruptive,” says Dr. Marcel O’Gorman, a professor in the Department of English at the University of Waterloo and the founding director of the Critical Media Lab. “The question is, what is it disrupting? Sure, some of the discussion has to be about impacts in education, but I think that might be missing the mark.” (para 3)

There are always straightforward ways to adapt assessments to foster a culture of academic integrity and engagement. One easy way is to have students complete coursework that involves the creation and evaluation of knowledge, rather than more rudimentary assessment of memorization or simple understanding. (para 9)

Rigolino, R. E. (2023, January 31). With ChatGPT, we’re all editors now. Insider Higher Ed. https://www.insidehighered.com/views/2023/01/31/chatgpt-we-must-teach-students-be-editors-opinion

there’s an existential difference between spell-check and AI-generated writing. While computer programs can be leveraged to reduce the drudgery of proofreading for spelling, grammar and citation errors, these programs aren’t like ChatGPT, which produces coherent texts that students can hand in, with no revision, for a passing grade (at least some of the time). The text is being generated on behalf of the student and is being substituted for the student’s self-generated text. This use of AI is inherently dishonest. (para 9)

Kelly, S. M. (2023, February 1). ChatGPT creator rolls out ‘imperfect’ tool to help teachers spot potential cheating. CNN. ChatGPT creator rolls out 'imperfect' AI detection tool to help teachers spot potential cheating - CNN (ampproject.org)

"We really don't recommend taking this tool in isolation because we know that it can be wrong and will be wrong at times -- much like using AI for any kind of assessment purposes," Ahmad said. "We are emphasizing how important it is to keep a human in the loop ... and that it's just one data point among many others."

Ahmad notes that some teachers have referenced past examples of student work and writing style to gauge whether it was written by the student. While the new tool might provide another reference point, Ahmad said "teachers need to be really careful in how they include it in academic dishonesty decisions."

McMurtrie, B. (2023, February 2). Rethinking research papers, and other responses to ChatGPT. Teaching: The Chronicle of Higher Education. https://www.chronicle.com/newsletter/teaching/2023-02-02

Maier tried out an approach called the I-Search Paper, in which the subject becomes the process of searching for information, what the student learned, and what questions arose from that.

“It really is like a dissertation proposal,” he says, “on a much more informal level”: This is what I learned. This is a question that stemmed from that. This is why I think it’s really important and interesting. These are the kinds of sources I plan to use to find answers.

Most important, says Maier, students must explain how their research changed their thinking. While ChatGPT could be used as a research tool, the final product would be an original work.

Griffith, T. L. (2023, February 14). Why using AI tools like ChatGPT in my MBA innovation course is expected and not cheating. The Conversation.

https://theconversation.com/why-using-ai-tools-like-chatgpt-in-my-mba-innovation-course-is-expected-and-not-cheating-198957

In my course, the notion of “individual work” must change.

I’ll be adjusting the assignments and requiring an appendix describing the toolkit and practices students use. Using AI is not cheating in my course, but misrepresenting your sources is.

The AI will get better, and there will be more of them. Guidelines in work and education need to keep pace and be thoughtfully aligned to how knowledge is constructed in different fields.

Nagel, D. (2023, February 16). K16, GPTZero partner on AI writing detection tool. Campus Technology. https://campustechnology.com/articles/2023/02/16/k16-gptzero-partner-on-ai-writing-detection-tool.aspx

"This technology eliminates the manual process of faculty spot-checking student submissions one by one for potential AI-generated content. It also provides academic leaders with a complete and holistic picture of just how much student-submitted content across their institution is potentially AI-generated."

The academic integrity technological arms race is on. Students will pay for a subscription to ChatGPT (which will continue to get better at evading detection), and academic institution will use taxpayer dollars and student-tuition to subscribe to services to detect the AI-generated work. The cycle will likely ratchet up and become both more sophisticated and likely more expensive for institutions and students.

Benson, A. (2023, February 25). AI programs like ChatGPT could change Saskatchewan education, experts say. Global News. https://globalnews.ca/news/9511957/sask-education-chatgpt-ai/

some academic leaders are already turning to the program for help.

“Personally, I’ve been using it for just about everything,” University of Regina technology professor Alec Couros said. “Emails, discussions with students and developing lessons.”

McMurtrie, B. (16, March 2023). What you need to know about ChatGPT. Chronicle of Higher Education Teaching Newsletter. https://www.chronicle.com/newsletter/teaching/2023-03-16

  • Communicate with students

  • Be cautious about detection tools

  • There are better ways to bolster academic integrity (tapping into what is already known about good pedagogy can help, including designing assessments that seem valuable to students).

  • These tools can be an educational aid

  • Digital literacy is more important than ever

  • Start a conversation!

Milian, R. P., & Janzen, R. (2023, March 29). How are Canadian postsecondary students using ChatGPT? Academica Forum. https://forum.academica.ca/forum/canadian-students-and-chatgpt-a-new-learning-tool

Overall, these results suggest that as of February 2023, there were high levels of awareness of ChatGPT among Canadian postsecondary students, but relatively low rates of problematic use as it pertains to blatant cheating….Documented use of ChatGPT as a learning aid potentially reflects a need for additional learning supports, and an opportunity for institutions to intentionally employ AI technologies for these purposes. Technologies like ChatGPT can provide an invaluable resource for students who need technical terms explained in plain language or wish to have assignment questions reworded, for example.

Yang, H. (2023, April 12). How I use GhatGPT responsibly in my teaching. Nature. https://www.nature.com/articles/d41586-023-01026-9

In previous years, I’ve assigned a literature review to my students. This year, to avoid plagiarism and encourage creativity, I asked students to work in small groups to collect air-quality data on campus. However, the students will still use statistical methods to analyse the data themselves and write individual essays.

Of course, many students are not familiar with creating projects. Some struggled to come up with a suitable method to assess carbon dioxide emissions — so I suggested that they use ChatGPT to help them to design their projects. The model can outline several steps: from identifying a location to choosing a CO2 monitoring device, setting up the equipment, collecting and analysing data and presenting and disseminating the results.

The students did all of the work when it came to scientific analysis and writing their essays — but they also learnt how LLMs can generate scientific ideas and help to plan generic experiments.

Prentice, A-E. (2023, April 26). ChatGPT not the cheating wingman you need, Manitoba colleges, universities warn. Global News. https://globalnews.ca/news/9653932/chatgpt-cheating-manitoba-colleges-universities/

Foltynek, T., Bjelobaba, S., Glendinning, I., Khan, Z. R., Santos, R., Pavletic, P., & Kravjar, J. (2023). ENAI recommendations on the ethical use of artificial intelligence in education. International Journal for Educational Integrity, 19(12). https://edintegrity.biomedcentral.com/articles/10.1007/s40979-023-00133-4

Day, T. (2023). A preliminary investigation of fake peer-reviewed citations and references generated by ChatGPT. The Professional Geographer. https://www.tandfonline.com/doi/full/10.1080/00330124.2023.2190373

The discovery of fake sources also calls into question the veracity of statements made in ChatGPT answers to questions. The chatbot will likely improve, but initial enthusiasm should be tempered with a more nuanced and cautious approach to the application of AI chatbot technology to teaching and research. It is unlikely the technology will disappear because of the teething problems identified here and there are still many potential applications of AI chat-bots in postsecondary education (Conclusion, p. 3)

Janzen, R. (2023, May 17). Canadian PSE and the machine: Faculty, staff, and leaders share their thoughts on AI. Academica Forum. https://forum.academica.ca/forum/canadian-postsecondary-professionals-share-their-perspective-on-ai