This month’s professional development theme for me is Artificial Intelligence (AI). I’ve got some professional development lined up, and I am very excited to start learning more about my district’s stance on the matter/how they encourage us to use it. Carol Todd sent me an invitation to this workshop I will be discussing and reflecting on today, and I knew I had to jump at the opportunity.

Our workshop was hosted by my district’s Principal of Technology, Dave Sands, and Dr. Alec Couros, who is the Director for the Centre of Teaching & Learning at the University of Regina on top of being a professor of Educational Technology and Media. It was supposed to be in-person, but due to the nature of the recent snow storm, we had to be moved online.

The main focus of the workshop was discuss how teachers, students, and the general population have been using Chat GPT and other AI platforms. We were interested in how the potential for AI use could drastically change the learning landscape for both teacher and student, while discussing the ethical and security issues that may come about.

At this moment in time, many of the big tech companies (like Microsoft greatly investing in OpenAI) and interfaces are integrating AI onto their platforms such as: MS Office’s Co-Pilot, MS Powerpoint’s Slideshow Speaker Coach, Google’s Bard (not currently available in Canada), Google’s workspace Petri, Khan Academy/Duolingo, Snapchat’s AI bot, and Meta’s collaboration with Rayban.

There has been rapid advancements in AI where these platforms can reason better, do complex tasks (i.e. synthesize large documents), run outputs multilingually, vocalize outputs, and express more multimodal capabilities. Many AI platforms are now coming out with custom GPTs and AI agents where the AI is subject or purpose specific (i.e. colouring page generator AIs to Advertisement generator AIs).

The biggest takeaway that Dr. Couros brought to the discussion about teachers and AI use is: being very specific with your prompting. You can either try incremental prompting to get closer and closer to your desired output, or you can write a very long specific prompt that encapsulates everything that you want down to the tone and the purpose. Below, are the presentation notes that Dr. Couros shared with us that have examples of both types of prompting.

Some of the major caveats of utilizing AI in the educational sense include:

Some of our global concerns on the use of AI include:

This talk came highly recommended by Dr. Couros on the concerns we have about AI (and this was from 9 months ago!)

Interesting applications for AI use outside of the educational sphere:

Dr. Couros painted a very detailed picture of what AI is, what we use it for in both educational and other realms, the concerns we should be aware of, and how we can utilize AI in our practice. Out of all of the workshops I’ve taken on AI this past school year, this one was the most thorough with the most tangible examples presented. I was so impressed with all of the different ways he could show us how AI was being used in the world today. I had heard many times before about how important prompting is, yet I’ve never really taken it too seriously until Dr. Couros showed us how results could drastically vary simply by changing a small detail. I have since tried both incremental prompting and large format prompts and I think I think I have a preference for the incremental style. I tried to make a sample rubric for a unit I would like to try in the near future and I enjoyed seeing all of the changes that were made along the way. I started with: “make me a rubric using the BC proficiency scales as headings”, added “write the rubric in the first person so a student could identify with the rubric”, further elaborated by asking the AI to “write the rubric for a grade 3 reading level”, and finally asked it to “weight the rubric more on the technical side of the project”. Each time I made a suggestion, it changed for the better. I still need to make some personal tweaks to the rubric if I do choose to use it, but it has given me a great foundational start to the plan.

I was amazed that there were so many custom GPTs out there just in the educational sphere, and I was really intrigued by IEP generators, when highly specific and sensitive information is being used. I was wondering if there would be a time and place for such a GPT when privacy is of the utmost importance. As a sometimes part-time student support teacher, I often meet with the full-time student support staff who struggle with all of the paperwork that is expected of them on top of providing support for our students on IEPs. If they were able to get an AI to write our IEPs, it would make more time for them to see these students one-on-one. Each student is so unique in what they need an IEP for I feel as if student services couldn’t use a GPT to write any part of the IEP. The strategies and learning plans for one student with Autism might not work for another, and the GPT inherently would ask our student services teachers for more personal information. With this consideration in mind, I worry that even amongst teachers in a school, AI use might not be equitable. I would be curious to speak with student services at the district level to see what their stance on using AI in their roles are.

What’s next? My district has very recently allowed us, as teachers, to use Bing’s Co-Pilot to help with resource generation, lesson planning, and unit planning. Use directly with students is strictly prohibited at this time, and the feature is not available to students on their school accounts. With this new knowledge of highly specified prompting and how we can use AI in many different industries, for a multitude of purposes, I want to play around with AI more in terms of using it to revamp my own curriculum. I want to make new lessons and units and see the AI’s perspective on things. I might be surprised by what it comes up with. I also want to rework some of my digital literacy and digital citizenship lessons with AI in mind. At the start of the school year, I always start my K-5 classes on digital literacy and digital citizenship. I often do a funny, yet serious lesson on digitally altered photos where I show the students pictures of mixed up animals, people from different time periods together in one image, digitally restored photographs, and digitally altered photographs for media (i.e. supermodels and celebrities). We talk about why people alter images for a myriad of reasons, and come out of the discussion more critical of what we see online. Now, I feel like I need to start bringing in AI generated images into the conversation. Our students need to know how to recognize a reputable source for the information, how to protect themselves online, and have the understanding that making AI images of others without their consent is not ethical. I am usually a big fan of the Media Smarts webpage for great Canadian content on digital literacy and I think I will start there to see if they have any resources. My next step would be to connect with Carol Todd to see if she has any resources, at the district level, that might be coming down the pipeline.

Overall, I was really impressed with this workshop. Although I had heard a lot about AI from other workshops in the past, this one helped provide some clarity and real-life application for me. I am excited for my next professional development day where I have signed up for courses on AI and utilizing it for Universal Designs for Learning (UDL).