By Amanda Robbins, Educator, University of British Columbia
Lately, it feels like educators are stuck between a rock and a hard place. On one side, there’s the nagging worry that students will use AI to shortcut their way through an assignment; on the other, there’s the pressure to adopt every shiny new tool that hits the market.
In our recent BCcampus EdTech Sandbox session, we stopped talking about how to block AI and started looking at how to weave it into pedagogy. The goal? To move away from policing AI and toward scaffolding learning with AI.
Moving Past the Answer Machine
The biggest hurdle is the “answer machine” mindset, the idea that you ask a question and, poof, a finished essay appears. In the sandbox, we flipped that script. We looked at teacher-designed chatbots as digital scaffolds: tools that support the process of thinking and reflection rather than just the final product. We weren’t looking to automate the learning, but to amplify the parts of teaching that actually matter.
What the Research (Actually) Says
The data is starting to catch up with our gut feelings. Simply giving a student general-purpose AI doesn’t magically lead to learning. In fact, if a student uses AI passively (letting the bot do the heavy lifting) their cognitive engagement falls off a cliff. When tasks become too easy, we rob students of the productive struggle that makes deep learning stick (MIT, 2025).
On the other hand, a 2026 Organisation for Economic Co-operation and Development report suggests that narrow bots (ones designed with a specific pedagogical purpose) actually work. Research indicates that when we shrink the AI’s world to a specific task, it stops acting like a shortcut and starts acting like a cognitive partner or a well-informed assistant.
The Anatomy of a Bot
We broke down what actually makes a teaching bot work, without coding.
- System prompts (the brain): This is where your teaching philosophy lives. You decide if the bot is a Socratic coach, a gentle encourager, or maybe a bit snarky to keep things interesting. You’re anchoring the bot to how you teach.
- Knowledge bases (the guardrails): Instead of letting the bot roam the whole internet, you feed it your specific rubrics, exemplars, and style guides. It keeps the bot focused and, crucially, stops it from hallucinating. (Most of the time.)
Bots in the Wild
Participants in the session explored two bots I created for class use:
- The Creative Coach: a Dungeons & Dragons-inspired bot for Grade 11s to help them build out character backstories without doing the writing for them.
- CiteGPT: my own creation for the ETEC512 class at UBC. It’s a specialized assistant to ensure APA citations are accurate.
We explored platforms like SchoolAI and Custom GPTs, but the conversation quickly turned to the fine print.
The Reality Check: It’s Not All Rainbows
We had some honest (and necessary) discussion about the downsides. One big takeaway? Guardrails are brittle. A tech-savvy student can usually find a way to turn a coach back into a cheat code if they try hard enough with customGPTs. Some ready-made educational platforms do better at this, such as SchoolAI, but can be costly or seem juvenile (K-12 focus). We also wrestled with the black box problem:
- The ethics of uploading course materials into a model we don’t own
- The paywall problem
- Constant shifts in privacy policies that make it hard to trust where your data is going
My Take: You’re Not Being Replaced
If the sandbox session proved anything, it’s that bots aren’t replacing us. If anything, they highlight why expert human judgment is more important than ever.
Take CiteGPT. It’s helpful, but it still makes mistakes — like a student who confidently cites The Onion as a peer-reviewed source. It also gets easily distracted by conversations about cats. But even with its quirks, it offloads the grunt work of citation formatting, freeing me up for the stuff only humans can do: mentorship, nuance, and real connection.
Webinar Resources and Transcript
If you missed the webinar, or want a quick refresher, you can access the webinar recordings and transcript here:
References
Chow, A. (2025, June 27). Is using ChatGPT to write your essay bad for your brain? New MIT study explained. TIME.
Organisation for Economic Co-operation and Development. (2026). OECD Digital Education Outlook 2026: Exploring effective uses of generative AI in education (OECD Digital Education Outlook). OECD Publishing.
Kosmyna, N. (2025). Your brain on ChatGPT: Accumulation of cognitive debt when using an AI assistant for an essay-writing task (Preprint). arXiv.