by Gwen Nguyen, Advisor, Learning + Teaching
“You ask me what I mean
by saying I have lost my tongue.
I ask you, what would you do
if you had two tongues in your mouth,
and lost the first one, the mother tongue,
and could not really know the other,
the foreign tongue.”
— Sujata Bhatt
Have you ever found yourself searching for your voice or feeling that you were muted, intentionally or unintentionally, by the situation you were in? Bhatt’s words resonate with me — not just as someone who has often navigated between different languages and cultures, but as someone who also spent years in classrooms where my own voice felt lost.
Growing up in Vietnam with an educational system that placed pressure on traditional teaching approaches, content mastery, and seniority, there was little room for student voice — our aspirations, our life stories, our concerns, and our agency. Throughout all my K-12 years and into college, I cannot recall a single moment when we were asked what we thought of a course, let alone invited to shape it. It wasn’t until my master’s program in applied linguistics in the United States that I first encountered course evaluations, open-ended feedback, and classrooms built on discussion and peer dialogue. At first, it was not easy — I didn’t know what to do. Was it okay to comment on my professors? I worried about saying the wrong thing in a language that still felt like Bhatt’s foreign tongue. But it was powerful, and I felt happy and connected every time my voice was heard. It is our nature that we speak to hear and we hear to speak.
Those experiences shape how I approach my work and my commitment to cultivate spaces where learners can bring their stories, voice their aspirations, and practice agency throughout our learning and teaching journeys. As AI becomes entangled with how we learn, teach, read, write, research, and demonstrate our understanding, I find myself returning to Bhatt’s question: what happens when we think we have lost our tongue(s)? Is it possible to help our students become confident thinkers and decision-makers amid pervasive AI storms? How do we do that? Or are we risking classrooms where students lose their voice and agency to algorithms that speak to them, speak for them, think for them, and decide for them?
In this piece, I share how I understand student voice and choice, how they can be muted in AI-enhanced learning environments, and what I call “loving out loud” — practical approaches for centering student agency as we teach with AI.
1. What Do We Mean by Voice, Choice, and Agency?
Student voice: more than just opinions
In education literature, student voice is not simply letting students speak. It refers to practices that position students as partners whose perspectives shape curriculum, pedagogy, and institutional decisions (Toshalis & Nakkula, 2012). In higher education, student voice is often understood along a spectrum of activities from basic expression (students join focus groups or complete feedback surveys) through consultation and participation (students collaborate, co-design activities, or co-lead elements of curriculum) to activism and leadership (student as decision-makers and leaders driving change).
Consider: do students have opportunities to decide on any changes to their learning in your course or program? Toshalis and Nakkula (2012) observed that most activities at schools remain at expression, consultation, and participation. They also suggested that when student voice moves toward authentic partnership — where students can influence assessment practices, curriculum focus, and classroom culture — it tends to support engagement, belonging, and ownership of learning.
Student choice: autonomy that actually matters
Student choice means structured opportunities for students to select among meaningful options in their learning including instructional approaches shaped by their interests and passions (Bray & McClaskey, 2015). In course and assessment design, choice often appears as reading options, assessment menus, format alternatives, flexible deadlines and weightings, or pathway options. When students have a choice in what they are learning and what really matters to them, they enter what is called flow.
However, choice is not anything goes. I would not consider it choice-pedagogy if students are offered a long list of course readings and told to pick any three. Simply giving students a massive menu does not directly support learning and can even disadvantage some students who thrive with more structure. The most beneficial forms of choice are bounded and purposeful: options that align with course or program outcomes while allowing students to pursue their interests, identities, and goals, supported by educators’ intentional guidance and scaffolds so that learners can become more confident and prepared to make informed decisions about their learning (Goel, 2025).
Student agency
Another key word that often appears in literature when discussing student voice and choice is student agency, which refers to students’ capacity to intentionally influence their learning through goal-directed action, self-reflection, and strategic decision-making (Bandura, 2006). Voice and choice are key pathways through which a broader construct — learner agency — is enacted across the following six dimensions (Suárez et al. ,2018).
- Control over goals – participating in defining and negotiating learning objectives
- Control over content – selecting and critically working with materials and sources
- Control over actions – choosing learning activities, pacing, and intensity
- Control over strategies – choosing learning and metacognitive approaches
- Opportunities for reflection – access to spaces/tools for metacognition
- Opportunities for monitoring – ability to track and act on their own data
Take a moment to consider, in your context, whether you support learners in enacting their agency throughout their study process. I believe we are all doing some parts of this. As Garry (2018) noted, “providing students a place to voice their needs and interests and a place for choice in the process starts with teachers” (p. 5). But where is the gap? Where is the part that might need more attention?
Why do voice and choice matter?
As Bray and McClaskey (2015) observed, when learners can express their voice and have opportunities for choice, this changes how they engage with the content, the instructor, and each other. Student voice and choice are also foundational for fostering higher engagement, a stronger sense of belonging, and greater academic persistence and success.
Little reflection corner: in your course, where do students currently have voice and choice? Why do you offer spaces for students to have a voice and choice in their learning process?
2. How Student Voice and Agency Get Muted in AI-Enhanced Learning
Now let us turn to AI-enhanced learning environments. Have you heard or seen recent posts or voices from others in your academic community saying:
- “Students are not learning at all.”
- “Students just use AI to cheat through their degrees these days.”
- “I thought I made the assignment so personal and reflective, encouraging them to think critically, and I ended up receiving 20 answers that sound exactly the same.”
- “Students use AI to cheat and teachers use AI to grade.”
- “Students are losing all their cognitive abilities by totally relying on AI.”
While AI promises considerable potential for enhancing student voice and choice in learning — for example, by offering personalized support for educators designing adaptive materials and by providing on-demand tutoring support for students — research also shows that poorly designed systems can create automation dependency, reduce critical thinking, and flatten diverse learning experiences (Suarez, 2025). Suarez further argues that specific characteristics of AI systems “position AI as an active participant in educational decision-making, requiring careful consideration of how human agency is preserved within human-AI collaborative learning environments.” AI’s proactive recommendations (telling us what to expect or do next), opacity (not understanding how the system arrives at answers), and adaptive behaviour (generating responses based on iterative data input or training datasets) are examples of AI characteristics that impact educational decision-making.
Here’s how this erosion of agency unfolds across the six dimension of student agency:
- Goals
If students are not taught to question AI’s suggestions, even a well-intentioned activity such as asking them to set or reflect on their goals for a course or program can be subtly redirected. Once they begin working with AI’s recommendations, they may allow the system to set or shape their learning goals rather than thinking critically about what they truly want to learn and why. - Content
Imagine you encourage students to use GenAI to brainstorm or to conduct a quick literature scan on Vygotsky’s zone of proximal development in higher education. Automated recommendation systems may surface only the most popular or most aligned content, while obscuring alternative, critical, or marginalized perspectives. Over time, this narrows the intellectual landscape students encounter and can limit their exposure to diverse scholarly voices. - Learning actions
When AI performs tasks that previously required human decision-making and planning, students lose valuable opportunities to practise these skills. This is at the heart of many educators’ concerns. For instance, think about a midterm assignment that requires a five to nine page business report. You might try to “cheat-proof” it by asking students to build on an in-class discussion. Yet if students are juggling two part-time jobs and multiple other midterms, and if a free version of any GenAI platforms can generate a passable nine page take-home assignment that earns a B+, many will simply “borrow” GenAI’s voice instead of engaging in the deeper cognitive work the task was meant to support. - Strategies and reflection
Consider an instructor who adopts AI early and integrates it into a statistics course to provide feedback on mini-unit quizzes. On the surface, this looks ideal: students have many opportunities to practise and AI gives them immediate, highly accurate feedback before their final departmental exam. However, if AI primarily delivers correct answers without explaining the underlying reasoning or multiple solution paths, students’ opportunities to reflect on how to solve a problem and why a particular approach works are diminished. - Monitoring
Many AI-enhanced educational technologies now offer learning analytics dashboards to help administrative staff and educators better understand students’ progress, with the hope that such tools will also foster student self-monitoring. Yet not all dashboards are designed with students as primary users. When learning data are difficult to interpret or presented in ways that are not meaningful to students, learners cannot easily use that information to guide or adjust their own learning. In such cases, AI-driven monitoring strengthens institutional oversight more than student self-regulation, further shifting agency away from learners.
But students do have voice…what do students say?
Students themselves voice this tension. Recent surveys (Attewell, 2025) show they recognize AI skills as essential for their futures and want meaningful integration. Yet they’re also anxious about the pace of change, unclear expectations, and whether AI might undermine the very learning they came to college to pursue. What they’re asking for is clarity, guidance, and, most importantly, a seat at the table in deciding how AI shapes their education.
Students aren’t asking us to remove AI. They’re asking us to love them out loud — to demonstrate through our policies and practices that their learning, integrity, and futures matter more than technological efficiency. Towards that, it should start with their aspirations, decisions, and agency.
3. Centering Student Agency in AI-Enhanced Teaching: Loving Out Loud
I would like to explain how the phrase “loving out loud” came into my thinking. Think about why you came to teaching in the first place. Why this program? This course? If your role were merely to deliver required material, test students on it, and collect a paycheque, you would not be reading this now. You are here because you care about your students and their learning and because you have done work to support them in becoming drivers of their own learning. But, collectively, have we done enough?
The goal of education is to help learners become expert learners. Individuals who want to learn, who know how to learn, and who, in their own individual and flexible ways, are prepared for a lifetime of learning. As Bowen and Watson (2024) powerfully stated, it is the job of educators to help students become better thinkers. Our new job is to help them become even better thinkers with AI.
So, how do we do this? How do we love our students out loud in an AI-enhanced learning environment? We begin by designing learning spaces that show, in concrete and consistent ways, that student voice and choice truly matter.
Cultivate learning spaces that demonstrate student voice matters
Acknowledge your stance on AI in the course or program. Be transparent about your own position. Why are you allowing AI? Where are the boundaries? What values guide those decisions? When students see our reasoning, they also learn to develop their own. For example, in a writing course focused on reflection, explain that the writing process is like going to the gym or writing wedding vows — it is a way of expressing commitment. If we do not do the work, we do not develop needed skills or genuine connection.
Invite students to co-draft AI use guidelines for the course. Rather than handing down policies, create space for dialogue. What do students see as fair? What concerns do they have? What opportunities excite them? Co-creation signals that their perspectives shape the learning environment. Given that we do not yet know whether AI will prove supplementary or competitive, I believe it is fair to invite students to revisit their values and goals for the course and, with that understanding, set their own boundaries with the tool and make healthy decisions about adoption.
You can explore examples of activities and resources for this kind of work in the BCcampus GenAI in Teaching and Learning Toolkit for Educators.
Position students as AI literacy experts. Have students prepare AI literacy lessons for their peers on topics such as misinformation, deepfakes on social media, bias in AI content, AI and environmental issues, or GenAI tools for collaboration. This helps students become knowledge-creators with shared responsibilities in the learning journey and supports their informed decision-making.
Design and offer different assessment pathways. Whenever possible, provide options for learning with or without AI so that students can contribute their thinking, values, and lived experiences — the things AI cannot do or might do poorly. Offer meaningful choices that align with their goals and values (Liu, 2024). EDUCAUSE’s AI Ethical Guidelines (2025) highlight the principle of respect for autonomy, calling on institutions to ensure that learners can “make independent choices regarding their engagement with AI systems without undue influence or coercion.”
The good news is that you can now use GenAI as a design partner when creating assignment materials and pathways, modelling these choices explicitly for learners. This becomes even better when we apply the “I Care, I Can, I Matter” framework (Bowen & Watson, 2024, p.185-188), which centers student agency in AI-supported activities.
- “I care” (Purpose)
How are we ensuring that AI-supported assignments are grounded in questions, contexts, and communities that matter to students? Can they see why their voice and perspective are needed here and now? How will the skills they are building travel with them beyond the course? - “I can” (Task clarity)
In an AI-rich environment that can easily feel overwhelming, students need transparent expectations about what they are being asked to do, how they may and may not use AI, and why those boundaries exist. This level of clarity does not reduce student voice — it creates a safer space for them to take intellectual risks, make choices, and experiment with tools. - “I matter” (Criteria for success)
When students co-construct criteria, have choice in the forms their work can take, and can see how their contributions inform peers, communities, or future practice, assessment becomes a site of agency rather than compliance.
Building assessment around these questions — Why should I care? How can I do this? How do I know my work matters? — keeps human effort, student voice, and student choice at the center, and resists the pull to let technology define what counts as meaningful learning.
Offer space for self-reflection and co-reflection. Build in regular opportunities for students to think metacognitively about their learning process, their use of AI, and how they are developing as thinkers. For example:
- Before using AI for an assignment, ask students to set a purpose: my goal in using [GenAI tool] for this task is to ____.
- Revisit this goal at the end of the task: did it help? Did it just distract?
- Prompt reflection: identify one perspective that seems absent when you work with the tool.
- Ask: how did working with AI in this task influence your process, confidence, and learning both positively and negatively?
All of this takes time and effort. But as students express that they want their voices to be heard and to shape the future of education, we are called to create that space by inviting them into the conversation. As Buolamwin (2023) reminds us, as long as we have a face, we have a voice and mission in this journey. We are responsible for the collective decisions we make together with AI. I believe it continues to be our role to communicate with students, model ethical use, and invite them to join us in deciding how AI should be used in their education.
Conclusion: Still Rising
I began with Sujata Bhatt’s question about what happens when we lose our tongue. In AI-enhanced education, we face a similar question: what happens when students lose their voice to algorithms designed to speak for them?
As we all know, our aspirations are strong and resilient. As Bhatt writes, “but overnight while I dream… it grows back, a stump of a shoot … Every time I think I’ve forgotten … it blossoms out of my mouth.” We just have to find them and make space for them.
We can love our students out loud, through our pedagogies and our daily practices, in ways that show them their thinking matters, their perspectives are needed, and their futures are worth fighting for.
I’ll end with some beautiful words from Maya Angelou that capture the resilience I see in our students, and the hope I hold for what we can build together.
“Just like moons and like suns,
With the certainty of tides,
Just like hopes springing high,
Still I’ll rise.“
References
Angelou, M. (1978). Still I rise. In And still I rise: A book of poems. Random House.
Attewell, S. (2025, May 22). Student perceptions of AI 2025. Jisc.
Bandura, A. (2006). Toward a psychology of human agency. Perspectives on Psychological Science, 1(2), 164-18
Bandura, A. (2020). Social Cognitive Theory: An Agentic Perspective. Psychology: The Journal of the Hellenic Psychological Society, 12(3), 313–333.
Bhatt, S. (1988). Search for my tongue. In Brunizem. Carcanet Press.
Bowen, J. A., & Watson, C. E. (2024). Teaching with AI: A practical guide to a new era of human learning. Johns Hopkins University Press.
Boulamwini, J. (2023). Unmasking AI: My mission to protect what is human in a world of machines. NY: Random House
Bray, B., & McClaskey, K. (2015). Make learning personal: The what, who, WOW, where, and why. Corwin.
Daoayan Biaddang, L. M., & Caroy, A. A. (2024). Student voice and choice in the virtual classroom: Engagement strategies. Discover Education, 3, 162.
Garry, A. (2018). Student voice and choice. Education Week, 37(25), 5.
Georgieva, M. et al. (2025). AI ethical guidelines. EDUCAUSE,
Liu, D. (2024). Menus, not traffic lights: A different way to think about AI and assessments. Teaching@Sydney
Suarez, A. (2025). AI and learner agency: A framework for preserving student autonomy in educational technology. Rural, Digital y Descentralizado (RDD).
Suárez, Á., Specht, M., Prinsen, F., Kalz, M., & Ternier, S. (2018). A review of the types of mobile activities in mobile inquiry-based learning. Computers & Education, 118, 38–55.
Toshalis, E., & Nakkula, M. J. (2012). Motivation, engagement, and student voice. Jobs for the Future.
Khedr, N. (n.d.). Voice and choice in student learning. Teacher Education Resource Hub – Scarfe Digital Sandbox.