← Back Published on

Entry 7: Ethical Reflections on Surveillance and Autonomy

Date: June 2, 2025

Quotation: "Custom feedback and goal-setting to promote skills mastery" (Khan Academy, as presented by Auston & Basma, 2025).
Reference:
Ben-Porath, S., & Ben Shahar, T. H. (2017). Introduction: Big data and education: ethical and moral challenges. Theory and Research in Education, 15(3), 243–248. https://doi.org/10.1177/1477878517737201
Auston, A., & Basma, B. (2025). AI in the Classroom: Khan Academy Tools. [Class Presentation, Module 3 Discussion].

Why I Included This:
This reading and class presentation profoundly impacted me by exposing the risks and potential of relying on AI tools in education. It connects closely with Entry 3’s concerns about adaptive systems and pushes the idea further. Are we reducing students to patterns? Or are we using data to personalize support?

Auston and Basma’s discussion of Khan Academy’s use of custom feedback illuminated how AI might offer valuable insight for promoting learner autonomy. Yet, as Ben-Porath and Ben Shahar (2017) warn, “data-driven decisions in education may prioritize efficiency over empathy” (p. 244). This tension stood out to me.

In my LINC class, we use Avenue to collect learner artifacts. While useful, I’ve begun to question: where is this data stored? Who can access it? Are students aware of how their learning is documented? This reflection encouraged me to bring these issues into the classroom inviting students to discuss their digital footprints and choose what they feel comfortable sharing.

This entry also builds on Entry 6: tech data might track clicks, but it can’t capture emotional nuances. As a result, I strive to merge tech feedback with human interpretation, ensuring students feel seen, not scored.

This entry shows a shift in me (from data user to data questioner). I’m beginning to see that technology in education is not neutral. It reflects values and I want mine to prioritize trust, dignity, and dialogue.