Pick a Framing
0 of 4 framings picked 0 highlighted
reset

Reader interactive · panel responses

How three educators are actually learning AI

Four questions from a panel on AI in education. Three working answers from a faculty developer, an instructional designer, and an 8th grade math teacher. Pick whose framing resonates with you on each one.

Akesha Horton, PhD
Director of Academic Engagement and Learning, Indiana University Bloomington
Yousuf Marvi
Math Teacher and ELD Coach, Sierra Vista Middle School (Irvine USD)
Anne Fensie, PhD
Director, Center for Teaching and Learning, University of Maine at Presque Isle
Two ways to read. Same four questions to all three panelists. Pick the framing on each question that lines up with how you think about your own work — and click any individual sentence to highlight the lines that stick. At the end, you'll see your pattern alongside every line you marked.
Pick a framing per question Click any sentence to highlight it

Question 01

What is your why of AI in your work?

The opening question. The moderator noticed that none of the three led with personal efficiency, which is the answer she usually hears. Each landed somewhere different.

Akesha
Faculty designer in the seat

"AI puts educators in the design seat, instead of waiting for someone else's tool to fit our context."

Working with Punya Mishra taught her that there is no such thing as educational technology. Every tool was designed to make a profit. PowerPoint, Canvas, all of it. AI lowers the cost barrier so educators can build the tools their classrooms actually need, instead of bending borrowed ones.

Yousuf
Every child, every day

"My students are my why. When AI came about, I saw multimodality."

As a math teacher of English language learners, he could not always tell what his students were doing in transformations or dilations through text alone. AI changed what formative assessment can look like. The orientation is to every single student, not to his own workload.

Anne
Help, not hinder, learning

"My why is to figure out the potential here. But I also want to know where the dangers are."

When ChatGPT launched, she dug in instead of banning. She wrote a chapter on using AI to help, not hinder, learning. The driving question is one she came to from teaching: where do students need productive struggle, and what can they offload?

Question 02

How do you preserve friction when AI is designed to remove it?

A follow-up the moderator built in real time. AI fluency is complex. So is the question of what students need to keep struggling with even when the tool would happily struggle for them.

Akesha
Failure as the teacher

"When AI fails, students realize they need a human in the loop. That's where the friction lives."

She does not try to put friction back into AI. She watches for the moments AI breaks down and uses those as the teaching surface. The job, when working with faculty, is to make sure the design does not erase the struggle students need to grow.

Yousuf
Effort over output

"If your effort over output ratio is close to one, learning is happening."

AI is designed not to have friction. The educator's job is to bring it back. He uses a working ratio. Effort divided by output. Close to one means students are doing the cognitive work. Far off in either direction means there is a problem on either side of the tool.

Anne
Memory as residue

"Memory is the residue of thought. We don't absorb. We think, and that's what we remember."

She borrows Daniel Willingham's line as the test. The question is not whether AI is in the room. The question is whether students are still doing the thinking. If AI is helping them think more deeply about content, fine. If it is replacing the thinking, the residue is gone.

Question 03

How are you actually learning to use AI?

This was the practical turn in the panel. Habits, routines, what actually moves the needle. None of the three landed on traditional professional development as the answer.

Akesha
Project-based, peer-based

"I've made about 30 apps in two months. The learning is in asking other people."

She has taken courses and certificates, and found most weren't built for educators. The shift was to project-based learning. Build something real. Then ask other people which tool, what it costs, what you give up in privacy. The peer network is the curriculum.

Yousuf
Structured PLC plus near-peer

"Paid, opt-in, with a teacher on special assignment as facilitator. Both the structure and the near-peer matter."

Irvine Unified runs an AI Pioneers PLC. Five times a year. Structured peer collaboration plus a thought partner who is one step ahead. He credits both. The structure makes the time exist. The near-peer makes the learning specific.

Anne
Self-set challenges

"I didn't sign up for classes. I set myself challenges and tried to break things, safely."

She skipped the formal courses on the theory that there are no settled best practices to teach yet. Her version of learning is annotating chatbot conversations, writing chapters, breaking platforms in low-stakes places, and leaning hard on cross-institution peer networks.

Question 04

How has your thinking on opportunities and risks evolved?

The hardest question, in some ways. Not what you think now, but what you used to think and don't anymore. Each panelist named a specific shift.

Akesha
Sociotechnical lens

"When someone benefits from a technology, someone else is usually being disenfranchised by the same one."

Working alongside an informatics department changed her frame. Reading Safiya Noble and Ruha Benjamin made the questions about each tool sharper. Tokens, privacy layers, environmental cost, who pays. Her stance became: never jump in. Understand what you want to do first.

Yousuf
Not every teacher needs AI

"It took me three years to come to this point: not every teacher needs to use AI."

His evolution was past evangelism. Most opposition to AI in schools is opposition to ChatGPT specifically. The question for educators is whether they have a problem of practice that AI is part of the answer to. If they don't, the answer is not to push the tool. It is to ask a better question.

Anne
Always start with the lowest tier

"I will never use AI to create video. The energy use is exponential as the video gets longer."

She found what-uses-more.com, which compares the energy use of AI tasks against familiar baselines. The shift was technical and ethical at once. Now she always starts with the simplest, lowest-tier tool and only goes deeper if she needs to. The tools are not going anywhere. The question is how to use them responsibly.

Take it further

Continue in the Teaching, Learning & AI Workbook

This page is a fast read — four questions, three working answers, your picks. The workbook is the long version of the same conversation: an hour of guided writing through six tensions, three pathways, and a one-sentence working stance you sign and revisit. Same panelists. Same questions. More room to write.

Open the Workbook →
Your picks and highlights here stay on this page. The workbook keeps its own private copy of your writing.

More like this, in your inbox

Spilled and Studied is a Substack on instructional design, AI in education, and what's actually changing in how people learn.

Subscribe at akesha.substack.com
Quotes lightly edited from a panel discussion on educator learning about AI.
For the curious. This page records your picks and highlights in your browser. To aggregate votes across readers, replace the placeholder endpoint at the top of the script (VOTE_ENDPOINT) with a real one (Formspree, Tally, a function on your domain, etc.). The payload shape is documented inline.