Jump to content

The School Homework Debate No One Wants to Have

From JOHNWICK

Former OpenAI researcher says AI detection of homework is failing. Schools say we can’t give up…. academic integrity matters. Both might be right.

Andrej Karpathy recently stood in front of a school board and delivered a message that made everyone uncomfortable. “You will never be able to detect the use of AI in homework. Full stop.” The former OpenAI researcher and Tesla AI director did not mince words. Every AI detection tool can be defeated. Schools should assume any work done at home used AI assistance. His recommendation…. move all grading to in-class assessments where teachers can physically monitor students. The reaction was predictable. Some educators agreed. Others pushed back hard. And most just felt overwhelmed. But here is what makes this conversation worth having. Both sides have legitimate points. And pretending otherwise does not help anyone.

What Schools Are Actually Worried About

When educators resist Karpathy’s framing, they are not being stubborn. They are worried about students who submit AI-generated work without understanding any of it. Students who get A’s on essays they never read. Students who graduate with transcripts that mean nothing. A Pew Research survey found 26% of teenagers admitted using ChatGPT for schoolwork in 2024. The real number is almost certainly higher. Schools see students who cannot write a coherent paragraph without AI assistance. Students who panic when asked to solve problems without their phone. Students who have outsourced thinking to the point where they cannot think anymore. That is a real problem. And telling schools to just accept it feels like surrender.

What Karpathy Is Actually Saying

But Karpathy is not advocating for giving up on education. He is making a different argument. He is saying the detection approach is doomed. Not difficult. Doomed. As in…. it will never work no matter how hard schools try. Google’s Gemini Nano can now look at a chemistry exam in an image and solve every problem. Students can get AI to solve homework in their own handwriting. The technology is not going backwards. So if detection is impossible…. what are schools supposed to do? Keep pretending it works? Keep punishing students based on unreliable tools? Karpathy’s point is that schools need to stop fighting reality and start designing for it.

The Calculator Analogy Everyone Keeps Using

Karpathy draws a comparison that resonates with a lot of people. Calculators. When calculators first appeared, teachers panicked. They said students would never learn real math. Schools banned them. Everyone predicted disaster. Eventually schools figured it out. Teach foundational arithmetic so students understand what the calculator is doing. Then let them use calculators for complex problems. Karpathy argues AI needs the same treatment. Teach the fundamentals. Then let students use AI as a tool.

But here is where the analogy cracks. Calculators are reliable. 2+2 always equals 4. AI hallucinates. It generates plausible-sounding nonsense. It confidently gives completely wrong answers that look perfect.

Students using calculators need basic math literacy. Students using AI need deep subject expertise to catch when the AI is confidently wrong about everything. That is not the same thing. The calculator comparison suggests AI is just another tool. But AI does not just speed up work. It changes the nature of work. It can make thinking optional in ways calculators never could.

Where Both Sides Miss the Mark

Schools investing in AI detection software are fighting a losing battle. The tools do not work reliably. They can be fooled. And they often flag innocent work while missing actual AI use. Worse, detection teaches students the wrong lesson. That learning is about not getting caught. That the goal is beating the system rather than understanding material. But Karpathy’s solution has problems too. Moving all assessment to in-class work assumes schools have resources they do not have. Smaller class sizes. More direct observation. That costs money. It also assumes in-class monitoring is foolproof. Smartwatches. Hidden phones. Bathroom breaks with quick AI consultations. The monitoring approach has limits too.

The Questions No One Is Asking

What if homework was always a flawed assessment tool and AI just exposed it? Traditional homework measured compliance as much as understanding. It rewarded students with time, quiet space, and parental support. It penalized students who worked jobs or cared for siblings. AI levels that playing field in some ways. But it creates new inequalities. Students who understand how to use AI effectively gain advantages. Students who do not fall further behind. If we are redesigning assessment…. maybe the question is not just how to prevent AI use. Maybe it is what we are actually trying to measure. Are we testing memorization? Understanding? Application? Critical thinking? Different goals need different approaches.

What Might Actually Work

The middle path is not AI detection. And it is not pure in-class monitoring either. It is redesigning education around the reality that AI exists and students will use it. That means teaching AI literacy. Not how to use ChatGPT. But how to evaluate AI outputs. How to catch when AI is wrong. How to use AI as a tool rather than a crutch. It means designing assessments AI cannot easily complete. Open-ended questions requiring judgment. Projects requiring iteration. Work demonstrating understanding rather than just correct answers. It means making homework optional for grades but valuable for learning. If students can pass exams without homework…. maybe the exams test the wrong things. It means accepting that some students will shortcut learning. They always found ways to cheat. AI just makes it easier. But most students actually want to learn if learning is relevant and engaging.

The Real Tension

The world students are entering is saturated with AI. Jobs will require working with AI. Problems will be solved with AI assistance. If education pretends AI does not exist…. students are unprepared for reality. But if education allows unlimited AI use without teaching foundational understanding…. students are equally unprepared. They become dependent on tools they do not understand. The goal is not AI or no AI. The goal is students who can think with or without AI. Schools that ban AI entirely are preparing students for a past that no longer exists. Educators who embrace AI without teaching critical evaluation are preparing students for a future where they cannot think independently.

Where This Goes

Some schools will adapt thoughtfully. They will redesign assessment. They will teach AI literacy. They will prepare students for a world where AI is everywhere. Other schools will keep fighting. They will ban AI. They will invest in detection. They will punish students. Both approaches will produce results. But different results. The schools that adapt will graduate students who can work effectively with AI. Who understand its capabilities and limits. Who can think critically about AI outputs. The schools that resist will graduate students who…. also use AI. They just learned to hide it better. They learned school is about gaming the system.

The Question That Matters

Karpathy founded Eureka Labs, an AI-native education company. He is not anti-education. But his message forces an uncomfortable question. If the only way to verify learning is constant monitoring…. are we testing understanding or compliance? If students can get perfect grades using AI but fail without it…. what did school actually teach them? Schools are right to worry about academic integrity. Karpathy is right that AI detection is doomed. The answer is not choosing one side.

The answer is accepting that education needs to fundamentally change. Not because AI is replacing human intelligence. But because AI is exposing what was always broken about how we measure learning. The homework debate is not about AI. It is about what education is actually for. Are we preparing students to pass tests in controlled environments? Or are we preparing them to think, learn, and solve problems in the messy real world where AI exists? Because those require different approaches.

And pretending we can have both without changing anything is the real fantasy.

Read the full article here: https://ai.plainenglish.io/the-school-homework-debate-no-one-wants-to-have-0b5bf9a1471c