English teaching, AI and the thermostatic principle
A one-off issue between Fortnightlies 172 and 173: the latter will be along this time next week, followed by an Occasional for paid subscribers in two weeks.
Nobody knows anything, as the screenwriter William Goldman famously said about Hollywood and films:
Not one person in the entire motion picture field knows for a certainty what’s going to work. Every time out it’s a guess—and, if you’re lucky, an educated one.
Alan Jacobs refers to ‘the seductions of prediction’, and these seductions have been very much on show since the release to the general public of ChatGPT in November 2022, followed by many new iterations from several tech companies. There are a lot of people who have made completely confident pronouncements about what AI will change in the world of education. But widely-accessible Generative AI is not even two years old! It is a guarantee that most of those pronouncements will turn out to be wrong.
There is also an inbuilt bias in the tech world towards positivity, certainty and boosterism, partly for commercial reasons, and a blindness to unintended consequences (I’m looking at you, social media). The tech-bros are quivering with excitement right now, as are their bank managers (maybe misguidedly).
has a properly sceptical eye on the pushing of tech services such as Khanmigo and Magic School to teachers, let alone children.This post is a scattering of thoughts and links on the teaching and learning of English as a secondary-school subject in the infant years of Generative AI, and I’m fully prepared for many to turn out to be off-target in the future.
The question: in English, just what problem is AI solving? Could it actually generate problems instead?
An analogy: is using AI in English classrooms the equivalent of being a sports’ coach and showing your team videos on how to get fit, rather than actually getting them to train?
Thomas Newkirk wrote one of the best books I have read about teaching, The Art of Slow Reading (2012). In it, he quotes Neil Postman in Teaching as a Conserving Activity:
Schools, Postman argues, should act on a thermostatic principle: a thermostat acts to cool when a room is too hot, to heat when too cool. Schools should act to check (and not imitate) some tendencies in the wider information environment: ‘The major role of education in the years immediately ahead is to help conserve that which is necessary to a humane survival and threatened by a furious and exhausting culture.’
Postman wrote those words in 1979.
It’s time to turn down the thermostat.
AI and English
So, some jottings, and some useful recent references and links to helpful current thinkers in a kind of commonplace post: a collection of interesting things.
I write as a technophile who enjoys using new tools (soft and hard alike). But tech boosters always over-claim for their worth in education. As shows in her book Teachers vs Tech, there is a long history of blustering overpromise and subsequent disappointment.
The pattern is re-asserting itself since the arrival of Generative AI. Among the usual suspects: that personalised education will change everything (a cousin of the discredited idea of learning styles), and that we can easily grab knowledge and understanding from the online world (‘you can just Google it nowadays’). The tech world regularly conflates information and knowledge, unable to see the profound difference between the two.
Recommended reading: . His book More Than Words: How to Think About Writing in the Age of AI (early 2025) looks like being an essential one. A collection of his writings about this subject.
A crucial point:
Writing is thinking. Writing is simultaneously the expression and the exploration of an idea. As we write, we are trying to capture an idea on the page, but in the act of that attempted capture, it’s likely (and even desirable) that the idea will change. The leapfrogging of AI is going to miss that.
And another:
The fact that writing can be hard is one of the things that makes it meaningful. Removing this difficulty removes that meaning.
This talk by Dan Meyer in the lions’ den at a tech conference on ‘The Difference between Great AI and Great Teaching’ tears apart the tech industry’s perennial presumption: that its tools can ‘transform’ education in a positive way. He shows how little understanding it is has of the fundamental principles of teaching and learning. Worth 18 minutes of your time (the YouTube video is also embedded at the bottom of this post).
What we need when we teach writing, and encourage reading, is friction: writing should be effortful and challenging. But AI’s impulse is to remove friction. For pupils - all people - the idea of bypassing productive struggle is another seduction. Vocabulary is an example: it is built up over time as reading becomes more sophisticated, and its effectiveness depends on a slow development in each individual.
Technology tends towards speed. Good writing comes out of slowness over time, and drafting, and messiness. Then its foundations are solid.
Since well before the invention of the internet, English teachers have been aware of the problem of CNS, Coles-Notes-Syndrome. ‘Reading’ a book via summaries is now old hat compared to what AI can do. We need to show our pupils how disastrous to their understanding this will be.
It is also a fundamental misunderstanding of a literary text to think it can be ‘summarised’. This is parallel to the misguided notion that you can ‘always Google’ something. The value of the text is in the parts filtered out: the sentence-level texture, the rhythm of the narrative, the incidental pleasures.
in No One is Talking About AI's Impact on Reading -When we teach reading as a skill, we’re asking students to practise more than analytical thinking through close examination of a text—we’re inviting them into a conversation with the reader about their ideas. We want maturing readers to engage with the text, not a summary of it, because we know that doing so means those ideas can help shape and mould a student’s thinking, challenging their assumptions, and making them argue within themselves over deeply held beliefs they may have about our world. A bespoke generative summary just doesn’t do that. If students adopt it as a replacement for close reading, it may rob them of the opportunity to fully engage with an author’s ideas.
and
Unchecked, the lure of frictionless reading could produce a profoundly atrophied culture of surface-level ideas rather than exploring them in depth. In such a world, I shudder to think how blind obedience to authoritative-seeming AI outputs could allow misinformation and weaponized narratives to proliferate unabated.
is one of the most consistently interesting, well-balanced and precise thinkers about AI in education. See his
newsletter, and in particular his ‘Beyond ChatGPT’ series.Maryanne Wolf’s important book Reader, Come Home: the reading brain in a digital world (2018) explores the centrality of what she calls ‘cognitive patience’, precisely what Generative AI leapfrogs.
The digital chain that leads from the proliferation of information to the gruel-thin, eye-byte servings consumed daily by many of us will need more than societal vigilance, lest the quality of our attention and memory, the perception of beauty and recognition of truth, and the complex decision-making capacities based on all of these atrophy along the way.
Here’s my essay on the book: better to read the latter of course.
Tony Wan in EdSurge:
For many people writing is the most brutal exercise in thinking. It reflects and tests our assumptions, pushing us to refine our ideas and uncover new ones. It leads us down rabbit holes that we have to crawl back from. It requires us to connect the dots and think about what makes sense or doesn’t, to transition between ideas and evidence, and to consider what makes the cut and what doesn’t. When AI is used as a shortcut, we lose some of these muscles, as painful as they are to build. For developing young writers, this can be a major setback.
AI for ‘brainstorming’ ideas for an extended piece of writing: the very worst time for pupils to use AI is early on, for gathering ideas or producing an early draft, and that is just the time they are most likely to use it. This is not an effective tool for novices, who are the most likely people to be seduced by it. Novices don’t know enough to know what they don’t know (not to mention all the ‘hallucinations’). Let’s mention Dunning-Kruger. Experts in a subject can rapidly assess. I was trying to find ‘new’ questions to ask about King Lear: AI produced some fairly standard ones that didn’t help a great deal, but I knew that instantly from a deep level of knowledge of the play.
explains it better in First Drafts in the AI Era:
I don’t like the idea of students going to AI and prompting a first draft. I know some have argued that this could be a helpful method to fight the blank-page anxiety most writers feel. Others view this as helping maturing writers by giving them a template or outline to help them organize and scaffold their ideas. I think there may be some value in those approaches, especially in terms of helping struggling students who might otherwise balk at writing, but all of these approaches assume a maturing writer will then use their budding rhetorical knowledge, content knowledge, and contextualize knowledge to complete the draft. Those of us who’ve taught first-year writing likely raised a questioning eyebrow at that idea.
It’s entirely possible that Generative AI writing will get worse rather than better, as it eats its own tail, feeding on its own flabby flesh.
Richard Hughes Gibson in The Hedgehog Review:
When writing meets no impediments, we can easily become links in a chain through which misinformation spreads. Yet my appeal for friction writing goes to something even more basic: When you encounter (and pay heed to) resistance in your writing, you have the chance to change not only your words but also your mind - and even to consider whether you need to be writing something at all, or at least at this moment.
Jane Rosenzweig in Writing Hacks also says it well:
We need to be able to recognize when removing the friction from the process might mean losing something important. We may love having written for different reasons, but the friction contributes to that feeling of satisfaction. I love having written when I am able to give structure to my thoughts or discover something in the process of writing that is satisfying or even profound—when I find an answer or solve a problem or arrange words in a way that makes me see something more clearly.
in Warning Bell:
It’s not much of a take to say that writing well is difficult. So is thinking critically. But, crucially, both endeavors help me better understand topics when I have to explain them for others. In other words, doing things on my own is the helpful part.
He quotes high school teacher Liz Schulman writing in the Boston Globe:
ChatGPT eliminates the symbiotic relationship between thinking and writing and takes away the developmental stage when students learn to be that most coveted of qualities: original.
Isn’t originality the key to innovation, and isn’t innovation the engine for the 21st century economy, the world our students are preparing to enter?
You can sense the aliveness in the classroom when students use their imagination and generate their own ideas. Their eyes become warm. They’re not afraid to make mistakes, to shape and reshape their ideas. The energy shifts. I’ve seen it happen in their discussions and with stages of the writing process, from brainstorming to drafts to silly stories to final essays. They’re more invested in arguing their points because they’ve thought of them themselves.
A useful summarising post on ‘friction’ by Leon Furze:
We need to find ways to convince students and early writers that the struggle – the friction – is worthwhile. With technologies that increasingly remove all of the barriers from first draft to final edit, we need to re-evaluate how and why we teach and assess writing.
Conor Murphy, English teacher in Ireland:
The English classroom is about sharing the writings of other human beings, of sharing thoughts and experiences from our tangible, and intangible, realities. There's enough writing out there for me to source poems, short stories, articles, novels, comics, films, plays etc etc without having to ask AI to help out.
Writing and meaning are intertwined. How we write reflects how we think. I want my students to see that, to see that this person used this technique for this reason (or reasons, even if that reason is to be opaque). I want my students to then be inspired to create, cultivate, their own voices, their own way of expressing themselves.
More from Conor, as Ireland reassesses its Leaving Certificate courses: he’s referring to the danger of allowing AI to be used in uncontrolled conditions.
The idea that you would create a new course without considering the impact of AI is to disregard education itself. Only a fool, or someone with no interest in our society as a whole, would do such a thing. I presume my own subject, English, will steer clear of that thinking. If we get a new course in English that ignores AI we will be a country that no longer values critical thinking, creative writing, or nurturing our societies intellectual capabilities. I can't wait to see how they solve that problem.
Still in Ireland: the current Senior Cycle reform process involves reshaping courses so that 40% of the Leaving Certificate terminal grades will be allocated to ‘Additional Assessment Components’. In the Autumn 2024 the distinguished academic Professor Áine Hyland wrote about these in her article ‘Assessment for Equity and Excellence?’ in Leader magazine, and stated that these AACs have caused a lot of concern among students, teachers, university representatives and the general public. She points out the inevitable disadvantages for equity:
A greater concern has to be the advantage that such a decision will inevitably confer on students from already advantaged backgrounds. Regardless of how carefully a student’s work outside the examination hall is monitored by their teachers, students who have support and resources and/or access to additional advice and teaching outside school will be at considerable advantage when preparing their investigative projects.
This long-understood concern has been supercharged by Generative AI, as shown in the article by Professor Hyland in the area of science. English is currently protected from such concerns, but
in view of these developments, there is a strong argument for postponing the introduction of the 40% AAC in the Leaving Cert, pending study of international experiences of the implications of Generative AI for assessment – or at least reducing the proportion to a more reasonable 20%. Given the high-stakes nature of the Leaving Cert and the fact that there will always be some parents who will do whatever it takes to secure a place on a high-points university programme for their offspring, it is unrealistic and unfair to ask teachers to verify that the submitted work is the unaided work of the student.
In the light of the above and in the interests of equity and fairness, it is to be hoped that this ill-advised reform will be reconsidered and that the proposal to allocate 40% of Leaving Cert marks to an additional assessment component be reconsidered.
has produced a really clear and helpful 11-page document called ‘Education Hazards of Generative AI’ for teachers, administrators and others, which explains large-language models and their considerable limitations: for instance, that they seem more authoritative than they are, and that they are essentially ‘role-playing entities that imitate intelligence’.My claim is that AI in the form of large-language models is a tool of cognitive automation – and that’s all it is. All it does, all it can do right now, is make statistical guesses about what text to produce in response to text that it’s been given (and often it guesses wrong) … Using a tool that automates student cognition will lead to less effortful student thinking, which will in turn lead to less student learning.
More points from the document (with English in mind):
‘When providing feedback on essays, LLMs may not focus on the aspects of student work that are most important from a teacher’s point of view.’
Models available in the US are ‘WEIRD’ (Western, Educated, Industrialised, Rich, and Democratic’, and thus ‘exposed to a biased sample of cultural practices and values.’ See that very point made by
in the next section.‘LLMs may not recognise student creativity if the student’s work does not align to the data that they have been trained on.’
‘Knowledge cannot be outsourced to AI, and students who have not built a broad base of knowledge will not be able to make best use of this new technology. Educators should continue to focus on building student knowledge across all subjects.’
Writing: ‘Educators should be particularly careful regarding the use of AI for writing tasks. In many cases, the purpose of a writing assignment is to make students think effortfully and develop their writing, they can miss opportunities to learn how to think critically, to assess ideas, and to consider alternate viewpoints.’
It ends, as I began this: ‘But no one can predict what the future holds. Be sceptical of those who claim otherwise.’
One always to bear in mind: it is all too easy for the unwary not to be aware of AI’s biases, especially, as
writes, in matters of diversity, which isa very real problem for educators and AI. If AI is learning, where is it learning from? What sources is it using? Who is ‘teaching’ AI when there is human input and guidance? How diverse are those teams?.... with no specific prompt, AI places poetry and the Western canon together as the definition of poetry. Knowledge and its curation has always been a biased affair. If we look at the National Curriculum in the UK and the process through which it was curated, we can see that it resulted in a curriculum that was deeply rooted in the Western European canon. When learning is impacted by colonial ideas, the result can only replicate colonial ideas. We see this happening with AI now.
Right now, there is an opportunity cost to spending too much time absorbed by this topic:
-Talking about AI is a fun way of neglecting real professional growth. We can feel so progressive and productive.
US teacher Chanea Bond caused a stir by stating that any of her students using AI would receive zero in their grades. Her coherent, robust and unapologetic reasoning:
They’re not using AI to enhance their work. They are using AI instead of using the skills they’re supposed to be practising. So, I decided we’re not going to use it in the classroom.
and
Her policy isn’t about asking students to bury their heads in the proverbial sand. She’s more concerned with what her students are learning—or more often, not learning—by leaning on AI to help them formulate and write their assignments. Bond believes that allowing students to outsource their ideas and rough-draft thinking to AI doesn’t help them and in fact devalues vital literacy skills like originality, creativity, analysis, and synthesis. “The original ideas are the most important component in a student’s writing,” Bond told me. “You can polish everything else. But how are you going to polish an idea that you didn’t originally have, that you didn’t originally think of, and that you don’t really have any investment in?”
and
There are a lot of things we don’t teach kids how to do that they end up using in their careers. That’s not my job. My job is to help kids develop foundational skills. Using AI at this point in time is not a foundational skill. If they need it, they will learn it on the job, in a job-specific way—just like we are doing right now.
Full interview with Andrew Boryga.
Antidotes.
Last term I spoke to our school on the provocative idea that the most ‘important’ subject to study, indeed of the most practical use in our lives, is poetry. That is based on this essay. Reading, studying and writing poetry are antidotes to the dehumanisation of Generative AI.
On which topic: you should read English teacher Carol Atherton’s recent book Reading Lessons: the books we read at school, the conversations they spark and why they matter, which shows powerfully how beautiful works of literary art engage our humanity in the classroom.
Adapted, expanded and reshaped from this original post.
Used just a two paragraph excerpt from the introduction to Postman's Technolopy in an English class here - a semester course on AI and Ethics - and Postman resonated pretty deeply with about 50 students in the graduating class of 2025.
As a youth, a student in both middle and high school, the teachers would have us write reports on things I wasn’t interested in. I wish I would have, or could have, been more forceful with my, “no, I’m not interested in this subject or this book”.
There’s something to be said for letting a person develop into his or her own slowly and naturally. However the broken system or misguided teacher is too often only interested in a thing for the sake of a thing. Work will always be work if we view it as work. AI is the escape hatch for the person who isn’t interested in taking the time to be interested. It’s the ultimate cheat, the ultimate hack. You can’t outsource your heart without there being ramifications. That, and AI won’t ever really be from the life of the heart someone
I’m not much older than a youth at this point. I’m not currently in school, but always in the process of learning. All I have to say is, “I’ve got a lot to say and I don’t have to share it with any of y’all! Yeehaw!!”
Thanks for the read ;)