39 Comments
User's avatar
Erin Leigh | WitchySelfHealer's avatar

I am of the opinion that we need to be more careful with what technology we engage with and push forward. For so many reasons AI is one that I do not support. Rather than continuing to adopt this position of immediately assimilating in order to fit in and make sure we don’t fall behind, maybe we should stick to our values and morals. Why are we so eager to adopt technology that makes things more productive rather than technology that makes us happier, more at peace, more emotionally intelligent? Maybe we should focus more on how to evolve emotionally rather than technologically or how can we be more productive. And make our children more productive. This is just so the wrong direction for humanity.

Expand full comment
Hana's avatar

Agreed, Erin. You have uncovered the next level of what Sasha has been trying to bring to light here.

Expand full comment
Tina Ye's avatar

Noting that a lot of the responses are focused in on the AI usage itself, but I wanted to comment on the parenting and schooling aspects.

As a parent, I share your concerns about what "meaningful, life-giving work" means in the future when so few of us can even find it now. I also am slowly embracing unschooling/deschooling as my kid nears school age, which means questioning and resisting the assumptions put in place by schools, among them the necessity of mandated assignments and curriculum. Much has been written and studied about how standardized schooling suppresses and destroys a person's desire to learn and they instead become good at "performing studenthood" for the authorities. How much of that actually builds skills, resilience and creativity for navigating the challenges of today's world?

I'm personally opposed to the wanton use of GenAI, because it's both energy-intensive and yields unimaginative results that make a mockery of human ingenuity, BUT there is something to this notion of teaching our kids to navigate the world in ways that are "smart" and maybe a little cheeky/disruptive, rather than simply abiding by the expectations put on us by institutions who were designed to create obedient, predictable workers to keep the capitalist machine going.

It can be tough to put yourself out there as a parent, and I feel like sometimes parents are the most-criticized people in the world even though no one gave us a training manual and it's one of the hardest jobs in existence, so I applaud you for starting this conversation! I believe it is a good one.

Expand full comment
Lily East's avatar

Excellent analysis.

Expand full comment
Akram Khan's avatar

I like where your head was at regarding being disruptive to the capitalist machine, but I'm sorry, using this tech is not cheeky. You're falling for what they want - to use your data and thoughts, to use your artistic creations to generate lifeless mockeries, to numb you down until you can't think for yourself. It's an insult to life itself, as Miyazaki from Ghibli said.

To be disruptive, to really be punk, I personally think we need to wait for more regulation with this tech before we palm it off to our kids. We need to teach them time management, teach them to question everything. We need to tell them to slow things down rather than perpetuate the world where instant gratification is the norm. Read books. Take time. That's punk.

Expand full comment
Tina Ye's avatar

Hard agree. And yet, at work… everyone has succumbed to using GenAI to write OKRs and other corporate busywork because there is no purity in a system in which time is not your own. That’s the reason I don’t fault the parent for their β€œmoment of weakness.” The more interesting conversation is β€œnow that you’ve tried using GenAI for homework, how did that make you feel? What was the impact locally and systemically? What is your role and responsibility in all this?” That is the teaching moment for our present times, not the writing of the book report or the assembly of the slide deck.

Expand full comment
Keith Friedlander's avatar

Pretty disappointing to see such a blase approach to AI in education coming from Centre for Humane Technology. As other's have said, the ability to locate, comprehend, and explain information to others isn't just "assembling dry informaiton." It's foundational to developing communication and information literacy. This is like saying, "why should my son have to learn his scales to play the piano when we can just get an AI to play piano music for us."

Listen, I get it. AI will be a part of our future and the nature of essential literacy skills will change in the coming decade. But having a child just use AI to perform every part of the task will not help prepare them for a more complex information ecosystem.

I teach a college-level writing class. We spend the first two weeks playing with generative AI. But when I have students play with prompt engineering, I task them to identify the errors in its results and consider where it requires intervention, and I prompt them to think critically about scenarios where it is appropriate and inappropriate to apply the tool. I give them readings about the environmental, legal, ethical, and developmental aspects of the technology. We can't pretend that students won't engage with AI or that it won't be part of their future, but you have to at least present it in a proper context.

I feel bad for your child's teacher.

Expand full comment
Sasha Fegan's avatar

Hi Keithβ€” thanks for your comments. You're right that assembling information involves cognitive effort, but the main issue with assignments like this their poor adaption to the 21st century. Asking an 11-year-old to do this as a Google slide deck is just asking for plagiarism, IMO.

Even in the pre-LLM world, we knew that these digital homework assignments usually translated to a combination of Google + Wikipedia + parents. How many Grade 6 kids are genuinely synthesizing information versus just copying and pasting?

If we truly want students to develop these skills, the education system needs to acknowledge the reality of the digital world we live in. That likely means shifting these assignments to in-class, book-based activitiesβ€”which I'm firmly in favor of and have been advocating for at my kids schools.

But concurrently, we can't pretend that AI is not at our kids' fingertips. They need to be taught how to use AI responsibly and critically - which is what I've been teaching at home.

Expand full comment
Joel D. Kennedy's avatar

Uh oh! Sounds like the armchair Luddites are β€œquite put out” by your honesty and vulnerability. I think the point of the article is that we need to be realistic and ask ourselves, how do we manage our own impulses and teach our kids a world that is rapidly changing and adopting new technology, whether we’re fully on board or not? Because, whether we like it or not, the raised expectations that come with expanded technological capabilities are here, and expanding every minute that we sit and debate the virtues of AI. That genie is not going back in the bottle, just like no one can un-invent the atomic bomb. All we can do at this point is decide how we are going to let it affect our lives from here on out. I, for one, am cautiously optimistic, though I do worry about the effects it will have on the mental development and environment of generations to come.

Expand full comment
ValΓ©ria Souza's avatar

I train AI for a living, and I can assure you that you have done your child a disservice with what you documented here.

Expand full comment
Obnubilated's avatar

I also work in this area. Just because you can use it, does not mean that you should. Tech companies have done many internal studies showing how casual use of AI causes de-skilling over time.

There are many appropriate uses for this very resource intensive, environmentally unfriendly technology. Avoiding having to do basic research and writing is not one of them.

Expand full comment
This Be The Verse's avatar

I too was pretty disappointed to read this introduction to the latest podcast episode. While I don't know the complete details of the task that the kid's teacher assigned, it seemed like a cool research presentation where a lot of learning could have taken place. Sifting through websites and articles, synthesising findings into a coherent paragraphs sound like a valuable learning experience to me. And by the way, if the child had turned up to class without a good presentation, they would have had to own up to their laxity, which would also have been an opportunity to learn accountability, a rare quality these days.

I teach film studies and theory of knowledge in a high school in western europe and I already see the effect of AI on students cognitive development. The unregulated availability of Ai means that stduents now automatically reach for it to bypass the hard thinking stage of any task given. I've witnessed their higher-level thinking skills atrophying in the last two and a bit years, and I'm so, so worried they are becoming prime targets for manipulation by bad actors as their critical thinking skills aren't given the chance to develop. It has been very hard for teachers who deeply care like I do because we are fighting what seems to be a losing battle. Yes, we can not ignore Ai and I try to teach students how to use it critically but this is an (educated) adult technology, students simply don't have the intellectual maturity to spot epistemological bullshit when they see it most of the time.

Please, let's not forget that teachers are professional in pedagogy, we think about how to best help young people learn all the time. Yet, we are often disregarded in the conversation about AI in education, while people who have never actually taught make all the decisions.

Expand full comment
Thea's avatar

This is nonsense. Seeking out and ordering knowledge is a form of intellectual production (i.e., something that one can and should learn -- it's not a mere matter of Googling or using Chat GPT). Just because you can outsource that work (particularly to something as grossly destructive of the environment, the best human impulses, as Chat GPT) doesn't mean that you should.

What did your son learn from using Chat GPT to produce this work? What would he had learned if he'd done it himself?

Why am I subscribed to the Center for Humane Technology, anyway?

Expand full comment
E Olson's avatar

To me, this doesn’t seem like a humane use of technology. For one thing, what about the teacher who assigned the presentation? Surely they had an educational goal here, whether it was to teach students about research or how to present information to a group. Isn’t it seriously disrespectful to that human being to tell your kid that he can cheat on an assignment just because you have a personal disagreement with the pedagogy? You seem to disregard the teacher entirely, and you’re teaching your son not to respect teachers who ask him to do work he isn’t interested in, too.

You glibly mention being a helicopter parent β€” but I do have to ask: what do you see as the value of education? Is it training to be a citizen, or is it a series of hoops to jump through to get credentials that make you a better asset to the market? Because lying to a teacher to get a better score seem to teach a very cynical lesson about why anyone attends school in the first place.

Expand full comment
Center for Humane Technology's avatar

Yes, good points. Well, it was certainly a humane use of technology for me, an overworked parent! ;-) But more seriously, I do totally understand your concerns.

But I think we need to be realistic. Even in the academic stream that my son is in, most kids handed in work that still had the blue Wikipedia links in it. That's just reality, and no less 'disrespectful' to the teacher or the educational goal than using AI.

If the task had required the kids to write an essay about something in another country they found really interesting, then my son would have been fine. He was well on the way with his study on the myth of Dracula.

The problem is that that would not have been rewarded by the strict grading rubrics that accompany assignments. The education system, as it currently stands, rewards people who are good at jumping through hoops and parroting information- kids who are great at 'achiever mode' as Rebecca Winthrop described in the podcast - and they emerge as anxious, fragile people who are good at doing exactly what they are told.

I'm not sure this is a good skill to encourage when AI can do that better. That is what needs to change if we are going to set our kids up for good lives as citizens and humans.

Expand full comment
E Olson's avatar

These are intriguing ideas - and there is something romantic and exciting about the prospect that AI could catalyze curriculum reform. Tomorrow, middle school will be interesting! Fewer five-paragraph essays and geography quizzes: yes, please! I also appreciate your candor and openness in discussing a real ethical dilemma from your life. (And please do take all of these words in the spirit of no-judgement, if possible. :) I'm stuck on this post as an amateur ethicist, but it's a hard world out there for parents.)

All that being said, I’m still not convinced.

1. If I was the teacher, I would feel disrespected by the lying, more than anything -especially being lied to by another an adult.

2. If we’re trying to break out of the achiever mindset, then why are we cheating on middle school homework assignments? Assuming that it was a mindless assignment of little educational value, then why care at all about the grade? Implicitly, what you're teaching is that the grade matters more than the educational content of the assignment, even if you need to cheat to get a good grade. Even more than the strict rubric, I think this mentality contributes to the cultural problem of anxious over-achievers that you cite. Letting kids use AI to get good grades more efficiently without learning doesn’t solve the problem - it digs the hole deeper.

3. Expecting kids to be honest and responsible community members isn’t evil capitalist mind-control - it’s just part of raising good people, isn’t it? Although we might be able to use AI to accentuate critical-thinking and creativity in school, I'm skeptical about the notion that we can really eliminate all of the drudgeries of learning. Algebra - which, I'm told, is fairly important for building and running these LLM things - is still algebra.

Coming back to my bigger point, this post rankles because it feels like the teacher is so totally left out of the picture! Remember, teachers are an extraordinarily smart, driven, and compassionate group of people. They don’t want to make children robots any more than parents do - and they HAVE to be involved in the conversation about how AI is used in education. We can’t neglect these individuals in large-scale critiques of the education system or capitalism.

Expand full comment
The Mentat Project's avatar

More importantly - you're not teaching your child to fail. SO what if he didn't complete it - that's on him. If he got a bad grade - again - that's a consequence. You have to let children learn that a) Failure is a lesson and not a personal or ethical failure - you learn and move on. b) He learns that if he doesn't do the work he's not going to get the grade - cheating only helps him bypass the process. He can go through life knowing he can take this short cut or that cheat and it's ok - hey Dad said it was. c) This is why you helicopter parent fail your children - you do not help them learn resilience in the face of disappointment - the importance of failure to inculcate a sense of humility and perspective - as well as the necessity to pick up and move forward again. By discounting the curriculum or the process because YOU DON"T AGREE - then talk to the teacher - get their perspective on it. If the other kids have wikipedia links - why do you care? Why hold your child to the lowest standard and encourage him to cheat to get the better result. At least those other kids looked up the information and read something about it on their own. All you did was pose a question into a bot. This is why we're in the situation we find ourselves - too many people never learning to be resilient, self-reliant, and humble. Humane technology - just another hypocrite.

Expand full comment
Michele's avatar

Teachers assign these types of assignments for many reasons. I’m sure time management was an objective of the assignment. Your son wasn’t supposed to do all of that in one night. One of the lessons you stole from him was the realization that leaving it all to do the night before is not a good approach. You also showed your son that his teacher’s assignment is worth his time- a subtle reinforcement that you don’t respect the teacher.

Expand full comment
Michele's avatar

Typo- you showed him that the assignment wasn’t worth his time.

Expand full comment
S. T.'s avatar

This article convinced me to unsubscribe from the Center for "Humane" Technology. I understand taking a nuanced look at homework and our approach to education, but the nonchalance at which the author completely dismisses the value of the work is quite concerning.

| So I told him, this is not a skill for humans anymore. It’s a task for AI.

Such a quote seems quite irresponsible, and is a surprising take for someone employed as a content director that could not have attained their position without these skills.

Expand full comment
susan fitzpatrick's avatar

As someone who recently signed on to the Center for Humaine Technology, receiving my first email with the subject line: Teaching My Son to "Cheat" with AI: A Parenting Confession in the Age of ChatGPT, was a really unfortunate way to get a first look at centre's your work. Perhaps I should read it, but I'm not going to. I'm unsubscribing because I can't manage shock surprise email randomly arrriving in my inbox.

Expand full comment
Dana PolojΓ€rvi's avatar

This is my first read of any article from your organization. I'm surprised that a center for humane technology would be advocating for cheating. Would you be so kind as to tell us why you put quotes around the word "cheat" in the title?

Would you clarify how the activity was humane? Did you use it as a way to communicate with the teacher by having your child openly co-author the paper with ChatGPT? Did you use it as a means of building a bridge between yourself and the teacher? I'm confused.

Expand full comment
Steve Muth's avatar

I thought that after actually having a child, all us parents took to the no-judge pledge! I loved hearing a real story about pedagogy meeting parenting. But for me, the part of the story that resonated most wasn't the parenting part, but instead it was the impact of that very traditional assignment on a young person's mind. He was quite obviously disengaged in the work, or it would have been done already.

To use Jenny's language from the interview, he was either in "passenger" or perhaps "resistor" mode, and it turns out that either of those states are not quite the cause for alarm that they have long been. As a young student, I was a stone-cold resistor and got the grades to prove it. But because I grew up in a pre-internet age, I simply read the books I loved instead of doing my classwork. I graduated with an abysmal GPA but an intact sense of agency and energetic curiosity that has served me really well.

Similar to my no-judge pledge on parenting, I won't judge that teacher who gave that assignment because I'm missing a boatload of context that matters. But in general, I'm hoping that gen AI's unpleasantly quick arrival in education will act like a brushfire and clear out some of the dead pedagogical underbrush. There are many, many wonderful learning activities that play just fine in our new AI world, and we just need to move to them, very, very quickly.

Expand full comment
This Be The Verse's avatar

"He was quite obviously disengaged in the work, or it would have been done already." I hear this argument often but the reality of why kids don't complete assignments is much more complex. For example, their poor attention and time management skills often stand in the way of their potential for engagement. Secondly, what is an engaging task for one student might not be for another, there are no rules, we are dealing with individual human beings. Usually students find writing essays hard for example. They're lengthy assignments that demands that students are left alone with their thoughts for an uncomfortable amount of time while they makes sense of what's in their head, they often lose engagement while working on them. But, it doesn't mean that the assessment is flawed, it just means that it's hard. Hard often means valuable.

"There are many, many wonderful learning activities that play just fine in our new AI world," Like what?

Students are usually engaged when I use multiple choice quizzes, but these are low-quality learning activities that I rarely use. Instead, I assign tasks that offer more substance. In the same logic, just because kids prefer junk food over veggies doesn't mean they should be given it without restraint.

I don't doubt for a second that some teachers need to revamp their teaching strategies and I agree that education in the western world needs to be reformed to an extent but I'm yet to hear some good, practical ideas of how to do that.

In my experience, an engaged, passionate, happy teacher means engaged students. As simple as that. πŸ™‚

Expand full comment
Steve Muth's avatar

I do agree with quite a bit of that. The reason for not completing the work may definitely have been poor attention and management skills; however, it also could have been that they simply weren't intrinsically motivated, which is a difficult but foundationally important thing to assess. I think I'd really only trust three voices to figure that out: the teacher, parent, and the student. But IF the problem was a lack of intrinsic motivation, then that is the absolute most important problem to solve for. Unfortunately, I think it also happens to be the hardest problem to solve, particularly at scale.

This is why my personal preference is for a model where the teacher is given a great deal of time, space, and general deference about how to solve for that. Teachers have been doing it since before teaching was a profession, and they are uniquely positioned to know whether a learning task is a desirable difficulty or just curiosity crushing. Two books I'd recommend that dig into this issue of outcome based assessments causing profound harm are John Warner's "Why They Can't Write" and Kelly Gallagher's "Readicide."

I think I might not have been clear enough when I said there are many wonderful learning activities available in our new AI world because I wasn't talking about anything that uses AI or anything new at all. There are many kinds of amazing alternative grading techniques that have been around for some time but are now being explored and scaled up with much more intensity because the gen AI use by students pretty much makes many traditional assessments even more harmful than they already were. Emily Pitts Donahoe is just one of number of people doing some really interesting things with alternative grading. https://open.substack.com/pub/emilypittsdonahoe/p/the-autonomy-accountability-paradox?r=38n142&utm_campaign=post&utm_medium=web&showWelcomeOnShare=false

But I did also have a personal favorite example to share. It's a video about an "AI Aware" stacked group assignment where peers use interactive video commenting to critique the work of a peer. It's quite obvious in the example that gen AI, even if it was used, could not really have played any significant role at all, and... critiquing (or poking fun at) one's peers is, I think, very intrinsically motivating ;-)

Here's the link to a video about that example - https://voicethread.wistia.com/medias/2p9dgteqkp

Expand full comment
Agnese's avatar

I also avoid judging other parents. I appreciate your honesty in sharing this anecdote. Of course, like many commenters, I'm concerned about the impact on learning and critical thinking of relying on AI to complete homework and assignments.

I'm a writing instructor in higher education. Next semester I won't be assigning any homework and go back to vivas and handwritten assignments.

Expand full comment
Tina Ye's avatar

My first thought upon reading the harsher comments was β€œsome of y’all must not be parents” πŸ˜…

Expand full comment
The Mentat Project's avatar

Call me what you like but teaching your child at this stage of his development to rely on ChatGPT is abuse. You are teaching him to curb his own powers of creativity, synthesis, research, and analysis by instead relying on a Bot that will use him as the tool. All you people out there proclaiming the glorious AI future - WAKE UP! The Run-Fast-and Break-Things Ethos of an unfettered Silicon Valley has led over the past decade to deep knowledge silos and ever-shorter attention spans; a tribalized and magical thinking society; citizens who are less literate, creative, or capable of nuanced critical and abstract thinking than previous generations; human beings who are lonelier, angrier, and prone to mental illness; a public discourse mired in shallow puffery and demagogic certainty; a nation unable to remember its past or to actively work towards its future. This was the result of outsourcing human thought and socialization to the algorithm. The manic drive to release more malignant and powerful forms of machine intelligence are the culmination of these nation-shattering trends.

With AI copywriters, AI lawyers, AI accountants, AI programmers, AI engineers, AI architects, AI musicians and composers, AI novelists, poets, and visual artists, AI administrators, AI filmmakers and actors, AI teachers, and AI philosophers, AI doctors, and even AI friends and AI lovers all touted by the dark cheerleaders of Musk, Vance, Thiel, Altman as "Progress". We need to strangle AI it in its crib - now.

Expand full comment
dotpip's avatar

The assignment seems to have been to obtain and organise factual information in a coherent way from multiple sources so that the student may build on these skills in the future - the ultimate aim down the track being to produce a critical thinker. Rather than gaining middle-class advantage by plundering the dross-blend of others' intellectual property that is AI output, a more constructive approach may have been to (a) encourage the child to do the work of building their brain by doing the assignment themselves, or (b) help the child through accepting the consequences of not being organised in time to submit the assignment, and to help the child plan better in future. What happens next time? And the time after that? And in exams in the future? This was a valuable step missed in building a working method and a thinking mind.

Expand full comment
Dave Talas's avatar

I appreciate you sharing this, and I respectfully disagree. I teach AI to entrepreneurs for a living, and I'm seeing it more and more that using AI is making them dumber and dumber.

There is a real risk in offloading all intellectual tasks to AI. Sure, it can do it, but the "use it or lose it" principle still applies here.

You said: "The assignment wasn’t teaching him how to think. It was teaching him how to assemble dry factual information and lay it out nicely on a page." β€” and that's where you are wrong. Here are a couple of things this assignment was teaching him:

- time management

- planning and estimating how long a task takes

- project management (in which order do I do which task? What are the tasks I even need to do? if I'm starting to go off schedule, how do I realign myself?)

- research skills

- fact validation (how do I know what I read online, or from an AI is actually true?)

- staying focused while researching stuff online

- distilling valuable information (signal) from noise

- accountability (I stole this from another comment. What if you let your kid fail the assignment or miss the deadline?)

- presenting information to others (a.k.a. empathizing with an audience) in a way they can also understand, without knowing everything that I know

And the most important one:

- the self-confidence that I can do stuff.

I wonder how your kid will feel when they receive praise and admiration for work that wasn't theirs. They will be more and more dependent on you and an AI. Which is actually quite the opposite of what I think parenting is about.

Expand full comment