Stay up to date with notifications from The Independent

Notifications can be managed in browser preferences.

In Focus

‘Learn to code’ is dead. So what the heck should you actually teach your kids in the age of AI?

Holly Baxter asks tech experts what students should actually study, now ‘learn to code’ is dead — and gets some surprising answers

Head shot of Holly Baxter
The kids of today aren’t just digital natives — they’re AI natives. But computer science graduates aren’t that employable anymore, and coding is becoming a blue-collar job. So what’s next?
The kids of today aren’t just digital natives — they’re AI natives. But computer science graduates aren’t that employable anymore, and coding is becoming a blue-collar job. So what’s next? (The Public Interest Network)

Once upon a time, “learn to code” was the answer to every dying industry. It was the mantra of bootcamps, the moral of every op-ed about automation, the smug refrain lobbed at laid-off journalists and humanities majors alike. Coding was supposed to be the last safe job in a world eaten by software.

And it was hard not to believe the hype. Software engineers in their first jobs were raking in high six-figure salaries and complaining on Reddit about million-dollar packages from Google, while Philosophy grads shared tiny apartments and subsisted on ramen. Coding was rational, lucrative, and future-proof, we were told. In an uncertain environment, the only thing that made sense was to learn the languages underpinning all recent technological developments.

But now, the coders are out of work. Tech layoffs have gutted the field, and talk has turned to replacing entire engineering teams with one well-prompted chatbot. According to the New York Federal Reserve’s analysis of Census and employment data, recent computer science graduates now face a 6.1 percent unemployment rate, placing CS among the top undergraduate majors with the highest unemployment (yes, even higher than journalism.) To put that in further context: the unemployment rate for all recent college graduates (ages 22–27) hovers around 5.8 percent — meaning both computer science and computer engineering grads are overrepresented among the unemployed. A recent report by VC-backed firm SignalFire found that recruiting of new graduates by the top 15 tech companies has fallen by more than 50 percent since 2019, especially for entry-level roles. Couple that with frequents rounds of layoffs — in 2024 alone, over 150,000 tech employees were laid off across 551 tech firms; through mid-2025, nearly 89,000 more have been cut — and the message seems pretty clear: coding isn’t the safe bet it used to be.

The great irony of automation is that it has come for the very people who built it. CS graduates are now finding themselves competing with machines as well as each other, and then, when they do get jobs, they risk creating even more sophisticated algorithms that could become their own replacements. So what the hell do we teach our kids?

“Coding used to be like: Hey, I know Python, I know C, I know Java, I know this stack or that stack, and I have five or ten years’ experience on this,” says Nicolas Genest, an AI entrepreneur and CEO of the tech education startup Codeboxx. “And it no longer matters. AI codes better today. I’ve seen it, I’ve experienced it, I’m saying it: it codes better today than any software engineer you can think of. It’s faster, it secures more coverage, it writes all the tests, it doesn’t have a blind spot, doesn’t get tired. It can work 24/7 and doesn’t react negatively to undesired feedback from the product team. It doesn’t flinch when you tell it ‘Do this again’ for the seventh or eighth time, or ‘You’re not done.’ Engineers hate being told they’re not done.”

Genest should be an unlikely candidate to say this, since his entire career has been built around his own coding skills. He’s been coding since 1985, when his grandfather let him borrow his computer and he taught himself the basics. But these days, Genest hardly codes at all: he “vibe codes” instead. In other words, he writes prompts (such as “create the code for a phone app game about farming”) in language models like Claude or ChatGPT, and then he simply tinkers with what’s there. He rarely finds that many mistakes. In journalistic terms, he’s moved from being a staff writer to an editor. Why would he go back?

It isn’t just computer science that’s in danger, either. Genest, who is Canadian, had his green card denied four years ago.

“I spent tens of thousands of dollars on lawyers to get denied that green card,” he says. “And this year, I decided to give it another shot — but instead of hiring lawyers, I opened a conversation with ChatGPT.” With a few short prompts, the LLM had built a new green card strategy for him, created letters of recommendation and support that he could send to the appropriate people, and had advised him to check six boxes on the standard application rather than the three that are usually required. Genest asked it to take into account his failed application, the ways in which other applications had failed, and strategies that had been used in the past for people in a similar position to him. He then followed its instructions and put in a new application, which was successful. His green card arrived in July.

The most successful coders of today learned on machines like these. Their kids have access to much better technology, but also potentially much more limited opportunities
The most successful coders of today learned on machines like these. Their kids have access to much better technology, but also potentially much more limited opportunities (Getty/iStock)

“I was able to achieve all that without using flawed lawyers — expensive, flawed lawyers,” he says. The key was augmenting all the usual steps with personalized ones, he believes. He also asked it to take into account types of green card applications that seemed to have been successful most recently. A single lawyer would struggle to stay on top of their game like that.

In his day job, Genest often sees new computer science grads sign up to educational courses in AI, because the developments have moved faster than college curricula. That’s one way of dealing with the current landscape. It’s similar to the way in which Genest’s eldest son, who is 18 years old and studying at the University of Florida, has also responded: he’s doing a double major in math and finance, with AI-focused internships. Genest’s younger son, however, who is 15 years old, has responded very differently.

“He said, ‘I'm gonna be a baker and I'm gonna bake bread. I'm gonna have my own bakery,’” says Genest. “‘Because people need to eat and people love bread, and I'll do pastries.’ And so a lot of people of this generation cope that way. He was like: ‘I need certainties and I'll go physical, because the physical world remains existing. The white-collar jobs are at risk. I'm not going to risk myself going into this right now. I'm going to try to do something very concrete.’”

The new consideration for parents, then, might not be how to make their children fluent in technology, but how to make them more human in a technological world.

A digital world shaped by white men

Kate Arthur is an entrepreneur, charity founder and author of two books about how to develop “AI literacy” in a rapidly changing landscape (her latest, Algorithm to Adulthood: Becoming an Adult in an AI World, comes out later this year.) She’s also a mother to three girls, one of whom recently dropped out of her computer science degree after four years, in order to transfer to Women’s Studies.

“We had a long discuss about why,” Arthur says. “And she just said that in the university system, her classes are so dry and so boring. There’s no discussion, no debate, there’s no real thinking in terms of world problems. And so she said: ‘Actually, I’m not engaged anymore. I know how to code. I’ve done four years of this. Now I really want to challenge my creative side and my debating and conversation.’”

To Arthur’s mind, her daughter is entirely justified in moving out of CS at this juncture — in fact, she might be doing the savviest thing she could possibly do. A basic knowledge of coding is great, but the companies of the future may well only employ a small fraction of the number of coders they once did. What will be more important is being able to inject the humanity back into the code.

“In the ‘80s, it was a very minority, small group of white men who knew how to code and who started to shape this digital world,” Arthur adds. “So we've got this physical world where we were all in, and then they were building this digital world to mirror the physical world. And in that, all of their biases and all of the inequalities were baked into that environment. And so now we have this digital space where it's so in its nascency, so young — we don't know how to communicate in this space. We don't have values in this space. It has no borders.” More than anything, this new digital realm needs proper architects: people who have thought long and hard about its inadequacies and people who have engaged in critical thinking about it. They will have to smooth out the edges. They will have to debate about how to nudge subjectivity toward objectivity. They will have to consider how much of the code from the eighties should be used today, and how much of it might do damage; what is scaffolding and what is rot.

Arthur’s daughter doesn’t want to be a feminist academic; she wants to be a data scientist. When she told Arthur that, she asked her: “Well, what would an AI be able to do in that?” They had a discussion about what can or should be automated versus what shouldn’t. Arthur still thinks it’s important to know how to learn to code — to know what constitutes the building blocks you’re using — but she thinks that coding will become just another form of literacy, like reading or writing. The skilled workers of the future will be trained in critical thinking, creativity and community-building: things that can’t be easily replicated by AI.

“To parents who worry about ‘What should I be doing for education?’, I like to ask them how they define education, what is its purpose for them? Because we get very quickly into the myopic and forget to think about: why do we send our kids to school for 20 or 25 years?” she says. “Why is it mandatory? And a lot of it does link back to that first industrial revolution and the need for us to be literate, for us to be able to read and write and communicate, and to have the numeracy skills to be able to understand numbers. Those three — reading, writing, and numeracy — anchor the creation of the digital world. So lines of code are letters, words, and numbers. And I still am a believer that we do need to know how to code — not everybody, and it doesn't need to be a massive army — but if we are now using machines that are coding for us, who's checking the machines? I would say the same to someone saying: Do we need to read and write? Because if machines are reading and writing for us, do we need to read and write? The moment we give over that human agency to machines, we're losing control.”

There’s no way to predict how the technological landscape will look in a decade, Arthur adds, so instead we should be doubling down on critical thinking skills. Just like Nicolas Genest — who says his own kids were able to unlock a phone at 18 months old — she believes the best thing we can do is introduce critical thinking very early in education, so kids are given the tools to be able to parse out what’s real and what’s not from an AI environment. Genest says he worried most about the effects of social media on his teens, but now there’s a host of other problems for the newest parents to have to worry about: AI psychosis, AI boyfriends and girlfriends, isolation driven by a dependency on AI friends, de-skilling caused by AI programs writing essays and conducting research for today’s students.

“Needing to read and write, needing to understand numbers, needing to learn to code, are all part of a skillset that strengthens our ability to communicate,” says Arthur. “And they’ll go into that AI space understanding how machines work, understanding how data is manipulated in these machines. And then whether you go into building those AI models or designing them or using them for your own job, that all just depends on who you are as a kid and where your interests go.”

Encouraging a love of education for education’s sake, rather than pushing people to follow subjects in the name of employability, is the only way to be truly future-proof, she adds: “If they can just get the love of learning as a skill, because this is like a lifelong skill that this generation is going to need to know, because It is all moving so fast. To say that I want my kid to do this job — the job might not be there in five or 10 years’ time. We just don't know. So to be able to continuously learn and relearn and pivot and build those human skills is most important.”

The best way an education system can respond to all this is by bringing the arts and sciences back together, Arthur believes: “I think our education systems in the West do a disservice by separating the sciences and the arts. Some of the greatest innovations of our time, of human history, have been through the lens of both arts and sciences. And if you just look at the beauty of nature, it's so mathematical and so scientific, yet so beautiful and so artistic.” For too long, we’ve allowed a strict separation of humanities and STEM subjects, but “in the world that we're going into with AI, we need to be bringing those two together.”

It’s not necessarily just the kids who need to pivot, she adds. If essays for homework are being compromised because students are using ChatGPT at home, then maybe this should be the end of writing essays outside of the classroom. This is a good opportunity to “overhaul the education system” for the better, Arthur says, and it’s the responsibility of adults first and foremost to become AI-literate so they can teach that literacy to their charges: “The role of the educator should be changing as well as the student’s.”

Bringing the arts back

Mitch Resnick has spent most of his career trying to make technology feel more like play. At the MIT Media Lab, where he leads the Lifelong Kindergarten group, he developed Scratch — the block-based programming platform that turned millions of children into coders without them ever typing a line of syntax. The idea was simple but quietly radical: instead of teaching kids to memorize commands, let them drag and snap colorful logic blocks together, more like Lego than code. In the process, they’d learn the habits of mind behind programming — breaking problems into parts, debugging, iterating — without the intimidation of text-based languages. Scratch was phenomenally successful and is now used worldwide in schools from kindergarten age onwards. But when I ask Resnick whether Scratch successfully prepared kids for a coding-based jobs market, he’s clear: “Preparing young people to get jobs as software developers was never the goal.”

Resnick’s “ultimate goals have never been specifically about coding,” but instead about “helping young people develop as creative, curious, caring and collaborative learners.” That’s an important distinction, because the qualities he names are specifically human qualities that an AI would struggle to emulate.

Like Kate Arthur, Resnick believes that the most successful workers of the future will need to be adept in the arts and humanities, as well as the pattern-based, systematic thinking common among software engineers and computer science grads. That’s something he believed even when he was developing Scratch over 20 years ago.

“In the beginning, we wanted Scratch to appeal especially to kids who never imagined themselves being interested in coding,” he says. It was great that naturally pattern-based thinkers also got on board, but he was especially happy when “someone's friend told them: Oh, with Scratch, you can make your drawings come alive,” and then the kid with the sketchbook had their curiosity piqued.

Scratch is now a component in most curricula across the US and the UK, teaching children how to code through simple setups that allow them to bring their drawings alive or move a robot around the table
Scratch is now a component in most curricula across the US and the UK, teaching children how to code through simple setups that allow them to bring their drawings alive or move a robot around the table

It’s not just children who have benefited from Scratch. When the globally successful CS50 course at Harvard — a course available to on-campus students that was then additionally made available for free online in 2015, and has since been taken by well over 5 million people — implemented a small Scratch project in its first week in 2007, it made a remarkable discovery: the retention rate went way up. People who might previously have dropped out because they felt that computer science was too technical or stale, or didn’t speak their language, were able to stay on for weeks of technical lectures after just a small amount of Scratch. It particularly seemed to be good at keeping retention rates up for women.

Resnick cautions against trying to make too many predictions for the future, considering how fast AI developments are moving. But when he thinks about what coding looks like in the future, he believes we have a responsibility to urge our systems-minded kids to consider dabbling in the arts, and vice versa. Because “vibe coding” and writing good prompts into LLM’s will be a good skill to have, but it’s best deployed with a knowledge of the technology underneath.

“A type of thought experiment I would do is… as 3D printers become less expensive and more capable, if someone wanted to make a dollhouse, they could just sort of tell the 3D printer to print a dollhouse for them,” he says. “Or you could use Lego bricks and build a dollhouse yourself. Each has an attraction. It's not that one is right and one is wrong. For certain things, if I want to quickly build a dollhouse, I want to make sure it's sturdy, then I can download some plans for a dollhouse, put it into my 3D printer, print it out, and I'll have a good dollhouse. And you could then play with it and it could be a good experience. But even if you could do that at a very low cost, I think there'd still be a lot of kids who want to build with Lego bricks. There's a joy that comes from building with Lego bricks.”

There’s an in-between path here: you can build your initial dollhouse and then change parts quickly with a 3D printer, or print out a generic house and add to it with personalized Lego. That’s probably how coding jobs will look. But if you take dismiss coding languages entirely, Resnick adds, “there's a risk that you might lose the joy of creating, which I think is an important thing.” There’s also a risk that only systems-minded people would then learn the actual code, and humanities-minded students might default to “vibe coding” and prompt engineering and, in doing so, miss out on a valuable skillset. It’s important on a social and personal level, Resnick believes, for those groups to interact with each other as we continue to build an AI world.

Making an effort to stay human

The professor at the helm of Harvard’s CS50 course became somewhat of a celebrity as his subject skyrocketed in popularity. David Malan has just under 125,000 followers on Twitter and 155,000 on Instagram: a staggering amount for an Ivy League computer science academic. He is responsible for thousands of career pivots and the subject of many a ruminative LinkedIn post, a coveted guest on podcasts and a subject of serious media profiles; he was even recently tapped to lead Arlington’s Regents Theatre because of his “educational theatricality”.

“I think back on my own experience and for the past 20-plus years, I've been thinking to myself, as have I presume a lot of my friends in industry, how fortuitous it was that we all happened to like computer science back in the nineties when we were studying it and look at where it's led us, which has been a very happy accident in some sense,” Malan says, when I ask about the building of his CS empire. “And now all of a sudden the world — or parts of the world — are claiming that, well, that's it for computer science. So what should we have all chosen instead that would've given us 50 years of runway instead of 20 years of runway? It just seems like an impossible question to answer.”

At Harvard, he sees pretty quickly how changes in tech are affecting the classes he’s teaching: “We do certainly see this at the level of the undergraduates, because we see the changes over the years and just how active recruiting is by industry on-campus or off. And absolutely, it's definitely a different feel right now for students in terms of the availability of opportunities and the compensation packages certainly that they're being offered. And that does certainly seem to be a mix of the sort of contraction of a lot of tech companies and also the current trend toward AI. And the reality is that the industry tools in AI right now perform at the level of like a junior engineer for the most part. And that does call into question: why hire a junior engineer?”

Malan prefers, however, to take a “glass half-full” approach to this new environment. He knows some students have recently graduated without the AI knowledge they need to succeed: the landscape moved so quickly that even students who graduated last year will probably already need to take an extra course to get up to speed. But he knows that there are industries — healthcare, for example — where there were simply never enough humans available to fix the problems that needed fixing in the system day-to-day. He believes that AI will allow us to do more, and that coding will always be a useful skill to have, even if coding jobs are more likely to be fractured among multiple smaller companies rather than concentrated in larger enterprise corporations.

Ultimately, Malan believes, “the only right decision, whether it's in K-12 or higher education, is to pursue one's passions.” He’s already improved on the CS50 course by building an AI program that acts as a 1:1 teaching assistant to students in his class. The program — known as the Rubber Duck — started out as a way for students to simply talk through their own problems on the page, and it would simply answer with “quack” (or “quack quack” or “quack quack quack”.) Now, it’s a fully trained AI model built on top of ChatGPT. Crucially, Malan and his colleagues designed the program so that it would never just give the student an answer outright: instead, it gives small hints and deploys the Socratic method. It also won’t give answers to any questions unrelated to computer science, so that students interacting with it don’t go off-track. Since the improved Rubber Duck went live, Malan says far fewer students have posted questions on the course’s message boards and fewer attend office hours. In other words, it seems to be working. Is he unwittingly building his own replacement?

“No, I don’t think so,” he says. There’s always going to be a role for “smart humans to teach other smart humans,” and AI is simply a useful tool to facilitate that. Besides, he adds, if the problem can be automated away, “I would argue that perhaps humans shouldn’t have needed to be solving that problem in the first place. So it isn’t really a loss.”

Malan sees a lot of concerned parents in his line of work, and he knows that when they’re shelling out for a Harvard education, a lot of them want to think strategically. He still thinks the best advice one can give is to pursue one’s passions. “You can't go wrong in general with studying some aspect of STEM, just because the world is so increasingly technological and scientific in nature,” he adds. “I don't think that should be to the exclusion of the arts and the humanities because I think it is that combination of STEM and non-STEM that really tends to yield interesting applications and intersections.” Instead, the focus should be on building critical thinking skills, he adds.

MIT’s Mitch Resnick has a similar outlook. “I think it's always important to emphasize — it's going to make you happier, but I also think it's gonna be a successful strategy — to follow your passions and work on things you really care about,” he says. Additionally, “I think with all the changes going on in the world and the proliferation of AI, it’s going to be more important than ever for young people to develop the most human of their abilities. In talking about what discipline of computer science or physics or literature or whatever field you choose, make sure that you keep alive your creativity, curiosity, care, and collaboration. I think that's going to really pay off in the future.”

The skills that matter most, then, even according to the most ardent computer science defenders — imagination, empathy and critical thinking — aren’t new at all. They’re just the ones we forgot to call essential until the algorithm learned everything else. In other words: “learn to code” is dead. Long live “learn to code”.

“Knowledge is no longer an issue,” says Nicolas Genest, the AI entrepreneur. “Speed? No longer an issue. Accuracy? No longer an issue. We have all the tools to make our results reliable and trustable… What we inject as humans is the consciousness.”

Join our commenting forum

Join thought-provoking conversations, follow other Independent readers and see their replies

Comments

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in