The Independent's journalism is supported by our readers. When you purchase through links on our site, we may earn commission.
AI is taking our jobs – but these skills are uniquely human
Artificial intelligence is filling the workforce and graduate jobs are in freefall, writes Anthony Cuthbertson – but amongst the upheaval there might be some good news

Twenty five years ago, the British science journal Nature published a fictional article about the advent of superhuman intelligence. Titled ‘Catching crumbs from the table’, it imagined a future where the frontiers of scientific inquiry had moved beyond the comprehension of humans.
The piece speculated that a form of artificial intelligence would become responsible for all new discoveries, leaving human researchers with the realisation that they will never make an original contribution to science again. Great advances are forecast, which humanity still benefits from, but details of the findings are practically indecipherable to even the most brilliant minds.
In this imagined future, which incidentally takes place in 2025, some researchers give up on science entirely, while others shift their attention towards attempting to understand the discoveries made by AI. “The question," the article queries, "is are these worthwhile undertakings for scientists?”
We may not have reached this point with science – although it may be getting close – but AI is already forcing many professions into a reckoning.
A study last week by Stanford economists found that employment growth for young people has declined by six per cent since the launch of OpenAI’s ChatGPT in 2022, with this new era of generative AI able to replicate the kind of book learning that university students receive before entering the job market.
The number of entry level jobs in the UK is down by almost a third during that same time period, according to separate research by job search site Adzuna, while another survey of 7,000 students across nine countries by student housing operator Yugo found that four in five students fear AI will make human workers redundant.
The students’ fears are reflected by recent online search trends, with data from Google showing a 5,000 per cent spike in the past 30 days in searches relating to AI job replacement.
“AI is changing everything – not just how we study, but what jobs we’ll be applying for after graduation,” Sunjaya Philips, a 22-year-old marketing communications student at Oxford Brookes University, told The Independent. “I’m trying to stay ahead of it by building both my tech skills and my creative thinking. That’s what I think will make the difference in a job market where machines can do all the basics.”
The worry is that the retraining does not keep pace with AI advances, and that for many it is a game of catch-up in a race that will soon be lost. The jobs most at jeopardy, according to a study from Microsoft in July, range from data scientists and economists, to historians and authors.
This last profession, which until relatively recently was considered the sole domain of humans, was put to the test last month by fantasy writer Mark Lawrence. In a blind test, he pitted AI writing against a selection of flash fiction written by himself and other award-winning authors. He was humbled to find that most readers preferred the AI-written stories.
“It’s a pretty grim outlook, especially for new and future authors,” he noted. “On the face of it, it undercuts so many things we value about being human.”
The same thing appears to be happening with music. In August, 37-year-old Oliver McCann became the first AI music creator to land a record deal after his songs reached over three million streams on Spotify.
“I have no musical talent at all,” McCann, who goes by the stage name imoliver, admitted to AP. “I can’t sing, I can’t play instruments, and I have no musical background at all.”
His success is not unique. AI indie band The Velvet Sundown has racked up hundreds of thousands of streams, while music streaming platform Deezer estimates that 18 per cent of daily uploads are now AI generated.
Microsoft’s list of jobs at risk to AI were mostly within white collar or creative industries, with physical tasks deemed to be unaffected by generative AI – at least in the short term. But this could soon be about to change on a massive scale following the recent launch of Nvidia’s new “robot brain” chip.
Designed for a new generation of highly competent humanoid machines, the Jetson AGX Thor is capable of running generative AI models like ChatGPT in order to interact with humans. Embedded visual models also allow the robots to interpret the world around them and adapt to perform tasks accordingly. Nvidia boss Jensen Huang described it as “the ultimate supercomputer to drive the age of physical AI and general robotics”.

If it lives up to its hype, and trends in other fields continue, humanity may well reach the long-predicted point where human endeavour is no longer necessary. Innovation theorist John Nosta described the developments as “delightful, if not magical”, but warned in a recent article that they could expose a deeper human truth.
“If AI can do this, too, what’s left that’s really ours?” he wrote. “Maybe the things we believed defined us were never truly ours to begin with… They were only ours because no one else, ‘no thing’ else, could do them.”
Nosta noted that AI’s mimicry is not the same as the human act, as it lacks intention, lived experience and mortal awareness. But that doesn’t mean it is unable to perform the tasks. “The quiet and uncomfortable gift AI gives us”, he wrote, is that it will force us to reclaim what it is to be human. “When AI steals the doing, what’s left is being.”
Other philosophers have questioned whether AI will not just take our jobs, but also our sense of purpose. Nick Bostrom, whose 2014 book Superintelligence offered a dystopian vision of what might happen when artificial intelligence surpasses human intelligence, addressed this outcome in his follow-up book last year, Deep Utopia.
Speaking to The Independent ahead of its publication, the Oxford University professor said that what is ultimately needed is a culture shift, whereby the emphasis is on “enjoyment and appreciation rather than usefulness and efficiency”. For this to happen, he is advocating for a complete upheaval of the entire school curriculum as we know it.
In the absence of traditional employment, students would instead learn “appreciation of the arts, literature, sports, nature, games, food, and conversation, and other domains which can serve as playgrounds for our souls that let us express our creativity, learn about each other and about ourselves and about the environment, while enjoying ourselves and developing our virtues and potentialities”.
All this is of course assuming that this hypothetical superhuman AI is benevolent. The conclusion of the 2000 Nature article offered the scant consolation that the technologies that made this takeover possible were originally invented by humans. And that at least we would, at last, be able to enjoy the fruition of our labours.
Join our commenting forum
Join thought-provoking conversations, follow other Independent readers and see their replies
Comments