The Five Stages of AI Grief
A conversation with Ewan Morrison, author of the new novel For Emma.
1. Denial
When I first wrote about AI, I did so out of curiosity. In Elisabeth Kübler-Ross’s framing, I was in the denial phase of the grief cycle about this potentially killer technology. Two years on and many of us are heading towards resignation. But not Ewan Morrison.
Morrison is still angry at the liberties taken by OpenAI and other companies and has become a prominent sceptic of their product’s capabilities. His latest novel, For Emma, takes his research into the realm of fiction. It focuses on one father’s deranged mission to take vengeance on the tech CEO who killed his daughter by microchipping her brain. As someone who is still trying to make sense of the AI revolution, I met up with Ewan to understand why he is so passionate about this topic. Before I started, though, he had a question for me.
Ewan Morrison: Will you be using AI to transcribe this interview?
Neil Scott: Yes, I'm afraid so. I hate the sound of my own voice so AI transcription is a blessing.
Ewan Morrison: AI has limited uses. As long as we know what the limits are, it's fine.
This is round three of our conversations. This time it’s on the occasion of the release of your latest book, For Emma, which is a great novel about searching for meaning in an uncaring universe. A lot of the book centres on the overarching question of “Why?” And I am curious to know why this novel? Why now?
My kids’ generation, Generation Z, is grappling with the problem of nihilism and asking why of everything. It's a strange thing for us in Generation X to see nihilism go mainstream. We experienced nihilism in its rebel phase with bands like Nine Inch Nails screaming, "God is dead, and no one cares. If there is a hell, I'll see you there." But, Generation Z, they try to use nihilism as a living philosophy.
You can tell yourself that it's okay if you don't have a boyfriend, if you fail at school, if you fail to be attractive, if you fail to get into university because, ultimately, we're all going to die. Nothing has any value. Even planet Earth is going to be destroyed when the sun becomes a red giant. They're using nihilism as a way out of dealing with daily reality.
That troubles me because we bequeathed nihilism to our children and we still don't have the answer to their big why questions. A lot of them are terrified of the various apocalypses that they believe are coming as well, whether these are ecological or biological. This demotivates them because they can't find a greater meaning. So I wanted to lump this on to Emma and her dad, who doesn't have answers to any of her questions.
2. Anger

One thing For Emma taps into the question of what people really need. Not just what we want, but what we really need. The main character has a strong connection with his daughter but he shows what happens to people when they’re alienated from society and the depression that ensues. This got me thinking about the theory of the Triune brain …
I don't know this theory.
It's this idea that the human brain evolved across millions of years from the lizard brain which regulates eating, sleeping, fucking ...
[Laughs] I've been there.
Everyone loves a lizard brain. But then you've got the mammalian brain with its doglike loyalty and affection. Finally, you get humans where the neocortex adds abstract reasoning. When I see AI creations, especially all these kitsch image creations, they're very mammalian: everything is on an affective level. Your novel seems to connect to the mammalian heart of humanity beyond rationality.
Yes, life experience shows us that while we might go searching for life's ultimate meaning in abstract ideas, the more abstract we get in these searches—whether we're searching across the stars or searching down into the core of our DNA—we're missing out on the actual thing itself that is present within us, which is our intimate connection with other people, our empathic connection.
The deeper we go into seeing society as a bunch of problems to be solved by reason, the more we isolate people and turn them into just little dots on a spreadsheet. You see this with people like the transhumanist David Pearce who is a member of Humanity Plus, which used to be the Transhumanist Society, which was created with Nick Bostrom, another one of these gurus. This is rationalism and utilitarianism gone to the nth degree, reaching a conclusion that is obsessive, neurotic, and destructive.
They take the utilitarian, rational premise—which goes back to the beginning of the Enlightenment—that you want to create a society in which there's no more suffering, in which there's no more pain. Pearce argues that we can redesign the entire human species and every other species to remove the capacity to cause suffering to others and remove the ability to feel pain. This would require three or four different augmentations of humanity: a genetic one, an alteration of brain chemistry through technology, the addition of drugs, and some social engineering.
It's a monstrosity of reason. One of the earlier AIs was asked, "How do we end the sufferings of humankind?" and it replied, "End humankind." And that’s really the solution; that’s the end point of reason: Human nature is a problem.
Reason wants there to be a blank slate human being that can be redesigned from scratch. It hates the fact that there’s all this genetic baggage; that there's love and lust and anger and all these things that seem to be hardwired into us. Pure reason wants to start again: redesigning a blank-slate society with blank-slate humans.
3. Bargaining
When we think of Western individualism, we're basically talking about the last 500 years. To paraphrase Simone Weil, the Gregorian chant, Romanesque architecture, the Iliad, and the invention of geometry don't have authorship in the same way the novel does. Perhaps we've reached the terminal endpoint of individualism with people using AI to create novels or films for themselves only.
If everyone is using AI to write novels, these novels will never be read by anybody other than a tiny circle of friends. People won’t have even time to read those novels so they'll get the AI to skim through it, write a synopsis, and then they'll get the AI to write a tweet saying how much they really enjoyed the novel that someone had sent them.
But vanity is seemingly much more appealing in our individualistic society than transcendence or quality. It's easy to imagine a future where we're all famous for 15 million bots.
I question whether we live in an individualist society. Certainly, the 80s and 90s were grounded in individualist capitalism with people like Madonna. But I've seen so many people who have succumbed to the temptation of non-individualism and statism.
But what do you think people want from art?
On X, I discovered that all the people who felt alienated from postmodern, politically correct art, which has been the dominant mode in the last 20 years, have gone into creating concept art. They create superheroes, dragons and all these kinds of highly skilled, artistic drawings. These people often start working in the sectors like comics or illustrations or graphics. They are now the anti-AI army online.
You won't find many people in the postmodern art scene standing up against AI. They tend not to get involved in that kind of politics, but those who do drawings of dragons and robots and Lord of the Rings figures or whatever, who've made a living as concept artists, they are fighting passionately. I see them on my Twitter feed all the time. Some are trans. Some are Christian. Some are left-wing. They're all fighting against the demonetization of their art because they know that it's massively under threat from AI because AI has been hoovering up tons of this art.
What AI can't do is schmooze at a party with a curator. It can’t convince an administrator that their politics are the right ones to get funding. The art world is one of the last places where hierarchy exists. If you're represented by a big gallery, you're in the game. If not, you don't really exist. Success in the art world is based on the opinions of a tiny group of people. In some ways, that's a good thing because if those people have good taste then we could have good art. Should we avoid democratic art forms and return to a hierarchical mode of art production?
I don't know what a democratic art form would look like. I don't think it's ever existed.
The novel feels fairly democratic because you actually have to sell products to real people.
There are forms of music that are definitely democratic: things that stay away as much as possible from state control. Maybe the confusion happens around the word ‘democratic’ and our ideas about the state. The novel and music exist in the free market of ideas—or what's left of it—and they exist in niches that have their own little markets as well.
I take great inspiration from the way that music survived, the way that, for example, in the 90s and 2000s people were saying rock music was dead. Yet now there's a huge explosion of post-rock, post-metal, death metal, dark metal and black metal. And these are really niche phenomena that gather huge followings. I don't know what we gain by describing them in any way as democratic, though. Perhaps the idea of the demos is always going to be the enemy of art in a sense because it seems to imply a form of centralization at some point.
4. Depression
In the novel, Emma goes through a period of depression and despair, and then she discovers a TED Talk by a tech CEO who gives her some optimism about what technology is to come in the future. Reading your X account, it seems you think these AI people are frauds committing a Theranos-style con. However, the novel really comes alive when you describe this incredible brain-chipping technology. Do you think these kinds of technologies are coming?
I'm a strategic decelerationist. I'm doing my best to discourage the advance of AI because it's based on a myth anyway, which is Nick Land's notion of hyperstition, which says that if you believe in something enough, it will manifest. If you get enough people to believe in it and invest in it, then these new technologies you dream of will realize themselves. It's a sort of magical thinking that lies underneath the philosophy of AI investment. But you can also have a negative hyperstition where you can disillusion people and stop this horror from developing. I'm actively involved in that; trying to find the holes in the argument all the time.
My understanding of Nick Land is that he believes technology is inevitable rather than wish-fulfilment. There's something inhuman in capitalism that doesn't care about human wishes. I'm curious why you never take the critique back to capitalism itself.
We don't actually have an alternative to capitalism. We have a zombie left that was over as soon as the planned economy died as a concept. And we have an identitarian left that deals with some of the superficial effects of what was the grand left project. And we have bureaucratic statism, which was one of the emerging properties of the old left. But every functioning economy is at least a mixed economy or a capitalist economy. We have to deal with modifications rather than an alternative solution to capitalism.
Surely this failed planned economy would help decelerate everything!
If you look at the stagnation of the Soviet Union, you would be talking about deceleration to the point of 300 people queuing for a loaf of bread or 800 bureaucrats involved in working at the calculations for who gets potatoes next year. Deceleration isn't an end in itself. We don't want to just destroy society. What we want to do is take the elements of hyperstition, and speculative capitalism and try to fasten them back onto real, achievable, concrete goals so that we're not in this runaway, accelerationist craziness where regulations just get crushed all the time.
Sam Altman is a big proponent for UBI, but it's interesting that actually there is so little talk about AI being used to create an efficient planned society.
Things have shifted. About 10 years ago, you had people in the Zeitgeist movement. We're talking about a futuristic planned economy where every single item in the world is fed into a huge system which will then deliver value and goods to everyone on an egalitarian basis. But it's a monster of rationality: it's an imposition of a single universalist solution on a very diverse world ... like forcing everyone to live in modernist tower blocks.
5. Acceptance
You are one of the most active people I know on X at a time when all the virtuous people have departed to Bluesky. It seems to me that they are missing out on some of the most important conversations that are happening in the world right now. If you want to understand the emerging ideology that’s coming out of America with people like Marc Andreessen …
Marc Andreessen is a very dangerous person.
… you have to be on X.
I'm focused on AI. I see it as a challenge. I've discovered the mindset of the techno-optimists, the accelerationists. If you're curious as to who your enemy is, you want to try to engage with them as much as possible. You want to find out their language and decode it. I've discovered from talking to these people—and being hounded by them occasionally—that they're trying to fill the hole in their lives with an almost theological belief in the role of AI. It’s the new saviour for them, even though, technologically speaking, large language models are not going lead to the AI God.
You have an incredible collection1 of blurbs from film directors, psychogeographers, novelists, podcasters etc. It’s interesting to me that most of them are from outside Scotland. Do you feel more connected to a global view of the world than a local one?
I was bred into internationalism from being born in the late 1960s. Internationalism was always the goal of the left-wing mindset. Internationalism unites the human race and I was always trained to despise the parochial as an extension of corruption and small-mindedness. But, as I get older, I realize that internationalism is alienated and disembodied. As Theresa May said … and I don't quote her because I like her, but because it was a good sentence … she described the internationalist world and its inhabitants as citizens of nowhere. And I think since Tales from the Mall, I've been dealing with that sense of existing in a globalist nowhere.
My worry with the current situation, especially in Glasgow and the UK, is that we seem incapable of fixing anything. Sauchiehall Street has been a wreck for months, more than a year, because they discovered a pipe or something. There's a sense that nothing can be done because there'll always be an objection or an unintended consequence.
One day you wake up to the fact that Scotland is one of the most statist countries in Europe. It's extraordinary how uncapitalist it is and how many people are employed by the state and offshoots of the state. All the things that haunted former Communist countries haunt this place: the entropy, the stagnation, the over-regulation, the inability to get things done. If you try to fix something in your location, you're put on hold by a complaints department that doesn't actually exist. We've achieved a state of almost total stagnation.
It's manifest in things like the film industry. People who work in the film industry here have to leave because everything has become so tied to state funding. You have to go somewhere else to make a film like Anora. It won five Oscars and that was largely an autonomous individual director getting people together and pulling money from elsewhere. It’s an example of venture capital at its best.
Do you even need venture capital? Sean Baker's other film, Tangerine, was filmed on an iPhone 5s.
Exactly. Get up and do it yourself. Don't sit around and wait for the state to give you permission to make art. As an artist in Scotland, I see that people get stuck in the trap of waiting around for the government to sanction and fund the art they do. It just leads to stagnation.
Thank you, Ewan.
For Emma is published in the UK on 25 March by Leamington Books and in the US on 17 June by Arcade Publishing.
Check out Ewan’s Substack at
.Including Irvine Welsh, Terry Gilliam, John Banville, Atom Egoyan, Ian Rankin, Iain Sinclair, Lionel Shriver, Jonathan Ames, Mark Cousins,
, , , Kieran Hurley, and J.T. Leroy.
Excellent interview. I worry, though, that trying to "negative hyperstition" our way out of an AI future leaves all the reasonable people who believe in and are truly committed to the importance of the _human_ getting blindsided by the technology. Whether or not the AI boosters are promoting the tech out of some quasi-theological imperative deep in their subconscious (and I really agree that at least some of them are - Sam Altman freaks me the fuck out, I think he's very dangerous, Andreesen too), that has nothing to do with the technical questions at the heart of "how does this technology work, and how far can it go?" I think the scariest prospect is that AI does become extremely powerful and disruptive - not necessarily in the ways the AI boosters predict, all predictions are wrong and predictions from people with dodgy psychological motivations even more so, but in other ways - and the perspectives that really need to be at the heart of the conversation at that point get ignored, because all the "humanists" (for want of a better term) were predicting the tech would go nowhere, so the people in power decide they don't know what they're talking about and turn towards the hyper-utilitarians / Landians / etc.
I tend to believe that, as a matter of technical facts about the nature of the technology, continuing to pile money and computing power into AI _will_ turn it into something powerful and dangerous, risky for humanness in art and not just in art. But the people who have control over money and computing power are exactly the people who are least likely to be taken in by attempts to 'negative hyperstition' the culture, and are the most prone to boosterism. It sounds like Ewan has some scepticism towards the state, and that's a perspective I have a lot of sympathy towards. But I think trying to influence the state into caring about this technology and paying attention to it - and as a result, hopefully, regulating it - might (maybe, not certainly) be a better bet than trying to convince people it's going to fizzle out, and letting the future of the tech be dictated by the deep-pocketed boosters who are beyond convincing.
I don't know how much of the above is a disagreement with Ewan and how much is just verbal difference, but I wanted it out in the open at least.
Also, totally different point, but the "triune brain" just sounds like Plato's Republic in a trench coat. Threefold division of the mind into the lowest part driven by base appetites, then loyalty and courage and pride, and then the highest part engaged in abstract reasoning?
This was a fascinating read! I have been concerned about AI ever since it all started. And I, too, am afraid that it will reach a tipping point.
A hug part of me wants nothing to do with it, but I think the solution or maybe a small part of the solution lies in this:
“AI has limited uses. As long as we know what the limits are, it's fine.”
Nothing can ever be a substitute for a human brain.