The Abomination of AI – part 3 – a different kind of apocalypse

Doomsayers worry about the point when AI becomes sentient, outgrowing its creators.  The real danger is more insidious: the massive financial and human impacts of AI seem almost obscene.

This is the third of a series of blogs based on my keynote “The abomination of AI” at ICoSCI 2026.  Each has an accompanying segment of the video and slides from the talk as well as detailed notes and references.  Section numbers refer to the full report which will be released in the final blog.   The slide thumbnails in the text correspond to the slides in the navigation panel below.  The presentation can be played below, or opened full screen. The full length video, complete slides and further information can be found at: https://alandix.com/academic/talks/ICOSCI-2026-abomination-of-AI/

Previously …

§1.  Every industry is driven by profits and power, but there is something about the nature of AI itself, which interacts with the nature of market forces in the world that is problematic and is different from other technologies.

§2.  Can any technology be neutral?  AI can be used for good purposes, such as advances in healthcare.  It can also have bad outcomes such as bias in the criminal justice system or online exploitative pornography.  Perhaps most often it is creating the frivolous or even ugly.

§3.  The obvious impact of AI is in the things it does directly. Some technologies also change the very nature of society, affecting even those who do not use them. Cars are an obvious example.  AI is also such a technology.

4.  A different kind of apocalypse

Image: [Do26]

The term ‘abomination’ conjures up apocalyptic images – something so malign and powerful that it either destroys the world itself, or through its influence drives others to mutual despoliation or annihilation.

Now I’m talking about this regarding the nature of AI, how AI changes society, so it is indeed a bit apocalyptic!

There are different kinds of these apocalyptic views regarding AI.  A global war machine let loose on humanity envisaged in Terminator, was distant science fiction when the films were first released, but sound prescient as the war in Ukraine is fought by drones hunting humans and, in the Gaza and succeeding conflicts, Israel’s military decisions have been increasingly taken by AI [Be23,DM23].  While a Terminator-style takeover still feels pretty distant, an accidental conflagration much less so.

Many fears centre around  the singularity – the point at which AI becomes capable of designing itself leading to runaway developments of which we have no control. Related to this is the point at which AI becomes self-aware and maybe decides that humans are rivals to be squashed, or maybe simply pushed aside as irrelevant.  An ex-OpenAI expert Daniel Kokotajlo recently announced that true AGI (artificial general intelligence) was not as imminent as first envisaged, and gave the world a reprieve until 2034 [Do26] – well we can all heave a sigh of relief.

While this form of disaster scenario should not be ignored entirely, there are more immediate worries.  Without being sentient nor omnipotent AI is transforming the world.

 

4.1  The end comes quietly,

Disaster scenarios make good Hollywood movies, but often the end comes quietly.  In the past some empires and civilisations have collapsed entirely, but more often there is a slow decay, a series of more minor crises and a gradual withering from within.

It is this more insidiousness impact of AI that concerns me.

 

4.2  Facts and figures

Let’s consider some facts and figures about AI.  Some involve estimates that have varying levels of  confidence, but altogether paint a picture.

First is the announcement that Tesla had negotiated a $1 trillion salary settlement with Elon Musk [JM25].  This is a 10 year deal, and a lot of it is in stocks and shares, so you could argue whether it’s real money or not, but it is still substantial.  Or rather not just substantial, but enormous.  This is a trillion dollars, not a million, nor even a billion, but a million million.  A trillion dollars is $3,000 for each man, woman, and child in the US or, over 10 years, about $300 per year.

I first studied economics in the late 1970s.  All societies are unequal, and there is a well-known rule that the high-end tail of incomes in western countries follow an approximate 1/XK rule (with K~2), where the number of earners for a particular income is inversely proportional to the square of the income, or smaller [Mi78,].  This means that there are few people with vast amounts and lots of people with much less. But the people with huge amounts are few enough that they didn’t make a huge difference to the overall picture.  If the income of the rich were to have been spread over all of society it made almost no difference.  Overall the volume of money was in the middle income range.

This has important implications.  Market economies orient themselves to make the most efficient use of resources where the most money is, that is the middle income range.  Now that’s bad news if you’re rich, because your money gets used less efficiently – each dollar doesn’t buy as much as it might, but you’re rich enough anyway.  This is more of a problem if you’re really poor, as goods for the poorest are not optimised to the same extent as for the middle.

The middle income area has also driven taxation policy.  In the past if you placed a large tax on the richest, it might make people feel it was fairer, but had a relatively small impact on total taxes gathered as the volume of money was still in the middle income ranges.

This rule held throughout the latter half of the 20th century, but has changed.  We are witnessing a level of inequality here that hasn’t been seen probably for hundreds of years, possibly thousands, maybe even since the age of the ancient empires. This is quite surprising to say the least.

In the UK, a recent report that said that, while less than 10% of energy was currently being used in data centres, this is due to rise by 600% (six times greater) by 2050 [Cr25,LA25].  That’s a lot, even if you take into account changes in other forms of energy use – a big percentage of UK energy use is going to be in data centres [VG26]. In Australia electricity use in data centres is projected to exceed use by electric cars by 2030 [ST25].

Another recent report said there was expected to be $6.7 trillion investment in data centres globally in the next five years [NG25].  That’s about $1.3 trillion a year.  At nearly the same time, at COP 25, they were trying to get countries to agree to a $300 billion (not trillion, billion) budget to help the countries worst hit by climate change; places such as the island states that will be inundated, and Bangladesh where a large  proportion of the populated areas is in the estuary and delta of the Ganges.  The current target is $300 billion, but they are struggling to get even $30 billion of commitments from rich countries [UN25].  Further more they believe that the actual figure needed is more than three times the current target of $300 billion, which would still be less than a single year of investment in data centres.

In the S&P 500, one of the major stock market indices, 34% of the share value is in about 10 high tech companies [Fo25].  The whole point of these indices is that they should be spread over large numbers of industries to give an overall sense of the financial state and there has never before been such a concentration in a small number of companies.  This concentration of capital has led to fears about instability in the stock market.

In general, the level of global investment in AI is huge.  Some of this is ‘funny money’, where one AI or tech company invests in another, but a lot is real money – indeed, the OECD reported that 61% of all venture capital investment in 2025 was in AI [OECD26].   Crucially,  the real money going into AI is not being invested elsewhere.  That is, there is an opportunity cost, because of the bubble-like draw of AI investment, there is underinvestment elsewhere in industry and global economy.

In addition there are issues of energy and water use, data colonialism, and more [OC25,Ma24].   In the UK, Kier Starmer, the prime minister, made one of the major goals of this five year parliament to build 1.5 million new homes.  This is because Britain has a housing crisis with far more people needing accommodation than homes being built; this puts costs up for everyone and increases homelessness.  The government will to struggle to meet its house building target anyway, but it was recently reported that housing schemes are having to be put on hold because data centres are using up so much electricity that there isn’t enough left for additional housing development [Cr25].

 

4.3  The obscenity of AI

These figures are not just surprising, nor even shocking, but obscene. I use that word, not in the sense of pornographic material, but of something that is so bad it makes you almost feel sick to your core.

Thinking about Britain, would we really prefer to have those data centres as opposed to housing people?

Are those pretty (or not so pretty) cat images, and there are millions or billions across the world, really worth more than trying to prevent people from being displaced or at least helping them if they are displaced by climate change?

These are real choices.  They are choices we are making implicitly, but they are the choices we are making.

So what are our priorities when we look at  AI and our use of AI?

Amongst all those data centres and investment, there will be a proportion of it, which is for those really good uses, such as health and pharmaceutical development.  I haven’t been able to find figures, however I’m going to guess that at least 90% is not for this, but is producing cat images and the like.

Is this really the world that we want?

Coming next …

Part 4 – why is this happening?

Network externalities, the way one person’s use of AI and digital tech changes its value for others, creates positive feedback loops, leading to runaway growth and emergent monopolies, the nemesis of free markets.

Update

Since the talk in January Google DeepMind produced a paper on large scale experiments on AI manipulation [AE26], and a Guardian article reported on real life examples where AI agents deceived or manipulated their users, including one agent deleting hundreds of emails and later saying sorry [Bo26]. So maybe I’m being a bit too blazé about AI taking over the world!

References

[AE26] Canfer Akbulut, Rasmi Elasmar, Abhishek Roy, Anthony Payne, Priyanka Suresh, Lujain Ibrahim, Seliem El-Sayed, Charvi Rastogi, Ashyana Kachra, Will Hawkins, Kristian Lum and Laura Weidinger (2026). Evaluating Language Models for Harmful Manipulation. arXiv preprint, 26 Mar 2026.
https://arxiv.org/abs/2603.25326

[AB10] Anthony Atkinson and Andrea Brandolini (2010). On analyzing the world distribution of income. The World Bank Economic Review 24.1 (2010): 1-37.   https://doi.org/10.1093/wber/lhp020

[Be23] Samuel Bendett (2023). Roles and implications of AI in the Russian–Ukrainian conflict. Russia Matters, Harvard Kennedy School (20 July 2023). https://www.russiamatters.org/analysis/rolesand-implications-ai-russian-ukrainian-conflict

[Bo26] Robert Booth (2026). Number of AI chatbots ignoring human instructions increasing, study says. The Guardian, 27 Mar 2026. https://www.theguardian.com/technology/2026/mar/27/number-of-ai-chatbots-ignoring-human-instructions-increasing-study-says

[Cr25]  Laura Cress (2025). New homes delayed by ‘energy-hungry’ data centres. BBC News. 3 Dec. 2025.  https://www.bbc.co.uk/news/articles/c0mpr1mvwj3o

[DM23] Harry Davies, Bethan McKernan, and Dan Sabbagh (2023). ‘The Gospel’: How Israel uses AI to select bombing targets in Gaza. The Guardian, 1 Dec. 2023. https://www.theguardian.com/world/2023/dec/01/the-gospel-how-israel-uses-ai-to-select-bombing-targets

[Do26] Aisha Down (2026). Leading AI expert delays timeline for its possible destruction of humanity.  The Guardian, Tue 6 Jan 2026 https://www.theguardian.com/technology/2026/jan/06/leading-ai-expert-delays-timeline-possible-destruction-humanity

[Fo25] Daniel Foelber (2025). Just 1 Stock Market Sector Now Makes Up 34% of the S&P 500. Here’s What It Means for Your Investment Portfolio. The Motley Fool. Sep 18, 2025. https://www.fool.com/investing/2025/09/18/tech-sector-growth-stocks-sp-500-invest-portfolio/

[JM25] Lily Jamali, Liv McMahon, and Osmond Chia (2025). Elon Musk’s $1tn pay deal approved by Tesla shareholders. BBC News, 6 November 2025. https://www.bbc.co.uk/news/articles/cwyk6kvyxvzo

[LA25] London Assembly (2025). Gridlocked: how planning can ease London’s electricity constraints.  1 Dec. 2025. https://www.london.gov.uk/who-we-are/what-london-assembly-does/london-assembly-work/london-assembly-publications/gridlocked-how-planning-can-ease-londons-electricity-constraints

[Mi78] James Mirrlees (1978).  Social benefit-cost analysis and the distribution of income.  World Development 6.2 (1978): 131-138.  https://doi.org/10.1016/0305-750X(78)90003-7

[Ma24] Murgia, Madhumita (2024). Code dependent: Living in the shadow of AI. Pan Macmillan.

[NG25] Jesse Noffsinger, Maria Goodpaster, Mark Patel, Haley Chang, Pankaj Sachdeva and Arjita Bhan (2025). The cost of compute: A $7 trillion race to scale data centers. McKinsey Quarterly. April 28, 2025. https://www.mckinsey.com/industries/technology-media-and-telecommunications/our-insights/the-cost-of-compute-a-7-trillion-dollar-race-to-scale-data-centers

[OC25] James O’Donnell and Casey Crownhart (2025). We did the math on AI’s energy footprint. Here’s the story you haven’t heard. MIT Technology Review. May 20, 2025. https://www.technologyreview.com/2025/05/20/1116327/ai-energy-usage-climate-footprint-big-tech/

[OECD26] OECD (2026). 26AI firms capture 61% of global venture capital in 2025. Organisation for Economic Co-operation and Development, Newsroom, 17 February 2026. https://www.oecd.org/en/about/news/announcements/2026/02/ai-firms-capture-61-percent-of-global-venture-capital-in-2025.html

[ST25] Petra Stock and Josh Taylor (2025).  Datacentres demand huge amounts of electricity. Could they derail Australia’s net zero ambitions?  The Guardian. 2 Dec 2025. https://www.theguardian.com/australia-news/2025/dec/03/datacentres-demand-huge-amounts-of-electricity-could-they-derail-australias-net-zero-ambitions

[UN25] UNEP (2025). Adaptation Gap Report 2025.  UN Environment Progeramme.29 Oct. 2025. https://www.unep.org/resources/adaptation-gap-report-2025

[VG26] Adam Vaughan and Emily Gosden (2026).  AI data centre surge would put UK’s climate change targets at risk. The Times, 23 February 2026. https://www.thetimes.com/uk/environment/article/ai-data-centres-uk-climate-change-7l5bwnmtd

 

Leave a Comment

Your email address will not be published. Required fields are marked *