Art or wank?

artorwank.jpg

Figure 1: "Creativity takes courage" – Henri Matisse

When someone recently described "vibe coding" as throwing spaghetti at the wall, I had the feeling Art History might have something to tell us about what is going on in computing. Is "AI" the Modern Art of computing? Now, I'm no scholar of Art History (which I'll now no doubt demonstrate), but we all know the familiar periods:

Depictive, explanatory and narrative art. That describes the Primitive, Byzantine and Romanesque periods. Seen simply, Art evolved through many centuries culminating in Realism, an attempt to objectively depict texture, light and perspective. Then came a period of playfulness; Expressionism, Futurism, Surrealism where the actual visual subject matter receded in importance. In its place technique, the emotion of the artist, social commentary and the act of artistry itself took centre stage.

Computing, as it exists publicly and politically, also has a history. Its periods and movements have their own sequence and logic, but let's just call them "Classical" and "Modern" for now. It is not yet that one has replaced the other. Presently they exist side by side as distinct but diverging spheres of computing.

Theoretical computer science comes out of mathematics and predates any actual computing devices, even those of Babbage and Lovelace. The engineering precursors, in mechanical clocks, Jacquard looms and pianolas hardly seem more than the serendipitous confluence of progress until Shockley's transistor and the fully digital electronic age.

What came about in that period, and connects all of Classical Computing, was emphasis on numerical methods, correctness, formal understanding, and the closely related concepts of accuracy and precision. By contrast "Modern Computing" is about "almost good-enough magic".

Modern computing, by which I mean trends in "AI", centralised data processing, large data sets and statistical methods has a wholly different flavour. Moreover, just as Art movements changed radically with culture and shifting power, modern computing embodies an entirely different socio-political agenda.

Here's an easy, obvious and verifiable observation: Classical computing aims to save time. Modern computing aims to waste it.

Of course if you ask anyone invested (financially, professionally, emotionally) in "AI" they'll dispute this, and possibly assume it is an insult rather than a strange but accurate ground truth. Here, stated and actual aims are at odds. Much ado is made about 'efficiency', especially by hit-squads of political ideologues in the USA. With great irony, "AI" is a set of brute-force methods, which in theoretical computer science are considered the ugliest, least efficient of all approaches.

"AI" is the point where granny stopped reasoning about why the TV set is not working and started bashing it on the side.

Classical computing is about the task. Modern computing is about the technology. Along the way computing became involuted and self-regarding. Analogously, for the first 10 years of mobile phones, most of the conversations on mobile phones were about mobile phones. "Can you hear me now?", "I'm on the train", "Yes, I've got a new Nokia"… etc. Today most of what concerns us online is "Online" - "The Internet is dying", "Bots ate my job". The rest (about half now) actually is bots trying to scope-out the networks and discover what people are talking about or influence them to think about and buy… technology.

Classical computing aims to model or reveal the world. Modern computing aims to be the world. Moreover, modern computing centres on the "consumer" or "user" more than any task. It's aim is to capture, to engage, or rather to keep the user stuck, addicted, in order to extract behavioural data and sell advertising influence. Zuckerberg's failed project of the Metaverse is the paradigmatic fly-paper. The idea of being in the metaverse always outsold any actual benefit of so being.

Classical computing is the bedrock of science. It is continuous with scientific method, a logic requiring reproducibility. Much of Classical compute is about modelling. We've created ever more accurate representations of things like weather systems, astrophysics of planets and stars, chemical processes and biological systems like proteins. This was driven by steady progress increasing bit depth, microprocessor ALU design and so on. This was our period of "Realism". Indeed, in the 3D "virtual reality" world of games, realism became an overarching desirable quality which more or less led us to being able to simulate the sound, texture and physics of entire cities or even planets.

Modern "AI" processors using graphics, tensor or neural processing units do vector-optimised linear algebra on lots and lots of floating point numbers. While accuracy is still important internally for each operation, the overall goal is to achieve what is really a "perceptually passable facsimile" of correct output. In other words, where simulated neural computing meets psychology, the driving goal has subtly shifted to convincing the user that the output is okay, regardless of any objective measure of correctness. While this trend started as far back as perceptual coding in MP3 music algorithms, its inevitable path is to producing perceptual effects that exploit gaps in human acuity so as to feel real or "good enough" regardless of underlying correctness. This has very close connections to "magic", both as preformative stage magic and a deeper form of psychological influence.

Generative diffusion models create what looks and feels a lot like images.

Generative language models create what seems a lot like coherent prose or instructions.

At the root of this thinking is what may be the greatest accidental disservice to computing by none other than Alan Turing. The "Turing Test" has roots in idealist/perceptual philosophy of Hume, Berkeley and Leibniz with regard to Identity of Indiscernibles, but is also influenced by the context of counter-intelligence and deception. "If X is indistinguishable from something, it is that thing."

This is not only dubious but dangerous. For example we have a name for a drug which produces the medical effects of another drug but is not it. It's called a placebo. In computing Searle's Chinese Room is an argument that at least attempts to separate imitation from emulation. Throughout philosophy and science there are exhortations against conflating appearance and actuality, but perhaps most interestingly in this discussion, Modernism is the socially progressive urge to reject the nomos (sociology) or widely accepted perception of things in favour of a 'new perception' that works just as well but differently from the customary.

At best this is the very essence of innovation and creativity. At it's worst it represents a break with reality and is psychotic.

Yet behind all such output the word "Artificial" is always present. Artifice means to deceive or trick. The etymological root is shared with "Art". This underscores the shift to a modernist art framing, at best expressionist, but often surreal and dadaist in the form of incongruous hallucinations.

In art, Realism exposed a similar peak. Until photography arrived there was only so much more you could do with oil paint. Frustration at reaching limits often sows the seeds of a reactionary movement.

If computing ever had a Pop movement it was the DotCom times. From the Eternal September to 2008 were about 15 glorious years of fun and games. Mostly this was about "The Web". About accessibility. It was the truly social period of the Internet that preceded the descent into cloistered "social network" qua "walled garden" or "platform".

I see "AI" as the Modern Art moment for computing. We've reached the point of throwing bicycles and naked ladies covered in paint at a canvas (Action Painting). Is it art or is it wank? Nobody is really sure.

In fact that "unsureness" is part of the shtick. The "Art World" is relatable as a strange place where mysterious "influencers"; collectors, patrons, museums, dealers, critics… imbue artefacts with value. Essentially that value is arbitrary and those - usually wealthy and influential - people are not themselves artists. Indeed they rarely experience much about artistic life at all. What they buy and sell is the struggle of others, culminations of life works, traumas, years of study in poverty and so on.

Obviously this makes a massively distorted market. Indeed it isn't a "market" at all by the rules of economists. It's a way for capitalists to respond to abundance. DotCom was a crisis of democracy for power.

Within computing a similar schism has opened. The "artists" are the engineers, developers and computer scientists. We're the people who actually know how to build things and how they work. We're dying out, not ironically along with the actual artists - who are now referred to as "content creators". The ascendancy of the Big Tech class, obscene concentrations of capital and corporate power, has made the actual programmer qua artist, redundant - at least within the emerging movement of computing.

Like Abstract Expressionism, "AI" is unlikely to last in anything like its current paradigm, if only for the simple reason that any brute-force venture must exhaust the entire energy and financial resources of the planet to continue. If it 'succeeds' it will destroy the economy which creates consumers who might buy anything, and it will most certainly destroy its own knowledge foundation, education and motives for any next generation of computer engineers (artists) to sustain it.

The question is simply when the bubble will pop and what will be the cost, and what will be viable as the next movement to build out of the wreckage.

It is tempting to say that we might benefit from looking at Art movements to see where computing might go next, but I feel that here our allegory must end. Is a post-modern reaction to modernism meaningful in this context?

Clearly we are in a process of trying to make the feeling of being correct, somehow equivalent to actually being correct.

Notwithstanding edge errors of justified true beliefs we find in the philosophical field of epistemology, it's hard to see how this would work with GPS coordinates, airliners or missiles, unless accidentally bombing a primary school becomes politically and militarily equivalent to hitting an enemy base. How far will humans be willing to distort reality and abandon concepts of truth to fit the Procrustean frame of "AI"?

I have little doubt that the catastrophes soon to unfold in software engineering and cybersecurity, major disasters in medicine, aviation, critical infrastructure and social stability resulting from "AI" will eventually cause a backlash toward correctness and "classical" methods. A huge backlog of vibe-coded spaghetti is going to need mopping up, but there will be no skilled operators, let alone any who are willing even under conditions of desperate penury, to do the work. It is possible "AI" provokes a "winter" or "dark age" of computing to be followed by a Renaissance of sorts.

Don't throw away your copies of Donald Knuth's The Art of Computer Programming just yet.

[Valid RSS]

Copyright © Cyber Show (C|S), 2026. All Rights Reserved. Site design 2023-2025: Ed. Nevard

Podcast by

Want to be featured? Have an idea or general query? Get in touch contact us by email

Date: 2026-03-21 Sat 00:00

Author: Dr. Andy Farnell

Created: 2026-03-21 Sat 18:30

Validate