Why I'm Not Suing Anthropic

ducktracy.jpg

Figure 1: "When plunder becomes a way of life for a group of men in a society, over the course of time they create for themselves a legal system that authorizes it and a moral code that glorifies it" – Frederic Bastiat

Should I sue Anthropic? Some friends and colleagues - fellow authors and PhD types - are suing them. Just before Christmas came the letters from publishers trying to round-up a class-action posse. My mate Daniel got the same letter regarding a book he wrote almost 20 years ago, a book about using Free Open Source Software. Part of me was intrigued by the prospect of a promised 3,000 dollars. After a moment's reflection I thought, "Let it go". That sort of thing absolutely isn't my style.

The problem is that settling with "AI" companies who thought they'd just nick my stuff, legitimises their actions. As the law stands today - or rather fails to stand - I can't condone that. Even if it put money in my pocket it would be harming and betraying other creators. The "AI" crusaders already factored a couple of trillion in slappy-wristies into their plan.

There's a principle here. Money won't fix it. If you dishonour someone, rape them, take their dignity, throwing some dirty cash on the bed as you swagger off only adds insult to injury. Everyone who made it through the first hour of jurisprudence class knows this; sometimes sorry doesn't cut it and you better brace for vengeance.

Now, as Rob Pike nobly wrote a few weeks ago, these fist-magnets pissed off a lot of others. For my tuppence - all of US Tech, the horse they rode in on, the ground it shits on and its eternal equine soul - can munch on my British beef banger.

Anger is not only justified, it's a credit to human beings still able to feel and express it - unlike the emotional stiffs it's directed at. The source of anger is not about money. After 80 million people died in a war for democracy, it's a tad annoying to see it kicked aside for the amusement of less than a hundred tech demons with a bizarre fetish for "convenience and efficiency". They defecate all over a five thousand year run of science and culture.

There's a number of factors at play. Together they make Anthropic and the other "AI" companies the worst kind of crims. Let's assemble the case…

Today Bruce Schneier and J. B. Branch reminded me these are the same people who sent Aaron Swartz to his grave for much less. Different clothes, different names, same money, same villains. Whether you call them publishers, Big Tech or "AI" companies it's a familiar game of domination; controlling other people's ideas. They want nothing less than to control science, technology, art and culture. But while Schneier and Branch see a fair likeness between "AI" company copyright infringements and those of Swartz, they miss some important differences that are even more damning.

The Internet's own scientist

What did Swartz do? He took scientific ideas that belong to the people, because they were funded with public money, and because the scientists who published them had intended their work to be widely read and studied.

We call that "science".

He took them, in the context of an education establishment, MIT no less, and tried to give them to others… so that they might, you know… cure cancer, build a better solar panel… evil stuff like that. What can only be described as a lynch mob of US officials including the FBI and MIT, hounded Swartz to suicide.

aaron-swartz.jpg

Figure 2: "Consider the effects, consider the alternatives, but most importantly, just think." – Aaron Swartz

Here's a fact that will shock some people. There are people in this world who actually want you to take their ideas, for you to study them, replicate, copy, spread those ideas. Only by some weird, twisted and deformed interpretation of "intellectual property" - the insane conceit that someone could control and own someone else's idea - we got into this awful mess.

Intellectual property (in particular Copyright) serves capitalism very well because it can be transferred (bought and sold). Printing presses then digital systems brought cheap replication which meant there was no way to monetise intellectual works without contriving artificial scarcity. And since authors and artists don't want obscurity and scarcity, those who exploit and control them had a problem.

But there is more to it than greed. A brilliant lawyer and scholar called Lawrence Lessig is a sound authority on this. Many years ago I read his books. Later his ideas (and those of others) grew into the Creative Commons. It separately addresses several sociological/psychological factors;

  • Creators want recompense. Recognition for our intellectual labour. When we cannot get it, but someone else profits from our work that is an insult and a harm. That may seem resentful, but that's human nature.
  • Also humans are egotistical. We want people to associate our ideas with us. We want attribution. Without it we call that plagiarism. Having someone else take your work and put their name on it is a knife in the gut - as everyone who's worked as an artist or in academia knows.
  • We want our work respected. We don't want others to misrepresent the intent or values of art. It must have its integrity preserved.
  • Some people get off on controlling the ideas of others, by censorship, suppression, selecting 'winners' according to their own arbitary criteria and control of channels. Indeed the basis for copyright, the Statute of Ann, has its origins in a prior Censorship Act. In many ways Copyright is Censorship in disguise. Copyright law softens control of dissemination by pretending it's all about "protecting authors".

In a nutshell, today's "intellectual property" law favours big business and harms the individual creator. It patches up the worst aspects of human nature and tries to keep a lid on many tensions. Its basis and stated aims are dishonest. In trying to achieve them it stands as an obstacle to us realising the best of our naturally creative and sharing side.

There is a good case emerging for the total abolition of intellectual property law. Indeed that would greatly accelerate human progress. It would certainly please Big "AI" companies. They would use the same moral argument as I made above to say: "Copyright law stands in the way of progress" - where 'progress' is their right to take whatever they please. By the same token you and I should be able to download any movie or book we like and sell it, right? Fair is fair.

Here's what may surprise you. I've no problem with an actual OPEN "AI" project taking my work as training data. As a computer guy I love the idea of "AI". I worked on it in my younger days (Boltzmann machines, fuzzy logic and expert systems - so I'm as guilty as the next man). "AI" might amplify, integrate and synthesise my research and original ideas alongside other scientists. Hoorah! For it to be made freely available to everyone for all time is exciting and surely positive.

That's not what's happening. That's not what's going to happen. "AI" is bending us over and taking everyone up the wrongun. It's a scam. A ruse. A flim-flam. "AI" is a bunch of loaded capitalists realising that we've passed "peak tech", worrying about their money, and conjuring up another hype cycle to keep the tech industry on its feet.

We're supposedly entering a world where there are no creative jobs or livelihoods to be had. So creators have no further stake in defending copyright or patents. Is that a road we want to go down?

One of the hardest things for me to admit as a computer scientist is that the world just isn't ready for "AI". We can't even transfer a file from one device to the next properly. There's so much basic stuff to be done getting ordinary computing to even be good enough.

"AI" is a cybersecurity disaster. It will set us back decades. Chat is for smartphone zombies to amuse themselves to death. It makes them go crazier, get stupider and more lost in their noo-noo comfort world. It's a damn shame society has nothing better for most people to do. That's a failure of purpose, a vacuum that mindless technology rushes in to fill. Under current political and economic structures it's poison to everything we hold dear, a distraction from everything we should be working on to resolve poverty and climate ruin. Maybe in the next century, we'll have a real reason for "AI", if we're still here and have got rid of all the megalomaniac tech nob-jockeys.

"AI" is solutionism because it tries to make things we shouldn't need to do anyway more efficient. Wrong! An even more 'efficient path to efficiency' is to stop doing bloody stupid things in the first place.

The "AI" industry is filled with liars. No more proof of bad faith is needed. OpenAI moved the goal posts and shed their roots as an ethical non-profit right out of the gate. In absolutely unapologetic Darth Vader style they "altered the bargain", and "open AI" became "closed for private profit AI". If that's their opening gambit what else are you going to trust them and their dodgy Epstein Island paedo mates with? Your kids?

vader.jpg

Figure 3: "I am altering the bargain. Pray I don't alter it any further. " – Lord Vader

"AI" does not faithfully reproduce my work. It bastardises it. It makes colossal mistakes. It gets fundamental ideas arse-backward. It misattributes and credits other people with my ideas. It's a chaos machine.

"AI" does not make my work available freely to everyone. It imprisons it deeper within a proprietary labyrinth of unfathomable, inscrutable matrices of floats, guarded by obscurity and the attack dog of the world's most disgraceful law, the DMCA. Yarbles to your digital millennium. The reality of "AI" is bollocks and the people behind it are total spanners. It's corrosive to science, education, mental health and human progress.

"AI" threatens peace and stability everywhere. Prominent people who should know better are talking about bombing data centres. Where's the moderate voice of Ted Kaczynski when you need him? If "AI" were any other kind of phenomenon governments around the world would be gathering to enact sweeping laws to contain and destroy it, eliminate its sources, call it "terrorism", "economic warfare" or a natural disaster.

Bruce Schneier doesn't quite get around to saying it as Agent Smith would say; "It's the smell…" It's the rank, foetid hypocrisy of the civilisation's rotting corpse as rules break down and corporations do as they please. They ruin society by making us stupid and dependent. They destroy the planet. Impotent, Law stands aside playing pocket billiards.

Why is the law so weak? Is it just because a trillion dollar company is a tougher opponent than a young student? Or because one law applies to the rich and another to the poor? Or is it because there simply isn't a law that even covers the harms "AI" companies are wreaking or enough prisons to hold all the executives?

Reckoning up the hurt

So here's a taxonomy of IP harms, admittedly one I just pulled out of my backside. It's a scale, or circles of Hell, in which the actions get more egregious and objectionable. With it we can compare the acts of Aaron Swartz (or SciHub, ZLib, Anna's Archive) with those of Anthropic and the like.

pain.jpg

Figure 4: "A computer shall not harm your work or, through inaction, allow your work to come to harm." – Jef Raskin The Humane Interface

With respect to what I'll just call Lessig's criteria we need to consider the following factors;

  • Amount; How much of a work is taken? A few words? A single video frame? A paragraph? Or the whole book, song or film?
  • Attribution; Is the creator credited? Fully referenced in academic style? Is she just omitted? Or does a new creator make claim to the work?
  • Accuracy; Is the work faithfully copied? Is it strewn with errors? Is it seriously and deliberately misquoted?
  • Intent; Is the work copied respectfully as part of another work? Is it fair use for critique or parody? Is it libellously used to attack the original author?
  • Gain and loss; Does it help the author sell copies or earn royalties? Does it directly cause loss or harm opportunity? Does it directly profit the person making the copy?
  • Context; Like intent, context is everything. Is the work copied to save a life, or to further enrich a criminal oligarch? It's a broad moral qualifier.

Harm Level 1

The amount of the work copied is small. It is accurate and correct. It is properly, generously attributed. Intent is flattering and in proper context. An example would be a literary quote in academic work, a sound sample of a single instrument, scratch or breakbeat used in hiphop. It is a homage with artistic merit and cultural relevance. Harm is negative in that fair use amplifies and honours the original.

Harm Level 2

Someone copies all of the work. However it is properly attributed. The act is a form of flattery with intent to amplify the work and extol the creator. There is no direct financial loss to the author, and crucially there is no gain to the publisher. An example might be broadcast of music on non-commercial pirate radio, or pop music used by kids to make a dance routine uploaded to YouTube. Another would be Aaron Swartz collecting scientific papers to help students.

Harm Level 3

This is the same as level 1, but although accurate and attributed the context is unflattering. This is the nature of fair critique so long as the fair use doesn't stray into libel. There's no financial loss or gain for either party. An example could be a podcast host or political satire show including footage from another show in order to rebut it.

Harm Level 4

Again only a partial copy is made, but it is misattributed, misquoted, or disparaging. It goes beyond fair critique into the realms of trolling, academic misconduct or simply bad journalism. No significant financial loss or gain to either party, but there's a reputation loss to the creator.

Harm Level 5

Like level 2, the work is copied in its entirety, properly attributed, and the intent is to amplify the creator. However there is real or potential loss to the author through lost sales or web impressions even if there is no gain to the publisher. An example might be those games bundles that include other developers' work without any permission.

Harm Level 6

This expands on level 5 but the publisher has a clear intent to make money whether or not that also causes corresponding loss to the author. It is an example of what most realistic thinkers consider "commercial piracy".

Harm Level 7

Commercial piracy where all of the work is faithfully copied for gain or sold for profit, however it adds further harm in that the work is misattributed so causing an additional reputation or opportunity loss to the author. An example might be the "Dodgy Dossier" plagiarised by the UK Labour government from a political science student and used as the faulty basis for starting a war with Iraq.

Harm Level 8

This miscreant copies all of the work, misattributes it, misquotes it, and does so for money, while causing the author actual loss of money and reputation. An example is that time a fraudulent professor at Queen Mary University commissioned me to give a course, but then made the contents of my website into power-point slides, removed my name and passed-on an error filled version as their own lecture.

Harm Level 9

What Anthropic and OpenAI do takes the biscuit; they steal people's stuff, not just one persons stuff but everyones stuff, in entirety, after you told them not to, remove attribution, mash it up and cook it, fill it with mistakes, pass it off as their own fuck-witted "Claude" or "ChatGPT" or whatever, for money, all while causing loss of livelihood to the creator, and all other creators, while simultaneously damaging the environment and the political and economic stability of the world.

Implications

Hopefully Bruce and J.B. can see where the comparison of Swartz and BigTech is weak, even though their remarks on astonishing hypocrisy stand strong.

Can the conceit of intellectual property survive this era? If the Law will not defend it, Big Tech gets to scoop up everything you create. It can take your poems, essays, music, videos and life data to use as its own. By the same token you get to download every movie, album, game ebook, without let or hindrance and a clear conscience. Hooray. "Piracy" is to be legalised on both sides?

"AI" means the end of intellectual property?

Fat chance! It'll mean the end of your property. Period.

Personally I think that would be a very bad deal for people, for culture, arts and science, and a good deal for predatory business. We'd get to keep all Hollywood movies made unto 2026 (there won't be any more). They'll get to subvert democracy and control our lives forever.

I'm not suing Anthropic. That will fix nothing and lend credence to their crimes. Forget "intellectual property". Intellectual dignity cannot be protected by damages and a circus of misdirection, pantomime hearings in clown courts, while we pretend we've still even the flimsiest grip on any idea of 'justice'. The Law needs to step up and grow a pair. Money won't fix anything. Big Tech must simply be stopped. Hopefully vast swathes of it will be eliminated when the "AI" bubble collapses. And don't let the door hit your massive ugly bum on the way out "AI" companies.

[Valid RSS]

Copyright © Cyber Show (C|S), 2026. All Rights Reserved. Site design 2023-2025: Ed. Nevard

Podcast by

Want to be featured? Have an idea or general query? Get in touch contact us by email

Date: 2026-01-20 Tue 00:00

Author: Dr. Andy Farnell

Created: 2026-01-22 Thu 09:06

Validate