Now that NFTs have finally died and we don’t have to look at those ugly goddamn monkeys any more, those very same tech bros are coming at us with something even more stupid: AI writing.
Because, of course, we are constructing a world in which machines get to do all the wonderful things, like art and creating, and humans get to do <checks notes> housework and taxes. Cool cool cool.
Blah blah disruptive tech. Blah blah innovation. Blah blah synergy. Here’s a quick and (extremely biased) WHQ rundown to everything you need to know about artificial intelligence and whether it’s coming for your writing.
I need a quick primer – what’s going on?
As soon as ChatGPT was released, everyone decried the death of the author. Not a Barthesian death. A capitalist death. The horror. Why would we need to hire writers when we can just get our machine slaves/overlords to spit it out instead? But it quickly became apparent that maybe this writerless utopia wasn’t quite so close.
Jo-ann Fortune, director of content at Brighton-based content agency iCrossing wrote about her experience trying out ChatGPT and came to the conclusion that it’s just not very good. “Writers will always need experienced editors to refine the content, structure and tone of their work, as well as to fact-check and proof. AI-written content even more so – there’s no cost saving there. Editors also guard a brief and translate vague client feedback into actionable amends. It’s a human-to-human job. ”
Despite that, in September ‘23, tech blog Gizmodo laid off writers on its Spanish website and opted to use AI to translate stories from English to Spanish, and then went on to publish an absolutely disastrous AI-written blog about Star Wars, in which the computer couldn’t even get the films in the right order – a job any 7 year old can manage without much bother. So far so not so good.
Never mind content, what about fiction?
Well. At some point in 2023, a bunch of wise guys decide to feed their generative AI system called Book 3 nearly 200,000 books in order to train it. Those wise guys were Meta, Bloomberg and other similar unstoppable corporate behemoths. People with serious resources at their disposal and with very questionable ethical tastes.
The authors of those books – writers such as Rebecca Solnit, Stephen King, Zadie Smith, Junot Díaz, Chuck Wendig, Neil Gaiman, George Saunders etc etc and so on forever – weren’t consulted, contracted, or paid. In fact, the books appear to have been obtained from piracy websites. And according to Alex Reisner in The Atlantic “the people building and training these machines stand to profit enormously”. Insert raised eyebrow emoji.
Meanwhile, in February 2023, sci-fi and fantasy lit mag Clarkesworld was forced to close submissions after a flood of AI generated stories overwhelmed their systems. Clarkesworld editor-in-chief, Neil Clarke, told NPR, “By the time we closed on the 20th [of February], around noon, we had received 700 legitimate submissions and 500 machine-written ones.” Asimov’s Science Fiction magazine suffered a similar fate.
So, now we have stories created by machines sent to lit mags, which then have to close, so no one can read them anyway. It does raise the very important question: eh?
Is AI actually writing?
Is Data human? No human could write a poem as good as Ode to Spot tbh.
An extremely reductive explanation of AI writing: it’s like super sophisticated predictive text. It’s looking at language models, looking at what’s been said before, and making a pretty good guess at what comes next. It’s building patterns, but not in the way, say, a poet would build a pattern, or the way a novelist would construct a rhythm, or a short story writer would craft a narrative. It’s just guessing based on previous structures and patterns. It can’t innovate, it can’t have new ideas, it doesn’t understand what it’s saying, it’s only as good as the (mostly whitestraightcismen) people who built it, and it can’t do that thing that writers can when they hit you in the solar plexus with their words because the writers themselves were so deeply connected to what they were saying.
So, sure, it is writing in the sense that it’s the written word. Is it writing though? Ach. That’s almost not the right question to ask. The question is more like: why do we want to train machines to do this uniquely human thing in the first place? (Psst the answer is the profit motive! I know you are absolutely off-the-chain shocked!).
Why are writers pissed off?
It’s not just the idea of having your profession or craft taken from you by a tin can. It’s that your own work can be weaponised against you in order for your work to be taken away more efficiently. Rude.
Plus, these machines just aren’t very good at writing. According to tech blog Rest Of The World, AI companies have started listing jobs for poets, novelists and other creative writers to come train their bots. That’s great for often financially-starved artists in the short term. But long term? The aim is that those writers can be replaced with a much lower paid, much more casualised job of ‘AI writing assistant’ or “prompt writer’, or a similar bullshitised job title.
Whoever is training these machines is using work that we have created, without permission, to create a system which in turn is designed to destroy what we have made. It’s the perfect metaphor for the way the machinery of capitalism is fuelled by human misery and suffering. I guess there’s something poetic about that at least?!
Why some people think it’s ok
I’ve trawled hundreds of social posts and blogs trying to find out why some people think AI writing is a good idea and I won’t link to them because, wow, you do not need to see it. But as far as I can make out, it comes down to this: people don’t want to put the time, energy or effort into honing their craft, into finding their niche, their truth. They just wanna spin up a thing like magic: abracadabra, done!
To be fair, this is the same as it ever was. We all want to have the opus written yesterday. It’s just that some of us have the self awareness to realise that it’s the process of creating the opus that makes it an opus, and not the existence of the thing itself.
Recently, I was talking to Jesse Meadows, who writes the (brilliant) Sluggish newsletter, as they want to move towards writing fiction. We talked through their blocks and processes and eventually they said: “ohhhh so I guess what I’m doing is trying to skip the bad first draft and write something immediately good and it’s impossible.” I laughed. That’s exactly what everyone does. And that’s what the pro AI writing crowd want. To skip the long and winding process that actually makes art good. Like Queen Ursula K Le Guin says: “In art there are no easy steps”.
Why were the US writers striking and what happened?
Big up some good ol’ fashion industrial action! The 2023 WGA & SAG-AFTRA writers strike was the longest ever interruption to Hollywood production bar the pandemic. While the main focus of the strikes was about how writers are paid, the secondary issue was AI. Specifically, that AI should not be allowed to replace the writers. And guess what? The writers won. Under the terms agreed by the studios, any work generated by AI cannot be considered “literary material”, nor can AI be used to rewrite work written by a human writer, and nor can AI writing be used as source material.
This move pretty much kills dead AI in Hollywood. For now, at least. What are we to make of this? Probably that AI simply isn’t good enough yet for Hollywood execs to worry about it, but that doesn’t mean the danger is over.
Is anyone doing anything about this?
There’s a handful of ongoing lawsuits being brought against Meta by a whole bunch of big name authors. Sarah Silverman, Michael Chabon, David Henry Hwang, and Rachel Louise Snyder among others. They are all claiming that using any author’s work in generative AI training without their permission is copyright infringement. This appears to be a no-brainer, but copyright law is not always so clever and Meta has an awful lot of money behind them, so who knows how this one will play out?
Interesting side note
Remember the archive.org case? At the beginning of the pandemic, archive.org, the not-for-profit internet archive, launched the National Emergency Library. They lent out thousands of e-books to people who couldn’t access bricks-and-mortar libraries due to lockdowns and sickness. But the books were in copyright, and Hachette Book Group, HarperCollins, John Wiley & Sons, and Penguin Random House sued archive.org and mostly won.
It will be fascinating to see if this case can be used as some kind of precedent – if copyright applies to archive.org, then it absolutely should also apply to training generative AI. And if not, well, then we’ll know a thing we always knew about the people with money versus the people without. Rest assured we’ll be keeping our nerdy-writerly eyes on it.
Should I be worried?
Only in as much as you should be worried about the creation of malevolent sentient computer code Skynet, which we were SPECIFICALLY WARNED AGAINST DOING.
But this is a piece of string question. Some people have lost their jobs. Some people are saying they cannot use AI for their work because it’s simply not good enough. So, maybe yes a bit, and maybe no a bit.
Plus, it depends what you’re worrying about. The fact is that any writing is meant for human eyes. Lit mags want stories written by people. Publishers want stories written by people. Competitions want stories written by people. And perhaps most importantly readers want stories written by people. Anyone reading anything is someone who wants to find connection in some way – human connection. That’s the whole point of it to begin with.
Even when you get stuck in the kind of dystopian hellscape of a computer writing content so another computer can index it and bump it up the listings – at some point the efficacy of that content relies on human eyes landing on it. If it’s too shit for those human eyes to do anything with, then all the number one google spots in the world aren’t going to help you all that much.
And let’s be honest. You’re not going to stop writing just because ChatGPT can spit out some turgid cliched story. And who knows, maybe this whole thing will hasten a UBI for artists. Hashtag wishful thinking.
What can I do?
Dismantle the profit motive as a driving force behind the dissemination of art.
Questions (to the forums with ye!)
Is copyright even fit for purpose? How should we define the art we produce and our ownership of it? Should we even be able to sell the concept of ownership? WHAT EVEN IS PROPERTY?!
Come tell us your brainings over on the forums >>
Reading list
Why ChatGPT is a red herring for content marketers, from icrossing
These 183,000 books are fuelling the biggest fight in publishing and tech, in The Atlantic
How an AI-written Star Wars story created chaos at Gizmodo, in the Washington Post
A sci-fi magazine has cut off submissions after a flood of AI-generated stories, in The Verge