It is the Year of AI. In the past few months in San Francisco, I have seen a ballet about AI (in which AI is a sort of Pandora’s Box) as well as an opera (actually an older operetta, Cinderella, in which AI was uneasily squeezed into the plot.) The latter was accompanied by a talk about AI, in which I heard that food preparation will be a thing of the past, and we will have more spare time for…well, whatever humans do. (Probably half the audience lives on take-out, but no matter.) I also heard a composer/DJ claim that AI has enabled him to produce “a song per day” as well as all his marketing copy. In ten years, he promised, we’ll all produce our own soundtrack with our own music, films, you name it.
My own personal soundtrack? Well, that is a tall order.
Now, we’ve been training machines for decades to learn from our data (from books to sensory data.) We’ve trashed maps in favor of GPS, and library card catalogues in favor of Google. We’ve grown accustomed to customer service robots. As any technology grows (e.g., the automobile), some will worry about its downside (e.g., people walk less.) There’s much to say about data and privacy— but here, I’m limiting my discussion to two specific (and related) concerns about machine intelligence in the arts:
— AI algorithms will violate copyright or steal content
— AI will replace human artists altogether
Let’s take the theft (or intellectual copyright) problem. A friend of mine’s a painter (as well as a fine novelist)— like many painters, she does not have gallery representation and shares her images widely through social media. I’ve used a few as virtual backgrounds for my Zoom calls, which delights her. At the same time, she frets “AI” (or a company using AI) will grab her paintings and cast them into a sea of data. She is considering deleting her posts, in case her art will be stolen (as I mentioned, she’s happy to give them away digitally.)
In this view, AI is seen as a pilfering machine, in which vulnerable creators have ideas stolen, and morphed into…well, that’s where it gets blurry. Where will your art go? I asked her. What exactly do you think is going to happen to your paintings?
Someone’s going to sell my art, without my permission, she says. She says she’s losing sleep about this.
But, I remind her, anyone can copy anything without a computer. It’s nothing new. I can pick a magazine, copy a poem, plop my name on it, and presto, there you have it. In fact, several literary magazines (including ONE ART) have all been duped by a certain plagiarist. No sophisticated computer model, just good old-fashioned fraud. In yet another incident, the editor of Spelt Magazine posted recently:
It deeply saddens me on so many levels to have learnt earlier this month that Taiye Ojo, whose submission I awarded second place in the Spelt Poetry Prize 2022, is a persistent plagiarist of other people's poetry. Like other poetry plagiarists, he has repeatedly taken work known in one arena, collaged it or loosely reformed it, and submitted it as original work under his own name to magazines and competitions.
Think about it. No one would invest billions of dollars to do stuff that any college student or plagiarist can do (and are doing right now as I write.) As for forging paintings, that’s a large industry, and requires nothing more than the will (and some modicum of talent, although that’s debatable.) And once work is digital, anyone can save it at any time for any reason; no special software is required.
We could argue that “ideas” from art are being dumped into a machine-learning model, and the model will manage to earn money for a theoretical image without paying us for these ideas. Apparently, the model will be so clever that it can earn money when artists can’t. (If so, can I purchase it now? Please?)
Truth is, machine models require lots and lots of content, and, in the arts, canonical content. A poetry model, for example, could easily skip my work but must include the big guns like Frost. (Note: Sites that generate single-artist poems are not AI; they are amusing, but not intelligent. They can use a fill-in-the-blank type system à la Mad Libs, or take lines from existing poems and rearrange them— fun, but hardly complicated.)
So, what does “being in a model” mean? Take credit card data. Each time we use a card, our transaction is incorporated into a lot of statistical models. Those data are resold (to retailers, to trend predictors, etc.)— and allow credit card companies to detect fraud. Your general habits are part of someone’s model, just as your health data are part of another model. Likewise, a painting could (potentially) be one of a million or more fed into some image-generator.
In some theoretical future, perhaps companies will pay content creators for the privilege of inclusion in models. But for most creators, this payment will be minimal, because most artists earn little on their work to begin with; the model doesn’t need them all that much. (I pity the company that tries to model storylines: there are remarkably few plot formulas, and no one needs software to learn them.)
I’d like to move to the deeper fear about AI: language models or more precisely, the poems, stories, even novels they generate, will replace humans. Forget painting, forget poems, machines will do it all and, for some obscure reason, better than we do. If we don’t put a stop to these nefarious machines, who knows what will result?
This debate became inflamed when poetry publisher, DIAGRAM, awarded its prize to Lillian-Yvonne Bertram’s chapbook, A BLACK STORY MAY CONTAIN SENSITIVE CONTENT. The poet had, by her own statements, used language models (such as the Talk-to-Transformer text-generating neural network) in creating parts of the manuscript. The contest was deemed unfair (although its rules allowed experiments with language models) and there was a fair amount of hand-wringing. In fairness, the backlash against this chapbook ended quickly after people began to read the poems— Bertram uses her interactions with models in a critical, humorous way. Still, the initial furor is telling— it’s as if Bertram had violated the purity of writing.
Ironically, alongside this fear of AI is an explosion of “erasure” and “cento” and “found poetry.” An erasure poet takes text, erases part of it, and the result is considered an original work. “Found poems” often take text straight off ads. I’ve published poems based on other poems or letters— the resulting cento hasn’t manufactured itself from the original lines, I had to craft it line by line. I’ve used as many as 20 different sources to generate a single cento. I did this “by hand” because I write in notebooks. But I can’t see the harm in using AI software to find lines if and when AI becomes more accurate. (Currently, Wikipedia, ChatGPT and other AI software toss off huge amounts of misinformation.)
So, I’m generally an optimist about what AI will do for us— because I am very much aware of what it can’t do. AI models will surely replace certain types of jobs— in the healthcare field in particular. But language models will assist writers much as online dictionaries, rhyming dictionaries, and word processing tools do. Necessary tools perhaps, but just tools— unlikely to improve writing any more than typewriters did.
Take one exciting breakthrough: AI software is translating the 5,000 year old Akkadian cuneiform texts. There are thousands of tablets ready to be translated; and finally, we will have them. A stunning achievement, and without AI, almost impossible to conceive. But if in these tablets, we find a Gilgamesh-type masterpiece, we will demand the eyes and ears of a sensitive translator. No fancy algorithm is going to render Homer better than Fitzgerald, Fagles or Wilson. And people who understand the reality of machine intelligence wouldn’t dream of trying; it would be a spectacular waste of time and money.
Because there’s a simple truth. Even the most sophisticated machines lack what every human has — real feelings. Machines don’t cry. They don’t feel alienated. They can’t feel joyful or insecure. All of these rather ordinary human feelings are the raw material of art. We need characters that are lively, eccentric, scheming, and well, human. We need images that excite us, that evoke emotions and memory. Without those, there’s little point in poetry or novels or painting. So, let’s allow machines to learn as much from us as possible. After all, we made them. Perhaps it’s best to end with the final lines from “AI Duplex,” a poem that that Lillian-Yvonne Bertram generated with the Talk-to-Transformer website:
“I behold the screen. The chance to get right a life
implies a different kind of intelligence.”
*
About The Author:
Carla Sarett is a poet and novelist based in San Francisco. She earned her PhD from University of Pennsylvania, and ran her own market research company.
Interesting article, Carla! Have you read/listened to Tim Green's interview with Sasha Stiles, a poet who uses AI in her work? I was extremely skeptical, but I found she made many persuasive points, as well as being a fascinating person. I like how you address the cento and erasure poems here, too. Art has always incorportated a certain amount of borrowing, tweaking or even outright theft. But AI may be different in kind and also orders of magnitude.