Ever since ChatGPT burst onto the scene in 2023, generative AI has enjoyed the spotlight of fascination with technology. I loathe it, and my distaste for it appears to be justified. Amazon had to quickly revise its upload limits because so-called authors were generating oodles of low-quality books and uploading them to the Kindle Direct Platform, flooding the market with poorly written, banal books. The platform also added a checkbox for authors to self-declare whether their books were substantively created using generative AI. The honor system, by the way, isn’t working.
So, why not use AI to write my stories? After all, if I dump some prompts into the system, it will spit out the tens of thousands of words necessary to make a novel with a mere fraction of the effort required to actually write the story myself. Here’s why, paraphrased from Susan Joy Paul’s commentary:
- AI is legalized, tech-enabled plagairism.
- AI-generated content must be disclosed to Amazon when uploading to KDP. (Of course, many so-called authors don’t.)
- Any so-called author using the same prompts will get the same story and release nearly identical books.
- AI-generated content isn’t necessarily accurate. It makes stuff up.
- Using AI-generated content violates the reader’s trust.
- The author with unique perspective and voice makes the book special. AI merely repeats what others have written.
- Editing AI-generated content to sound like the author takes more time and effort than it would have for the author to write the story.
- AI-generated content cannot be copyrighted.
That last concern is by no means the least. Protection of intellectual property is important. In fact, that’s one key aspect of my pitch when responding to requests for proposals to write or edit manuscripts: I will protect the client’s intellectual property.
I recently read a book that was written by AI (although the author did not label it as such). While initially engaging, the writing never deviated from a strict cadence established on the first page. There was no ebb and flow in the story, no ramping up or release of tension, just a rigorous march from plot point to plot point without deviation. There were small inconsistencies I picked up (and that a human editor would have detected) but which the software missed. The overall result was, in a word, mediocre.
Assistive AI, generative AI’s close cousin, may indeed be helpful, but it, too, has its pitfalls. Assistive AI found in editing software will turn sparkling prose into banal writing. It often introduces as many errors as it corrects because it does not comprehend nuance, context, or slang. AI follows recognizes patterns and follows rules; it does not know when breaking rules make sense or is more effective than following linguistic conventions.
I’ll grant you that AI—both assistive and generative—is improving. But it will never have the spark of humanity. AI-generated content sets the bar for mediocrity.
When you read my books, you read stories I write. I labor over every word. I hear the characters speak in my mind. I envision the worlds in which they live. My stories are products of my imagination, not the byproduct of plagairized patterns. I hire a professional editor, too—a human being who adds her human perspective. We’re not perfect, but we are authentic.