In my last blog defending em dashes from AI, I promised another blog on ways AI can ruin writing.
Again, I will say that I do not hate AI. I use it. Mostly for research, summarizing things I’ve written, outlining, and for feedback from custom GPTs designed for that. But I am not a fan of AI writing…or any AI-generated creative acts (music, video, etc.). They simply aren’t good at it.
Yes, they may be getting better, but I will always argue the need for humans to be the driving force in creativity and the arts.
That being said, I thought it would be helpful to start with a few tips for using AI if you are wedding to your GPTs. Here are a few measures you can take to protect yourself, if you are going to use AI to do your writing:
- Fact check everything—AI likes to make things up
- Learn how to provide detailed prompts to get better results
- Break your tasks up into smaller pieces—AI does not do well with compound tasks
- Train a custom GPT on your voice, brand, business, etc. to get the best results
- Add in personal touches and stories on top of the AI writing
- Edit everything…multiple times
Now we can get to how AI can mess up.
How AI Ruins Writing
There are always signs that AI wrote a piece, whether it’s an email, post, or speech. The AI writing stands out. And that is an issue by itself. But let’s break it down. How can you tell AI wrote something? (Beyond it’s reliance on my beloved em dashes.)
1. Repetition—Redundancy—Replicate
AI loves to repeat itself. It will often list or describe the same item multiple times in a paragraph, adding nothing new. It does this with both ideas, phrases, and words. Apparently, it can’t see or comprehend the duplication. We can.
Now it’s one thing to repeat a word for rhythm or an idea for reinforcement (usually with additional details for clarification). But it’s another thing to say the same thing again and again.
2. The Rule of 3—but in a scary way
Someone programmed poor ChatGPT and Claude with the power of three, but didn’t train them to use it properly. All it knows is that humans like things in three, so it adds three things to each explanation or list it provides paying no attention to whether or not the three items go together. It doesn’t matter if they make sense. It doesn’t even matter if they say the same thing. It’s three things—so it’s okay, right?
Not so much.
3. Sentence Structures Don’t Vary—they are very formulaic
The sentences tend to be the same length, which means there’s no rhythm or cadence to them. They all sound a bit monotone.
Writing should pull the reader in with tension and rhythm. This means pairing longer sentences (that slow the reading) with shorter sentences (that speed it up). It means varying the sounds of the writing. AI misses most of this in its writing.
4. Heavy Reliance on AI-Favored Words
AI loves its preferred word list. Someone programmed it to love words like delve (which is a great word I’ve used for years), dive, furthermore, consequently, in today’s (insert adjective here) world, and adverbs of all sorts.
You can find lists of AI-favored words all over the internet and, in particular, on Reddit.
It’s AI’s way of trying to eliminate those words from human writing. And that’s a shame (similar to what’s happening with my beloved em dash). Stand firm. If you write well, it won’t matter if you use an AI-preferred word.
5. They Can’t Stop Themselves
They praise and fawn. They go on and on about absolutely everything. Have you ever seen such a brown-noser as AI? All you need to do is respond to it once to find out just how amazing you are.
They also provide way too many details, many of which aren’t relevant or in any way how a human would respond to the subject.
6. They Lie
If you aren’t careful, they will make crap up. In fact, ChatGPT used to come with a warning about hallucinating facts. Not sure if it still does. But it’s true. Even when using custom GPTs, you have to be careful. Go slow and stay on top of the facts.
It’s better if you split things up into tiny tasks so it doesn’t get confused.
It does help to provide the source material or data and restrict it to using what you provide. If you can’t do that, be sure to fact check everything against reputable sources.
7. They Can Be Overly Formal and Boring
If you don’t train them, they kind of stink. Even when you do train them, it’s often faster to write it yourself. That’s what I do. I might have my custom GPTs help me fill out my outline, but I don’t have it write for me. It takes too long to edit their mess and make it sound like me, even with trained bots.
They can’t help but sound like bots. That’s what they are.
But they do summarize well. Stick to that. You need a human for that quirky sound of voice and idiomatic expressions that don’t involve transportation or music. (Have you noticed they love any metaphor or analogy with trains, planes, and orchestras—or is that just with me?)
And for those of you who love AI, yes, you can train them to sound like you and their writing can get better. But is it worth the time and effort? Is it worth letting them think for you? Write for you? Put words in your mouth?
Not to me. But if that’s what you want, then train them well, because otherwise you risk boring your readers. It helps to know how to write well and structure for the brain even if you don’t plan to do all your own writing.
8. They Have a Dearth of Details
AI lacks imagination. They will fill your writing with fluff and weasel words, but when it comes to offering details they fall flat.
They skim the surface, offering facts you could glean from a quick Google search and nothing more. They don’t lean into subjects or offer deep looks into topics. That’s up to you.
If you want to include more details into your AI-driven writing, then I suggest using Perplexity to do your research and then copy what it finds into ChatGPT, Claude, or Gemini to write. At least, you will get a richer description of the topic than you would skipping this step. Better yet, upload your own sources into NotebookLM and past the summary of your topic into your GPT and restrict it’s content to that information.
9. New Definitions or Uses of Words
AI likes to insert words in ways we don’t normally use them. The words sound almost right, but not quite.
For example, they might say something like:
when taking photography
Instead of
when taking photographs
Photography doesn’t sound right with the verb taking. If it had said, when doing photography, we’d be okay with it. But it often struggles with matching verbs, nouns, and adjectives. Be on the lookout for odd word choices. It might be the sign of a bot.
10. If It’s Not in the Prompt, It’s Not in the Answer
AI relies on the prompt to deliver results. If your prompt lacks logic flow or details, then your response will lack those too. Or the AI will make something up and insert it without alerting you.
I once asked ChatGPT to merge two lists and make a single master without any duplicates. It gave me one list, but I found two sets of duplicates and the numbering was off. Then I discovered three items from my second list that never appeared on the master list. I ended up having to do it by hand after three problematic rounds with AI.
Maybe it was my prompt? But I don’t think so. All I asked was to merge the two lists and remove any duplicates.
AI struggles sometimes.
Good prompting can help. But even then, you will have to edit what it gives you and add in personal touches.
Remember that AI is not the subject matter expert—you are! All it will give you (outside of Perplexity and other research-based AIs) is what can be found on Wikipedia and similar searches. And don’t get me started on how old the research is that it uses.
11. They Can’t Feel
People need emotions—humor, tragedy, tension, surprises. These are all things AI cannot provide. No matter how they try to fake it. You can always see when they try to feign emotions. It hits in that uncanny valley for us and is kind of creepy.
As Robert Frost said, “No tears in the writer, no tears in the reader. No surprise in the writer, no surprise in the reader.” If you want your readers to feel something, you need to infuse it with emotions. Your emotions.
12. The Biggest Flaw—AI has no stories!
Only humans have stories. It’s how we operate. We depend on them—for knowledge, safety, connection, community, and so much more.
Stories are paramount and AI can’t participate. Those have to come from us.
When AI tries to tell a story, it comes off weird. The conclusion often doesn’t follow the story told (chalk that up to flawed story logic). And they are not convincing as a storyteller.
If you want to learn why stories are so important, why they work and how, sign up for my upcoming Neuroscience of Story Masterclass. As a bonus for early buy-in, I am offering 50% off if you buy from the wait list. No obligation, just perks. This deal expires once I go live with the course (later this summer). You can join the wait list here.
