top of page
Search

Yes, ChatGPT has a "tell"—and it could tank your career

Writer's picture: Catie PharesCatie Phares

As an editor, I’ve never been phased by the widespread fears that generative AI is going to gut the writing and editing industries and make my job obsolete. Not for a second. Frankly, I know what I do for my clients (business academics)—and I know AI can’t do it. 


But I am starting to be deeply worried about the damage it’s doing to many academics’ careers. I’m realizing that countless scholars still don’t understand the true risks of outsourcing their most important thinking and creative output to this burgeoning tech. 


First, let’s clarify what the biggest risk isn’t. The biggest risk for academics using AI isn’t getting caught using it, despite all the legal and ethical implications at play (although it’s certainly not going to do anything good for your scholarly reputation). To be honest, I’ve yet to hear of a single serious consequence for any of the many professors exposed for submitting AI-written papers to journals. And don’t bother avoiding certain words because someone’s (wrongly) decided they’re a “tell” that ChatGPT wrote something; the only reason AI overuses certain words is because we overuse those words. 


No, the true giveaway—the thing that I can always spot when AI has been used to write something—and the main risk in using ChatGPT for your most important scholarly output are the same thing: ultimately, AI can’t make good art. 


Whether it’s a painting, a drawing, an article, a speech, a sculpture, a song, or something else entirely, good art shines like a light in darkness. It’s a magnet for connection. Human beings are drawn to it. We have been since … well, forever. We like to look at good art. Engage with it. See ourselves in it. Digest it. Return to it. Talk about it. Build on it. Learn from it. In short, good art has soul, and it calls to the soul in us. I’m seeing this consensus emerge as AI becomes more and more prevalent.


This lack of soul (which makes sense, right?) is the real tell of ChatGPT. It’s something that not everyone can spot at a glance, especially if they’re not actively looking for it—an ominously hollow "word salad" vibe that is difficult to describe, but easy to feel once you’re attuned to it. 


It’s an eerie feeling made even eerier by the fact that, credit where credit is due, there’s a lot that large language models like ChatGPT can do shockingly well. I was floored when even the free version of ChatGPT accurately caught all 10 of my top 10 typos that I see in business research—and even more floored when it presented me with a pretty decent rap about editing (lol):



I’ll save the other verses for my live tour… 👑🖍️


But amusing as that rap is, there’s still something missing, isn’t there? An absence that highlights that all great writing, from sonnets to rap verses and yes, even academic papers, is far more art than science. Don’t be fooled by the innumerable grammar “rules” out there. Language (especially English) is ultimately a paint box of myriad colors, not a set of code blocks. Attempts to treat it like the latter always fall a little flat—and will fall even flatter as generative AI floods our consciousness with more and more soulless (not to mention often unreliable) content. 


And therein lies the main risk to academics who use it to write anything of importance: using AI is an excellent way to ensure that your writing gets no attention, creates no stir, sticks with no one, changes nothing at all. 


I’ll close with one more comparative exercise. What makes this piece of art beautiful and soul-stirring?



And what makes this piece of art comically unsettling (at best)?



The answer is hard to put into words—and even harder to put into code. After all, they’re both pictures of staring faces. They’re both fairly realistic depictions that clearly took someone a fair amount of time and skill to create. Yet we (human readers) can all see the immense difference between them. 


I’ve said it before and I’ll say it again: if you’re an academic, like it or not, you are a professional writer. You write for a living. So if you focus and rely on using AI instead of mastering the fundamentals of engaging, persuasive writing, it’s like spending hours a day choosing the best shoes in the lead-up to a grueling race. While you’re fiddling around with the different laces and colors and features of the shoes, many of your competitors are training, putting the hard work and sweat toward mastering the activity that will actually win them the race. And unfortunately, despite all the AI hype that suggests otherwise, there are still no shortcuts to mastery. (Additionally, just how warranted this hype is remains hotly debated; the folks at Goldman Sachs, for instance, still aren’t convinced.) 


In sum, if you want a piece of writing to tick the box marked “done” and quickly fade into the growing sea of formulaic content swelling all around us, use AI. 


But if you want a piece of writing to distill your unique contributions, capture readers, advance your career, achieve your goals, and change the world? You’re going to have to write it yourself. 


The good news is that you don’t have to do this alone. My team and I have been helping business academics succeed with better, more artful writing for 14 years now. At present, there are 3 ways to work with us: 


We’ve watched with immense pride as some of our clients have risen all the way from PhD students to associate professors at excellent schools (in fact, some are even associate deans now!) with powerful writing about their research. Please don’t hesitate to reach out if you’d like to see how we can assist you too!


Sincerely,

-Catie Phares


p.s. Is using AI always a mistake or a risk? Nope, not by a long shot: here is a brief guide to using it effectively as a busy academic.


Comments


Commenting has been turned off.
bottom of page