I have used Chat GPT, the artificial intelligence program that came on like a thundering herd of wild horses last year. But I’m not a fan of incorporating AI into my work, and I’ll tell you why.
I haven’t used it for my writing, per se. Once or twice, I’ve asked it to find bits and pieces about ideas I’ve already had. The AI programs essentially plagiarise from material that’s already on the Internet. So, if you type in a prompt like, “find the top three things people worry about” – a random example I have not used – it will return results from scouring the Internet. Then you could write a post about worrying and respond to the top three issues people are having. The idea is to make your article or blog post connect to more people.
That, to me, is an effective use of this tool that in my view, isn’t “there” yet. It’s still being tweaked and perfected. But taking material from other sources is troublesome if not downright illegal. The New York Times is suing the makers of two AI programs, including Microsoft, for billions of dollars. It’s a huge test of copyright laws. The Times alleges that ChatGPT and other AI programs are sourcing its articles, even the ones hidden behind paywalls, to “teach” themselves how to do journalism. If users can get this stuff from an AI program, they’ll be less likely to visit the Times.
AI Came to Work With Me, Uninvited
There I was, recording narration of an audiobook and something felt a little off. There was repetition – not exact, but enough to make me go, hmmmm. The book’s solutions were simplistic. It was full of metaphors, nearly one in every paragraph. It read like filler. And the text was generic, not like someone who had “been there”. Like I said, it just felt off.
I decided to research the author. Searching by name, the only hits were from Amazon where she has posted previous books. So I took a screenshot of her photo and conducted a Google image search. Lo and behold, she came up as a voice actress for a foreign company. Her credits included doing voice dubs of English movies in to the foreign language.
I sent the screenshot to the rights holder of the book. The person who hired me. Often, it’s an agent or another representative of the author. And I asked him what he made of it. His response was surprise. He said he had paid for a one-off AI photo. It was obviously being used by someone else, and he felt ripped off.
Wait – what? The photo was AI? The author wasn’t even a real person? I immediately wanted out of this project. I hadn’t done my due diligence. This was on me. But should I have to give a voice to someone who doesn’t exist if I don’t want to?
AI IE, FYI
AI doesn’t scare me. I don’t want to waste my time trying to stop it. But I sure don’t want to participate in making it more popular. Potential clients constantly ask voice-over people to help them create an AI simulated voice. For every Kat Callaghan who presumably gets properly compensated as the text-to-speech voice for Tik-Tok, there are thousands of others whose voices are taken and/or used for ridiculously low fees.
All you need to create an AI voice is sentences that contain all of the sounds in English speech. Someone who knows what they’re doing can replicate the alphabet and start trying to sell the AI voice or use it for their projects. Forever. And the VO artist can’t do anything about it in most cases. (Bev Standing is an exception. She fought Tik-Tok for unauthorized use of her voice and won. It appears Tik-Tok seems to have learned from that experience. Kat’s experience is evidence of that.)
There are many projects where the client specifies that they want a human because AI is irritating to a lot of people. On at least six occasions last year, I was hired specifically to replace AI. Listeners still mostly want to hear a human voice. Will they always? It’s hard to say. A lot of things that sparked initial outrage eventually became acceptable, or at least, tolerable. Self checkouts. Renaming Skydome. Kardashians.
My AI-Book Narration Decision
I had a dilemma. Would I narrate this AI-created book and give voice to a computer generated photo? I thought about it for a few hours. Ultimately, I decided that I’d given my word and signed a contract, so I continued with the project. Next time, I’ll do my research before accepting the offer. I had worked with this client before and made an assumption about the material. That’s on me.
This scenario is likely to recur. There’s nothing in the contract that says “all parties involved must be flesh and blood”. It’s my bias so it’s mine to watch out for. This is the world we live in now. As many wise people have said: Adapt or die. Professionally, anyway.