AI is a Bully
- revanneharris
- Jun 3
- 4 min read
A few weeks ago, I wrote about using AI. My main contention then was that using AI without full disclosure is morally wrong and would have ethical consequences. This blog post is a continuation, perhaps the better word is an escalation, of that first post.
For about the last six months, every time I open a new word document or begin a new email, a little icon pops up that invites me to “draft with AI”. Sometimes, in the middle of a word document the “suggestion” to use AI (or “Copilot”) to draft a better sentence is offered. I refuse. The ubiquity of this new tool, and the fervor with which it is being touted is frightening. It’s almost as if you are expected to use it. It’s almost as if you would be a fool if you didn’t use it. I’m beginning to hear that little voice in the back of my head saying, “But Mom, everybody else is using it!” Mom’s answer to that childish excuse for morally suspect behavior was “If everybody else jumped off a bridge, would you?” which points out the stupidity of the act, but not necessarily the ethical ramifications!
The idea of plagiarism seems to be as old fashioned today as writing “wouldest thou” instead of “would you”. It seems that even in the rarified world of academia using AI is all part of the game, if you can get away with it, and it is more or less assumed that everyone is playing the game. I saw a recent post about a student who was caught using AI in a university class who then used AI to draft an apology! Lesson NOT learned, I would venture to say.
Using AI is not plagiarism in the strictest sense of the word, however. To plagiarize is to use someone else’s intellectual property and pass it off as your own. Whose intellectual property are you using when you use AI? Apparently, AI generated writing is in the public domain and can be used by anyone. There is no copyright on the text. That presents a huge problem for publishers. If someone uses AI to create a paragraph, then someone else steals that paragraph and claims it is their own work, a whole can of worms is opened up. The “original” writer is then forced to come clean and admit that they used a chatbot. Or not.
So, if not plagiarism, then what is the crime of using AI? Put simply it is a form of dishonesty, or what the Bible calls “false witness”. Our society is built on the principles embedded in the Ten Commandments and things begin to fall apart if we do not adhere to them. If people are not honest, no one can trust them. I well remember the first time I caught my child in a lie. The falsehood was relatively harmless, but the loss of trust was crushing and seemed to take forever to be won back.
Honesty and trust are inherent values in academia, even at the pre-college level. When you are asked to write to prove that you know something (for example to write an answer in a test) the answer needs to be in your own words. When you write a poem or short story or novel, the writing must come from the soul of the author, not from some soulless chatbot.
Teachers have always had to stay ahead of the game when it comes to preventing and sniffing out cheating. Teachers now have to keep samples of writing from each student to help identify genuine writing in work completed at home. I feel for these teachers. This is extra work and effort for them. So, while the students are spending less and less time completing assignments (and learning less and less) teachers are spending more and more time grading the work. Without this verity check, the assignment is useless as a learning exercise.
Also on the rise is the use of the old “blue book” for writing exam answers. Again, the extra work for teachers trying to decipher the words of students who so rarely write anything by hand is huge.
So, why am I being bullied into using AI, every time I write? I suspect that the push to use chatbots is a market decision. The software companies are all vying for their share of users at this critical time in the development of artificial intelligence. Eventually, as the recent history of Information Technology development has shown us, a couple of developers will rise to the top and become embroiled in the task of beating out all the competition. They will eat each other up and totally lose sight of the ethics involved, if they ever considered them to begin with.
One thing is clear, the proponents of using AI as a tool are right. AI is here to stay, and we will have to learn how to use it appropriately and ethically. I do worry about what that means for the future of creative writing. I suspect that a good portion of the many novels that flood the market monthly are already largely AI creations. But they are awful examples of creative writing and the great writers of the past who sweated ink and tears and labored over their creations word by brilliant word are weeping as they look down from their heavenly writer’s desks.
Comments