In mid-2025, a New York magazine article titled “Everyone Is Cheating in College” went viral. I think about that a lot, especially when I receive targeted ads that are basically some variation of “With our AI tools, you can cheat without getting caught.” Suffice to say it’s depressing.
However, the problem is not that students are “using AI.” I’m “using AI”, but it’s something everyone needs to learn how to do. The problem arises when students represent AI work as their own. At a fundamental level, the issue of academic integrity and the use of artificial intelligence in higher education is not a technical issue. It’s ethical.
I love generative artificial intelligence and use it for many different things. training. recipe. Outline and revision of papers and lectures. Multiple choice questions. To get the code, we need to tell R to convert the spreadsheet into a number of graphs. Track citations. Much more. The possibilities are endless. If used wisely, it can double your productivity. If used foolishly, the foolishness doubles. Discussions about academic integrity and artificial intelligence force us to seriously consider who we are and what we do.
The debate was divided into unhelpful camps. Some liken AI to a calculator. Some believe that AI is the end of human thinking. Both miss the point. The “just a calculator” crowd ignores how useful calculators and related software tools can be, relieving us of much of the burden associated with quantitative thinking. “It’s like a calculator” is true (in a sense), but it’s not reassuring. Knowing which button to press to make a parabola appear is not the same as knowing what a parabola actually is and why it makes sense. Those advocating the “end of thinking” ignore how generative AI is a powerful tool that can be used wisely. Are you an assistant? That’s great. Is it a replacement? it’s not.
However, the problem isn’t the tools. It’s the user. Like any other tool, people can use AI wisely or maliciously. In the hands of Manly Dunn and Paul Bunyan in Gravity Falls, an ax is a tool used to cut down trees and provide shelter. In the hands of Jason Voorhees of the Friday the 13th horror series, it becomes a tool for an entirely different purpose.
In 2023, when we were just beginning to meet and learn about our new AI overlords, I wrote an article answering a cynical student’s question: “When are we going to use this?” In the humanities and other studies that are not strictly vocational. My answer was (and still is) “Literally every time I make a decision.” why? The decisions you make are a product of who you are, and who you are is shaped by the company you work for. More generally, studying history, philosophy, literature, economics, and the liberal arts is training for keeping in good company and becoming a certain kind of person. That is, someone who has spent enough time working with the best things ever thought and written to be trusted with important decisions. It means becoming a person with good judgment.
This is a skill we practice poorly in a world where we can easily outsource our thinking to ChatGPT or Gemini. Let’s take an example here. If you haven’t seen the movie Alien, drop everything and watch it. It’s a classic among classics. If you’ve seen it, consider the scene at the end of the movie when Sigourney Weaver’s Ellen Ripley dons the P-5000 power loader suit to defeat the alien queen. She uses tools that amplify her powers, allowing her to accomplish things that would otherwise be impossible.
The way many students use AI is a lot like wearing Ripley’s Power Loader suit to the gym. You may be able to “lift” 5,000 pounds while wearing a power loader suit, but it is a mistake to think that the suit makes you stronger, and to think that you can lift 5,000 pounds without a power loader suit is a laughable self-deception, and a laughable lie for anyone trying to convince you that you can lift 5,000 pounds. Submitting your mostly AI-generated work won’t make you gain muscle, learn how to lift, or get stronger. While the muscles atrophy, they accumulate a huge number.
Of course, using AI can also be like having a spotter when you do a squat or bench press. I use AI at the gym as a kind of trainer that tells me what exercise to do next. That’s one way to use AI, but the way too many students use AI is like going to the gym and having an AI tool, a power loader suit, lift weights for you.
Tools like ChatGPT, Gemini, Grok, and Claude are supposed to free up our time and energy to do higher-level tasks, rather than hide the fact that we can’t. Technology has greatly increased my productivity. I dictated the original version of this essay into a Google Doc on my phone using wireless earbuds, then revised it using Gemini and Grammarly. What is the difference between that and submitting an AI-generated work? Using dictation tools and AI to generate and clean up essays like this is like using Ripley’s Power Loader to move heavy objects. Using an AI to create text and trying to pass it off as your own is like using Ripley’s power loader suit to fake training.
Thanks to ChatGPT, Gemini, Grammarly Pro, and GPTZero.me for editorial assistance.
