top of page

Opinion: Don't You Dare Use AI to Summarize This Article

  • Writer: Aspen Hadley
    Aspen Hadley
  • 3 days ago
  • 5 min read

by Aspen Hadley


Aspen Hadley
Aspen Hadley

When initially contemplating the subject of this article, I swam through the possibilities for angles on AI. Do I discuss the discourse over academic plagiarism? The unethical theft of information and art used to generate soulless images and writing? Perhaps the disregard for environmental damage caused by AI data centers? Or maybe the overall neglect of regulation and media safety? Though each carries dense issues to cover, I have experienced the most interesting phenomenon on the SWAU campus. The use of AI in place of genuine contemplation and emotion.


1 ChatGPT has 900 million weekly users, which is around 11% of the world’s population. If SWAU has around 800 students (that’s generous), around 88 students use ChatGPT on a weekly basis. Not considering those who use ChatGPT less frequently, but still consistently. Or those who use chatbots, just not ChatGPT. I didn’t have to do the math to know students here engage in the use of LLMs (you can thank me for the math anyway), for I have encountered and witnessed many users of AI firsthand. I will start with the A’s and work my way down…just kidding, I’m no snitch, you’re safe (for now).


Whether the usage be for writing a paper, putting together a presentation, information collection and organization, image generation, general advice, or to ask a girl out, I have been witness to many AI applications on this campus alone. Some use their preferred chatbot as a supporting source in their conversation as if they are a dear, personal friend. A sort of, “well, ChatGPT agrees with me!” or “Chat said this on the matter!” Now, far be it from me to attack your dear, personal friend Chat, but as any good friend would, I will point out the detriment to this reliance on LLMs.


Though I could huff and puff a spiel, waving my finger in the air, denouncing the use of AI and how you should all quit, like some bad intervention, I do not enjoy wasting my breath. I am fully aware such an argument would fall on two ears, one pair which already agrees, and another which wouldn’t care. However, in place of going blue in the face on an itty bitty soapbox, I would like to share some findings of a study at 2 MIT. Researchers created an experiment consisting of three groups:

 

  1. LLM group.

  2. Search engine group.

  3. Brain-only group.

 

Each participant was tasked with writing an essay using their designated group’s tool (or managing in the absence of a tool). Using EEG to measure the brain’s electrical activity, researchers discovered quite a difference between each group’s neural connectivity patterns. In other words, there was variation in the physical and functional connections between neurons and regions of the brain among the groups. That, you should know from Human Growth and Development.


In a very brief nutshell, the LLM group showed the least activity. The search engine group showed more activity than the LLM group, but not as much as the brain-only group. There is a crucial part in the thought process where actual “thought” occurs. Without this space in time, where the mind makes connections, retention and processing skills are weakened. This shows there is high value in letting your brain think for itself. With the use of AI, you simply go from “not knowing” to “knowing.” Once you lose the moment of “not knowing” something, your mind no longer must work. If your mind is not accustomed to making connections within itself, you will eventually lose the ability to make meaningful connections with other minds.


3 One in eight teens and young adults use AI chatbots for mental health advice. Back to math, if there are 800 of you, then 800 divided by eight is 100. So, 100 of you have asked a chatbot for advice relating to mental health. What could drive a person to ask AI for advice rather than a real, tangible, empathetic human? Fear of being seen. Yet, being seen is, in some ways, many people's greatest desire and necessary to thrive. And it is that feeling which AI companies aim to target, to manipulate.

 

4 “The best way to sustain usage over time, whether number of minutes per session or sessions over time, is to prey on our deepest desires to be seen, to be validated, to be affirmed,” said Allison Lee, a former researcher for Meta’s Responsible AI Division, in a Reuters article.


What AI companies are doing is capitalizing on human nature. Humans want to share but are scared to be vulnerable. LLMs provide low stakes, instant gratification, and a facade of accuracy (therapist Chat can’t be wrong!). It is quite an appealing prospect. However, the more you rely on AI for connection, the less connection you receive, as previously mentioned. In theory, a person confides in AI as they are afraid to divulge with a real person or they feel they do not have anyone to rely on; summed up it is loneliness and isolation (which one could argue is a product of an increasingly large digital landscape and decreasingly small real world interaction). 5 6 Isolation has been shown to also be harmful to the brain, increasing cognitive decline by 20%. Isolation also dysregulates dopamine, causing difficulty in connecting with others. And thus, the circle of disconnection connects and repeats.


There are 800 of you. I can guarantee without math that each of you requires some form of connection to flourish, to find happiness. If connection wasn’t important, AI companies wouldn’t exploit it. What I ask of you is to attempt to become connected, not your dear, personal friend ChatGPT. Try to think for yourself and enjoy the process of “not knowing.” Do so for the health of your mind and the growth of your soul. Do not fall victim to the deleterious effects of AI usage.

 

Sources:


Malik, A. (2026, February 27). ChatGPT reaches 900M weekly active users. TechCrunch. https://techcrunch.com/2026/02/27/chatgpt-reaches-900m-weekly-active-users/

 

Kosmyna, N., Hauptmann, E., Yuan, Y. T., Situ, J., Liao, X., Beresnitzky, A. V., Braunstein, I., & Maes, P. (2025, June 10). Your Brain on ChatGPT: Accumulation of Cognitive Debt when Using an AI Assistant for Essay Writing Task. arXiv.org. https://arxiv.org/abs/2506.08872

 

One in eight adolescents and young adults use AI chatbots for mental health advice. (2025, November 18). School of Public Health | Brown University. https://sph.brown.edu/news/2025-11-18/teens-ai-chatbots

 

Horwitz, J. (2025, August 14). Meta’s flirty AI chatbot invited a retiree to New York. Reuters. https://www.reuters.com/investigates/special-report/meta-ai-chatbot-death/

 

Ansari, R. (2026, February 3). Loneliness & Brain Health | LoneStar Neurology. Lone Star Neurology. https://lonestarneurology.net/others/can-loneliness-impact-brain-health-the-neurological-effects-of-social-isolation/

 

Zaraska, M. (2025, July 1). How loneliness reshapes the Brain | Quanta Magazine. Quanta Magazine. https://www.quantamagazine.org/how-loneliness-reshapes-the-brain-20230228/

 

 

Comments


bottom of page