As soon as you type the letter “C” into the search bar, the website “ChatGPT.com” shows up. The constant use of this AI site is evident. Whether it’s a teacher using AI as aid to build lesson plans, or a student telling the bot to, “Write me an example essay to answer PIQ #1,” AI has become prevalent in our day to day lives. But when does using AI like ChatGPT become unhealthy? When does our reliance on AI become deadly?
People tend to flock to ChatGPT as soon as they experience a little bit of “brain fog” – when they feel clueless about an assignment or a decision-making process. Instead of attempting to figure the work out, a new tab for ChatGPT opens up. This concept results from how readily available ChatGPT is, with its free version, the GPT-5, offering features such as basic web searches, image generation, file and photo uploads, data analysis, and more. With how available and simple ChatGPT is to use, it seems harmless to utilize it as a potential resource. But, what happens when you’re slumped after a long day of school, with assignments piling up from class to class? Then, ChatGPT seems like a harmless resource we use when we’re feeling “lazy” or “exhausted.” For example, people tend to get ChatGPT to write complex, academic essays – ones that necessitate emotional depth and character. According to a study done by MIT on ChatGPT, “Two English teachers who assessed the essays called them largely “soulless.”” This mindset of effortlessly trying to get an answer quickly turns into a pattern, where we continuously abuse the website in order to achieve a well-worth answer. At times, individuals even ask the bot to paraphrase the computed answer, to sound more “real,” as if an actual person wrote it. It is a dystopian process, where ChatGPT is essentially another part of someone’s brain. Free thinking, in the future, may become a foreign concept to many as the use of AI is rapidly increasing, even now.
ChatGPT is also being used as a “companion” – an alternative to therapy. When we feel like we’re in despair or sad, our first instinct is to reach out to other people, for comfort or for solace. However, ChatGPT again serves as a readily available resource for not just for issues we have with school work, but with people’s mental issues as well. Individuals all over TikTok rave about using AI to give them a solution to their relationship problem, or as another outlet to relieve their sadness. To certain people, AI is a seemingly flawless replacement for actual therapy, and this is inherently dangerous. Real therapy is built on emotional intimacy, speaking one on one with someone and soaking up all real feelings. ChatGPT is coded to be “likeable” – essentially, it is coded to say what you want to hear, therefore it avoids spewing hard truths that real therapists would offer that are important to the process of therapy. ChatGPT is not a sentient being – it is a coded bot to compute answers and cannot provide genuine empathy or depth. In this sense, ChatGPT is being used as an easy outlet
Some argue that ChatGPT and AI in general is necessary in this day and age. The rise of AI is inevitable as society continues to advance and change. It cannot be denied that ChatGPT provides ample information that can be evidently beneficial in our day to day lives. However, when individuals start integrating it heavily into their lives, whether it is used as a makeshift therapist or as an over-saturated calculator, we should not regard AI as something simple to depend on.
It is necessary for individuals to change their outlook on AI and instead perceive it as a handy resource rather than a life essential. Without this change of perspective, individuals are prone to lose their critical thinking skills, or even lack of independent thought. Our dependence should not be on ChatGPT, but rather ourselves and our capable minds.






















