
Recently, Drexel University announced it had signed a deal with OpenAI to grant students access to ChatGPT Edu – a version that features higher usage limits and more privacy.
They reasoned this by saying that Drexel “has long been a leader in technological innovation” and is therefore doing a lot of research on artificial intelligence.
As if not enough students are using ChatGPT already and giving it cringe nicknames like “Chatty,” now Drexel has to come in and encourage people to use this awful technology that has way too many disadvantages to justify any of the slightly positive aspects.
By using the ChatGPT Edu version, the cost per person is significantly lower than if every person were to purchase it themselves, estimated at around $2.50 per student per month. Most of that money is used by ChatGPT to expand its computer infrastructure, building data centers and funding training programs. Both of these things have severe environmental and ethical consequences, as I will talk about in this article.
Even if a handful of students bought the premium version themselves, should Drexel not have done this deal, it would be a lot less money, incentivizing problematic behavior.
First, it is literally destroying the environment. The AI data centers run hot and therefore need a lot of water to cool them down. It is estimated that one 100-word prompt to ChatGPT consumes around 500 milliliters, wasting roughly one bottle of water per search. Per day, big data centers need roughly five million gallons, equaling the water usage of a small town.
The problem is so big that it is actually speeding up the loss of the Earth’s natural water reserve. So, in that regard, AI is terrible because it is ruining the planet by taking away drinking water.
Second, it steals from artists without their consent or compensating them by, for example, downloading digital libraries and then training AI models with those books. Authors put their blood, sweat, and tears into their personal intellectual property, and it is not fair to them that other people get to mimic their writing style and claim credit for that if they use AI to write something. The problem with that is that you can use AI to generate stories or even novels, meaning that real human authors might have to compete with an AI market based on their stories in the future. There have already been far too many examples of AI-written books being published, and it is very sad to see that the work humans have put into writing books is being stolen by people who have ChatGPT and a dream to write their books.
The same thing is happening with artworks, where entire paintings are fed to the technology and used to spit out similar paintings when someone enters them as a prompt. Ethically speaking, I think it is quite sad that this is happening because it steals from artists and makes the entire field of work very unsafe for the future. AI can bring out thousands if not millions of books every day, while humans take way longer, so it would be easy to infiltrate the market with AI slop, which would be awful because AI only knows the information that it has been fed, meaning it is incapable of creating original thought. But, it is a lot quicker at writing books than humans, so it would be easier and so much cheaper to get books released. Of course, publishing companies would be quite interested in not having to pay authors.
Just by using OpenAI, Drexel is picking a side and it is very disappointing to see from a university that hosts so many artistic and creative degrees. Artist rights should outweigh the peer pressure to jump on the AI bandwagon. How can Drexel justify making students pay thousands a year in tuition to learn creative arts such as creative writing, songwriting, and animation, while at the same time pouring thousands of dollars into a machine that could potentially replace these people in a few years? It is not even because the machine is better, but because it is cheaper for production executives if they do not have to pay a screenwriter when making a movie, but use ChatGPT.
This does not just apply to writing books, but also essays or papers for school. The words and sentences AI writes for you are stolen by academic writers without them receiving compensation for it or being asked for consent.
Now you might say that it is a good thing because people who cannot draw or write finally have the opportunity to awaken their visions to life, but writing and drawing are skills that can be learned. No creative came into this world being perfect at their craft; they put in time and effort to become good. And especially nowadays, there are so many resources to help you learn, so many YouTube videos and books that explain various aspects of creative skills. So, if you have an idea about an artwork or a story, learn how to write or draw it yourself or hire human creatives to do it for you. There are plenty of ways for you to do that without using AI.
Additionally, while there is not much research on it yet, existing research shows that an overreliance on AI may cause cognitive decay because people are not using their brain as much.
In times of both a literacy and a media literacy crisis, this is worrying because people are unlearning how to, e.g., interpret sources – a really important skill both in a university context but also when it comes to the news (being able to interpret a newspaper’s political leaning by reading an article, etc.).
Artificial Intelligence is also dangerous in an academic context because it is not very good at citing correct sources yet. There have been plenty of academic articles already published with made-up sources, which is illegitimizing research across the board, especially if these made-up papers are then continuously used by AI, infiltrating platforms like Google Scholar with articles that do not have proper scientific sources backing them. Even if peer review could maybe make this problem go away, is this really a world we want to live in, where you have to peer-review every single source because it might be made up?
I get that using artificial intelligence is faster and takes less effort than manually writing a paper or an essay, but the facts that it destroys the planet, steals from artists, and further contributes to the media illiteracy crisis definitely outweigh any of the positives for me. I do not think that people should use it, and I am very disappointed in Drexel University encouraging students to use it because you are supposed to learn how to deal with sources yourself, not have a machine do it for you. You are supposed to learn how to write scientific papers, not have a machine do it for you. Even if you write your paper the night before, strung out on caffeine with no sleep, that is still so much better than ruining the environment and unlearning basic skills just for your own convenience.
