By: Charlotte Fox
College students and staff today are poised on the brink of a critical technological frontier. Since Chat GPT’s initial release in 2022, all students have had to adjust to a new academic landscape, whether or not they have personally used the program. At Trinity, college staff have scrambled to keep up with the rollout of Chat GPT and the many dubious AI-detection sites that have followed in its wake. In the last two and a half years, students have received a confusing mix of threats, ethical talks, cautionary jokes, and supportive messages concerning our own use of the platform. Some students have had to attach statements to their essays acknowledging that Chat GPT may be used when cited, but for others, the statements warn, “Just as AI tools are evolving, so too are AI-detection tools. Turnitin has announced new capabilities to appear by the end of this year. Improper use of chatGPT now could come back to haunt you later.”
The growing consensus seems to be that lecturers should attempt to work with Chat GPT, rather than against it. More and more classes have purported the site as a helpful tool, wagged their fingers once more about the issue of plagiarism, and simply moved on. Many lecturers have been instructed to “Chat GPT-proof” their assignments, by changing up their questions or requesting writing styles that are harder for the chat-bot to replicate.
With all of the warnings, ethical debates, and semi-conclusions Trinity has reached on the regulations and morality of AI platforms like Chat GPT, it is shocking how often the most lethal component is left out of the conversation completely. The environmental cost of using models like OpenAI’s Chat GPT is under-discussed in our classrooms. Alongside the AI-centred plagiarism warnings that accompany almost every module’s set of guidelines, should Trinity’s lecturers be just as responsible for reminding students of the environmental risks that come with using AI?
The uncomfortable fact is that artificial intelligence is not artificial at all: it relies on a tangible substructure that has extreme planetary consequences. The argument could be made that colleges have as much responsibility to keep students informed of these as they do of the academic consequences.
The Material Basis of AI
In a step towards mapping out what exactly Chat GPT’s role should be in the scholarly landscape, we must outline one of its more concealed features: AI’s material basis, and its environmental cost. Open AI itself does not disclose the number of servers and computers needed for its models to run, the number of data centres it uses, or any specific information regarding the amount of energy these data centres consume. However ambiguous the exact ecological impact of Chat GPT specifically is, it cannot be removed from the impact of AI models in general. Machine learning models require significant electricity during their training phase to power servers and cool data centers.
Beyond training, it is known that AI systems like ChatGPT need substantial amounts of energy for daily operations. Data scientist Kasper Groes Ludvigsen has estimated that ChatGPT produces approximately 522 tCO₂e (greenhouse gas emissions) per search query. That is equivalent to the greenhouse gas emissions from about two million kilometres driven by the average car. Given the high volume of user interactions, this means the model could generate up to four tons of CO₂ (carbon dioxide) per day. Since the internet largely depends on fossil fuel-generated electricity, there is a high probability that AI operations are fueled by non-renewable sources, raising serious questions of the sustainability of frequent Chat GPT use.
The “Birthplaces of AI” are described by political scholar Kate Crawford as the places where minerals needed to build and power computerised systems are found. One example is the Silver Peak lithium mines in the US state of Nevada – an expansive underground lake in a spot somewhere between Death Valley and Yosemite, swallowed by desert. Lithium is an essential component of rechargeable batteries, though an even more essential element in our ecosystems. Once made, these batteries don’t live forever. They die, more lithium is mined, and the battery carcasses pile up. Tesla, for example, is responsible for half the planet’s total lithium consumption. Smart devices, with their lifespans, see a similar fate as they degrade over the course of just a few years and then are tossed into vast technological dumping grounds in countries such as Pakistan and Ghana.
More ‘Birthplaces of AI’ include lithium mines in Bolivia, tin mines in Indonesia, more mineral mines in Congo, Australia, and so on. Rare earth minerals are needed for everything from earphone speakers and camera lenses, to GPS satellites and military drones. Not only is the mining of these minerals depleting our planetary resource bank, but it is also a source of great local conflict and global violence. The world’s top financial powers vie for control over the zones that contain these precious minerals, bringing war and devastation to the target and surrounding areas. In Congo, mineral mining funds militias, which have kept the region at war for years. The labour practices in the mines themselves have been referred to as modern slavery. The site of an artificially created ‘black lake’ in Mongolia contains toxic chemicals that have poisoned and polluted the area, forming mass quantities of ammonium, acidic water, and radioactive residue. These mineral mines and toxic lakes existing in deserted, remote locations contribute to the idea that the cloud is an intangible, artificial creation. Distancing the data centres from populous areas furthers public ignorance about the real, material basis of artificial intelligence.
As we know, AI only profits some because the cost is shunted onto others, in both present and future generations. Think of the data centres moving to Ireland en masse (there are currently 95 and counting), and tech giant offices that have cropped up in the Dublin Docklands, driving rent prices through the ceiling. Or similarly, the ‘titans of tech’ that have infiltrated San Francisco and the wider Silicon Valley, pushing thousands of people out of their homes with no rights to basic services, resulting in a massive human rights violation condemned by the United Nations.
An AI system cannot function without a battery, and so it can not function without the mining of precious minerals, sending the planet into further degradation with each extraction. So much of modern existence has been crystallised into data, and backed up to ‘the cloud’, without much consideration for the material cost of such a process. Kate Crawford explained in an environmental impact essay: “The cloud is the backbone of the artificial intelligence industry, and it’s made of rocks and lithium brine and crude oil.” In their article “Dark data is killing the planet – we need digital decarbonisation”, Professors Jackson and Hodgkinson explain the notion of ‘dark data’ – digital photos, files, and recordings used once or twice, and then forgotten in a vast cyber-abyss. Our data that piles exponentially by the day has a corporeal counterpart, requiring more and more energy to store. They link all of this digital litter to its large carbon footprint, explaining: “Even data that is stored and never used again takes up space on servers – typically huge banks of computers in warehouses. Those computers and those warehouses all use lots of electricity.”
All over the world, forests are bulldozed, miners are worked to death, and animals are driven to extinction – all to feed the bottomless appetite of AI’s expansive supply chain. ‘The cloud’, despite its atmospheric namesake, relies not only on earthly minerals but copious amounts of fossil fuels to keep itself running. AI, and in fact the entire tech industry, maintains a public image of environmentalism and hopeful, tech-based climate solutions. In reality, electrical energy is consumed by the bucketload for these computational systems to function, and they spit carbon emissions back out. How much energy exactly the AI system consumes is not public information – the corporations involved are the close-fisted gatekeepers of this data. Meanwhile, Microsoft, Google, and Amazon all licence their AI models, engineering labour forces, and infrastructure to fossil fuel firms.
Finally, there is the logistical factor. Global tech commerce comes at a cost of its own. The cargo shipping fuel, toxic waste in the oceans, and the low-paid employees who do the dirty work– all of this adds to the earthly, physical trail of destruction platforms like Chat GPT leave behind.
The Ethics and Ecology of AI-Powered Learning
Students are increasingly given the choice to use AI platforms for their assignments, which is undoubtedly a complicated and difficult situation to navigate for school faculty and administrators. But students should be given the opportunity to make informed choices about using AI, which includes the environmental costs of generative AI’s many systems and models. Trinity’s official website includes lengthy chapters for both staff and students surrounding the use of AI and its ‘limitations and capabilities’, including many links to further research and guidelines on the topic. Yet, not a single one of these pages contains any information on the environmental, or material, side of the equation. Trinity, like many institutions, has grappled with the ethical and academic implications of AI, yet the environmental consequences of generative AI remain largely absent from the conversation – an omission that must be addressed if students are to make truly informed choices about their use of these powerful new technologies.

Leave a comment