A new report by Stanford University researchers finds that just training the model behind the popular artificial intelligence (AI) chatbot ChatGPT released emissions equivalent to those of 9 cars over the course of their lifetimes, adding another layer of scrutiny regarding the future of humanity on Earth to technocrats’ promised AI revolution.
According to Stanford’s Artificial Intelligence Index, it took the equivalent of 502 tons of carbon dioxide emissions to train GPT-3 last year, the model developed by OpenAI that powers ChatGPT, and close to 1,300 megawatt hours of power.
As the researchers calculated, this is the equivalent of the lifetime emissions of 8 cars — or 109 cars’ yearly emissions — and enough energy to power an average U.S. home for over 120 years. Of the four models that the report scrutinized, GPT-3 released the most emissions and required the most power consumption.
Put in perspective, this is a drop in the bucket compared to yearly emissions across all sectors in the U.S. But it is also potentially just a drop in the bucket compared to the overall impact of language learning models and AI as a whole.
The data on GPT-3 represents the emissions of only one model out of many language models, and doesn’t take into account emissions associated with the computers manufactured for the model’s training, emissions from OpenAI’s newly released GPT-4, or, crucially, the tech sector’s plans to begin massively scaling up the use of language models and other AI technologies.
Like much else in emerging tech, it’s difficult to tell how big of an impact such technology will eventually have on the environment and society at large; Big Tech and AI in particular often make huge fanfare out of technology’s ability to solve humanity’s big problems and then ultimately underdeliver. Manufactured excitement around technological solutions in the climate sector alone, e.g. carbon capture, is often generated by people or groups, e.g. the fossil fuel industry, with underlying motivations to capture more funding.
But researchers have raised alarm about AI’s polluting potential, saying that any purported benefits could be massively outweighed by climate harms; it could be dangerous to scale up such technology in a time when countries are supposed to be massively drawing down emissions. In its article on the Stanford report, Gizmodo likened the emissions potential of running and developing AI models to emissions from the mining of cryptocurrencies; mining bitcoin consumes more energy than the entirety of Norway yearly.
Likewise, the carbon footprint of GPT-3 is not dissimilar to the training of other AI models, and the development of these technologies is consuming a large amount of electricity. Researchers have estimated that Google, for instance, uses as much electricity yearly in developing its AI as every home in a city the size of Atlanta. And these calculations are largely estimates because companies are extremely secretive about their processes, making it difficult to discern their precise carbon emissions.
What is also difficult to account for is the amount of energy used to develop models that end up failing to have any sizable impact. For instance, Google’s chatbot Bard, which it developed in response to ChatGPT, is often laughably bad in terms of its ability to do common arithmetic or answer basic questions about history. Perhaps Bard and Google’s AI will play a huge role in science, technology and labor in the future, as many technocrats have promised. Or, perhaps it won’t.
If the tech sector and their funders are to be believed, AI is slated to take over the future; technocrats say that technology like AI could usher in a future of global equity, and even solve problems like the climate crisis.
But a skeptic might look to the world of cryptocurrencies, another obsession of Big Tech, for a thorough debunking of these claims. Crypto enthusiasts, seeking new buyers to fill the bottom of the pyramid and free up capital to cash out, sell crypto as the great, decentralized equalizer, free from institutional control. In reality, crypto does not exist in a vacuum but rather a hyper capitalist world, and much of its structure, less than 15 years after the establishment of bitcoin, now resembles the highly stratified society that it exists within.
AI could be used much in the same way, while also contributing to the climate crisis — while the promise of AI is questionable, the carbon impacts are very real. With very few regulatory safeguards, machine learning can be and has been shown to reflect racist biases of the real world and the people developing it; AI-powered facial recognition is already being used ona wide scale by police across the globe to repress protests and other civil liberties.
Not everyone can pay for the news. But if you can, we need your support.
Truthout is widely read among people with lower incomes and among young people who are mired in debt. Our site is read at public libraries, among people without internet access of their own. People print out our articles and send them to family members in prison — we receive letters from behind bars regularly thanking us for our coverage. Our stories are emailed and shared around communities, sparking grassroots mobilization.
We’re committed to keeping all Truthout articles free and available to the public. But in order to do that, we need those who can afford to contribute to our work to do so.
We’ll never require you to give, but we can ask you from the bottom of our hearts: Will you donate what you can, so we can continue providing journalism in the service of justice and truth?