GBA Logo horizontal Facebook LinkedIn Email Pinterest Twitter X Instagram YouTube Icon Navigation Search Icon Main Search Icon Video Play Icon Plus Icon Minus Icon Picture icon Hamburger Icon Close Icon Sorted
Guest Blogs

Is Generative AI Bad for the Environment?

A computer scientist explains the carbon footprint of ChatGPT and its cousins

Generative AI takes a lot of computing power. How does that translate into society's future carbon footprint? Photo courtesy Brookhaven National Laboratory / CC BY-NC-ND / Flickr.

Generative AI is the hot new technology behind chatbots and image generators. But how hot is it making the planet?

As an AI researcher, I often worry about the energy costs of building artificial intelligence models. The more powerful the AI, the more energy it takes. What does the emergence of increasingly more powerful generative AI models mean for society’s future carbon footprint?

“Generative” refers to the ability of an AI algorithm to produce complex data. The alternative is “discriminative” AI, which chooses between a fixed number of options and produces just a single number. An example of a discriminative output is choosing whether to approve a loan application.

Generative AI can create much more complex outputs, such as a sentence, a paragraph, an image or even a short video. It has long been used in applications like smart speakers to generate audio responses, or in autocomplete to suggest a search query. However, it only recently gained the ability to generate humanlike language and realistic photos.

Using more power than ever

The exact energy cost of a single AI model is difficult to estimate, and includes the energy used to manufacture the computing equipment, create the model and use the model in production. In 2019, researchers found that creating a generative AI model called BERT with 110 million parameters consumed the energy of a round-trip transcontinental flight for one person. The number of parameters refers to the size of the model, with larger models generally being more skilled. Researchers estimated that creating the much larger GPT-3, which has 175 billion parameters, consumed 1,287 megawatt hours of electricity and generated 552 tons of carbon dioxide equivalent, the equivalent of 123 gasoline-powered passenger vehicles driven for one year. And that’s just for getting the model ready to launch, before any consumers start using it.

Size is not the only predictor of carbon emissions. The open-access BLOOM model, developed by the BigScience project in France, is similar in size to GPT-3 but has a much lower carbon footprint, consuming 433 MWh of electricity in generating 30 tons of CO2eq. A study by Google found that for the same size, using a more efficient model architecture and processor and a greener data center can reduce the carbon footprint by 100 to 1,000 times.

Larger models do use more energy during their deployment. There is limited data on the carbon footprint of a single generative AI query, but some industry figures estimate it to be four to five times higher than that of a search engine query. As chatbots and image generators become more popular, and as Google and Microsoft incorporate AI language models into their search engines, the number of queries they receive each day could grow exponentially.

a roomful of people work on computers
AI chatbots, search engines and image generators are rapidly going mainstream, adding to AI’s carbon footprint. AP Photo/Steve Helber

AI bots for search

A few years ago, not many people outside of research labs were using models like BERT or GPT. That changed on Nov. 30, 2022, when OpenAI released ChatGPT. According to the latest available data, ChatGPT had over 1.5 billion visits in March 2023. Microsoft incorporated ChatGPT into its search engine, Bing, and made it available to everyone on May 4, 2023. If chatbots become as popular as search engines, the energy costs of deploying the AIs could really add up. But AI assistants have many more uses than just search, such as writing documents, solving math problems and creating marketing campaigns.

Another problem is that AI models need to be continually updated. For example, ChatGPT was only trained on data from up to 2021, so it does not know about anything that happened since then. The carbon footprint of creating ChatGPT isn’t public information, but it is likely much higher than that of GPT-3. If it had to be recreated on a regular basis to update its knowledge, the energy costs would grow even larger.

One upside is that asking a chatbot can be a more direct way to get information than using a search engine. Instead of getting a page full of links, you get a direct answer as you would from a human, assuming issues of accuracy are mitigated. Getting to the information quicker could potentially offset the increased energy use compared to a search engine.

Ways forward

The future is hard to predict, but large generative AI models are here to stay, and people will probably increasingly turn to them for information. For example, if a student needs help solving a math problem now, they ask a tutor or a friend, or consult a textbook. In the future, they will probably ask a chatbot. The same goes for other expert knowledge such as legal advice or medical expertise.

While a single large AI model is not going to ruin the environment, if a thousand companies develop slightly different AI bots for different purposes, each used by millions of customers, the energy use could become an issue. More research is needed to make generative AI more efficient. The good news is that AI can run on renewable energy. By bringing the computation to where green energy is more abundant, or scheduling computation for times of day when renewable energy is more available, emissions can be reduced by a factor of 30 to 40, compared to using a grid dominated by fossil fuels.

Finally, societal pressure may be helpful to encourage companies and research labs to publish the carbon footprints of their AI models, as some already do. In the future, perhaps consumers could even use this information to choose a “greener” chatbot.

Kate Saenko is an associate professor of computer science at Boston University. This article was originally published at The Conversation.


  1. jollygreenshortguy | | #1

    Somehow humanity has managed to survive for millenia without ChatGPT and its associated carbon footprint. One wonders how.

  2. vap0rtranz | | #2

    As an ex-technologist who left the tech industry after 20+ years, I find Kate's article interesting but I'll assume she's aware of various caveats or nuances. I'll point out these caveats:

    >By bringing the computation to where green energy is more abundant, or scheduling computation for times of day when renewable energy is more available, emissions can be reduced by a factor of 30 to 40, compared to using a grid dominated by fossil fuels.

    Time-shared computing has been around for awhile. If I remember right, it was 1st implemented by IBM, Honeywell, and other mainframe designers way back in the 50's. Fast forward, and Cloud computing "rebooted" the time-sharing idea. AWS (Amazon Web Services) has dynamically scalable computer (EC2) with on-demand consumption for some time, and though its based on price (Spot Pricing) I don't see any reason why it couldn't be switched to energy consumption. (Dynamic power consumption monitoring & control, though newish in home building, has been around for awhile in server grade computing, and I don't mean just Intel's voltage regulators and hibernation but also remote command & control of loads within the computer by time of day, users, etc.) So the fundamental technology has existed for awhile, like the tooling needed to do it. What I'm saying is: the technology isn't the problem. User behavior is the problem. Will a 21st century user be OK waiting for a response while computers are hibernating on dark and windless days? In today's world of instant gratification, I'm not optimistic. Besides the inconvenience, some compute services are critical. This idea would face something similar to home energy conundrum of asking my Floridian in-laws to delay turning on their AC's cooling. Actively cooling a building used to be thought of as a luxury but now it's considered a matter of public health & safety, like avoiding heat strokes. AI is currently considered a luxury, or toy depending on your view, but it's probably going to become integrated into critical compute services.

    >There is limited data on the carbon footprint of a single generative AI query, but some industry figures estimate it to be four to five times higher than that of a search engine query.

    There's underlying, systemic challenges in computing that compound a futurist view of carbon footprints. I encourage Kate and others to read about the ongoing carbon footprint of spam e-mail. What the public and some technologists outside of core computer infrastructure haven't been told is: techies never got rid of spam. What happened is spam was shoved under the rug, so to say, via a different box in people's e-mail through smarter detection but electricity continues to be consumed by spam. Technology, like Baysean statistics, was applied to better detect spam but attempts to prevent or squash spam altogether have failed. (There are means to elliminate spam but those were seen as burdening the user and, at the risk of going philosophical, fundamentally stiffled the ideals of free communication.) I asked BingAI to estimate the carbon footprint of spam: 9.7 billion metric tons of carbon per year. That's a bigger carbon footprint than Costa Rica. And instead of being useful it's wasteful. I call this out to raise awareness of technical debt, or technical baggage. That technological progress is hyperfocused on the future and avoids talking about the past. The past is a great teacher of lessons learned. Instead of continuing to fun FASTER towards MORE technology, how about the idea that "less is more"? On-demand AI could be a start but it's still more tech. I think we need another shift in users' views about tech. That's one reason I got out of tech.

Log in or create an account to post a comment.



Recent Questions and Replies

  • |
  • |
  • |
  • |