How energy intensive are AI apps like ChatGPT?

Simon SpichakContributor

So far there's little information about how much energy their AI chatbots are using, especially how much is from renewable sources.

Large language models are powering many artificial intelligence (AI) chatbot apps like ChatGPT.

The models are trained on enormous amounts of data, which allows them to respond to prompts and questions by predicting the next word in a sequence.

Depending on who you ask, these new AI applications could solve almost any social or technological problem, take away millions of jobs, or lead to humanity’s extinction.

But these models require a lot of energy to set up and use.

“Given that AI models are going to be increasingly deployed in the real-world it is more important than ever to make sense of the ways in which they affect climate change and the environment,” Dr. Nestor Maslej, a research manager for the Human-Centered Artificial Intelligence at the University of Stanford, told The Weather Network.

Training is the first step in creating these models. It is also the most energy intensive, requiring the use of computer chips called GPUs for performing computations.

Content continues below

READ MORE: Cost of crypto: report says U.S. bitcoin as dirty as 6 million cars

What is training? Basically, scientists provide an AI model with a huge set of data to study.

“It takes millions of hours to learn patterns from the data,” Dr. Sasha Luccioni, research scientist and climate lead at HuggingFace told The Weather Network. “GPUs use energy to function essentially and also you have all of the overhead of data centers for example, the heating and cooling, the networking, the data transfer and storage.”

AI energy climate - image3

Comparison of carbon emissions between BLOOM and similar LLMs. Numbers in italics have been inferred based on data provided in the papers describing the models. (Estimating the Carbon Footprint of BLOOM study)

Luccioni and other scientists at HuggingFace attempted to estimate the energy usage of some of these large language models in a 2022 non peer-reviewed study. OpenAI’s GPT-3 emitted more than 500 metric tons of carbon dioxide during training, equivalent to driving one million miles in a gas-powered car. This calculation doesn’t even account for the e-waste or energy required to manufacture the GPU components.

But for the current models behind popular AI models like ChatGPT or Google Bard, it is almost impossible to calculate energy usage. It isn’t even clear how large the models are, or how many parameters and variables to incorporate.

“Unless the companies provide information to the record it's all just guesswork,” Luccioni said.

Content continues below

The Weather Network reached out to Microsoft, Google, and OpenAI for comment. Microsoft declined to answer how much energy is used to train Bing’s implementation of GPT-4, which may replace Bing search. They did not specifically comment on how implementing these AI models will impact the company’s carbon footprint. A spokesperson for Google declined to answer similar questions. OpenAI did not respond for comment.

“As part of our commitment to create a more sustainable future, Microsoft is investing in research to measure the energy use and carbon impact of AI while working on ways to make large systems more efficient, in both training and application,” a spokesperson from Microsoft said. “We are also continuing to invest in purchasing renewable energy and other efforts to meet our sustainability goals of being carbon negative, water positive and zero waste by 2030.”

AI energy climate - image1

Table from Artificial Intelligence Index Report 2023.

“The energy usage depends on a plethora of factors, including the size of the data that systems are being trained on as well as the energy efficiency of the datacenters in which these systems are being trained,“ Maslej said. “If the observed trend of increasingly larger AI model deployments continues than all else being equal, the amount of energy required to train them is likely to continue increasing.”

The models have gone from 100 million parameters in 2018 to 500 billion in 2023.

Training isn’t the end of the story for these AI models. It takes more energy to upload the model to the internet and keep it online for people to access. This for example allows you to ask ChatGPT to perform tasks or answer questions. Each time the model receives a new prompt, it requires an extra source of energy to answer it.

Content continues below

Many companies, including Google and Microsoft, are testing if these AI models can replace search engines. According to Wired reporting, an AI model would use five times more energy than a traditional search engine.

There’s also the issue of the servers themselves, and the mix of energy used to power them. Microsoft has servers located in 28 different countries and outlines individual pledges for improving sustainability in each region. In Singapore for example, 13 per cent of the energy used by servers comes from solar power. In Arizona, 47 per cent of the energy comes from a mix of wind, solar, and hydro power. However, they don’t supply an overall measurement of how efficiently energy is used across all these servers.

OpenAI states their servers are located in the U.S. without providing information about energy usage. Google has data centers in 11 countries reporting that, in 2021, 66 per cent of the energy for the data servers was sourced from non-carbon sources.

It isn’t clear exactly where the servers storing these language models are located, so even as many parts of the world are transitioning to renewable energy sources, it's as of yet impossible to determine how much renewable energy is being used to power the AI revolution.

Thumbnail image: Server racks at OVHcloud datacenter in Beauharnois, Que. (The Weather Network)