When you think of carbon emissions, you probably don’t think of the digital sphere. But you might be surprised to learn just how much of the energy you consume on a daily basis (and therefore your monthly energy cost) comes from your virtual actions. For instance, a single email releases about 0.3g of carbon into the atmosphere.
With more “high-powered” platforms like artificial intelligence (AI) models and automation tools spreading across the Web (and being used by major corporations like Amazon, Apple, and Meta), there’s a growing carbon footprint to contend with.
Read on to learn more about AI and its association with power consumption in the U.S. today.
- The computing power required for AI and machine learning is massive and steadily increasing. Training one AI model can use as much electricity as 100 US homes consume in an entire year.
- AI is also costly, with a model often costing a company equally as much as it pays its employees on a monthly basis.
- Some data centers consume more electricity than some entire countries.
- AI systems and their energy use have a real environmental impact and contribute to climate change. Today, “the cloud” produces more carbon emissions than the entire airline industry.
General AI Energy Consumption Statistics
- The computing power required for AI to operate is steadily increasing. For instance, the power required for the program AlphaZero has consistently doubled every 3.4 months, increasing by 300,000 times between 2012 and 2018 alone.
- Recently, researchers in San Francisco developed AI capable of solving a Rubik’s cube. This effort likely consumed about 2.8 gigawatt-hours of electricity (roughly equal to the amount of power used by three nuclear power plants in an hour) (Wired).
- GPUs, fan-heater-like machines that are used to mine crypto, are also used for AI, and often consume 500 watts of energy per 500 watts of heat produced (meaning AI consumption contributes directly to CO2 emissions).
- Training a single AI model or chatbot can use more electricity than 100 U.S. homes use in a whole year. Training Chat-GPT alone used 1.287 gigawatt-hours of energy, roughly equivalent to the amount used by 120 American homes in a year.
- Google researchers recently discovered that AI accounts for about 10-15% of their company’s total electricity use (Bloomberg).
- The upfront power cost to train an AI model is only about 40% of its total electricity consumption during general use. Newer AI models also offer more variables—ChatGPT offers 175 billion, compared to its predecessor’s 1.5 billion (Bloomberg).
- A 2019 study found that creating a generative AI model (“BERT”) with 110 million parameters consumed the same amount of energy as one ticket on a round-trip transcontinental flight (Scientific American).
- Creating a larger AI model, like GPT-3 (which has 175 billion parameters), was equivalent to driving 123 gasoline-powered vehicles for an entire year in terms of energy consumed (1,287 megawatt-hours) and carbon emissions (552 tons). This is accounting for building the model alone, not consumer use once it’s up and running (Scientific American).
- Some companies are working to create smaller AI models for this reason, like LLaMA, Meta’s seven-times-smaller large language model answer to OpenAI’s large language model (Scientific American).
- However, the size of an AI model isn’t the only predictor of energy efficiency. A study conducted by Google found that using a greener data center, more efficient model architecture, and a more efficient processor can reduce an AI model’s carbon footprint by 10 to 100 times.
- The BLOOM model is similar in size to GPT-3, but only consumed 433 megawatt-hours of electricity and only generated 30 tons of carbon dioxide during its creation (Scientific American).
Costs of AI Energy Consumption
- A primary data center workhorse chip (which is responsible for operating the GPU machines that power AI) from the market leader Nvidia costs about $10,000.
- It’s estimated that training an AI model at the scope of GPT-3 could cost as much as $4 million (CNBC).
- Training an even larger, more advanced language model could cost more than twice as much (CNBC).
- Training Meta’s largest language model, LLaMA, which has just 65 billion parameters (compared to current GPT models’ 175 billion parameters), cost $2.4 million to train over the course of one million GPU (graphics processing unit) hours (CNBC).
- LLaMA used 2,048 Nvidia A100 GPUs to train on 1.4 trillion tokens (with 1,000 tokens being roughly equivalent to 750 words). This took a total of 21 days.
- “Retraining” AI models and updating the software can be costly as well–as much as $10,000. This is why AI models often don’t have the most up-to-date “knowledge”–for instance, GPT-3 is not aware of events past 2021 (CNBC).
- “AI employees” can often cost just as much as “human employees”— the company Latitude spent nearly $200,000 per month to power their model AI Dungeon at its peak in 2021 (while it kept up with millions of daily requests) (CNBC).
- Inference can be extremely costly. ChatGPT had 100 million active monthly users this past January, and processing their millions of prompts is estimated to have cost OpenAI $40 million for that month alone (CNBC).
- Some models are even used billions of times a day, like Microsoft’s Bing AI chatbot, which serves all Bing users daily. It’s estimated the Bing AI chatbot requires at least $4 billion in infrastructure to continue to operate at this pace (CNBC).
- The startup Latitude’s language model was trained by OpenAI, so that smaller company was not responsible for training costs. However, inference costs remained steep (about half a cent per each of the millions of requests put in per day) (CNBC).
- In February 2023, ChatGPT lowered the cost for companies to access its GPT models. It now costs one-fifth of a cent per 750 words of output (CNBC).
Data Centers and AI Energy Consumption
- Globally, data centers consume about 200 terawatt-hours of power annually (meaning these centers consume more than some entire countries). As of 2021, there were about 8,000 data centers globally.
- Data centers account for about two percent of total U.S. electricity usage, according to the Department of Energy.
- Some estimates show that by 2030, computing and communications technology will consume 8-21% of all the electricity in the world, and data centers will contribute to one third of that statistic.
- Some companies offering “cloud” services have pledged or taken steps towards carbon neutrality. Google’s data centers are currently carbon-neutral, Microsoft plans to be “carbon-negative” by 2030, and OpenAI signed a deal to use Microsoft’s cloud service.
- Most data centers operate at 100% capacity, meaning a 20-megawatt facility would be drawing 20 megawatts of energy fairly consistently (UPenn).
- New data creation reached 59 zettabytes globally in 2020 alone. A 146-fold increase in data creation is likely to take place between 2010 and 2025 (United States International Trade Commission).
AI Consumption Trends and Environmental Impact
- The impact of AI on the environment is real–training a single AI model can emit as much as 626,000 pounds of carbon dioxide equivalent. This is nearly quintuple the lifetime emissions of an average American car.
- The cloud computing industry’s energy demand is staggering. According to MIT, “the cloud” (the cloud computing industry) now has a greater carbon footprint than the entire airline industry.
- To keep emissions under control, researchers estimate each self-driving car will need to use less than 1.2 kilowatts of energy for computing.
- In 2018, computers consumed about 1-2% of the global energy supply. In 2020, computers were already consuming 4-6% of the global energy supply thanks to innovations in AI. By 2030, this figure is expected to increase to anywhere from 8-21%, even in the face of the current energy crisis (UPenn).