The global AI industry has undergone exponential growth within just a few years. Technological advancements have surged dramatically, and the applications of this technology have evolved significantly. Every major tech company is now investing heavily in AI, allocating billions of dollars to develop and enhance the AI infrastructure within their organisations.
According to the eighth edition of Stanford’s AI Index report, over 58 notable AI models were launched in 2024, which included 40 models from the US and 15 from China. However, these models have significant carbon emissions, negatively impacting the environment.
The report highlights that the training of the latest AI models emits over 250 times more carbon than an average American in a year. As per the report, recent models show a dramatic increase in emissions: GPT‑3 (2020) produced approximately 588 tons, GPT‑4 (2023) reached about 5,184 tons and Llama 3.1 405B (2024) produced roughly 8,930 tons of carbon emissions.
For context, the report notes that the average American emits about 18 tons of carbon per year.
The report also highlights that training early AI models, such as AlexNet in 2012, produced relatively low carbon emissions, estimated at approximately 0.01 tons. However, as the industry has advanced and demand has grown, the complexity of training these models has intensified, requiring greater computational power and, consequently, resulting in higher carbon emissions.
Energy Efficiency & Environmental Impact
Despite notable advancements in the energy efficiency of AI hardware, the overall power consumption needed to train AI systems continues to escalate rapidly. According to the report, the original Transformer, introduced in 2017, consumed an estimated 4,500 watts. By comparison, PaLM, one of Google’s initial flagship large language models (LLMs), had a power draw of 2.6 million watts, nearly 600 times that of the Transformer. Llama 3.1‑405B, released in the summer of 2024, required 25.3 million watts, using over 5,000 times more power than the original Transformer.
The report cited Epoch AI to claim that the power needed to train cutting‑edge AI models is doubling each year. This surge in power consumption reflects the trend of training on increasingly expansive datasets. Unsurprisingly, as the total power used to train AI systems has grown over time, so too have the carbon emissions produced by these models. Several factors influence the carbon emissions of AI systems, including the number of parameters in a model, the power usage effectiveness of a data centre and the carbon intensity of the electrical grid.