There are new technologies available to help lower the energy consumption of AI models.

There are new technologies available to help lower the energy consumption of AI models.

You may have noticed that each flight's estimated carbon emissions are now displayed next to its price when searching for flights on Google. It's a technique to let customers know about their environmental impact and allow them to take it into consideration when making decisions.

Despite its higher carbon emissions than the whole aviation industry, the computing sector still lacks a similar level of openness. Artificial intelligence models are driving up this energy demand. Huge, well-liked models like ChatGPT point to a trend of massive AI, supporting predictions that data centers may consume up to 21% of global electricity by 2030.

The amount of AI workloads using the LLSC's hardware has significantly increased, as has the case for many other data centers. The computer scientists at the LLSC were interested in ways to run jobs more effectively after noticing an increase in energy use. The center, which is totally run by carbon-free electricity, has as one of its guiding principles green computing. Using graphics processing units, which are power-hungry technology, is required while training an AI model, which is the process by which it learns patterns from enormous datasets. As an illustration, it is estimated that the GPUs used to train GPT-3 required 1,300 megawatt-hours of electricity, which is about equivalent to the monthly electricity use of 1,450 typical U.S. households.