The great AI models have triggered computer power consumption |MIT Technology Review in Spanish

In 2018, Openai discovered that the amount of computational power necessary to train larger artificial intelligence (AI) models doubles every 3.4 months since 2012.

A year later, the research laboratory of San Francisco (USA.UU.) added new data to his analysis and demonstrated how the increase after 2012 is compared to the historical duplication time since the beginning of the field.From 1959 to 2012, the amount of energy required doubled every two years, following Moore's law.This means that the increase in today's power is more than seven times greater than the previous rate.

Graph: Computational use to train AI systems demonstrates two clearly differentiated different times.In the first era, the required energy doubled every two years (according to Moore's law), now doubles every 3.4 months.Credits: OpenAI.

This drastic increase in the necessary computational resources underlines how expensive the field achievements have become.It should be taken into account that the previous graph shows a logarithmic scale.On a linear scale (below), you can see more clearly how computational use has increased 300.000 times in that period of seven years.

Los grandes modelos de IA han disparado el consumo de potencia informática | MIT Technology Review en español

The graph does not include some of the most recent advances, such as the large-scale Language Model of Google Bert, the OpenAI GPT-3 Language Model and the Deepmind, Alphastar II game model, and even larger modelsdeveloped last year.

Given this tendency towards increasingly large models and, therefore, with greater computational consumption, it is causing more and more researchers to warn about the enormous costs of deep learning.In 2019, a study by the University of Massachusetts in Amherst (USA.UU.), showed how these growing computational costs translate directly into carbon emissions.

The article also indicates how this trend intensifies the privatization of artificial intelligence investigation because it undermines the capacity of academic laboratories to compete with the private ones rich in resources.

In response to this growing concern, several industry groups have begun to launch recommendations.The Allen Institute of Artificial Intelligence, a non -profit research company in Seattle (EE.UU.), has proposed that the researchers publish the economic and computational costs of training their models along with their performance results, for example.

In his own blog, Openai suggested that policy formulators should increase funds for academic researchers to close the resource gap between academic laboratories and industry.