Train Neural Networks More Efficiently Significant Reduction of Electricity Consumption in AI Training

From TU Munich | Traanslated by AI 2 min Reading Time

Related Vendors

Training neural networks requires enormous computing resources and thus a lot of electricity. Researchers at TU Munich (Germany) have developed a method that is a hundred times faster and therefore much more energy-efficient: Instead of proceeding step by step, the parameters are calculated directly from the data based on their probability.

The SuperMUC-NG of the Leibniz Supercomputing Centre, the eighth fastest HPC worldwide. Researchers at TU Munich have tested the energy efficiency of their new training method for neural networks on this system.(Image: Veronika Hohenegger, LRZ)
The SuperMUC-NG of the Leibniz Supercomputing Centre, the eighth fastest HPC worldwide. Researchers at TU Munich have tested the energy efficiency of their new training method for neural networks on this system.
(Image: Veronika Hohenegger, LRZ)

AI applications, such as large language models (LLMs), have become indispensable in our daily lives. The necessary computing, storage, and transmission capacities are provided by data centers. However, the energy consumption of these centers is enormous: in 2020, it was around 16 billion kilowatt-hours in Germany—about one percent of the entire German electricity demand. For the year 2025, an increase to 22 billion kilowatt-hours is forecasted.

Hundred Times Faster, Similarly Accurate

In addition, more complex AI applications in the coming years will significantly increase the demands on data centers. These require enormous computing resources for the training of neural networks. To counteract this development, researchers have developed a method that is a hundred times faster and delivers comparably accurate results as previous training methods. This significantly reduces the energy required for training.

Neural networks, used in AI for tasks such as image recognition or language processing, are inspired by the human brain in their functioning. They consist of interconnected nodes, called artificial neurons. These receive input signals, which are then weighted and summed with certain parameters. If a set threshold is exceeded, the signal is passed on to the subsequent nodes. To train the network, the parameter values are usually chosen randomly at first, for example in a normal distribution. They are then adjusted through small changes to gradually improve the network's predictions. Since this training method requires many repetitions, it is extremely resource-intensive and consumes a lot of electricity.

Parameters are Selected Based on their Probability

Felix Dietrich, professor of physics-enhanced machine learning, and his team have now developed a new method. Instead of determining the parameters between the nodes iteratively, their approach is based on probability calculations.

The probabilistic method chosen here is based on deliberately using values located at critical points of the training data. It focuses on the points where the values change particularly strongly and quickly. The current study aims to learn energy-conserving dynamic systems from data with this approach. Such systems change over time according to certain rules and are found in climate models or the financial market, among others.

"Our method allows us to determine the required parameters with minimal computational effort. This enables neural networks to be trained significantly faster and thus more energy-efficiently," explains Felix Dietrich. Furthermore, it has been shown that the new method is comparable in accuracy to iteratively trained networks.

Subscribe to the newsletter now

Don't Miss out on Our Best Content

By clicking on „Subscribe to Newsletter“ I agree to the processing and use of my data according to the consent form (please expand for details) and accept the Terms of Use. For more information, please see our Privacy Policy. The consent declaration relates, among other things, to the sending of editorial newsletters by email and to data matching for marketing purposes with selected advertising partners (e.g., LinkedIn, Google, Meta)

Unfold for details of your consent