Bruteforce computing is the next “winter of AI”

The search for a greener artificial intelligence

Wednesday, January 19, 2022
How much power is too much power when it comes to artificial intelligence (AI) achieving breakthroughs—breakthroughs which could change everyday life and make for a cleaner, more sustainable future? MBZUAI Adjunct Professor in Machine Learning, Mérouane Debbah, predicts that the next winter of AI is coming, and that the breakeven point of running exascale computing models may be too costly unless better algorithms and more efficient models are created.

Debbah wants to build AI systems to be more like the human brain and operate more sophistically. Debbah sees that industry is trying to better understand the benefit of exascale computing models from a “green perspective,” but that there is a long way to go. AI, in Debbah’s conception, must get less power hungry.

The amount of energy that is poured in today to build an exascale model is not sustainable, which will be a problem for AI in the future,” Debbah said. “From my point of view, we have had a lot of successes in AI and a lot of winters, and one of the next possible winters is the fact that we’re using too much energy today.”

Debbah believes that current methods are largely dependent on leveraging computation, but that this is bruteforce oriented, and must be addressed. He feels that current research is not looking at efficient algorithms, rather it is a race to see who can simply buy the most GPUs.

“If you go down this path — in the next 10 years — it is not going to be sustainable. So…just buying more GPUs and running those GPUs means more and more consumption to build up better and better models.”

Brain-like computing could be the answer

“The amount of energy that is poured in today to build an exascale model is not sustainable,” Debbah said. “We need more parsimonious algorithms which take care of the energy which is consumed. The problem today is that we are training too much, and this constant machine learning has a huge impact on our carbon footprint.”

“Humans are not only intelligent, but they are also super energy efficient,” Debbah said. “The energy we consume is very low for the tasks that we do. It means that we need to restructure the kind of algorithms that we’re using.”

Many people around the world see AI as the answer to making some industries more sustainable, and Debbah doesn’t deny its capability to do this.

“Historically, when you look at the machine learning era and AI, the majority of algorithms that we’re using today were mostly invented in the 90s. Limited progress has been made at the mathematical level and in improving those algorithms,” Debbah continued.

The largest area of progress, in Debbah’s opinion, has been in the optimization of computing power. One example he puts forward is neuromorphing computing or brain-like computing, where exciting possibilities are being actively explored, but where, unfortunately, progress is relatively slow. This is precisely the reason, according to Debbah, why researchers often revert to  simply buying a bigger computer.

GPT-3 is a famous example of an exascale model by Open AI, a non-profit AI research company backed by Elon Musk and colleagues. GPT-3 is the most powerful language model ever and has 175 billion parameters (the values that a neural network tries to optimize during training). Unfortunately, with such language models – size does matter – at least for now. Debbah is hopeful that better hardware, better software or more innovation can solve the power usage issue.

“AI can make our networks greener, but we never account for the energy that we use to calculate all those configurations to make it greener,” Debbah said. “At the moment this is increasing too much, and we may get to a breakeven point where basically we can’t go any further with the trend of using AI to make greener outcomes because already AI is proving to not be all that green.”

The semiconductor industry continues to face worldwide shortages due to supply chain issues. “This shortage is already hindering the progress because you cannot go further,” Debbah said. “We’re still relying on the classical type of von Neumann computing architectures that were built 50 years ago. With the convergence of AI and high-performance computing, it is time to go beyond that.”

Related

thumbnail
Thursday, January 23, 2025

Forecasting crop yields in an era of extreme weather

Fakhri Karray's new statistical model could help improve sustainability, agricultural systems, and the price of produce.

  1. forecasting ,
  2. food ,
  3. environment ,
  4. climate ,
  5. statistics ,
  6. sustainability ,
  7. machine learning ,
Read More
thumbnail
Monday, January 06, 2025

Accelerating neural network optimization: The power of second-order methods

A team from MBZUAI presented a new approach for optimizing neural networks at the recent NeurIPS conference.

  1. neural networks ,
  2. second-order ,
  3. optimization ,
  4. neurips ,
  5. students ,
  6. research ,
Read More
thumbnail
Wednesday, December 25, 2024

Machine learning 101

From optimal decision making to neural networks, we look at the basics of machine learning and how.....

  1. prediction ,
  2. algorithms ,
  3. ML ,
  4. deep learning ,
  5. research ,
  6. machine learning ,
Read More