Climate conscious computing

Wednesday, October 11, 2023

Technological revolutions throughout history have harnessed energy to produce great change. The industrial revolution of the late-eighteenth century was driven by the power of water and gravity to spin cotton into clothes. Early locomotives propelled by steam were able to cross continents in days along routes that had previously taken months. The combustion engine has given us cars and airplanes that have restructured how our cities and economies are built.

We are perhaps on the cusp — or firmly in the midst — of another technological revolution, this time sparked by electricity-hungry artificial intelligence. All this happening at the same time that the world is working to reduce carbon emissions and slow climate change.

How can we deliver on the promise of AI while also considering its environmental impact?

Assistant Professor in Machine Learning at MBZUAI Qirong Ho and his colleagues at the Center for Integrative Artificial Intelligence have developed what they call an Artificial Intelligence Operating System (AIOS) for decarbonization. Their goal is to reduce the high cost of AI programs by decreasing the amount of energy that is expended in their development. “There’s a surprising amount of energy waste with AI that we can cut it down,” Ho said. “The AI future needs to be low-carbon, economical and human empowering.”

Ho and his team are addressing AI carbon consumption in several ways.

The importance of communication

Powerful artificial intelligence platforms — for example, large-language models like OpenAI’s ChatGPT and Facebook’s LLaMA-2 — require huge amounts of computing power to develop. The machines that are used to train these models are not like the computers that are found in our homes. “You don’t just use one computer, a laptop, or even a single server machine,” Ho said. “You need hundreds of machines to train one of these ChatGPT-style models. And that means that we need to have energy efficient ways of using all these machines.”

Qirong Ho

Ho calls this component of the operating system AI model sustainability. The solution relates to the way machines communicate with one another. “The greatest technical challenge in training this many machines is the way they communicate,” Ho said.

Ho offers an analogy to humans in that communication is an ever-present conundrum in any organization in which large groups of people must coordinate their work. In businesses, there is a significant amount of time and thought put into how teams exchange information because there are major losses in productivity if communication within and across teams isn’t done effectively.

When computers don’t communicate efficiently, tasks take longer, and more energy is consumed. More energy use means more carbon put into the atmosphere. “It’s not the transmission of information that’s expensive from an energy perspective, but rather the time the machine is using energy while it is waiting for a response from another machine,” Ho said. “Think about a time when you sent an email to a colleague and had to wait to receive their response before you could move forward on a project.”

“Just like it is with humans,” Ho said, “it’s very difficult for machines to communicate unless they are organized correctly.”

Finely tuned

Once a model has been trained, developers need to improve its performance. This is called the tuning stage, and it is conducted by experts who specialize in dialing in performance: “The people who do this are like an F1 race crew who know so much about how to improve how a model works,” Ho said. That said, even with the best pit crews in the business, the tuning stage takes time, and there is a direct correlation between time spent by people working on computers, electricity being wasted, and, thus, carbon being emitted into the atmosphere.

“We need to do smarter AI performance tuning that respects what I call a carbon budget,” Ho explained.

AIOS adopts an approach called AI tuning sustainability. It works according to constraints determined by the amount of energy considered appropriate for a specific level of tuning. “According to this approach, we would run machines for no more than a certain number of hours, which caps the amount of carbon emissions. The approach automatically discovers the best level of tuning in that allotted period of time,” Ho said.

Deployment

The next phase of AIOS relates to AI data center sustainability. Just like with an energy efficient appliance, say a refrigerator, an AI data center should be programed to use the right amount of power at the right time. In this energy-efficient analogy, an AI program that normally completes in one day might take one and a half days when using only 50% of the machines, but in exchange would have saved 25% of the power and carbon emissions.

AI data center sustainability is also about building software tools that reduce the time taken by developers to set machines up and get them running AI programs. “Time wasted by developers is time wasted leaving power-hungry computers switched on. Human time is correlated with machine time, which is correlated to carbon emitted,” Ho said.

A recent success

Ho’s colleagues at the Center for Integrative Artificial Intelligence have already used AIOS to develop an application that is an environmentally friendly and cost-effective alternative to popular chatbots like OpenAI’s ChatGPT. Called Vicuna, it was developed with a fraction of ChatGPT’s monetary and carbon cost and achieves 90% of the subjective quality of ChatGPT. “In terms of carbon emitted, the electricity that went into producing ChatGPT resulted in thousands of tons of carbon being emitted, while ours is measured in kilograms,” Ho said.

Vicuna is named after a South American animal that is a relative of the llama, as it was built on Facebook’s LLaMA language model. The project is a collaboration between the University of California, Berkeley, Carnegie Mellon University, Stanford University, University of California San Diego and MBZUAI.

With AIOS, “we have been ecofriendly from day one, not as an afterthought,” Ho said. “And our impact starts here at home, at MBZUAI.”

Related

thumbnail
Wednesday, December 25, 2024

Machine learning 101

From optimal decision making to neural networks, we look at the basics of machine learning and how.....

  1. ML ,
  2. deep learning ,
  3. algorithms ,
  4. research ,
  5. machine learning ,
  6. prediction ,
Read More
thumbnail
Monday, December 23, 2024

Bridging probability and determinism: A new causal discovery method presented at NeurIPS

MBZUAI research shows how a better understanding of the relationships between variables can benefit fundamental scientific research.

  1. research ,
  2. student ,
  3. determinism ,
  4. variables ,
  5. casual discovery ,
  6. neurips ,
  7. machine learning ,
Read More
thumbnail
Wednesday, December 18, 2024

AI and the Arabic language: Preserving cultural heritage and enabling future discovery

The Arabic language is underrepresented in the digital world, making AI inaccessible for many of its 400.....

  1. atlas ,
  2. language ,
  3. Arabic LLM ,
  4. United Nations ,
  5. Arabic language ,
  6. jais ,
  7. llms ,
  8. large language models ,
Read More