Skip to content
  • Facebook
  • X
  • Linkedin
  • WhatsApp
  • Associate Journalism
  • About Us
  • Privacy Policy
  • 033-46046046
  • editor@artifex.news
Artifex.News

Artifex.News

Stay Connected. Stay Informed.

  • Breaking News
  • World
  • Nation
  • Sports
  • Business
  • Science
  • Entertainment
  • Lifestyle
  • Toggle search form
  • Markets fall on unabated foreign fund outflows; HDFC Bank, Reliance also drag Business
  • Byju’s CFO Ajay Goel Quits And Rejoins Vedanta To Oversee Restructuring Nation
  • Hoax Bomb Threat On Dubai-Bound Chennai Flight Delays It By 12 Hours Nation
  • Novak Djokovic Untroubled At French Open As Fans Hit By Alcohol Ban Sports
  • Man Attempts To Smuggle 10 Anacondas At Bengaluru Airport, Arrested Nation
  • How a 6.3 magnitude quake caused another of same intensity Science
  • Northeast On High Alert, Officials Asked To Be Prepared Nation
  • South Africa Government Formed After Tough Coalition Deal With Opposition World

AI has a big and growing carbon footprint, but algorithms can help

Posted on March 5, 2024 By admin


Given the huge problem-solving potential of artificial intelligence (AI), it wouldn’t be far-fetched to think that AI could also help us in tackling the climate crisis. However, when we consider the energy needs of AI models, it becomes clear that the technology is as much a part of the climate problem as a solution.

The emissions come from the infrastructure associated with AI, such as building and running the data centres that handle the large amounts of information required to sustain these systems.

But different technological approaches to how we build AI systems could help reduce its carbon footprint. Two technologies in particular hold promise for doing this: spiking neural networks and lifelong learning.

The lifetime of an AI system can be split into two phases: training and inference. During training, a relevant dataset is used to build and tune – improve – the system. In inference, the trained system generates predictions on previously unseen data.

For example, training an AI that’s to be used in self-driving cars would require a dataset of many different driving scenarios and decisions taken by human drivers.

After the training phase, the AI system will predict effective manoeuvres for a self-driving car. Artificial neural networks (ANN), are an underlying technology used in most current AI systems.

They have many different elements to them, called parameters, whose values are adjusted during the training phase of the AI system. These parameters can run to more than 100 billion in total.

While large numbers of parameters improve the capabilities of ANNs, they also make training and inference resource-intensive processes. To put things in perspective, training GPT-3 (the precursor AI system to the current ChatGPT) generated 502 metric tonnes of carbon, which is equivalent to driving 112 petrol powered cars for a year.

GPT-3 further emits 8.4 tonnes of CO₂ annually due to inference. Since the AI boom started in the early 2010s, the energy requirements of AI systems known as large language models – the type of technology that’s behind ChatGPT – have gone up by a factor of 300,000.

With the increasing ubiquity and complexity of AI models, this trend is going to continue, potentially making AI a significant contributor of CO₂ emissions. In fact, our current estimates could be lower than AI’s actual carbon footprint due to a lack of standard and accurate techniques for measuring AI-related emissions.

Spiking neural networks

The previously mentioned new technologies, spiking neural networks (SNNs) and lifelong learning (L2), have the potential to lower AI’s ever-increasing carbon footprint, with SNNs acting as an energy-efficient alternative to ANNs.

ANNs work by processing and learning patterns from data, enabling them to make predictions. They work with decimal numbers. To make accurate calculations, especially when multiplying numbers with decimal points together, the computer needs to be very precise. It is because of these decimal numbers that ANNs require lots of computing power, memory and time.

This means ANNs become more energy-intensive as the networks get larger and more complex. Both ANNs and SNNs are inspired by the brain, which contains billions of neurons (nerve cells) connected to each other via synapses.

Like the brain, ANNs and SNNs also have components which researchers call neurons, although these are artificial, not biological ones. The key difference between the two types of neural networks is in the way individual neurons transmit information to each other.

Neurons in the human brain communicate with each other by transmitting intermittent electrical signals called spikes. The spikes themselves do not contain information. Instead, the information lies in the timing of these spikes. This binary, all-or-none characteristic of spikes (usually represented as 0 or 1) implies that neurons are active when they spike and inactive otherwise.

This is one of the reasons for energy efficient processing in the brain.

Just as Morse code uses specific sequences of dots and dashes to convey messages, SNNs use patterns or timings of spikes to process and transmit information. So, while the artificial neurons in ANNs are always active, SNNs consume energy only when a spike occurs.

Otherwise, they have closer to zero energy requirements. SNNs can be up to 280-times more energy efficient than ANNs.

My colleagues and I are developing learning algorithms for SNNs that may bring them even closer to the energy efficiency exhibited by the brain. The lower computational requirements also imply that SNNs might be able to make decisions more quickly.

These properties render SNNs useful for a broad range of applications, including space exploration, defence and self-driving cars because of the limited energy sources available in these scenarios.

Lifelong learning

L2 is another strategy for reducing the overall energy requirements of ANNs over the course of their lifetime that we are also working on.

Training ANNs sequentially (where the systems learn from sequences of data) on new problems causes them to forget their previous knowledge while learning new tasks. ANNs require retraining from scratch when their operating environment changes, further increasing AI-related emissions.

L2 is a collection of algorithms that enable AI models to be trained sequentially on multiple tasks with little or no forgetting. L2 enables models to learn throughout their lifetime by building on their existing knowledge without having to retrain them from scratch.

The field of AI is growing fast and other potential advancements are emerging that can mitigate the energy demands of this technology. For instance, building smaller AI models that exhibit the same predictive capabilities as that of a larger model.

Advances in quantum computing – a different approach to building computers that harnesses phenomena from the world of quantum physics – would also enable faster training and inference using ANNs and SNNs. The superior computing capabilities offered by quantum computing could allow us to find energy-efficient solutions for AI at a much larger scale.

The climate change challenge requires that we try to find solutions for rapidly advancing areas such as AI before their carbon footprint becomes too large.

(Shirin Dora is lecturer, computer science, Loughborough University. This article is republished from The Conversation.)



Source link

Science

Post navigation

Previous Post: Simona Halep Free To Return After Four-Year Doping Ban Reduced By CAS
Next Post: Ex-Minister Sukhdev Dhindsa Merges His Party With Sukhbir Badal’s Akali Dal

Related Posts

  • Chandrayaan-3 takes another crucial step closer to moon Science
  • CDSCO issues caution against manufacture and sale of unapproved drugs Science
  • Cats are killing India’s birds. Are we paying attention? Science
  • Discover the secrets of negligible senescence in tortoises and turtles, offering insights for anti-ageing research and potential medical breakthroughs. Science
  • Kerala researchers batting for better understanding of the flying mammal Science
  • Afghanistan: ‘all four quakes were in the same fault system’ Science

More Related Articles

Where do the wild colours of domesticated silkworm cocoons come from? Science
Two new malaria vaccines are being rolled out across Africa — how they work and what they promise Science
Goa scientists find 50,000-year-old magnetic fossils in Bay of Bengal Science
Daily Quiz | On scales that measure hurricanes Science
Sci-Five | The Hindu Science Quiz: On Elephants Science
Earth’s early evolution: fresh insights from rocks formed 3.5 billion years ago Science
SiteLock

Archives

  • July 2024
  • June 2024
  • May 2024
  • April 2024
  • March 2024
  • February 2024
  • January 2024
  • December 2023
  • November 2023
  • October 2023
  • September 2023
  • August 2023
  • July 2023
  • June 2023
  • May 2023
  • April 2023
  • March 2023
  • February 2023
  • January 2023
  • December 2022
  • November 2022
  • October 2022
  • September 2022
  • August 2022
  • July 2022
  • June 2022
  • May 2022

Categories

  • Business
  • Nation
  • Science
  • Sports
  • World

Recent Posts

  • China posts disappointing growth as officials hold key ‘Third Plenum’ meeting
  • AAP vs Tihar Jail On Arvind Kejriwal’s Loss Of Weight, Drop In Blood Sugar
  • Body of sanitation worker who went missing in canal recovered forty-six hours later
  • Rupee falls 4 paise to 83.55 against U.S. dollar in early trade
  • Nifty hits fresh record high in early trade; Sensex climbs 290 points

Recent Comments

  1. ywdVpqHiNZCtUDcl on UP Teacher Who Asked Students To Slap Muslim Classmate
  2. bRstIalYyjkCUJqm on UP Teacher Who Asked Students To Slap Muslim Classmate
  3. GkJwRWEAbS on UP Teacher Who Asked Students To Slap Muslim Classmate
  4. xreDavBVnbGqQA on UP Teacher Who Asked Students To Slap Muslim Classmate
  5. aANVRzfUdmyb on UP Teacher Who Asked Students To Slap Muslim Classmate
  • Court To Give Verdict On Batla House Convict’s Death Penalty Appeal Today Nation
  • NHAI’s road assets monetisation can fetch up to ₹60,000 crore for government this fiscal: Report Business
  • Marcel Sabitzer Snatches Austria Euros Group Win Against Netherlands Sports
  • MS Dhoni Out For Golden Duck, Punjab Kings Pacer’s Celebration Wins Internet – Watch Sports
  • Health Ministry Allows Use Of “Leftover” Organs, Blood, Hair For Research Nation
  • Ahead Of G20 Meet, PM Narendra Modi To Host Private Dinner For Joe Biden Tonight World
  • PM Modi “directs” ISRO to land man on moon by 2040 Science
  • Sayeret Matkal, Israel’s Tip Of The Spear, Preps For Hostage Rescue: Report World

Editor-in-Chief:
Mohammad Ariff,
MSW, MAJMC, BSW, DTL, CTS, CNM, CCR, CAL, RSL, ASOC.
editor@artifex.news

Associate Editors:
1. Zenellis R. Tuba,
zenelis@artifex.news
2. Haris Daniyel
daniyel@artifex.news

Photograher:
Rohan Das
rohan@artifex.news

Artifex.News offers Online Paid Internships to college students from India and Abroad. Interns will get a PRESS CARD and other online offers.
Send your CV (Subjectline: Paid Internship) to internship@artifex.news

Links:
Associate Journalism
About Us
Privacy Policy

News Links:
Breaking News
World
Nation
Sports
Business
Entertainment
Lifestyle

Registered Office:
72/A, Elliot Road, Kolkata - 700016
Tel: 033-22277777, 033-22172217
Email: office@artifex.news

Editorial Office / News Desk:
No. 13, Mezzanine Floor, Esplanade Metro Rail Station,
12 J. L. Nehru Road, Kolkata - 700069.
(Entry from Gate No. 5)
Tel: 033-46011099, 033-46046046
Email: editor@artifex.news

Copyright © 2023 Artifex.News Newsportal designed by Artifex Infotech.