Skip to content
  • Facebook
  • X
  • Linkedin
  • WhatsApp
  • YouTube
  • Associate Journalism
  • About Us
  • Privacy Policy
  • 033-46046046
  • editor@artifex.news
Artifex.News

Artifex.News

Stay Connected. Stay Informed.

  • Breaking News
  • World
  • Nation
  • Sports
  • Business
  • Science
  • Entertainment
  • Lifestyle
  • Toggle search form
  • Joe Biden’s Son To Plead Not Guilty To Gun Charges: Lawyer
    Joe Biden’s Son To Plead Not Guilty To Gun Charges: Lawyer World
  • Access Denied
    Access Denied Nation
  • Fake China Doctor Tricks Breast Cancer Patient Into Bizarre “Cement Treatment”, Probe Launched
    Fake China Doctor Tricks Breast Cancer Patient Into Bizarre “Cement Treatment”, Probe Launched World
  • “Look At Your Own Game”: England Great Shreds ‘Bazball’ To Pieces
    “Look At Your Own Game”: England Great Shreds ‘Bazball’ To Pieces Sports
  • Access Denied
    Access Denied Nation
  • Kerala Blasters announces team for Durand Cup
    Kerala Blasters announces team for Durand Cup Sports
  • India’s Champions Trophy 2025 Squad: No Jasprit Bumrah Due To Injury, No Place For Nitish Reddy In Side By Harsha Bhogle
    India’s Champions Trophy 2025 Squad: No Jasprit Bumrah Due To Injury, No Place For Nitish Reddy In Side By Harsha Bhogle Sports
  • Bangladesh urges China to reduce interest rate on loan, China expresses support for sovereignty of Bangladesh
    Bangladesh urges China to reduce interest rate on loan, China expresses support for sovereignty of Bangladesh World
How quantum computing can make large language models even better

How quantum computing can make large language models even better

Posted on September 17, 2024 By admin


In recent years, the landscape of artificial intelligence (AI), particularly within the realm of natural language processing (NLP), has undergone a remarkable transformation. We have witnessed the rise of powerful large language models (LLMs) made by OpenAI, Google, and Microsoft, among others, and generative AI (Gen-AI), characterised by its unparalleled ability to generate data based on user inputs.

These sophisticated models have revolutionised human-computer interactions, bestowing upon users experiences akin to human understanding. The advent of these cutting-edge technologies and their wide availability has compelled the people at large, industry stakeholders, and governmental bodies to pay attention to their implications.

Problems with current LLMs

LLMs are a cornerstone in AI and mirror the complexities of human language processing. They can classify text, answer questions, and translate between languages. But they also consume a lot of energy to be trained and when put in use.

For example, as models go, LLMs are much larger than other AI applications such as computer vision. The energy consumption of a large language model (LLM) is determined mostly by the number of parameters it has. Larger models demand more computational power for both training and inference. For example, GPT-3 has 175 billion parameters and required around 1,287 MWh of electricity to train. This is around what an average American household consumes in 120 years.

LLMs also surpass non-AI applications in this regard. Training an LLM with 1.75 billion parameters can emit up to 284 tonnes of carbon dioxide, which represents more energy than that required to run a data centre with 5,000 servers for a year.

It’s important that we lower LLMs’ carbon footprint to ensure they are sustainable and cost-effective. Achieving these goals will give LLMs more room to become more sophisticated as well.

Another shortcoming of  LLMs pertains to their pre-trained nature, which restricts the level of control users have over their functioning. These models are trained on large datasets with which they develop awareness of word-use patterns in diverse linguistic contexts. But such training often also results in “hallucinations”. Essentially, LLMs may generate text that is contextually coherent but factually incorrect or semantically nonsensical. This arises from limitations inherent to the training, when the model’s understanding may diverge from reality.

A third limitation revolves around the abilities of current LLMs to understand syntactics. Syntax refers to the structural arrangement of words and phrases in a sentence. LLMs excel at processing the semantic (meaning-related) aspects of natural language but struggle with syntax. For example, they may overlook or misinterpret syntactic cues and impede their ability to generate contextually appropriate text.

In sum, we need to develop sustainable, energy-efficient approaches that yield more accurate language models.

Quantum computing

Quantum computing is a highly promising way to address these challenges. It harnesses the remarkable properties of quantum physics like superposition and entanglement for computational needs. In particular, quantum natural language processing (QNLP) has emerged as an active and burgeoning field of research with potentially profound implications for language modelling.

QNLP incurs lower energy costs than conventional LLMs by leveraging quantum phenomena. QNLP models also require far fewer parameters than their classical counterparts in order to achieve the same outcomes (on paper), thus promising to enhance efficiency without compromising performance.

This processing paradigm takes advantage of quantum correlations, an approach in which the system focuses on grammar (syntax) and meaning (semantics) together, rather than separately as conventional systems do. QNLP achieves this using a better ‘mapping’ between the rules of grammar and quantum physical phenomena like entanglement and superposition. The result is a deeper, more complete understanding of language.

The approach is also expected to mitigate the “hallucinations” that plague many existing LLMs, as the resulting QNLP models are better equipped to distinguish the contexts of various pieces of information and produce more accurate outputs.

With the help of QNLP, researchers also hope to uncover the mental processes that allow us to understand and create sentences, yielding new insights into how language works in the mind.

Time-series forecasting

From the basic details of quantum mechanics, we learn that a quantum system (like an atom or a group of particles) can be described by a quantum state — a mathematical representation that keeps evolving with time. By studying this representation, we can determine the expected outcomes of an experiment involving that system. Based on the same idea, researchers have proposed a quantum generative model to work with time-series data.

A generative model is a mathematical model that generates data, if required with a user’s inputs. A general model designed to run on a quantum computer is a quantum generative model (QGen). Here, the techniques of quantum computing can be used to create or analyse sophisticated time-series data that conventional computers struggle with. Time-series data is data of something that has been recorded at fixed intervals. This new data can then be used to teach quantum algorithms to identify patterns in the data more efficiently, to solve complex problems related to forecasting (e.g. stock market trends), and/or to detect anomalies.

On May 20, 2024, researchers in Japan reported that a QGen AI model they built could successfully work with both stationary and nonstationary data.

Stationary data refers to information that doesn’t change much over time. It stays fairly constant or fluctuates around a stable average. For example, the current price of a commodity like gold or the world’s population can be considered stationary: the data doesn’t show big changes in trends over a short period and the values move within a predictable range. On the other hand, nonstationary data keep changing, such as ambient temperature, stock prices, and the GDP. Classical methods struggle to analyse such data accurately.

In the new study, the researchers built a time-series QGen AI model and evaluated its performance by applying it to solve plausible financial problems. They wrote in their preprint paper: “Future data for two correlated time series were generated and compared with classical methods such as long short-term memory and vector autoregression. Furthermore, numerical experiments were performed to complete missing values. Based on the results, we evaluated the practical applications of the time-series quantum generation model. It was observed that fewer parameter values were required compared with the classical method. In addition, the quantum time-series generation model was feasible for both stationary and nonstationary data.”

That fewer parameters were required means the model based on the quantum computer could solve the same problems as a classical computer but while requiring less computational resources.

In sum, quantum computing holds considerable potential to revolutionise AI applications, particularly in addressing the challenges posed by current LLMs. By embracing QNLP and QGen-AI, together with advancements in time-series forecasting, we can pave the way for sustainable, efficient, and performant AI systems.

Qudsia Gani is assistant professor, Department of Physics, Government Degree College, Pattan. Rukhsanul Haq is a quantum AI scientist at IBM Bengaluru. Mohsin Ilahi is senior quantum scientist, Centre of Excellence, Chinar Quantum AI, Pvt. Ltd., Srinagar.

Published – September 17, 2024 05:30 am IST



Source link

Science

Post navigation

Previous Post: US Woman Died After Abortion Ban Delayed Her Medical Care: Report
Next Post: Musk deletes post speculating about Harris and Biden assassination after widespread criticism

Related Posts

  • Chandrayaan-2 makes first ever observation of Sun’s impact on Moon, says ISRO
    Chandrayaan-2 makes first ever observation of Sun’s impact on Moon, says ISRO Science
  • Theory of relativity, quantum physics key to explaining Big Bang, says scientist
    Theory of relativity, quantum physics key to explaining Big Bang, says scientist Science
  • India’s first State-led Centre of Excellence for space tech launched in Bengaluru
    India’s first State-led Centre of Excellence for space tech launched in Bengaluru Science
  • Researchers use sound waves to detect elusive helium gas leaks
    Researchers use sound waves to detect elusive helium gas leaks Science
  • No direct evidence to show plastics cause autism, but what does BPA do?
    No direct evidence to show plastics cause autism, but what does BPA do? Science
  • Science Snapshots: April 26, 2026
    Science Snapshots: April 26, 2026 Science

More Related Articles

Why do we get headaches? Why do we get headaches? Science
How much salt should you take every day? How much salt should you take every day? Science
IIT Bombay researchers develop GPS-free control scheme for autonomous drone swarms IIT Bombay researchers develop GPS-free control scheme for autonomous drone swarms Science
Ancient landscape cut by rivers found deep under Antarctic ice Ancient landscape cut by rivers found deep under Antarctic ice Science
Russian Soyuz brings crew of three back from International Space Station Russian Soyuz brings crew of three back from International Space Station Science
Why is there a cold wave over parts of Telangana? | Explained Why is there a cold wave over parts of Telangana? | Explained Science
SiteLock

Archives

  • May 2026
  • April 2026
  • March 2026
  • February 2026
  • January 2026
  • December 2025
  • November 2025
  • October 2025
  • September 2025
  • August 2025
  • July 2025
  • June 2025
  • May 2025
  • April 2025
  • March 2025
  • February 2025
  • January 2025
  • December 2024
  • November 2024
  • October 2024
  • September 2024
  • August 2024
  • July 2024
  • June 2024
  • May 2024
  • April 2024
  • March 2024
  • February 2024
  • January 2024
  • December 2023
  • November 2023
  • October 2023
  • September 2023
  • August 2023
  • July 2023
  • June 2023
  • May 2023
  • April 2023
  • March 2023
  • February 2023
  • January 2023
  • December 2022
  • November 2022
  • October 2022
  • September 2022
  • August 2022
  • July 2022
  • June 2022
  • May 2022

Categories

  • Business
  • Nation
  • Science
  • Sports
  • World

Recent Posts

  • Nicobarese oppose proposal for three wildlife sanctuaries
  • Visakhapatnam Collector calls for inter-departmental synergy to boost investments
  • Kohli’s masterful knock powers Royal Challengers to the top
  • Vijay Narayan earns rare distinction of being Advocate General under two different governments
  • Learn from Sri Lanka’s experience on impact of fertilizer supply chains: experts

Recent Comments

  1. StevenLek on UP Teacher Who Asked Students To Slap Muslim Classmate
  2. Leonardren on UP Teacher Who Asked Students To Slap Muslim Classmate
  3. NathanQuins on UP Teacher Who Asked Students To Slap Muslim Classmate
  4. Davidgof on UP Teacher Who Asked Students To Slap Muslim Classmate
  5. NathanJobre on UP Teacher Who Asked Students To Slap Muslim Classmate
  • Cipher case | Pakistan court adjourns former PM Imran Khan, Shah Mahmood Qureshi’s indictment till October 23
    Cipher case | Pakistan court adjourns former PM Imran Khan, Shah Mahmood Qureshi’s indictment till October 23 World
  • There is a great feeling around the team, says Cummins
    There is a great feeling around the team, says Cummins Sports
  • Access Denied Sports
  • Access Denied
    Access Denied Nation
  • Australian former soldier gets bail on Afghanistan war crime charges
    Australian former soldier gets bail on Afghanistan war crime charges World
  • AAP Claims Arvind Kejriwal’s Car Attacked By BJP Workers, Party Hits Back
    AAP Claims Arvind Kejriwal’s Car Attacked By BJP Workers, Party Hits Back Nation
  • Ukraine’s military intelligence says North Korean troops suffering heavy battlefield losses
    Ukraine’s military intelligence says North Korean troops suffering heavy battlefield losses World
  • Sunil Gavaskar Meets Babar Azam At Dining Area, This Happens Next. Watch
    Sunil Gavaskar Meets Babar Azam At Dining Area, This Happens Next. Watch Sports

Editor-in-Chief:
Mohammad Ariff,
MSW, MAJMC, BSW, DTL, CTS, CNM, CCR, CAL, RSL, ASOC.
editor@artifex.news

Associate Editors:
1. Zenellis R. Tuba,
zenelis@artifex.news
2. Haris Daniyel
daniyel@artifex.news

Photograher:
Rohan Das
rohan@artifex.news

Artifex.News offers Online Paid Internships to college students from India and Abroad. Interns will get a PRESS CARD and other online offers.
Send your CV (Subjectline: Paid Internship) to internship@artifex.news

Links:
Associate Journalism
About Us
Privacy Policy

News Links:
Breaking News
World
Nation
Sports
Business
Entertainment
Lifestyle

Registered Office:
72/A, Elliot Road, Kolkata - 700016
Tel: 033-22277777, 033-22172217
Email: office@artifex.news

Editorial Office / News Desk:
No. 13, Mezzanine Floor, Esplanade Metro Rail Station,
12 J. L. Nehru Road, Kolkata - 700069.
(Entry from Gate No. 5)
Tel: 033-46011099, 033-46046046
Email: editor@artifex.news

Copyright © 2023 Artifex.News Newsportal designed by Artifex Infotech.