NVDA gained a massive 197% since our AI first added it in November - is it time to sell? 🤔Read more

'Godfather Of AI' Explains Why ChatGPT, Copilot, And Other AI Chatbots Hallucinate

Published 08/03/2024, 14:24
Updated 08/03/2024, 15:40
© Reuters.  'Godfather Of AI' Explains Why ChatGPT, Copilot, And Other AI Chatbots Hallucinate
MSFT
-
META
-

Benzinga - by Rounak Jain, Benzinga Staff Writer.

AI chatbots like OpenAI's ChatGPT, Microsoft Corp.'s (NASDAQ:MSFT) Copilot and others can sometimes generate responses or output that is nonsensical. This is known as hallucination. While it does happen sometimes, not everyone knows the reason behind it.

What Happened: Meta Platforms Inc.'s (NASDAQ:META) chief AI scientist and "Godfather of AI" Yann LeCun has explained exactly why AI chatbots hallucinate, in an episode of the Lex Fridman podcast.

Hallucination in AI refers to the phenomenon where large language models (LLMs) generate responses or outputs that are nonsensical, irrelevant, or disconnected from the input or context.

But why does it happen? Why do chatbots like ChatGPT and Copilot lose it?

"Because of the autoregressive prediction, every time an AI produces a token or a word, there is some level of probability for that word to take you out of the set of reasonable answers," explains LeCun, picturing how chatbots derail the conversation.

As the conversation continues, the probability of chatbots returning a nonsensical answer increases exponentially, according to LeCun.

"It's like errors accumulate. So, the probability that an answer would be nonsensical increases exponentially with the number of tokens."

LeCun explains that large language models don't always account for all the questions a user might ask, and when a user asks a question that is outside the scope of what the model has been trained on, it starts hallucinating and spits nonsense.

Why It Matters: AI chatbots have the tendency to hallucinate and can sometimes ask users to "worship" them, as was the case with Microsoft's Copilot recently.

Earlier when Microsoft adopted OpenAI’s ChatGPT technology and launched the Bing AI chatbot, users quickly discovered it had multiple other personas called “Sydney,” “Fury,” “Venom,” among others.

Now we know why chatbots behave in a crazy fashion sometimes.

Check out more of Benzinga's Consumer Tech coverage by following this link.

Read Next: Microsoft’s Sydney AI Makes A Comeback? New Copilot Conversations Hint At The Return Of The Sassy Chatbot

Photo courtesy: Shutterstock

© 2024 Benzinga.com. Benzinga does not provide investment advice. All rights reserved.

Read the original article on Benzinga

Latest comments

Risk Disclosure: Trading in financial instruments and/or cryptocurrencies involves high risks including the risk of losing some, or all, of your investment amount, and may not be suitable for all investors. Prices of cryptocurrencies are extremely volatile and may be affected by external factors such as financial, regulatory or political events. Trading on margin increases the financial risks.
Before deciding to trade in financial instrument or cryptocurrencies you should be fully informed of the risks and costs associated with trading the financial markets, carefully consider your investment objectives, level of experience, and risk appetite, and seek professional advice where needed.
Fusion Media would like to remind you that the data contained in this website is not necessarily real-time nor accurate. The data and prices on the website are not necessarily provided by any market or exchange, but may be provided by market makers, and so prices may not be accurate and may differ from the actual price at any given market, meaning prices are indicative and not appropriate for trading purposes. Fusion Media and any provider of the data contained in this website will not accept liability for any loss or damage as a result of your trading, or your reliance on the information contained within this website.
It is prohibited to use, store, reproduce, display, modify, transmit or distribute the data contained in this website without the explicit prior written permission of Fusion Media and/or the data provider. All intellectual property rights are reserved by the providers and/or the exchange providing the data contained in this website.
Fusion Media may be compensated by the advertisers that appear on the website, based on your interaction with the advertisements or advertisers.
© 2007-2024 - Fusion Media Limited. All Rights Reserved.