Advertisement
UK markets open in 4 hours 34 minutes
  • NIKKEI 225

    37,803.30
    +174.82 (+0.46%)
     
  • HANG SENG

    17,567.93
    +283.39 (+1.64%)
     
  • CRUDE OIL

    83.82
    +0.25 (+0.30%)
     
  • GOLD FUTURES

    2,347.00
    +4.50 (+0.19%)
     
  • DOW

    38,085.80
    -375.12 (-0.98%)
     
  • Bitcoin GBP

    51,600.87
    +258.00 (+0.50%)
     
  • CMC Crypto 200

    1,389.49
    +6.92 (+0.50%)
     
  • NASDAQ Composite

    15,611.76
    -100.99 (-0.64%)
     
  • UK FTSE All Share

    4,387.94
    +13.88 (+0.32%)
     

Microsoft limits Bing ChatGPT AI in attempt to stop it going awry

Microsoft Artificial Intelligence (Copyright 2023 The Associated Press. All rights reserved.)
Microsoft Artificial Intelligence (Copyright 2023 The Associated Press. All rights reserved.)

Microsoft has limited the number of interactions that people can have with its “new Bing” system, after it appeared to have a breakdown.

Earlier this month, Microsoft announced that it would be updating its Bing search engine with the same technology that underpins ChatGPT, allowing it to use artificial intelligence to discuss queries with users. The company said that it would allow for more precise and detailed answers.

But in recent days, users have found that the system has attacked and insulted them, lied to them and appeared to question its own purpose.

Some have suggested that the system may have become self-aware, and is expressing its own feelings. But it appears more likely that it has become confused and is attempting to match people’s conversations with messages of a similar tone and intensity.

ADVERTISEMENT

Now Microsoft has said that it also becomes “confused” when people talk to it for too long, and that it will be banning people from doing so.

The company had initially responded to reports that the chatbot was attacking users by saying that long conversation could make the system repetitive and that it may be “prompted/provoked to give responses that are not necessarily helpful or in line with our designed tone”.

Conversations will now be capped at 50 chat turns per day and five chat turns per session, Microsoft said, with a chat turn being one question and one answer.

Most people already find the answer they are looking for within five turns, Microsoft said. Only around one percent of conversations have more than 50 messages.

Now, if someone tries to talk to the system more than five times, they will be prompted to start again. At the end of each chat sessions, users will also be asked to clean it up, removing the old conversation so that the model does not get confused.

Over time, Microsoft “will explore expanding the caps on chat sessions to further enhance search and discovery experiences”.

The new limitations on the chatbot come after those initial reports led to a host of articles in which journalists conducted long conversations with Bing. In one, for the New York Times, a reporter published a two-hour conversation in which the system appeared to become increasingly critical and belligerent.