Advertisement
UK markets closed
  • FTSE 100

    8,139.83
    +60.97 (+0.75%)
     
  • FTSE 250

    19,824.16
    +222.18 (+1.13%)
     
  • AIM

    755.28
    +2.16 (+0.29%)
     
  • GBP/EUR

    1.1679
    +0.0022 (+0.19%)
     
  • GBP/USD

    1.2494
    -0.0017 (-0.13%)
     
  • Bitcoin GBP

    50,370.15
    -1,175.38 (-2.28%)
     
  • CMC Crypto 200

    1,304.48
    -92.06 (-6.59%)
     
  • S&P 500

    5,099.96
    +51.54 (+1.02%)
     
  • DOW

    38,239.66
    +153.86 (+0.40%)
     
  • CRUDE OIL

    83.66
    +0.09 (+0.11%)
     
  • GOLD FUTURES

    2,349.60
    +7.10 (+0.30%)
     
  • NIKKEI 225

    37,934.76
    +306.28 (+0.81%)
     
  • HANG SENG

    17,651.15
    +366.61 (+2.12%)
     
  • DAX

    18,161.01
    +243.73 (+1.36%)
     
  • CAC 40

    8,088.24
    +71.59 (+0.89%)
     

Microsoft responds after users of new Bing chatbot complain about its latest behaviour

Microsoft Artificial Intelligence (Copyright 2022 The Associated Press. All rights reserved.)
Microsoft Artificial Intelligence (Copyright 2022 The Associated Press. All rights reserved.)

Microsoft has responded to users of its new Bing chatbot, who have complained that limitations intended to stop the system going awry have also made it boring.

In the days since Microsoft announced that it was integrating ChatGPT technology into its search engine, many users have conducted long conversations with the system – and been left disturbed by what it has said. Users have described its chats as “unhinged”, and reported that it had attacked and lied to them.

In response, Microsoft said that many of those unusual remarks by its chatbot came in long conversations for which it had not been built. It said that such long chats “confused the underlying model”, and said that they represented only a very small number of such conversations.

ADVERTISEMENT

In an attempt to reduce the number of those interactions, Microsoft said that it would be limiting chats to five “turns” per session and a total of 50 each day. It encouraged users to clear out their history regularly so that the chatbot would forget old conversations and not become confused.

“This was in response to a handful of cases in which long chat sessions confused the underlying model,” Microsoft said.

“These long and intricate chat sessions are not something we would typically find with internal testing.  In fact, the very reason we are testing the new Bing in the open with a limited set of preview testers is precisely to find these atypical use cases from which we can learn and improve the product.”

After it announced those changes, however, many users complained that the shorter chats had ruined the fun of the system, and that Microsoft had reduced the number of ways that it could be used. On Reddit, for instance, users complained that the new Bing had become little more than an enhanced search engine.

The company said that it had received feedback directly that people wanted longer chats so that they could “both search more effectively and interact with the chat feature better”.

Over time, Microsoft hopes “to bring back longer chats and are working hard as we speak on the best way to do this responsibly”. Initially, it will increase the limits to six turns per session and 60 per day, with a view to increasing the daily cap to 100 chats “soon”.

But Microsoft also intends to add an option that will let users control how the system deals with their questions. It suggested that it will add something like a slider, so that users can choose a more “creative” option if they want the kinds of in-depth discussions that have gone viral in recent days.

“We are also going to begin testing an additional option that lets you choose the tone of the Chat from more Precise – which will focus on shorter, more search focused answers – to Balanced, to more Creative – which gives you longer and more chatty answers. The goal is to give you more control on the type of chat behavior to best meet your needs,” Microsoft said.