Advertisement
UK markets close in 5 hours 53 minutes
  • FTSE 100

    8,348.60
    -19.38 (-0.23%)
     
  • FTSE 250

    21,525.46
    -75.25 (-0.35%)
     
  • AIM

    785.71
    -1.31 (-0.17%)
     
  • GBP/EUR

    1.1836
    -0.0035 (-0.29%)
     
  • GBP/USD

    1.2765
    -0.0093 (-0.72%)
     
  • Bitcoin GBP

    50,573.16
    -1,365.72 (-2.63%)
     
  • CMC Crypto 200

    1,343.80
    -8.12 (-0.60%)
     
  • S&P 500

    5,522.30
    +85.86 (+1.58%)
     
  • DOW

    40,842.79
    +99.46 (+0.24%)
     
  • CRUDE OIL

    78.70
    +0.79 (+1.01%)
     
  • GOLD FUTURES

    2,477.60
    +4.60 (+0.19%)
     
  • NIKKEI 225

    38,126.33
    -975.49 (-2.49%)
     
  • HANG SENG

    17,304.96
    -39.64 (-0.23%)
     
  • DAX

    18,301.73
    -206.92 (-1.12%)
     
  • CAC 40

    7,438.75
    -92.74 (-1.23%)
     

DraftKings CEO rebuffs argument that AI could make sports betting more addictive: ‘There is some onus on the individual’

Artificial intelligence has been heralded for its ability to optimize tasks, from the menial to the more complex. But could it push unhealthy habits to a new extreme?

Like most organizations, DraftKings, the sports betting company, has increasingly incorporated AI into its business, amassing a trove of user data to enhance the platform.

But some argue that optimizing gambling could lead consumers down a dopamine rabbit hole by stoking the exact addictive behaviors that keep clients engaged.

The argument goes that once AI discerns what users want, it serves it up in a compelling, streamlined format. One need only look at e-commerce platforms as an example, which use AI to tailor shopping to a consumer’s exact tastes, keeping them in a loop of customized consumption.

ADVERTISEMENT

Speaking with Fortune Executive Exchange, DraftKings CEO Jason Robins said while the company takes gambling addictions very seriously, it shouldn’t accept full responsibility for preventing it.

“It’s not this black-and-white line,” he said. “There is some onus on the individual in these situations, too. But there’s a role we also have to play. We have to make sure that we’re both doing what we can to prevent it.”

To curb harmful fixations, Robins said the company offers tools that enable users to create limits for themselves, such as capping their monthly spending or time spent betting. DraftKings also employs a team dedicated to assessing high-risk users. If the team notices a client is betting for too long or spending a concerning amount of money, for instance, an employee reaches out to create some friction and assess the harm. Robins added that DraftKings’ advertisements list resources in its fine print, such as a help hotline for gambling addiction.

Still, using AI presents a deeper conundrum for a sports betting company that could hypothetically learn the language of addiction.

“We’re not using AI in that way,” Robins argued. Were we ever to employ AI in a way that was trying to do things that would show certain products or whatever, it would be much more ease-of-use driven.” He said that at this point, the company uses AI mainly for chatbot functions, to write code, and to find areas in which it can boost client satisfaction rather than ramp up gambling.

“If I’m betting on a particular football team every single week, and you are continuing to make me scroll down 12 pages to find that, that’s annoying,” Robins said. “You want it to be up top. No different than Netflix or Amazon showing you you just bought socks last time. ‘Here, click it if you want to buy it again.’ So I think it’s much more convenience-oriented.”

Nevertheless, industry titans like Amazon and Netflix do use AI to gather data on consumer behavior and curate personalized featured products and content that influence their audiences to engage with their sites longer. Despite this, Robins said he remains unconvinced that AI has the power to create or augment addictions.

“People who have gambling issues, they’re going to have a gambling issue,” he said. “And the job is to help identify those people and get them the help and get them to understand they need help.” But he added: “It has to be on them to decide that they want to change that behavior.”

Robins acknowledged that it could be tricky asking users flagged as high-risk by DraftKings’ system to cut back. If a player has already lost a substantial amount of money, requesting them to stop can feel upsetting. But he noted that if DraftKings draws in users with betting addictions, then it didn’t do a good enough job of ensuring that people with an addiction don’t make it onto the platform in the first place.

“You have to try to build the product so that it’s providing value for the people who should be playing it. And the people who shouldn’t be playing it, you have to try to have them not play the product,” Robins said.

He believes AI can play a role in that. “​​AI can actually be helpful in that regard by having good models that identify player behavior patterns and then having means of intervening when you find that there’s somebody that you think has an issue,” Robins said. “But it’s a delicate thing, right? All we can do is flag high-risk situations and then have more manual interventions that dig into them.”

This story was originally featured on Fortune.com