Advertisement
UK markets closed
  • FTSE 100

    7,895.85
    +18.80 (+0.24%)
     
  • FTSE 250

    19,391.30
    -59.37 (-0.31%)
     
  • AIM

    745.67
    +0.38 (+0.05%)
     
  • GBP/EUR

    1.1607
    -0.0076 (-0.65%)
     
  • GBP/USD

    1.2370
    -0.0068 (-0.55%)
     
  • Bitcoin GBP

    51,688.87
    +1,297.20 (+2.57%)
     
  • CMC Crypto 200

    1,371.97
    +59.34 (+4.52%)
     
  • S&P 500

    4,967.23
    -43.89 (-0.88%)
     
  • DOW

    37,986.40
    +211.02 (+0.56%)
     
  • CRUDE OIL

    83.24
    +0.51 (+0.62%)
     
  • GOLD FUTURES

    2,406.70
    +8.70 (+0.36%)
     
  • NIKKEI 225

    37,068.35
    -1,011.35 (-2.66%)
     
  • HANG SENG

    16,224.14
    -161.73 (-0.99%)
     
  • DAX

    17,737.36
    -100.04 (-0.56%)
     
  • CAC 40

    8,022.41
    -0.85 (-0.01%)
     

Google’s Co-Head of Ethical AI Says She Was Fired for Email

(Bloomberg) -- Timnit Gebru, a co-leader of the Ethical Artificial Intelligence team at Google, said she was fired for sending an email that management deemed “inconsistent with the expectations of a Google manager.”

The email and the firing were the culmination of about a week of wrangling over the company’s request that Gebru retract an AI ethics paper she had co-written with six others, including four Google employees, that was submitted for consideration for an industry conference next year, Gebru said in an interview Thursday. If she wouldn’t retract the paper, Google at least wanted the names of the Google employees removed.

Gebru asked Google Research vice president Megan Kacholia for an explanation and told her that without more discussion on the paper and the way it was handled she would plan to resign after a transition period. She also wanted to make sure she was clear on what would happen with future, similar research projects her team might undertake.

“We are a team called Ethical AI, of course we are going to be writing about problems in AI,” she said.

ADVERTISEMENT

Meanwhile, Gebru had chimed in on an email group for company researchers called Google Brain Women and Allies, commenting on a report others were working on about how few women had been hired by Google during the pandemic. Gebru said that in her experience, such documentation was unlikely to be effective as no one at Google would be held accountable. She referred to her experience with the AI paper submitted for the conference and linked to the report.

“Stop writing your documents because it doesn’t make a difference,” Gebru wrote in the email, which was obtained by Bloomberg News. “There is no way more documents or more conversations will achieve anything.” The email was published earlier by tech writer Casey Newton.

The next day, Gebru said she was fired by email, with a message from Kacholia saying that Google couldn’t meet her demands and respects her decision to leave the company as a result. The email went on to say that “certain aspects of the email you sent last night to non-management employees in the brain group reflect behavior that is inconsistent with the expectations of a Google manager,” according to tweets Gebru posted, which also specifically called out Jeff Dean, who heads Google’s AI division, for being involved in her removal.

“It’s the most fundamental silencing,” Gebru said in the interview, about Google’s actions in regard to her paper. “You can’t even have your scientific voice.”

Representatives for Mountain View, California-based Google didn’t reply to multiple requests for comment.

Google protest group Google Walkout For Real Change posted a petition in support of Gebru on Medium, which has gathered more than 400 signatures from employees at the company and more than 500 from academic and industry figures.

The research paper in question deals with possible ethical issues of large language models, a field of research being pursued by OpenAI, Google and others. Gebru said she doesn’t know why Google had concerns about the paper, which she said was approved by her manager and submitted to others at Google for comment.

Google requires all publications by its researchers to be granted prior approval, Gebru said, and the company told her this paper had not followed proper procedure. The report was submitted to the ACM Conference on Fairness, Accountability, and Transparency, a conference Gebru co-founded to be held in March.

The paper called out the dangers of using large language models to train algorithms that could, for example, write tweets, answer trivia and translate poetry, according to a copy of the document. The models are essentially trained by analyzing language from the internet, which doesn’t reflect large swaths of the global population not yet online, according to the paper. Gebru highlights the risk that the models will only reflect the worldview of people who have been privileged enough to be a part of the training data.

Gebru, an alumni of the Stanford Artificial Intelligence Laboratory, is one of the leading voices in the ethical use of AI. She is well-known for her work on a landmark study in 2018 that showed how facial recognition software misidentified dark-skinned women as much as 35% of the time, whereas the technology worked with near precision on White men.

She has also been an outspoken critic of the lack of diversity and unequal treatment of Black workers at tech companies, particularly at Alphabet Inc.’s Google, and said she believed her dismissal was meant to send a message to the rest of Google’s employees not to speak up.

Under Fire

Tensions were already running high at Google’s research division. Following the death of George Floyd, a Black man who was killed during an arrest by White police officers in Minneapolis in May, the division held an all-hands meeting where Black Googlers spoke about their experiences at the company. Many people broke down crying, according to people who attended the meeting.

Gebru disclosed the firing Wednesday night in a series of tweets, which were met with support from some of her Google co-workers and others in the field.

Gebru is “the reason many next generation engineers, data scientists and more are inspired to work in tech,” wrote Rumman Chowdhury, who formerly served as the head of Responsible AI at Accenture Applied Intelligence and now runs a company she founded called Parity.

Google, along with other U.S. tech giants including Amazon.com Inc. and Facebook Inc., has been under fire from the government for claims of bias and discrimination and has been questioned about its practices at several committee hearings in Washington.

A year ago, Google fired four employees for what it said were violations of data-security policies. The dismissals highlighted already escalating tensions between management and activist workers at a company once revered for its open corporate culture. Gebru took to Twitter at the time to support those who lost their jobs.

For the web search giant, Gebru’s alleged termination comes as the company faces a complaint from the National Labor Relations Board for unlawful surveillance, interrogation or suspension of workers.

Earlier this week Gebru inquired on Twitter whether anyone was working on regulation to protect ethical AI researchers, similar to whistle-blower protections. “With the amount of censorship and intimidation that goes on towards people in specific groups, how does anyone trust any real research in this area can take place?” she wrote on Twitter.

Rare Voice

Gebru was a rare voice of public criticism from inside the company. In August, Gebru told Bloomberg News that Black Google employees who speak out are criticized even as the company holds them up as examples of its commitment to diversity. She recounted how co-workers and managers tried to police her tone, make excuses for harassing or racist behavior, or ignore her concerns.

When Google was considering having Gebru manage another employee, she said in an August interview, her outspokenness on diversity issues was held against her, and concerns were raised about whether she could manage others if she was so unhappy. “People don’t know the extent of the issues that exist because you can’t talk about them, and the moment you do, you’re the problem,” she said at the time.

(Adds petition in 10th paragraph)

For more articles like this, please visit us at bloomberg.com

Subscribe now to stay ahead with the most trusted business news source.

©2020 Bloomberg L.P.