Timnit Gebru: Google and big tech are 'institutionally racist'
- Published
Timnit Gebru, a highly influential artificial-intelligence computer scientist, is at the centre of a race row that has engulfed Google's AI research workforce and raised passions beyond.
She says she was fired by Google after it took issue with an academic paper she had co-authored.
And Dr Gebru and her supporters believe institutional racism played a role in her departure.
The paper focused on issues related to AI language models - including structural bias against women and people belonging to ethnic minorities.
Google says the paper omitted relevant research on the topic and Dr Gebru resigned.
An open letter demanding to know why the paper was rejected has been signed, external by more than 6,000 people, including prominent researchers at Google and its DeepMind division as well as Microsoft, Apple, Facebook, Amazon and Netflix, among others.
Google chief executive Sundar Pichai wrote in a memo: "We need to accept responsibility for the fact that a prominent black, female leader with immense talent left Google unhappily.
"It's incredibly important to me that our black, women, and under-represented Googlers know that we value you and you do belong at Google."
He added the company would investigate its handling of the matter but stopped short of apologising for Dr Gebru's departure.
Dr Gebru is far from satisfied with that response and has now spoken to the BBC. The following interview has been edited for brevity and clarity.
What have the past couple of weeks been like for you on a personal level?
They have been exhausting.
It is not fun to be in the spotlight like this.
I feel like I've been thrown into a storm.
What was the paper about?
I did not expect it to be such a wave-making paper or anything like that.
The paper was about the ethical considerations of development research and development of large language models - one of which was bias.
Google says the paper didn't contain some of the latest research on the subject.
First of all, that's not true.
Secondly, imagine if it was true - are you really going to justify terminating someone the way they terminated me because the paper didn't contain a literature review?
But it's absolutely not true.
Do you think that Google would have treated you differently if you were a white man?
I have definitely been treated differently.
In all of the cases that I've seen in the past, they [Google] try so hard not to make it a headline.
They try so hard to make it smooth.
When it's some other person who is toxic, there are always these conversations about: "Oh, but you know, they're so valuable to the company, they're a genius, they're just socially awkward, et cetera."
My entire team is completely behind me and they're taking risks.
They're taking actual risks to stand behind me.
My manager is standing behind me.
And even still, they decided to treat me in this way.
So definitely, I feel like I've been treated differently.
I suppose if you think that, the next obvious question is do you think Google itself is institutionally racist?
Yes, Google itself is institutionally racist.
That's quite a thing to say - you were a Google employee until a short while ago.
I feel like most if not all tech companies are institutionally racist.
I mean, how can I not say that they are not institutionally racist?
The Congressional Black Caucus is the one who's forcing them to publish their diversity numbers.
It's not by accident that black women have one of the lowest retention rates[, in the technology industry].
So for sure Google and all of the other tech companies are institutionally racist.
Google says it cares about diversity. Do you agree?
I don't agree that Google cares about diversity.
What they need to do is be comfortable with these kinds of uncomfortable discussions - don't silence people, don't retaliate against them.
What they're doing is saying: "OK, we're going to have some random committee," or: "We're going to invest a certain number of million dollars into something."
But when it comes to someone challenging the status quo in the slightest possible way, you see what just happened?
So I think it's just very difficult for me to believe that they care about diversity.
Sundar Pichai has apologised for the circumstances around your departure but not for the departure itself. How do you feel about that?
He didn't even apologise for the company's handling of it.
He said this has sowed doubts among some in our community, that they feel like they might not belong.
And for that, he's sorry.
So it doesn't say: "I'm sorry for the way we handled this. We were wrong. I'm sorry for what we did to her," - nothing.
I don't consider it an apology whatsoever.
I consider it a statement that had to be made to make them look better,
How do you see AI going forward? Do you do you worry about racial discrimination in AI?
Definitely - and a lot of people have said that they think the next frontier for discrimination… is in this in this kind of technological realm in AI.
And so because of that, I worry very much about it.
And I and many - especially black - women have been writing about this and even teaching classes about this.
Unless there is some sort of shift of power, where people who are most affected by these technologies are allowed to shape them as well and be able to imagine what these technologies should look like from the ground up and build them according to that, unless we move towards that kind of future, I am really worried that these tools are going to be used for more for harm than good.
What are you going to do now?
I need to take some time to think about what I'll be doing next.
Right now, my priority is the safety of the people who've been supporting me.
I do not want any sort of retaliation against them.
So that's my priority right now.
BBC News put the accusations in this interview to Google.
It declined to comment but directed BBC News to Mr Pichai's memo, external and a statement made by Google AI head Jeff Dean, external, raising concerns about "gaps" in Dr Gebru's research paper.
James Clayton is the BBC's North America technology reporter based in San Francisco. Follow him on Twitter @jamesclayton5, external.
Related topics
- Published10 December 2020
- Published8 December 2020
- Published4 December 2020