Skip to main content
AI

Timnit Gebru debuts DAIR, an AI ethics group that will center marginalized communities

The global research center is starting with $3.7 million in funding, but aims to eventually earn income via AI ethics consulting.
article cover

Kimberly White/Getty Images

3 min read

On December 2, 2020, Timnit Gebru tweeted that she had been “immediately fired” from her role as co-lead of Google’s AI ethics team.

Exactly one year later, she announced her new venture, met with much fanfare in the tech community: the Distributed Artificial Intelligence Research Institute (DAIR). The global institute will focus on centering the marginalized communities that are most vulnerable to the tech’s harms.

Why it matters: Through both research and—eventually—consulting, Gebru hopes to push the tech industry toward a more ethical approach to AI deployment.

To recap, in December 2020, tensions between Gebru—then co-lead of Google’s AI ethics team—and Google leadership reportedly came to a head after a dispute over her research paper on the dangers of large language models.

Gebru was ultimately fired from Google, and in the months following, Google terminated AI ethics co-lead Margaret Mitchell, announced its continued forays into large language models, and restructured its AI ethics teams.

  • Google has disputed both Gebru’s and Mitchell’s versions of events.

With DAIR, Gebru has said she wants to use her past experiences in Big Tech to create an environment where researchers can openly communicate their findings about AI’s harms, and where research subjects are acknowledged or paid.

“I’ve been frustrated for a long time about the incentive structures that we have in place and how none of them seem to be appropriate for the kind of work I want to do,” Gebru told The Washington Post.

Since the tech world skews white and male, and AI research skews Western, DAIR will recruit staff from global communities that are underrepresented and underserved in the tech world—people who can help build beneficial AI applications that may not be created otherwise.

“Technology affects the whole world, but the whole world is not getting a chance to affect technology right now,” Gebru reportedly said. “If you want community-rooted research and you need to displace people from their communities, and they have to all go to Silicon Valley...That’s not the kind of thing I want to contribute to.”

Keep up with the innovative tech transforming business

Tech Brew keeps business leaders up-to-date on the latest innovations, automation advances, policy shifts, and more, so they can make informed decisions about tech.

One example: DAIR is working on a public data set and paper showing how South African apartheid still affects land use in the country today. An AI analysis of aerial images suggested that between 2011 and 2017, in a region densely populated with poor people, most vacant land was turned into wealthy residences.

No strings attached

DAIR has already raised $3.7 million in funding from investors including the Ford Foundation, the MacArthur Foundation, and the Rockefeller Foundation. But moving forward, to help cement its independence, the institute will increasingly look to earn operating income from consulting work in AI ethics.

“The same big tech leaders who push out people like me are also the leaders who control big philanthropy and the government’s agenda for the future of AI research,” Gebru wrote in an op-ed for The Guardian. “If I speak up and antagonize a potential funder, it is not only my job on the line, but the jobs of others at the institute.”

Looking ahead: Gebru wrote that a system of checks and balances, as well as dedicated government funding for independent AI research, is needed to help distribute power in the tech world—e.g., “alternatives to the hugely concentrated power of a few large tech companies and the elite universities closely intertwined with them.”

Keep up with the innovative tech transforming business

Tech Brew keeps business leaders up-to-date on the latest innovations, automation advances, policy shifts, and more, so they can make informed decisions about tech.