top of page

A bias bounty for AI will help to catch unfair algorithms faster

AI systems are deployed all the time, but it can take months or even years until it becomes clear whether, and how, they’re biased. The stakes are often sky-high: unfair AI systems can cause innocent people to be arrested, and they can deny people housing, jobs, and basic services.

Today (Oct 20, 2022) a group of AI and machine-learning experts are launching a new bias bounty competition, which they hope will speed the process of uncovering these kinds of embedded prejudice. The competition, which takes inspiration from bug bounties in cybersecurity, calls on participants to create tools to identify and mitigate algorithmic biases in AI models.

It’s being organized by a group of volunteers who work at companies like Twitter, software company Splunk, and deepfake detection startup Reality Defender. They’ve dubbed themselves the “Bias Buccaneers.”

The first bias bounty competition is going to focus on biased image detection. It’s a common problem: in the past, for example, flawed image detection systems have misidentified Black people as gorillas.


Competitors will be challenged to build a machine-learning model that labels each image with its skin tone, perceived gender, and age group, which will make it easier to measure and spot biases in datasets. They will be given access to a data set of around 15,000 images of synthetically generated human faces. Participants are ranked on how accurately their model tags images and how long the code takes to run, among other metrics. The competition closes on November 30.

Microsoft and startup Robust Intelligence have committed prize money of $6,000 for the winner, $4,000 for the runner-up, and $2,000 for whoever comes third. Amazon has contributed $5,000 to the first set of entrants for computing power.

The competition is an example of a budding industry that’s emerging in AI: auditing for algorithmic bias. Twitter launched the first AI bias bounty last year, and Stanford University just concluded its first AI audit challenge. Meanwhile, nonprofit Mozilla is creating tools for AI auditors.



1 view0 comments

Comments


bottom of page