The US Army Research Laboratory gave a $1,500,000 three-year grant to two associate professors to develop what’s being called a “fully automated luxury microaggression detector” Alexa-like device to “catch implicit bias” in workplaces across America.
Excited to share that @criedl, @RichRadke, Paul Sajda and I were awarded a $1.5M grant from @ArmyResearchLab to study human-agent teams. We hope to develop tech to detect (and fix!) implicit and explicit bias in teams. https://t.co/1OYOA3XTuO
— Dr. Brooke Foucault Welles (@foucaultwelles) January 30, 2020
Despite the growing adoption of implicit bias training, some in the field of human resources have raised doubts about its effectiveness in improving diversity and inclusion within organizations.
But what if a smart device, similar to the Amazon Alexa, could tell when your boss inadvertently left a female colleague out of an important decision, or made her feel that her perspective wasn’t valued?
This device doesn’t yet exist, but Northeastern associate professors Christoph Riedl and Brooke Foucault Welles are preparing to embark on a three-year project that could yield such a gadget. The researchers will be studying from a social science perspective how teams communicate with each other as well as with smart devices while solving problems together.
“The vision that we have [for this project] is that you would have a device, maybe something like Amazon Alexa, that sits on the table and observes the human team members while they are working on a problem, and supports them in various ways,” says Riedl, an associate professor who studies crowdsourcing, open innovation, and network science. “One of the ways in which we think we can support that team is by ensuring equal inclusion of all team members.”
“One step closer to a fully automated luxury microaggression detector,” psychology professor Geoffrey Miller commented. “Coming soon to your schools, workplaces, and public spaces.”
This is beyond parody.
The pair have received a $1.5 million, three-year grant from the U.S. Army Research Laboratory to study teams using a combination of social science theories, machine learning, and audio-visual and physiological sensors.
[…] “You could imagine [a scenario] where maybe a manager at the end of a group deliberation gets a report that says person A was really dominating the conversation,” says Welles. The smart device would alert the manager to the participants whose input might have been excluded, she says, with a reminder to follow up with that individual.
As a woman, Welles says she knows all too well how it feels to be excluded in a professional setting.
“When you’re having this experience, it’s really hard as the woman in the room to intervene and be like, ‘you’re not listening to me,’ or ‘I said that and he repeated it and now suddenly we believe it,'” she says. “I really love the idea of building a system that both empowers women with evidence that this is happening so that we can feel validated and also helps us point out opportunities for intervention.”
Imagine getting a $1.5 million grant from the US Army Research Laboratory to create a mansplaining detector and thinking you’re oppressed.
There are some challenges that the researchers expect to face. For one, says Welles, it’s anyone’s guess how knowledge of the presence of such a device in a room will affect how its occupants interact with one another. Another unpredictability, she says, is how subjects will respond to any errors that the device makes.
Indeed, it’s literally impossible to predict how men will feel having a microaggression detector spying on them at all times for crimes against wokeness!