The Signal for Help I Created Went Viral. Currently It Could Be Misused

In 2020, I assisted develop the Signal for Help, a hand signal that interacts to good friends, household, as well as onlookers that “I require you to sign in on me in a risk-free means.” Our group advertised the Signal for Help throughout social networks, preparing for a pandemic-related increase in currently high prices of gendered physical violence, as well as it went viral in November 2021 throughout a charged time of stress and anxiety, stay-at-home regulations, as well as the expansion of video clip calls.

Cases of ladies as well as ladies using the Signal for Help to get help in dangerous situations have made the news. For example, a woman used the Signal for Help during a traffic stop to get help with her abusive husband, as well as another woman used it to notify staff at a gas station that she was being held against her will by a violent ex-boyfriend. As a result, well-meaning people have been trying to integrate the Signal for Help with digital technology. A company with AI camera tools reached out to ask about building recognition of the Signal for Help into their security system, and there have been similar amateur attempts discussed on social media.

The appeal is clear: Automatic detection could be useful for a well-intentioned friend or coworker on the other side of a video call who might miss seeing someone making use of the Signal for Help. It’s admirable that people want to help those who may be in danger, but these new applications of technology misunderstand the purpose and use of the Signal for Help.

Such efforts are part of a growing trend of using AI to recognize distress: Experiments identifying distress in livestock like chickens, cattle, and pigs yield promising results because AI seems to disentangle a cacophony of animal shrieks, clucks, and grunts better than the naked ear.

But humans are not chickens or cattle. Intention to abuse and control can transform luddites into experts. In dangerous relationships, there’s always the question of who’s in charge of the tech.

The Signal for Help is an intentionally ephemeral tool, designed to help people communicate without uttering a word, and without leaving a digital trace. I’m being hurt … I can’t say it out loud … will you be there for me while I figure it out? Impermanence is an important feature, given the way abusers tend to control and manipulate. They lurk and stalk and monitor devices. Women’s shelters routinely help survivors deal with hacked smartphones, unwanted location tracking and voice recording apps, hidden cameras, and the like. Message boards, social media, and even word-of-mouth can help abusers violate the people they claim to love. In the case of Signal for Help, abusers might use the same AI mechanism designed for safety to alert them that the person they’re hurting is trying to use the Signal for Help.

And there are other problems with AI tools to detect distress in humans, which include software to scan student emails and web searches for self-harm and violence, as well as to identify student confusion, boredom, and distraction in virtual classrooms. On top of ethical and privacy concerns, their deployment hinges on the belief that we can reliably perceive someone in trouble, and act on it in a way that will truly help them. These tools operate on a positivist belief that when a human is in distress, they express it outwardly in predictable ways. And when they express it, they desire a specific kind of intervention.

But research shows that our assumption that human facial expressions align to emotions is not one we can wholeheartedly believe. Mismatches between body and emotion may be more pronounced in unhealthy relationships. People being abused speak of disassociation, of needing to “leave their bodies” to survive. Some refer to the lengths they take to obscure their offense, injury, and pain, that they have to do so to placate abusers and the bystanders who back them up. They talk about how conscious they are of every inflection and twitch, of how they chew, blink, and breathe, and that they obtain punished when they merely exist in a means that aggravates their abusers.

click here to read full news

Click here for latest AI news

You may also like...

Leave a Reply

Your email address will not be published. Required fields are marked *