Chatbots are available to track your calorie intake, assist you in finding a new place to live, and even promote better mental health. But recently, developers have made chatbots with a social justice slant so you can report misdeeds quickly and without giving your name.
Here are a few of the bots that help users battle injustice without putting themselves at greater risk.
As a child, you may have learned that members of the police force are available to protect you and keep everyone safe. However, as you’ve gotten older, personal experiences or headlines involving officers acting with inappropriate and sometimes deadly force in the line of duty may have undermined that trust.
On his show, TV personality John Oliver discussed how one of the issues associated with today’s police officers is a lack of accountability. Oliver revealed that according to Philip Stinson, a man who tracked what happened after fatal shootings by police via Google Alerts, thousands of such incidents have occurred since 2005, but only 77 officers were charged with murder or manslaughter, and only 26 were convicted.
Besides the fact that the data is not readily available as situations are unfolding, it’s also not open to the public after an incident. However, the Raheem chatbot aims to change that. Created by Brandon Anderson, a man whose life partner was killed by a police officer during a routine traffic stop, the chatbot lets users report both good and bad interactions they’ve had with members of law enforcement.
Most individuals don’t report incidents of misconduct related to interactions with police because cities typically require that you go to police stations in person and file a written complaint. Doing so can be uncomfortable and even traumatizing. However, Raheem asks simple questions and guides users through the process of making reports with Facebook Messenger. All submitted data is available for others to see in real time on an online dashboard.
Sexual harassment is illegal in the workplace, but it still happens. Sometimes, multiple victims are associated with a single incident. Coworkers who are not the direct target of the harassment often hear or see what’s going on and feel uncomfortable. They may be too scared to report it to the human resources department, fearing potential retaliation or that the HR rep they talk to won’t take them seriously.
Fortunately, there’s Spot, a chatbot that initiates an interview process about any sexual harassment event you’ve experienced or witnessed. The interview could take as few as 10 minutes or as long as the individual feels is necessary to provide all the details. The bot converts the information gathered during the session to a signed PDF report for victims to use as a reference tool and proof of what occurred. The bot deletes reports from its database after 30 days.
Furthermore, if desired, a user can send the report from Spot’s secure servers to their company to take further action, still without giving a name.
One of the significant advantages of many of today’s emerging technologies is that they facilitate faster, more tailored data collection. Some, like artificial intelligence, use the compiled information to give you a more personalized experience on social media.
Technology can make data collection simpler for end users, too. Such is the case with the Jornaler@ chatbot, which helps reduce wage theft.
Keeping track of how much you earn isn’t always easy, especially if you work at several job sites or frequently get hired for temporary assignments. Unfortunately, employers with bad intentions can capitalize on that element of confusion and fail to pay workers the proper amounts.
Jornaler@ aids in recording your hours and the amount you’ve earned. And it allows you to report cases of wage theft to the appropriate body in your area. Plus, you can help others steer clear of dishonest employers by warning fellow workers about your experiences and posting updates through the app.
Is this the beginning of a tech-fueled social revolution?
The chatbots above promote reporting incidents while protecting users’ anonymity. And there are many more socially minded chatbots available in the growing field.
For example, Gabbie, a chatbot based in the Philippines, helps users determine if they’ve experienced sexual assault and can provide additional information about how to proceed. There’s also HelloCass, an Australian chatbot that offers information to domestic violence victims and their concerned friends.
Many laud these chatbots for helping people get assistance without making them feel more ashamed or frightened. In some cases, bots like these could help marginalized groups finally get access to the justice they need and deserve.
However, social justice chatbots have also met with their share of criticism. SimSimi is banned in a number of countries, including Thailand and Ireland, because kids use it to supplement cyberbullying tactics. And although Spot, the sexual harassment-reporting chatbot profiled above, has obvious merits, it may fall short when it comes to data protection and security practices.
Still, these chatbots undoubtedly make it easier for people to report troubling incidents they see or experience, especially when the technology upholds anonymity.
Kayla Matthews is a senior writer for MakeUseOf. Her work has also appeared on VICE, The Next Web, The Week, and TechnoBuffalo.
The audio problem: Learn how new cloud-based API solutions are solving imperfect, frustrating audio in video conferences. Access here