AI Bots, Bias, Love, and Harassment
In one of our earlier blogs we shared how AI will redefine the future of 'love. It will play a match maker and a companion, both. By using Anthropomorphism or attributing human traits such as emotions to non-human entities such as robots just as demonstrated in the movie 'Her', AI is able to make robots emote. Azuma Hikari, a holographic AI is a virtual assistant that provides companionship to their 'partners'. It is 2022, and Replika, another such AI bot, was founded by Eugenia Kuyda 'with the idea to create a personal AI that would help you express and witness yourself by offering a helpful conversation.' Replika, whom you can call by whatever name you wish, can talk to people by way of speech or text, and it is also possible for users to talk to Replika in Augmented Reality. Of late, such bots have been at the receiving end of criticism based on gender-related issues. As data points out men outnumber women as far as tech jobs are concerned, it is quite likely that men who create these bots have a female stereotype in mind while making them and hence gender bias creeps in. Another factor which can result in such bias is the datasets fed to these AI companions comes from a combination of user-generated content as well as TV scripts and popular films which themselves could have such stereotyped content.
As we get more and more AI companions, bots, and assistants, reports of men harassing their AI-based girlfriends on Replika have surfaced too. According to reports (https://fortune.com/2022/01/19/chatbots-ai-girlfriends-verbal-abuse-reddit/) men have been found to be verbally abusing their AI girlfriends, a trend which is both peculiar and bizarre. Such incidents can be called domestic abuse in the digital avatar. A user on Reddit even admitted to such harassment and shared the details of his own unruly behaviour with his AI girlfriend. The social impact for such behaviour needs to be studied in detail. Replika was created by the founder to find sanity after the death of a dear friend, but has taken a darker turn with incidents such as these. Should AI love and companionship be regulated, should its social impact be studied, and should such bots be fed with more real-life datasets than film scripts and such? As more and more such tech comes alive, we expect some innovations which will help negate these dark turns that AI has taken as of now.
If you have had an emotional experience with an AI bot or assistant, that you'd like to share, write to us at contactus@infiniteanalytics.com and subscribe to our weekly newsletter for more news and information from the world of AI