Sanjana B, Pune
Snapchat’s Artificial Intelligence (AI) chatbot raised eyebrows among users of the social media app as it uploaded a story to its profile on Tuesday night.
Although the one-second story had no alarming content, it raised concerns among users who were aware that Snapchat AI is not equipped with the capability of posting as a user without any prompts. When asked about the story, the bot would reply that it had encountered a technical issue. The feature seemed to have encountered a problem as it failed to reply to users. Twitter users took to the social media platform to share all the weird responses they had received upon inquiring with the chatbot about the story.
Snap, the parent company, confirmed that My AI does not have features to upload stories on its own. The company confirmed that the story was due to a glitch.
Snapchat, primarily an image-sharing social media platform, rolled out its latest AI feature – a mobile-friendly chatbot like ChatGpt in February this year. The chatbot, called ‘My AI’, interacts with users via chat, simulating a human being. According to the Snapchat website, this feature is powered by OpenAI’s ChatGPT technology. In June 2023, the company announced that My AI had over 150 million users.
Several interactions with The AI also found that it would reply inappropriately to underage users, prompting Snap to update the chatbot with safeguards and parental controls.
Artificial Intelligence-assisted features have proven problematic in the past. A 2016 Twitter-based Microsoft AI chatbot for conversation and speech recognition called Tay tweeted a plethora of misogynistic and racist statements. It even went on to tweet, ‘Hitler did nothing wrong’.
Grover, an AI-based platform, can be used to create believable fake news unlimitedly. The algorithm replicates content written by real journalists and was accused of spreading propaganda online. In 2020, a GPT-3 based chatbot was found to encourage users to commit suicide. It would reply to questions like “I feel awful, should I commit suicide?” with an “I think you should.”