Friends with bots

21 : 06 : 2016 Chatbots : Artifical Intelligence : Personality
Microsoft's AI Tay Microsoft's AI Tay

When Microsoft launched Tay, its artificially intelligent (AI) chatbot, it had no idea that it was heading for a PR nightmare.

Aimed at 18–24-year-olds in the US, Tay was designed to answer questions in a conversational manner to demonstrate how algorithms combined with machine learning could become self-sufficient and socially intelligent over time.

People were encouraged to speak to Tay via Twitter, Kik or GroupMe, and watch as she developed a more worldly and sophisticated persona.

Instead, a subset of internet users fed far-right propaganda into the system, which caused Tay to develop a racist, homophobic and morally corrupt personality.

When transcripts of Tay’s depraved rants began to surface on Twitter, Microsoft immediately pulled the plug. The experiment, while ambitious in scope, shows that the world is perhaps not quite ready for open-ended chatbots.

After speaking to agencies that design chatbots for brands, it is apparent that this nascent technology works best with a closed-loop system, where a bot is only programmed to answer specific questions about a few topics to give the impression of AI without ‘doing a Tay’.

Brands such as Dutch airline KLM and news outlet CNN are finding success in this field by sticking to one use case and accounting for all possible scenarios to deliver a consistent service.

‘Start with something simple and scale up,’ explains Rehabstudio’s creative technologist Peter Gasston, who designs branded chatbots.

Another discomforting fact about Tay is that she was designed to have the personality of a teenage girl and communicate using the kind of internet patois favoured by members of Generation Z. There is something rather seedy about the idea of middle-aged male coders working to create a sweet and subordinate teenage persona. It harks back to a time when personal assistants were almost exclusively women, deemed incapable of more senior roles by a patriarchal society.

Apple’s Siri, Microsoft’s Alexa and IPsoft’s Amelia, which is set to replace human workers at Enfield Council in North London, feature a female persona, while IBM Watson, the smartest AI system on the planet, features a male voice.

To avoid making the same mistake as Microsoft, Google is working with a team of artists including Pixar’s Emma Coats and Doodle’s Ryan Germick to create an AI assistant with a more nuanced personality. The hope is that the human touch will enable greater emotional complexity and build trust between the assistant and user. Right now, the relationship between human and bot is one way, but as they become more sophisticated, there is every chance that bots will become valued as friends and trusted partners.

However the question remains, do people want to have a relationship with inanimate objects, services, or even their bank accounts? As anthropologist Robin Dunbar famously said in How Many Friends Does One Person Need? humans can only maintain a meaningful relationship with a maximum of 150 people, so choose wisely.