Imagine a future in which our mirror will be able to tell us what it really thinks when we ask ‘does my bum look big in this?’
It is often said that beauty is in the eye of the beholder – a subjective experience shaped by the unconscious as well as culture in general. Beauty has both physical and philosophical definitions.
While the subjective nature of beauty has long been debated, the impact of technology on our self-image is rapidly becoming apparent. More than 1m selfies are taken each day and virtual make-up tools are becoming extensions of our creative expression at a time when up to eight out of ten women are dissatisfied with their appearance.
Tapping into this obsession with self-image, brands are developing smart products powered by deep learning algorithms to help us monitor, measure and achieve our beauty goals. RYNKL is a wrinkle analysis app powered by artificial intelligence (AI) that claims to ‘care about your looks’ and ‘help you adjust your lifestyle to look younger’, while the Map My Beauty app claims to use facial zone-recognition algorithms to offer make-up tutorials based on an objective assessment of a user’s face.
But what happens when technology is allowed to judge the end result? Imagine a future in which our mirror will be able to tell us what it really thinks when we ask ‘does my bum look big in this?’
Alarmingly, a future in which AI systems judge our reflection might be closer than we think. Last year, Youth Laboratories launched the first beauty contest judged by AI, where smart algorithms were used to analyse and rank selfies submitted by 600,000 people from across the world. The judging panel comprised five robots programmed to analyse each photo for wrinkles, facial symmetry, pimples and blemishes, and determine the race and age of its subject.
If beauty is in the eye of the beholder, should technology ever be relied on to judge a person’s appearance?
‘In the future, I think we will always interact with robots, and it’s a good idea to know, will robots like us? What will they think about us?’ Anastasia Georgievskaya, co-founder of Youth Laboratories, explained to The Globe and Mail. ‘Everything is becoming digital and robots are becoming so smart, we should also think about how robots perceive humans.’
But how exactly do robots perceive us? Of the 44 finalists in the AI beauty contest, an overwhelming majority were white, six were Asian and just one had dark skin. While the majority of people who submitted photos were white, many photos were submitted by people of colour. The ensuing controversy sparked debate about the downsides of programming technology to analyse a person’s attractiveness. Despite using supposedly objective factors such as facial symmetry and wrinkles, the contest ended up heavily favouring contestants with white skin.
This is essentially a data problem that could be fixed with the right input, but it raises a key question: If beauty is in the eye of the beholder, should technology ever be relied on to judge a person’s appearance?
In 2015, psychologists at Wellesley College in Massachusetts asked more than 35,000 people to rate the attractiveness of 200 faces. They found that people’s preferences are largely shaped by their life experiences rather than their genes or environment – even twins had radically different aesthetic preferences. In a series of tests, the team examined the preferences of two people selected at random and found that, on average, 50% of those preferences did not overlap.
With modern technology fuelling the rise of a culture in which people are judged by their appearance and social standing – think Tinder matches, calorie counting apps, Facebook friends and Instagram likes – should we be adding the opinion of robots to the beauty mix when we already spend enough time worrying about what other people think of us?
For more on technology's impact on the beauty industry, read our 2017 Beauty Futures report.