Has Netflix exposed a dark future for advertising?

19 : 12 : 2018 Media : Advertising : Entertainment

Netflix was recently accused of tailoring its visuals to the race of its users. Is it clever hyper-personalisation or just plain stereotyping?

Netflix advertising Netflix advertising

Will future advertisers deliver campaigns tailored to each gender, age group or ethnicity, maximising the amount of attention they can win?

Holly Friend, foresight writer, The Future Laboratory

Netflix’s algorithms are infamous. They entice us to watch new shows, analyse our levels of engagement and, as recently discovered, generate targeted advertising based on our perceived race.

At the end of October 2018, subscribers started to notice a pattern as they scrolled through Netflix – the promotional artwork for many films featured characters of their own ethnicity. This was first highlighted, as many controversies are, on Twitter. Audio producer Stacia L Brown posted a screenshot featuring the artwork for the film Like Father. According to Brown, the black characters who fronted its Netflix banner had ‘20 lines between them, tops’, while the film's lead actors and its wider cast were predominantly white.

Since Brown’s tweet, a number of Twitter users have claimed that they were similarly lured into watching titles such as Set It Up, Love Actually and The Good Cop because of artwork that portrayed black characters. Netflix made a cautious official statement in response, emphasising how it does not track demographic data and therefore cannot tailor the artwork to the race of its users. However, what Netflix does track is subscribers’ viewing history, which means the company can speculate their ethnicity, sex and age based on fairly straightforward algorithms.

This is actually part of Netflix’s long-term plan to hyper-personalise its service. In December 2017, the company outlined its plans to customise its artwork in an in-depth blog post, which demonstrated how its algorithms would decipher what makes a user click. For those who are fans of romantic comedies, the algorithm would suggest Good Will Hunting by generating a picture from the film’s romantic sub-plot. Is this tricking users into viewing content or a clever way of introducing them to new entertainment? Perhaps Netflix has found a way to algorithmically burst the entertainment filter bubble, discouraging viewers from re-watching their favourites over and over again.

The Netflix debacle forewarns about a darker future, in which our individual preferences are trumped by those of our collective demographic.

Netflix’s algorithms determine your tv choices Netflix’s algorithms determine your tv choices
Netflix’s algorithms determine your tv choices Netflix’s algorithms determine your tv choices

I wonder whether Netflix's intricate algorithm could be the next iteration of Amazon’s ‘Customers also purchased…’ feature, potentially signalling the future of targeted advertising. We’ve grown to accept that companies are analysing our search history to sell us products. But what happens when they are targeting customers based on the colour of their skin?

Diversity in advertising is improving, but what if digital billboards could analyse and adjust their messaging in line with who’s approaching? In fact, it’s already happening, as advertisers turn to technology to sophisticatedly target passers-by. Translate this to tv and streaming, and there is the potential for future advertisers to deliver campaigns tailored to each gender, age group or ethnicity, maximising the amount of attention they can win.

But the Netflix debacle also forewarns about a darker future, in which our individual preferences are trumped by those of our collective demographic. English professor Chris Gilliard believes that using algorithms that are based on our viewing or purchase history is dangerous territory for brands, which are inadvertently encouraging racial profiling. ‘Once products and, more importantly, people, are coded as having certain preferences and tendencies, the feedback loops of algorithmic systems will work to reinforce these often flawed and discriminatory assumptions,’ he wrote in a recent article for Real Life magazine. ‘The presupposed problem of difference will become even more entrenched, the chasms between people will widen.’

Brands must acknowledge the biases that some algorithms are forcing on society, addressing and adjusting them accordingly. For more, read our macrotrend Morality Recoded.