Ed Finn on the cultural importance of algorithms

26 : 10 : 2017 Technology : Digital : Algorithms

The author of What Algorithms Want: Imagination In The Age of Computing discusses the magic and pitfalls of working with algorithms.

Archive Dreaming by Refik Anadol, Istanbul Archive Dreaming by Refik Anadol, Istanbul

What was the main question you were trying to answer in What Algorithms Want?

This project started for me with the question of the algorithm. This is a word that has been used more and more frequently in the past few years to talk about more and more things to the point that there is now a risk it could end up meaning nothing at all. I became very interested in that because when you look at the history of computer science and the mathematical foundations of contemporary computation there are very rigorous foundations for the notion of the algorithm and effective computability. But the cultural space that the algorithm occupies and the way we think about the computer have now become much fuzzier and increasingly expansive. So that was the question I wanted to answer. What is the cultural figure of the algorithm now? What is the gap between all of the things we think computers do and the things they really do?

Why did you feel this needed to be answered now more than ever before?

Today, computers can help us find the perfect date, the perfect spouse. They can also drive cars, assess people’s fitness for work and make billion-dollar decisions about the stock market. That, to me, means the stakes are becoming higher with algorithms. We’re investing more and more of our decision-making into these machines so it is something we need to recognise clearly and then grapple with, not just in a business sense, but socially, culturally and personally.

The most powerful combination is the human plus the algorithm… there are whole new art forms being created that use computational methods to enhance creative power.

What would your advice be for someone who wants to introduce algorithms into their offering?

The thing to understand about algorithms is that they occupy the functional role of magic. We desperately want our machines to perform these magic tricks for us and we’ll go to great lengths to ensure the magic seems real. For companies, it’s very powerful to have this kind of magic through technologies such as machine learning, personalisation and recommendations, but the same risks apply to companies as they do to consumers.

These risks come with having too much faith in these black boxes. We’re pattern-seeking animals and we’re always looking for meaning. So the more agency you give your tools, especially those that interact with your customers, the more you have to be careful about what kinds of meaning people will construe. I think the most important lesson is to understand what is going on inside the black box, and if that’s not possible, at least to understand the boundaries and framing of the box. What are the terms of the arrangement? What are consumers really giving up and what are they getting in return?

Archive Dreaming by Refik Anadol, Istanbul Archive Dreaming by Refik Anadol, Istanbul

In your book you talk in depth about the idea of algorithm aesthetics. How creative can algorithms be for a brand?

This is an open question in the long term because we’re just beginning to explore this area. At present, some algorithms aren’t doing mathematical things but are creating and curating art and music, and writing pieces of fiction and news articles. So although we don’t know in the long term where the boundaries of algorithmic creativity might be, in the short term we know that the most powerful combination is the human plus the algorithm. Wielded effectively, there are whole new art forms being created that use computational methods to enhance creative power. That’s where the prospects for new compelling work lie.

Working with algorithms presents a lot of human challenges too, especially around ethics and diversity. Is this something to be aware of and how can brands approach this?

I think it’s quite a concern and something that brands need to pay attention to, not necessarily on its own but as part of a larger reflection of what diversity means for that entity and industry. It’s clear that we are giving more power to these decision-making structures and they reflect the beliefs of the people who create them. These people tend to be affluent white men from Silicon Valley who have certain assumptions about the world. Although these assumptions might be well intentioned, there are also assumptions about what their users know and should be able to do, and so problems can arise.

Algorithms are marketed as being much more objective, better than human solutions, but they carry many of the same flaws as humans.

The risk is that there are so many problems algorithms aspire to solve that people will try to solve them with more algorithms. There has been a lot of controversy about systems designed to inform sentencing decisions in courts in the US. It has been shown that these systems encode all sorts of bias in terms of race and class. They are marketed as being much more objective, better than human algorithmic solutions, but they carry many of the same flaws as humans. Think about what diversity should be, not what it can be, and aspire to genuine diversity. It’s ultimately a human question about the people that are involved, not just a computational question.

Similarly, how important do you feel transparency is for consumers when it comes to offering data to algorithms?

Being thoughtful around transparency is very important too. It depends on the context of the problem or the product, but the word context itself is really important. If you can provide context around the decision that is being made, that can be very powerful. There is a growing movement in the machine intelligence industry that humans need context around decisions. You have to construct plausible suggestions and a narrative around human data so a human can assess whether it makes sense to buy into it.

Ed Finn is also the founding director of the Center for Science and the Imagination at Arizona State University

What this means to your brand

  • As we outlined in our Neo-kinship macrotrend, we are increasingly willing to share information with machines. As a result, opportunities are opening up to deliver more personalised services
  • In the same way that we urged brands to use data as a means of optimising the creative process in our Fashion Futures 2017 report, consider how algorithms can be used as part of current practices to enhance design