In Future Politics: Living Together in a World Transformed by Tech, Jamie Susskind makes accessible the often head-scratching world of technological development and how it intersects with everyday politics. In this interview, Susskind discusses some of the factors contributing to the exponential integration of tech in our daily lives and how we can take control away from market-driven tech firms by achieving a greater understanding of the technology that surrounds us.
Samantha Borek: Future Politics gets into the nitty-gritty language of both politics and the tech industry, but the book never felt inaccessible or dense. Was this a conscious effort in writing the book?
Jamie Susskind: Thank you for saying so! Absolutely — one of my main aims in writing Future Politics was to make sure that anyone could understand what lies in store for us, but without … over-simplifying. Too much tech writing (and too much political theory) is so convoluted and narrow in scope that it can’t hope to reach a general readership. I believe that the changes caused by digital technology will affect all of us, so it’s essential that all of us are able to think and speak about them in an intelligent way. That’s what I hoped to achieve with Future Politics.
Do you think that advertising, especially the warm and fuzzy ads of the holidays, has played a large role in the exponential rise of technology? Do they affect peoples’ willingness to let potential surveillance tech like the Amazon Echo or Google Home into their lives?
It’s certainly true that consumer demand is behind a lot of technological development (although not all of it), and that advertisements contribute to consumer demand in increasingly sophisticated ways. And yes, adverts for tech products are only going to show you the personal upsides to making a purchase, rather than the societal consequences or the potential downsides. One of my arguments in Future Politics is that we have to stop seeing tech solely as consumers, and start looking at it as citizens, applying the same civic scrutiny that we would bring to any other form of power.
I should add that I believe we are all becoming more comfortable with the notion of almost constant scrutiny of our lives in the form of data-gathering. Unlike most previous generations, whose lives were forgotten almost immediately, living in the digital age means that an increasing amount of our lived experience — what we do, where we go, what we think, what we purchase, and so forth — is caught and captured in permanent or semi-permanent form. That is a broader trend — and while advertising does little to discourage it, it can’t be called the sole cause.
In the book, you discuss technological coding as a form of power. What are the implications of that power in the hands of tech companies, and how can we put ourselves on an equal playing field?
Those who write code will increasingly write the rules by which the rest of us live our lives. So, when you take a ride in a self-driving car, you will be subject to the rules coded into that vehicle: It may refuse to go over the speed limit, or park in a particular spot. It may automatically pull over for the police in circumstances where you would have been inclined not to, had you been driving. In the future, digital technologies are going to be everywhere, touching every aspect of our lives. So too will the rules in those technologies.
You can’t ask a technology to do something it’s not coded to do, which is why a tweet of more than 280 characters simply won’t send, or a DVD made for use in North America simply won’t play on DVD players manufactured in Europe. As more and more of our lives — and our freedoms — are lived out through technologies, we will be increasingly subject to the rules coded into those technologies.
As I argue in Future Politics, code is therefore going to have a major impact on various aspects of our lives that we would traditionally have seen as “political” — our freedom, the health of our democracy, the justice (or otherwise) of our distributions of goods. In line with a long tradition of Western political philosophy, I call for greater transparency and accountability (yes, of tech firms) so that we never become the slaves of systems that we have no opportunity to affect or alter. I don’t believe the market mechanism is enough to regulate these systems, often because they establish monopolies through large network effects. The change needs to be more profound.
Define the concept of “re-magification” with regards to technology. How does it affect policy making, or policy makers’ understanding of technology?
In the past, advances in science and technology helped humans to strip away some of the mystery of the world. Max Weber, writing at the turn of the 20th century, identified the central feature of modernity as Entzauberung, translated as de-magification or disenchantment. This was the process by which magic and superstition were replaced by rational observation as the preferred means for explaining the mysteries of life.
I use the term re-magification to describe the opposite: a generation that increasingly finds itself surrounded by technologies of extraordinary power, subtlety and complexity — most of which we can barely understand, let alone control. This is the sense in which Arthur C. Clarke wrote that, “Any sufficiently advanced technology is indistinguishable from magic.”
Obviously, this has implications for policy making, as calls for “regulation” and the like will be futile, even counterproductive, if lawmakers do not have a decent grasp of the technologies they are looking to regulate. And in fairness, I do think a decent grasp is enough: You don’t need to be an engineer to understand in broad terms how a machine learning system works. And you don’t need to be Mark Zuckerberg to understand how Facebook makes money without charging its users. Sadly, many politicians are still behind the curve on even the basics. Until we have at least a basic grounding in how digital technology actually works, we are going to struggle to think sensibly about it politically.
Social media platforms like Twitter and Facebook have contributed massively to resistance organizing in recent years, but they’ve also amplified voices on the far-right. Will we need to rethink the meaning of “free speech” on these platforms, and what does that imply for the future of democracy discussed in the book?
The simple truth is that, for better or worse, the “freedom” of our speech is now increasingly reliant on the decisions taken by tech firms that host the speech platforms. They determine the forms of communication that are allowed (for example, images, audio, text, hologram, virtual reality, augmented reality, no more than 140 characters and so forth). They also determine the audience for our communications, including who can be contacted (members of the network only?) and how content is ranked and sorted according to relevance, popularity or other criteria. And, yes, they even determine the content of what may be said, prohibiting speech that they deem unacceptable.
There is no inherent reason why these platforms cannot be engineered in ways that are likely to promote truth or civility, rather than their opposite, but of course there are free-speech trade-offs in doing so. Moreover, I reckon few of us will feel comfortable with the idea that the health of our democratic discourse is increasingly going to be determined by firms whose main interest is the pursuit of profit rather than the health of democracy, and whose engineers and executives may not have the faintest idea about the philosophy of free speech, where it comes from and why it matters. Just as politicians’ pronouncements on technology can be cringe-inducing in their naivety, so can tech “gurus”’ pronouncements about politics.
There needs to be a meeting of worlds, whereby we recognize, frankly, the new political role that these platforms play; that they are not (and cannot be treated as) simply commercial entities like any other. The decisions they take strike at the heart of the way we live together. Regulation will play a part, but we also need more “philosophical engineers” worthy of the name.
Do you think younger generations are learning to be better equipped to handle the rise of technology, and should we be looking to them to guide conversations about policy regarding that tech?
Each generation brings a unique perspective. I am a millennial, and when I wrote the book, I expected that those my age and younger would be the most receptive audience for my work. But what’s been interesting to me is that folk older than me — Baby Boomers in particular — have taken to it with gusto.
Likewise, those a few years younger than me (especially those who cannot remember a time before the internet) often have different political priorities from my own. They seem less fazed by a loss in privacy but more concerned about the effects of technology on the distribution of wealth.
I have no statistical evidence for these findings — they are just what I have learned from speaking in front of lots of different crowds — but the overall point is that it cannot be any one generation that shoulders the responsibility here, just as it could not just be a single generation trying to solve climate change. It requires all of us.