Skip to content Skip to footer

For Tech Firms, Power Lies in the Coding

Power lies with those who control technology, not necessarily those who own it.

Power lies with those who control technology, not necessarily those who own it.

Part of the Series

Lenin is said to have distilled politics into two words: “Who? Whom?” If the previous four chapters are close to the mark, then we’ll need to think hard about the who and whom of power in the future. That’s the purpose of this short chapter.

Turning first to whom, it seems clear that most of us will become subject to technology’s power in two main ways. The first is when we engage technology for a particular purpose. That might be when we use a social media, communications, or shopping platform, or ride in a self-driving car. Almost everything we do will be mediated or facilitated by digital platforms and systems of one sort or another. Most of the time we won’t have a choice in the matter: in a fully cashless economy, for example, we’ll have no option but to use the digital payment platform or platforms. The second is as passive subjects — when, for instance, surveillance cameras track our progress down a street. Just going about our lives we’ll necessarily, and often unconsciously, engage with technology. Even when we try to avoid it by switching off our personal devices, then technology integrated into the world around us will often act on us in the background.

The question of who is a little more vexed. In modern times we’ve typically seen the distinction between the state and everyone else as the central cleavage in political life. This was the result of four assumptions. First, that only the state could force you to do things. Second, that it was the state (and not private firms) that did most of the work of scrutiny. Third, that the media (and not the state) properly enjoyed the power of perception-control. Fourth, that the power of perception-control was ultimately a less potent form of power than force and scrutiny. As we’ve seen, none of these assumptions are likely to hold in the digital lifeworld.

In Spheres of Justice (1983) the political philosopher Michael Walzer argues that “[d]omination is always mediated by some set of social goods” known as “dominant goods.” In a capitalist society, he explains, capital is the dominant good because it can be ‘readily converted’ into other desirable things like power, prestige, and privilege. In the digital lifeworld, I suggest, the dominant good will be digital technology, because for those who control it, it won’t just bring convenience, amusement, or even wealth: it’ll bring power. Note that power will lie with those who control the technology, not necessarily those who own it. Your personal computer, your smartphone, your “smart” thermostat, locks, and meters, your self-driving car and your robotic assistant — you may well own these things in the future, but if today’s system is anything to go by, you’ll very rarely control the code inside them. Tech firms have control over the initial design of their products, determining their “formal and technical” properties as well as their “range of possibilities of utilisation.” And they’ll obviously retain control over platforms— like social media applications — that remain under their direct ownership. But they’ll also control the code in devices they sell. That means that technology we buy for one purpose can be reprogrammed without our consent or even our knowledge.

For tech firms, code is power.

But the state will muscle in too. Its ability to use force against us, for instance, would be greatly enhanced if it also had access to broad means of scrutiny. That’s why although the state doesn’t own the technologies that gather data about us, it’s already tried to establish control over them — sometimes with the blessing of tech firms, sometimes against their will, and sometimes without their knowledge. To take a couple of examples, law-enforcement authorities don’t need to scan the emails of Gmail users for evidence of child pornography because Google does it for them and reports suspicious activity. Similarly, the state doesn’t need to compile public and private records of all the data collected about individuals (in the US the Constitution partly prevents it from doing so) but is perfectly able to purchase that information from data brokers who have undertaken the scrutiny themselves. Big Brother, it’s said, has been replaced by a swarm of corporate “Little Brothers.” In 2011 Google received more than 10,000 government requests for information and complied with 93 per cent of them. Tech firms comply with the government for various reasons: sometimes because they agree with the government’s aims, sometimes because they’re well-paid, sometimes because they want to collaborate on cutting-edge technologies, and sometimes because it makes business sense to stay on the state’s good side. In this context, Philip Howard, professor of Internet Studies at the University of Oxford, has identified what he calls a “pact” between big tech firms and government: “a political, economic, and cultural arrangement of mutual benefit to both sides.” As well as asking permission, the state will sometimes use the law to help it gain control over the means of scrutiny. Many European countries and the US have enacted laws requiring Internet Service Providers (ISPs) to adapt their networks to make it possible for them to be wiretapped. Sometimes, however, tech companies push back, as when Apple refused to accommodate the FBI’s demands that it unlock the iPhone of one of the San Bernadino terrorists.

Jamie Susskind
Jamie Susskind.

But where the state wants information that it can’t buy, legislate for, or demand — it still has the illicit option of hacking the databases of those who hold it. One of the revelations made by Edward Snowden was that the National Security Agency (NSA) project MUSCULAR had compromised the cloud storage facilities of both Google and Yahoo, harvesting a vast trove of emails, text messages, video, and audio for its own purposes.

As we saw in chapter eight, in jurisdictions such as China the state has gained control over not only the means of force and scrutiny, but also perception-control, in its ability to censor the news people receive, what they can find when they search for information, and even what they are able to say to each other using digital platforms. In the western hemisphere, too, the state has tried to muscle in on the means of perception-control, albeit in a more indirect way (people are wary of anything that looks like state control of the media). Think, for instance, of Google’s agreement to adjust its algorithm to demote sites that infringe copyright. This reduces the need for scrutiny or force by the state.

As well as the state and tech firms, less stable forms of power will also lie with hackers who temporarily assume control of given technologies. That could mean foreign governments, organized criminals, angry neighbours, naughty schoolkids, and industrial spies. More on this in chapter ten.

We’re not going to stand for it. Are you?

You don’t bury your head in the sand. You know as well as we do what we’re facing as a country, as a people, and as a global community. Here at Truthout, we’re gearing up to meet these threats head on, but we need your support to do it: We still need to raise $14,000 to ensure we can keep publishing independent journalism that doesn’t shy away from difficult — and often dangerous — topics.

We can do this vital work because unlike most media, our journalism is free from government or corporate influence and censorship. But this is only sustainable if we have your support. If you like what you’re reading or just value what we do, will you take a few seconds to contribute to our work?