Skip to content Skip to footer

To Fight Big Tech, We Must Seize the Means of Computation

“[Elon] Musk is kind of an unsubtle example of the problem with Tech Bros,” says Cory Doctorow. 

Part of the Series

“If you’ve never tried to organize a movement without the internet, I’m here to tell you, it’s really hard. We need to seize the means of computation, because while the internet isn’t the most important thing that we have to worry about right now, all the things that are more important, gender and racial justice, inequality, the climate emergency, those are struggles that we’re going to win or lose by organizing on the internet,” says author and activist Cory Doctorow. In this episode of “Movement Memos,” host Kelly Hayes talks with Doctorow about the lessons of his book The Internet Con: How to Seize the Means of Computation.

Music by Son Monarcas & David Celeste


Note: This a rush transcript and has been lightly edited for clarity. Copy may not be in its final form.

Kelly Hayes: Welcome to “Movement Memos,” a Truthout podcast about organizing, solidarity and the work of making change. I’m your host, writer and organizer Kelly Hayes. Today, we are talking about how Big Tech wrecked the internet and what we can do about it. I think we all know what it’s like to want to be done with an app or a website. Whether we’re fed up with unchecked harassment, deceptive marketing, the drain on our time or attention, or the sheer number of Nazis on a platform, I think most of us know what it’s like to want out, yet still feel walled in. We may worry that we’ll lose touch with family and friends if we permanently log off, or we may need our social media reach to facilitate our activism or careers. Some of us have deleted social media apps from our phones, in an effort to curb our scrolling time, only to download them again in short order, because we just can’t kick the habit. If, like me, you continue to use platforms whose owners you despise or that you honestly don’t believe should exist, this episode is for you. Because today, we are talking to author Cory Doctorow about his new book The Internet Con: How to Seize the Means of Computation. Cory Doctorow is a science fiction author, activist and journalist. He is the author of many books, most recently The Internet Con, which he has characterized as a “Big Tech disassembly manual.” Cory’s other recent works include Red Team Blues, a science fiction crime thriller; Chokepoint Capitalism, a nonfiction book about monopoly and creative labor markets; and the Little Brother series for young adults. In 2020, Cory was inducted into the Canadian Science Fiction and Fantasy Hall of Fame.

We have been talking a lot this season on the show about how Big Tech is shaping our lives and our world, and I was really grateful for the opportunity to discuss these issues with Cory, whose book offers an invaluable resource to people who want to push back against Big Tech. As Cory writes in the book’s introduction:

This is a shovel-ready book. It explains, in nontechnical language, how to dismantle Big Tech’s control over our digital lives and devolve control to the people who suffer most under Big Tech’s hegemony: marginalized users, low-level tech workers and the people who live downstream of tech’s exhaust plume: people choking on toxic waste from the tech industry and people living under dictatorships where control is maintained with off-the-shelf cyberweapons used to hunt opposition figures.

If we want to push back against the dystopian dynamics of the tech world, we have to understand how we got here. We need to know what we’re dealing with and what changes might upend the dynamics that currently define our experience of the internet. We have been discussing tech issues a lot this season on “Movement Memos” because I think there is a serious need for political education on this topic in our movements. With this podcast, we try to make every episode a resource for people who want to take action and change the world. I have been grateful to hear from folks who have used the podcast in their classrooms, for popular education in their organizing groups, or to help shape their own analysis. Your support and your feedback mean the world to us. If you believe in what we are doing and want to support the show, you can help sustain our work by subscribing to Truthout’s newsletter or by making a donation at You can also support “Movement Memos” by subscribing to the podcast on Apple or Spotify, or wherever you get your podcasts, or by leaving a positive review on those platforms. Sharing episodes that you find useful with your friends and co-strugglers is also a big help, so if you are doing any of those things in support of the show, I want to extend our warmest thanks. I love this work, and we couldn’t do it without you, so thanks for believing in us and for all that you do. And with that, I hope you enjoy the show.

(musical interlude)

Cory Doctorow: My name is Cory Doctorow. I’m a science fiction writer and an activist, and I spent more than 20 years working on tech policy and digital human rights, mostly with an organization called the Electronic Frontier Foundation, where I’m now a special advisor, but I was formerly the European Director. The Internet Con is my latest book. I write when I’m anxious and I mostly write science fiction novels. I came out of the pandemic with eight books. This is one of the two nonfiction books that I came out of pandemic with, and this one’s from Verso. It’s called The Internet Con: How to Seize the Means of Computation. And it explains how the internet became what Tom Eastman calls five giant websites full of screenshots of text from the other four, and lays out a plan for actually doing something about it. Not just things that we think will make the tech company sad, but things that might make us happy by giving us an internet that we deserve and indeed need.

KH: So, how did the internet become such a mess? Given that the internet as we know it has only existed for a few decades, it’s easy to zero in on regulatory failures and other major events during that time when trying to understand how we got here. But to really get our heads around what Cory calls the internet con, we have to rewind further and talk about the nature of monopolies and the gutting of antitrust laws.

CD: So antitrust law begins in the late 19th century, the kind of Robber Baron era. The first antitrust law was passed by a guy called Senator John Sherman, who’s better known brother was Tecumseh Sherman. And he really summed up the case for antitrust in a speech he gave in 1890 on the floor of the Senate where he was stumping for his bill where he said, “we got rid of kings, but now we have these kings of trade. If we would not allow a king over the daily lives that we live, we shouldn’t allow an autocrat of trade to decide how we live.”

So Sherman, he was worried about companies that would get too big to fail and too big to jail. Companies that would be so big that it would be impossible to hold them to account, because even if they were bank to rights, and even if you could muster the political will to do something about them, if you got rid of them, you would leave such a giant hole in society because of all the important services they provided that they’d still get away with it. And that was the basis on which we enforced antitrust law for the next 80 years until the mid 1970s. We passed lots of other antitrust laws, the Clayton Act and the Federal Trade Commission Act and so on.

But when you get to the mid-seventies and Jimmy Carter, he starts to toy with this idea that comes from this fringe character called Robert Bork, best known for the adjective “borked,” which comes from how badly he bungled his presentation when Reagan tried to put him on the Supreme Court and the Senate just shredded him for having been Nixon’s solicitor general. And Bork was a conspiracy theorist who said that, like, John Sherman and all these other people who wrote antitrust laws, actually loved monopolies, and they thought they were really powerful and really efficient and that they benefited all of us as consumers, and that the last thing that we should do to enforce these laws that were written by people who were very clear about what they wanted, is to use them against monopolies.

And so as a result, we then spent the last 40 years… It started with Carter and accelerated under Reagan, and then every administration up to the current one has piled onto this. We spent the last 40 years tolerating monopolies, allowing these autocrats of trade to spring up so that in every sector, not just tech, we now have between one and five companies running everything that’s important to our lives.

Computers have historically been very resistant to monopolization because they have this intrinsic characteristic that is almost mystical, it’s certainly very technical, which is this thing called universality. Like the only computer that we know how to make is something technically called the Universal Turing Complete von Neumann Machine, which is a lot of words. That just means that the only computer we can make is a computer that can run every valid program. And so every program that can run on your desktop computer could also run on your printer and could also run on your thermostat, and could also run on that little system on a chip in your singing greeting card.

And what that means is that historically, if you designed a computer like say an IBM Mainframe, and you were charging 10,000% margins for hard drives for your IBM Mainframe, someone could make what was called then a plug compatible hard drive, one that would just plug in, a company like Fujitsu say, and only charge a 100% margin or even a 10% margin. And as a result, tech was very dynamic. Companies would come and companies would go, and when a new company came on the scene, they could take all of the things that the old company was using to hold you prisoner, the proprietary formats for your data, the proprietary formats for storage, the proprietary architecture that the program ran on, and they could make conversion layers and tools and utilities and plug compatible devices, and they could just set you free, that you would always have these low switching costs to go from one technology to another.

And that meant that tech companies, even though they weren’t run by people who were any better than anyone else, were always keenly aware that their customers were, as The Google Boys used to say, “one click away from going somewhere else.” And they put a lot of energy into making sure that their customers were happy, because they really understood that if their customers were sad, that those customers had a lot of options. And as tech got bigger… And the way it got bigger was by buying its competitors. Google is a company that made one good product 25 years ago, a really good search engine, and then proceeded to fail at everything else they tried to make in-house, from multiple social media platforms, to a video platform, to a smart cities tool, to wifi balloons. Even their RSS reader, they all went down in flames. Google just went out and they bought other companies, right? They went out and bought a video stack and an ad tech stack and a mobile stack and a server management stack.

Docs, maps, collaborations, satellite photos, you name it. It’s someone else’s idea that they bought and operationalized, and operationalizing things is very important. I’m all about the care work, but to call them innovators is a stretch. And so the way that they grew was just by buying out everyone who might someday do to them what they did to the companies that came before them.

And that’s what everyone else did too. Tim Cook in 2019 told Kara Swisher that Apple had bought 90 companies the year before, Apple’s coming home with a new company more often than you’re coming home with a bag of groceries. And as a result, we got this massive concentration in tech, which left us not only with few places to go, but also a sector that was so concentrated in chummy that it found it quite easy to capture its regulators, so that the things that other companies used to do that those companies did to attain scale, to enter the market, to take customers from the incumbents, reverse engineering bots, scraping, making compatible products, all of those things became illegal. The pirates became admirals and they said, “When we did it, that was progress. And when you do it to us, that’s theft.” And they managed to create something that we can think of, as Jay Freeman says, as felony contempt of business model.

We’re doing things that aren’t illegal, but that make their shareholders sad. Can be made illegal by mobilizing different parts of IP law that have been distorted beyond all recognition, into a charter that allows these companies to control the conduct of their competitors, their critics, and us, their customers.

KH: In The Internet Con, Cory writes:

The history of technology is one long guerilla fight where the established giants wield network effects against scrappy up starts, whose asymmetrical warfare weapon of choice is low switching costs.

There is perhaps no better example of a company whose dominance is propped up by high switching costs than Facebook, a platform that many of us loathe but use anyway.

CD: So when Facebook started, well first they made themselves available just to American college kids, right? You had to have a .edu address to be a Facebook user. But eventually they ran out of worlds to conquer there and they decided they were going to welcome the general public to Facebook, and they went to everyone who had a social media account and they said, “you should come over to Facebook.” But there’s a problem with leaving social media, which is that as good as a rival service might be, if your friends aren’t there, it’s not good enough. You don’t use social media because the user interface is nice or you like the graphics or it has a great app. You use social media because of the people who are there. And when a lot of people gather in one place, they get a kind of inertia. I mean, anyone who’s ever, I don’t know, tried to figure out what movie to go to with all of their friend group on a Friday night knows what this is like, right? You could be arguing for hours and still never pick a movie.

And so when you’ve got a couple of hundred friends of yours on one social media platform and you want to go to another, well, maybe everyone’s dissatisfied with the one you’re on, but they’re not all ready to leave at the same time and they don’t always agree on where they should go next. So Facebook said, “hey, we’re going to solve this collective action problem for you. We’re going to lower the switching cost of leaving the major platform.” Which was not good, right? The major platform was this decaying social media platform owned by a crapulent, evil, senescent Australian billionaire whose name was Rupert Murdoch. And the platform is called MySpace, and most people who used it hated it, but they love their friends, and so they stayed there.

And Facebook said, “okay, well here you go. Here’s a bot. You give that bot your login and your password and it will log into MySpace and pretend to be you, and it will grab any messages that are waiting for you there and it’ll stick them in your Facebook inbox, and you can reply to them there and it will push them back out to MySpace. And back and forth and back and forth you can go so that you don’t all have to agree to leave at once. You can go when you’re ready and your friends can go when they’re ready. And if they’re never ready, well that’s fine. They can still be your friends and you can still stay in touch with them.”

Now, if you tried to do that to Facebook today, they would destroy you. They would say that you’d violated Section 1201 of the Digital Millennium Copyright Act, a 1998 law that makes it a crime to reverse engineer copyright access control systems. They would say that you had engaged in tortious interference with contract, an obscure part of contract law that has now become a weapon of choice for stopping people from making interoperable products. They’d say that you violated their patents and their copyrights and their trademarks and their trade secrets. If the person who helped build it used to work at Facebook, they’d say they violated their non-compete and their non-disclosure. So up and down the stack, they would mobilize this kind of weird gnarly hairball that we call IP [intellectual property] law to destroy you before you could even get started, because the pirates all want to become admirals. And when Facebook did it to MySpace, that was progress. But if you try to do that to Facebook, that’s theft.

KH: Under capitalism, all the wrong people tend to decide what does and doesn’t constitute theft. This is a problem that comes up a lot on the internet around copyright law and intellectual property. A system called “notice and takedown” is supposed to allow people to notify a platform when their copyright has been violated so that the platform can remove the offending material or monetize it in favor of the copyright holder. But like so many features of the internet, this mechanism has often been weaponized to enact censorship and harm platform users.

CD: When the internet started, it wasn’t clear what rule there should be for intermediaries who host other people’s speech, what we call content, but which we should really think of as speech. The things that you post that are expressive and matter to you, and that we have in mind when we talk about free speech and free expression and the first amendment and charter rights in Canada, and all that good stuff. It was pretty clear early on that most people, at least then, were not going to be in a position to host their own speech. That you weren’t going to run your own server and your own service that other people can log into in order to talk to you.

And even today, where technically that’s much simpler, because so many people are corralled within very large platforms, standing up your own server doesn’t really matter because no one’s going to be able to hear you or see what you have to say or speak back to you unless you’re on one of the big platforms. And so Congress in 1998, they passed the Digital Millennium Copyright Act. That’s the same law that has this anti-circumvention provision, section 1201. It also has this other provision, section 5-12 that says, “If a platform, an intermediary hosts a user’s speech, and that speech is later accused of infringing copyright, that that intermediary will not be a party to the copyright infringement, won’t be liable for damages, won’t have to go to court.” And the damages are crazy, right? The statutory damages for that kind of copyright infringement are $150,000 per download for a civil infraction and $250,000 for criminal infractions.

So even for a company the size of Facebook, it could quickly put the company out of business. And they said, “Instead of making you jointly liable for everything your users post, which would then lead to the circumstance where anyone who wanted to host speech for third parties would have to somehow discover whether or not that message that you posted saying, ‘hey guys, what movie should we see this Friday?’” Infringed copyright before they made it public, which is sort of full employment for every copyright lawyer that ever lived or could be trained. They would just say, “Look, if someone ever calls you out or sends you an email, and accuses one of your users of infringing their copyright, provided that you expeditiously remove that content, you won’t be a party to the copyright infringement.” And that may seem like a happy medium. And there are some ways in which it’s okay, it has ended up turning into a kind of thermonuclear weapon that can be used against speech.

So you have things like reputation management firms, who are really reputation launderers who work for torturers and murderers and corporate criminals, and they scrub the reputations of these people in part by writing to the search engines and saying, “The material that you’ve indexed that contains a truthful account of my client’s misdeeds, infringes my copyright, and I require you to de-list it from the search engine.” Often the newspaper or the blogger or whoever has written the thing that they’re objecting to never even finds out that one of their many articles has been de-listed from Google, so that material just goes down the memory hole.

When it comes to creative workers, it’s very bad indeed. So YouTube operates a kind of notice and take down service on steroids, what’s sometimes called a notice and stay down service, where a notification that something infringes copyright can result in all copies of it being taken down immediately and all future copies being prevented from being posted.

This system that Google runs for YouTube is called Content ID, and the way that it works is you upload some sound file and you say, “If this sound file is contained in any video that anyone posts to YouTube, I require you to remove it.” Or alternatively, you can say, “I require you to demonetize it,” or, “I require you to put ads on it and give the ad revenue to me. I’m going to take the ad revenue.” And so this has been a disaster for all kinds of performers. Obviously people who make documentary films that include excerpts from other films or music that they’re talking about, if they’re musicologists, this is very hard on them. A scholarly conference that live-streamed an eight-hour live stream, and during the lunch break, there was some licensed music played in the hall that was licensed to play in the hall, but not live stream, resulted in the entire video’s audio being removed permanently so that no one can hear anything that happened in this learning conference.

And if you’re a classical music player or performer, you have a very good chance of Sony Records taking down your performance, because Sony owns this giant library of classical music performances, and to the YouTube algorithm, anything that you play on your piano or cello or whatever is going to sound enough like a Sony recording that it will be taken down. And Sony has been prone in past and today to, when someone contests that take down, to insist no, that this really does infringe their copyright even when it doesn’t.

And then the worst of all is where you have creative workers who rely on YouTube for their income, either directing people to their Patreon or collecting ad revenue or doing something else like music teachers who drum up students by offering free lessons online. And you have predators who will send in bogus copyright claims against that person. And on YouTube, if you get three copyright claims, your account is removed permanently and you can’t open a new one. And they send two of these bogus claims so that you are one claim away from losing your livelihood on YouTube forever. And then they privately message you and demand ransom, and they say, “If you don’t pay them, they will send in that third claim, and the giant machine that is YouTube customer service will never hear your pleas, and you’ll be out of business.”

Authoritarians would like an internet where disfavored speech is not permitted, and where people who say disfavored things can be identified and punished. And this has, broadly speaking, been the goal of the entertainment companies since 1996. And that’s when Bill Clinton’s copyright czar, a guy called Bruce Lehman who’d been the head of copyright for Microsoft and then rotated into government service, went to Al Gore’s information superhighway hearings, the National Information Infrastructure hearings, and proposed something very similar to this. Gore laughed him out of the room. Gore’s a kind of a mixed bag politically. I think his environmental message is fine, but there are other areas I disagree with him with. But this was a good deed he did in laughing Bruce Lehman out of the room.

And after he laughed Bruce Lehman out of the room, Bruce Leman got the last laugh. He went to Geneva where the UN World Intellectual Property Organization meets. This is a thoroughly captured technical agency of the UN that has the same relationship to stupid internet law that Mordor has to evil and Middle Earth. It’s where it all originates from. And he got them to pass the internet treaties, the WIPO Copyright Treaties and the WIPO Performances and Phonograms Treaties, that were then brought back to the US as a treaty obligation and turned into the Digital Millennium Copyright Act, that law that we’ve been talking about where we find the prohibition on reverse engineering, and the notice and takedown system.

KH: One of the reasons I really appreciated Cory’s book is that, while I am very concerned about the tech industry and how it’s shaping our world, I am not a tech person. It’s a subject that interests me, but when well-meaning people try to explain particular problems or concepts to me, I can easily get lost. Books have been my greatest resource in trying to understand the tech industry, and the books I have found most useful come from authors who make tech speak legible to people like me, who are nerds of a different variety. In The Internet Con, Cory talks about how most of us aren’t really wired to process the ins and outs of internet standards and regulations, and how companies exploit that disconnect. Cory writes:

All this stuff — standardization meetings and forensic examinations of firewall errors — is supremely dull. It combines the thrill of bookkeeping with the excitement of Robert’s Rules of Order. Merely paying attention to it is a trial, and many of us are literally cognitively incapable of tuning into it for more than a few minutes before our minds start to wander.

It is precisely because this stuff is so dull that it is so dangerous.

Cory calls the insulation from scrutiny that all of this dull content creates for tech companies a “shield of boringness.”

CD: That’s a phrase I stole from the wonderful comics artist Dana Claire, who writes this great series of comic books about a little girl and her unicorn. And the unicorn can be seen by grownups, but grownups don’t notice the unicorn because the unicorn exudes a thing that she calls the shield of boringness. And the shield of boringness is just this thing that makes your eyes glaze over when you contemplate it. And as a result, nobody finds it remarkable that the unicorn is there. And so much of tech policy is so eye watering and dull and technical, and difficult to understand, that over and over again, these things that are incredibly powerful and important and will have a profound impact on how you live your life, just get turned into these dull technical debates that disappear into the halls of the most technocratic weirdos and freaks like me, and normies never get to hear about it until it’s making their life miserable.

So since the earliest days of the internet and even today, there have been proposals to ban working encryption. And encryption is a subject that’s boring even on its own, right? It’s a very abstract branch of mathematics. It’s hard to understand how it works. It’s full of all these technical terms like keys and ciphers and hashing and so on, asymmetric key exchanges, all of this stuff that’s very hard to understand to begin with, but encryption is incredibly important. For one thing, it’s like how you and your friends can talk to each other without being eavesdropped on. It’s how you can talk to your bank without having that information leak. It’s how you can protect the data on your computer, so if someone steals your computer, they can’t access your data. And it’s even how things like software updates for your pacemaker or your car’s anti-lock braking system can be transmitted from a server to the device, and the device can verify that it actually came from the real manufacturer and not from a third party who wanted to do something very bad and potentially lethal to you.

The problem with encryption is that it works. The little distraction rectangle you have in your pocket, when you take it out and aim your camera at something and press the shutter button, if you’ve got full disc encryption turned on, in the time it takes for your phone to play that little click, click sound, that photo is scrambled so thoroughly that if every hydrogen atom in the universe were turned into a computer and it did nothing until the end of the universe but try to guess what the key was that was needed to de-scramble it, that we would run out of universe long, long before we ran out of possible keys for you to have used for your phone to have generated and to scramble that photo.

And so this is very good if you’re trying to keep a secret, but it’s very bad if you’re trying to find out what a secret is, like if you’re a cop who wants to spy on someone, or if you’re a spy who wants to spy on someone. And so since the Clinton years, there has been this proposal that we should make encryption illegal, or that we should make encryption legal, but only broken kinds of encryption that we are assured by the authorities that only they can break into, and that they will only break into when they have a darn good reason. This is the first important case that the Electronic Frontier Foundation took on was over this.

In 1992, we represented a cryptographer who was then a grad student at UC Berkeley called Daniel J. Bernstein. Bernstein was publishing on the early internet something called Usenet computer programs that were stronger than the one that the NSA said was all that any civilian should ever need, that no bad guy would ever be able to break, but that they could break if bad guys were using it. Bernstein argued, and we argued on his behalf, that the First Amendment protected his right to publish that source code, that computer code, and the Ninth Circuit and the Ninth Circuit appellate division agreed.

But in that debate, normies were far from the action because it is such a genuinely weird and difficult to understand technical discussion. We tried to make it clear. One of the founders of EFF, an early technologist called John Gilmore, he built a computer called Deep Crack that could brute force all the possible keys that this NSA encryption allowed and therefore read everything scrambled with this NSA encryption. It cost him a quarter million dollars and it could brute force all the keys in two and a half hours. And fun fact, as I record this with you, Deep Crack is sitting about three feet to my left because John got tired of having it in his garage and made me custodian of it. It’s a thing the size of a beer fridge. And so we tried to explain to judges how this stuff worked, but their eyes glazed over.

The First Amendment argument carried the day, but to this day, and even now, there are laws all over the world, including in the United Kingdom, that are pending, that say that it should be illegal to make working encryption, and you should only be allowed to make broken encryption. And explaining the nuance between encryption that works and encryption that’s broken is very hard, and yet nobody wants to get a bad software update for their pacemaker. And so it’s very salient that this stuff works, and it works as well as possible. And by cloaking this discussion in the shield of boringness, something that should be of urgent technical debate in public society in every area of our world, becomes something that is banished to the back rooms where hobgoblins like me argue about it, and you only find out about it when someone ships a bum update to your anti-lock braking system and you die in a car wreck.

KH: Between the “shield of boringness” and the seemingly inescapable dominance of platforms we hate, it’s easy to throw our hands up and give up on understanding the specifics of how and why tech companies have screwed us over. But if we are going to fight for the world we want, confrontations with Big Tech are inescapable, and while most of us will never understand as much as an expert like Cory does about that world, we need to know enough to craft demands that make sense.

CD: People who say that tech is busted, I agree with them a hundred percent, but oftentimes if all you understand is that tech is busted but you don’t understand how it got that way, which is a mix of policy and technology, then you don’t know how to make it better. And so you end up with these solutions that aren’t solutions. You may have heard people talk about reforming something called Section 2-30 another. Another, my eyes glaze over initials with a number at the end, because it’s technically CDA 2-30, and then no one can make any sense of. And they’ll say, “the tech platforms are publishers, and yet they get something no publisher gets, which is that they’re immunized when their users post unlawful material, and therefore we should have that taken away, and then we can hold them to account for all the harassment and Nazis and whatever that are using their platforms.”

And this is just wrong. Most of the stuff that we’re worried about, racism and harassing content and so on, most of that stuff is legal to post. Making the platforms jointly liable for it won’t stop people from posting it. You’d have to change the first amendment for that. What it will do is make platforms extremely gun shy about stuff like Me Too, because they’ll be worried about being made jointly liable for libelous accusations of sexual harassment and assault. And what it will also do is stop sex workers or other marginalized groups from standing up their own servers where they can have discussions that aren’t moderated by Mark Zuckerberg and Elon Musk, and decide what is fair and not fair in their own spaces because they will never be able to manage the liability that’s associated with it.

KH: So all of this begs the question, how can we challenge or break up Big Tech?

CD: So Big Tech is distinctive. It’s not the only concentrated industry. There are plenty of those. But it is the only concentrated industry where we can use interoperability to affect the power of big tech. Now, big tech has used network effects to grow. Big tech platforms get more valuable when more people use them. So every seller that’s on Amazon is a reason for a buyer to be there, and every buyer that’s on Amazon is a reason for a seller to be there. Every Uber driver is a reason to become an Uber rider, and every Uber rider is a reason to become an Uber driver. Every friend on Facebook has a reason to join Facebook. When you join Facebook, you’re a reason for someone else to join Facebook. And network effects have driven explosive growth for all of these platforms.

But historically in tech, those network effects were countered by the low switching costs of interoperability. That anyone could make a product or a tool or a service that would make it easy to leave. You could make a tool that allowed Uber drivers and Uber riders to sense when both of them had a third party rideshare app installed that didn’t rip off the drivers, say one that was owned and maintained by a driver co-op. And after you set up the call with Uber, it would automatically cancel that Uber ride and rebuild it as a driver’s rideshare version. Or you could have a thing that when you left Facebook, you would go back and scrape Facebook for all the messages that your friends were sending you, put them in your inbox on a Mastodon server where you didn’t have to worry about being spied on by Mark Zuckerberg, and let you to reply to those and send them back to Facebook. All of that stuff is possible, and it means that you can leave these platforms without enduring high costs. All we have to do is clear the way, legally, for people to do this.

And so in the book I sketch out a shovel-ready, two-pronged approach to making it easy for people to leave the big tech platforms, to evacuate the big tech platforms. The first is to force the tech platforms to support interoperability. In Europe, the Digital Markets Act is going to do this. It’s going to force the biggest platforms to create these automatic gateways that new market entrants, cooperatives, nonprofits, startups, community groups, government agencies and even large tech companies can set up and connect to so that you can leave a platform but continue to talk to your friends or continue to enjoy the files that you bought, or whatever it is that the platform’s using to lock you in.

But that’s really easy to cheat on. The platform can slow walk it, they can break it selectively. They can shut it down and say later, “oh, we shut it down because we thought someone was hacking into it and stealing our users’ data.” And figuring out whether they’re telling the truth is really hard, because those things all do happen, right? There are hackers and people who steal users’ data, and we don’t want them to not shut down the gateway if they think some harm is coming to their users. But because everyone who understands how Facebook works is a Facebook employee, it’s going to take years to get to the bottom of those questions. And by that time, everyone who’d left for a new platform because they could continue to enjoy the stuff that they had in the old one, they’re going to have gone back because no one wants to wait years for that stuff to work.

So the second part of this is immunizing all of these new market entrants, these reverse engineers, these bot masters and scrapers, from liability, both criminal and civil, for allowing users to leave the platforms provided that they don’t violate privacy law, consumer protection law, or labor law.

And so we need to create these interoperator’s defenses that say that when the platform shuts down the official route, you can blast an unofficial route in there. Now, I think that that’s going to stay the platform’s hands in many cases. I think the platforms are very sensitive to how grueling and how potentially damaging it is to have to engage in guerrilla warfare with reverse engineers. It represents the kind of unquantifiable risk that leaves you doing your quarterly shareholder report and announcing that things are worse than you thought they would be, which is the kind of thing that causes Facebook to lose a quarter of a trillion dollars in one day, and for the stock portfolios of the managers who made those decisions to be absolutely devastated since they have mostly Facebook and their stock portfolios.

But if it doesn’t stay their hand, if they’re reckless, and of course no one ever lost money betting on the hubris of tech leaders, then we’ll have access to reverse engineering, to adversarial interoperability. And to put some concrete flesh on those bones there, in 2012, the people of Massachusetts went to the ballot box and they passed an automotive right to repair bill that said that the big three automakers were going to have to expose their error codes to independent mechanics so you could take your car to any mechanic and get your car fixed. And that passed with a 78% majority. People really wanted it, and it had huge participation. Normally ballot initiatives get low numbers. This was a big one. Nobody wants their auto manufacturer to decide who can fix their car.

So the law that was passed eventually as a result of this ballot initiative, it had a weird loophole, which is that it said that the automakers had to give error codes that traveled on the wired network within the car. And so the big three automakers immediately retooled to send all their error messages on a wireless network in the car so that it wasn’t subject to the law. Now in 2020, Bay Staters went back to the ballot box, and with a similar commanding majority passed a new ballot initiative that said basically, “For avoidance of doubt, we meant wireless too.”

Now in the intervening eight years, independent mechanics would have their customers come in, they put the car up on a lift and they would say, “You know what? It turns out that this is one I can’t fix.” And customers learned that they should just go to the dealer. And bankers learned that they shouldn’t loan money to mechanics. And mechanics learned that either they worked for the big three automakers or they should change jobs. Now, Boston and Massachusetts, home to some pretty good technical universities. Cambridge has got MIT where I’m a research affiliate. If it had been legal to reverse engineer car diagnostic systems, then a couple of smart kids from MIT could have just designed a dongle with a bill of goods of like, three bucks, had them manufactured by the container load in Guang Jo, fire them at the Port of Los Angeles, truck them across the United States and sold them in Boston and across the state from which they would’ve leaked into every other state in America.

They could have used them as a platform to offer all kinds of other services like warranties and parts, and things that are very high margin for the automakers, and maybe that would’ve stopped the automakers from engaging in all this skullduggery in the first place. But if it didn’t, then everyone in the Bay States and eventually everyone would’ve had access to a tool that allowed you to fix a car even if you weren’t blessed by the manufacturer. And so that’s the way that a mandate and the safety and freedom for people who want to go in on their own and do their own reverse engineering work hand in hand to produce something that’s quite durable. It’s like a two-part epoxy, right? The mandate is strong, but it’s brittle. And the reverse engineering, the adversarial interoperability, it fills all the cracks, but it’s gooey and it bends a lot. And so you put the two together and you get something that is strong and resilient.

KH: Circling back to the problem of continuing to use platforms that are run by terrible people whose unethical policies and practices make us increasingly miserable, I wanted to take a moment to acknowledge what’s happening at Twitter, which Elon Musk has recently renamed after his favorite letter of the alphabet. A site that had become one of the most important hubs for news and information sharing among journalists and activists alike has been devastated by a fascistic billionaire, whose actions continue to erode the platform’s utility for individuals and communities who have grown to rely on it. Many of you probably never would have found my work if it weren’t for Twitter. And yet, as important as that site has been for my organizing, my journalism, and my popular education work, I desperately want out. And I know I’m not alone, but many of us who depend on the app to reach our audiences feel unable to leave.

CD: I think Elon Musk is a demonstration of the problems of collective action on social media platforms, because people are still on Twitter even though they don’t like Twitter, and they’re on Twitter because they like their friends. Now, Twitter was actually developed to be API first, to be interoperable, and the original API for Twitter is designed to allow people to leave Twitter and continue to talk to Twitter from wherever they are. We could order Elon Musk as a settlement for one of the many infractions he’s made to his existing consent decrees under the Federal Trade Commission to simply open up those gateways that are already latent in Twitter, modernizing them as necessary, and then you could go to Mastodon or any other service and you could send messages to people on Twitter and they could send messages to you. And then every time Elon Musk did something that made his users angry, they could bolt for the exits.

Musk is kind of an unsubtle example of the problem with Tech Bros. But he’s just doing more bluntly what more cultured versions do in the shadows, in the same way that there’s not a lot of difference between Mitt Romney and Donald Trump, it’s just that Donald Trump speaks his mind, and Mitt Romney only says that 46% of the population are socially useless when he thinks he’s talking to fellow ghoulish plutocrats.

KH: In addition to not wanting to part with friends who are not leaving the platform, I have also found the reach that my Twitter account affords difficult to part with, and I know this has been a major concern for a lot of journalists, educators, and activists, who have long relied on Twitter to get the word out about their work and projects. Personally, I have started a newsletter, and I have also dabbled with Mastodon and Bluesky in an effort to become less reliant on Twitter, because I believe our days on that website are numbered, or at least should be, but it is a struggle. Some people who have experimented with alternatives to Twitter have complained that moving from a walled garden like Twitter to a federated system like Mastodon, can be confusing – and I will admit to being one of those people. I did wind up setting up a Mastodon account after someone helped simplify the process for me, but initially, when I was approaching it on my own, I did find the complexity of the Fediverse intimidating.

CD: I don’t think that Federation is intrinsically confusing. The idea that I might have an email server account at Gmail, and you might have one at Yahoo and that we can send each other email is just Federation. And it’s the same thing for social media on the Fediverse, which is, I have an account at one Mastodon server and you have an account on another Mastodon server, and we can exchange messages that way too. We can be part of a group or individuals and what have you. So it’s not that confusing, it’s just unfamiliar. And I think we should distinguish between confusing and unfamiliar.

Federation I think offers enormous benefits. You’re right that there are drawbacks in terms of reach. The reach that you can get from an algorithm and algorithmic suggestions can be very powerful for freelancers and for activists. But it’s also a devil’s bargain because that reach is not guaranteed to you, and it’s often the case that platforms will give you a little of it and then take it away, and then ask you for money to boost your content. And so that’s a kind of reach that I call the giant teddy bear approach where if you’ve ever been to a carnie in the morning, you’ll see some guy wandering around with a giant teddy bear he’s won by tossing five balls in a peach basket. And he didn’t actually get the five balls in. What’s happened is the carney has said, “Hey buddy, I like your face. Tell you what, you get one ball in, I’ll give you this little key chain. You do that twice. I’ll let you trade your two key chains for a giant teddy bear.”

And then that guy lugs around the giant teddy bear all day and makes people think that it’s possible to win one of their own. And the same is true when Spotify gives Joe Rogan a hundred million dollars or when TikTok uses what they call their heating tool to pick an individual influencer or person whom they want to make the platform attractive to and just send tens of millions of users their way. Rather than having those users see the content because the recommendation algorithm predicts that they like it, those users see the content because the thumb has been put on the scales by someone who works at TikTok. And the idea here is to convince some random sports bro, say, that he is the Louis Pasteur of TikTok, and that the best place in the world to be a sports bro and monetize your content is TikTok.

And then he runs around like a Judas goat and he tells everyone else, “Look at this giant teddy bear I got. It’s so easy. You should come be a sports bro on TikTok too.” So yeah, those algorithmic recommendation systems can do you good, but they’re a pretty limited utility and there are lots of failure modes for them. Meanwhile, there’s nothing that says that you couldn’t have an algorithmic recommendation system for Mastodon or other Fediverse or federated products, but it’ll be separate from the product itself. So it would be something that would go off and look at all the things that it could find either on your own server or on lots of servers, and then acting for you, it would make a choice or a guess about what it thinks you would like, and put that in front of you.

And some of that’s going to work well, and some of it’s going to work poorly. But what is also available to people in federated networks that don’t have recommendation systems is the old-fashioned kind of recommendation system, which is saying to the people who follow you, “please tell your friends about this.” And that is an organic, social, authentic way of making these things work. It eliminates influencer as a job and replaces it with people who other people trust. And when those other people are trustworthy and they tell you that you might like something, like I post book reviews to my Fediverse feed, and lots of people read those books and they forward those book reviews around, they retoot them in order to get them to other places. And that’s a really exciting and powerful way to reach people, and it makes for a much deeper connection. And it’s one that’s more durable and less intermediatable by the people who want to step between you and the people who follow you and take your money.

Bluesky technically looks pretty interesting, but it’s not federated. It is Federateable. And so this is like someone saying, “I have this service. It has all kinds of features that will stop me from abusing you, and I promise that I’ll turn them on later.” “Well, okay, but I prefer that you turn them on now. Or how about you tell me when you’ve turned them on and then I’ll come back and start using your service?” In the meantime, the board of directors of that one Bluesky service that exists, the one that is the only one you can join, the Federation of one, that board of directors includes the guy who sold Twitter to Elon Musk.

And so if the idea is Bluesky is preferable to Twitter because you are not subject to the foolish decisions of individuals who run the service because you can always go elsewhere. Bluesky is a service that is actually owned or managed or overseen by the same fool who made the worst decision in Twitter’s history, and there’s nowhere else you can go if you don’t like it.

KH: One of the problems that many of us have encountered when trying to lure people toward alternatives to Facebook and Twitter, so that we can keep our digital communities intact, is that we simply cannot get around the problem of switching costs. Sure, the alternatives themselves often have problems, but in my experience, few of those problems are worse than what’s presently happening at Twitter, where Elon Musk’s fascistic, sycophantic fan base is being algorithmically amplified at the expense of content that most of us actually want to see and engage with. So when we name the various frustrations that are keeping us from fully engaging with other platforms, it’s not that those problems aren’t real, but I think the bigger issue is that those problems are not offset by the presence of the communities we have come to rely on, many of whom are still holding on at Twitter, even as the site’s deterioration makes us all miserable.

CD: This was one of the problems with the Web3 people is that in addition to being kind of grounded in this speculative, hypercapitalist cryptocurrency nonsense, they also thought that the reason that people kept using Facebook or Twitter is because they didn’t know about better alternatives. Not that they understood that there were better alternatives, but as good as those were, they weren’t better than being around the people that they cared about. And they thought, “Okay, if we just show them something better, they will walk away from the people they love and go to the better thing and wait for the people they love to show up.” And no one’s going to do that. It’s a bridge too far. And no one ever did do it. The way that our technology grew historically was by people making interoperable layers so that the switching costs were as low as possible, not by having this kind of holus bolus exodus.

My grandmother’s story is kind of relevant here. So my grandmother was a Soviet refugee, a child soldier in the siege of Leningrad who was evacuated across the winter ice when she was two years in, when she was 15, and then got inducted into the Red Army and then got knocked up by my grandfather. And then after the war, they fled to Canada via a displaced person’s boat out of Frankfurt. And she was the only person in her family who left the Soviet Union, although all of them suffered under Soviet mismanagement and authoritarianism. And the reason for that is that in order to leave, she had to give up everything.

She didn’t know if her family were alive or dead for 15 years. She didn’t know. She didn’t have anything she’d grown up with, not a photo, not anything but the clothes on her back. She gave it all up, and that cost was too high for her family members to give up, and they stayed. And it was and remains a problem. My family in St. Petersburg are in a really bad place because they or their ancestors didn’t follow my grandmother to Canada, but they didn’t because the switching costs were too high. It wasn’t because the situation wasn’t bad, it was because the switching costs were too high.

If we want to make it easier, if we want people to go somewhere better, we have to make it easier for them to leave.

KH: Given Cory’s expertise in computers, I couldn’t pass up the opportunity to get his take on a couple of the big topics we have discussed this season: AI and long-termism.

CD: Well, I think AI is wildly over-hyped. I think if you add up all the things that AI is good for… And it’s good at some things, I just saw my friend Patrick Ball from the Human Rights Data Analysis Group talk about how they use large language models during the Truth and Reconciliation hearings in Colombia to assemble the largest dataset, human rights dataset ever, and to assign probabilities that individual accounts of murders were carried out by either right-wing militias, the FARC, or the government, or were unrelated to the Civil War and how this has been very important and holding responsible parties to account in Colombia’s Truth and Reconciliation.

That’s amazing, but the idea that we are going to have a multi-trillion dollar industry by automating some of the process of characterizing accounts of murder in a war zone, or that we are going to somehow make it all worthwhile by eliminating all the lowliest, most automated illustration jobs that are total wage bill is in the single digit millions of dollars, then you know that they’re just lying, right? On the one hand, it’s wildly immoral to want to destroy the livelihood of every commercial illustrator, but on the other hand, if you manage it, you won’t even accomplish the 10th of a percent of the savings that would justify the valuation that your company has gotten.

And for most everything else, it’s basically nonsense. All the decision support applications where it’s like, “Oh, we’re going to use it to find cancer, but we’re going to have a human in the loop, a radiographer who reviews the x-rays and makes a call about the judgment that the machine has made.” Well, if that person is actually going to investigate the machine judgment in depth and make a real call about it, then it’s going to be just as slow as not using the machine. And so nobody is going to buy a machine to use it that way. The only point of having such a machine is to allow decisions to be made faster than any human could review them. And that’s really the intent. And so the only way that this can possibly be worth the valuation that’s been assigned to it is if it’s used in incredibly harmful ways.

As to long-termism and TESCREAL and so on. Look, I’m a cyberpunk writer. It was meant as a warning, not a suggestion. The fact that these people can’t tell the difference means that they’re both very stupid and very frightening.

KH: Well, I am so grateful for this conversation and for Cory’s book, which I think is a tremendous resource for people who want to reclaim the internet’s potential and leverage our connectivity to do great things. If you’re like me, you won’t agree with everything in The Internet Con, but I think we should welcome that, and that we should be open to engaging with our disagreements constructively, to see what we might learn. I definitely learned a lot from this book and I think every activist and organizer should give it a read. Given that I think everyone who listens to this show wants to make the world a better place, I also appreciated Cory’s parting message for our listeners.

CD: So I would urge us all to be Luddites in the best sense of the word. So you know that Luddite is a term that is used as a synonym for a technophobe, but the Luddites weren’t technophobes. To be a textile worker in the dawn of the industrial revolution, you needed to complete a seven-year apprenticeship using the most technically advanced machines of the day. A textile worker was like someone with a master’s degree in engineering from MIT. And what they were angry about was not the machines. They were angry about the social arrangements of the machines. It wasn’t what the machines did, it was who they did it for and who it did it to. They were angry because their bosses had these machines that they said were so easy children could use them, and they were kidnapping children from the Napoleonic War orphanages in London and forcing them to work in the mills, in periods of 10 years of indenture during which they would be beaten and starved, maimed and often killed.

Robert Blincoe survived the mills, wrote a memoir that became a bestseller and inspired Charles Dickens to write Oliver Twist, which is really best understood as Luddite fanfic. The Luddites wanted the machinery to be accountable to the workforce and the people they served, and not to the forces of capital. Computers are incredibly powerful tools for activists. I cut my teeth riding a bicycle all night around the streets of Toronto, wheat-pasting posters to telephone poles to get people to turn out for demonstrations. And if you’ve never tried to organize a movement without the internet, I’m here to tell you, it’s really hard. We need to seize the means of computation, because while the internet isn’t the most important thing that we have to worry about right now, all the things that are more important, gender and racial justice, inequality, the climate emergency, those are struggles that we’re going to win or lose by organizing on the internet.

And the internet has these foundational characteristics, the interoperability, the universality, working encryption that lets us keep secrets, that allow us to organize mass movements in ways that our forebears could only have dreamt of. And it’s up to us to take control of that technology and make it work for us.

KH: I appreciate Cory’s argument that while there are struggles of greater moral importance than what happens to the internet, we cannot win those struggles without taking on Big Tech. I wholeheartedly agree with that sentiment, which is why we will keep circling back to tech issues on this season of the show. I hope these discussions leave us all better informed and better prepared to seize the means of computation, and to wage our struggles for liberation. I want to thank Cory Doctorow for joining us today. I learned a lot from talking to Cory and from reading his book, and I am looking forward to checking out some of his sci-fi titles as well.

I also want to thank our listeners for joining us today, and remember, our best defense against cynicism is to do good and to remember that the good we do matters. Until next time, I’ll see you in the streets.

Show Notes

We have 2 days to raise $29,000 — we’re counting on your support!

For those who care about justice, liberation and even the very survival of our species, we must remember our power to take action.

We won’t pretend it’s the only thing you can or should do, but one small step is to pitch in to support Truthout — as one of the last remaining truly independent, nonprofit, reader-funded news platforms, your gift will help keep the facts flowing freely.