Skip to content Skip to footer

Chomsky and Herman’s Propaganda Model Foretells a Weaponized Facebook

Information can enlighten and democratize. It can just as well imprison and impede the interests of the common good.

In an online world of commodified speech, perceptions and opinion are easily weaponized.

The personal is now public. Consider Facebook. As the global leader in platforming interpersonal interactions with public discourse across boundaries, Facebook enjoys a virtual monopoly in reflecting power.

Facebook’s massive global reach gives the platform immense influence to shape public perception, awareness and opinion. Notably, one of the platform’s creators, Chamath Palihapitiya, did admit that the team “knew something bad could happen,” having “created tools that are literally ripping apart the social fabric of how society works.” Still, public awareness of this subterfuge has changed nothing.

The relevance of mediated social reality to everyday life has, for much of the industrialized world, never been as pronounced. Information technology and social media exist within political-economic contexts wherein ideas and information are routinely commodified for marketplaces. In 2001, researcher and author Edwin Black meticulously laid out the case of how publicly traded companies can (literally and figuratively) make a killing out of acquiring and managing private information for use in particular markets.

Along with altruistic pretenses like its claims to respect the commons and connect the social world, Facebook also sells user data to advertisers and other institutions intent on managing public perception while simultaneously using personal data for private profit.

The social is also now commodified. Facebook is strictly oriented around total profit in the commodification of user data. In fact, The New York Times detailed how Facebook has allowed its big tech partners to breach privacy rules to gather user data.

Moreover, the very nature of corporatized social media makes users into active market actors, products to be sold. As professor and media theorist Robert McChesney points out in a C-SPAN interview, “Everything we do online is known by commercial vendors and the government to the extent it wants to know. We have no privacy at all.”

No reasonable human being really wants to be integrated into the AI singularity, experimented on, spied on, or to have their private interactions packaged and sold. Resistance to this onward march, we are conditioned to believe, is futile. So how might Facebook’s digital citizens, now 2.3 billion strong, better comprehend and understand the company’s immense power over their ideas and their freedom to exchange ideas — the foundation of freedom itself? A conceptual model of the late 1980s can help pry apart the puzzling performances of this anti-social behemoth.

A Model Fashioned for the Age of Mass Media

First introduced in Manufacturing Consent: The Political Economy of the Mass Media (1988), Edward S. Herman and Noam Chomsky’s “Propaganda Model” of media performance situates superbly the technological zeitgeist of our times. The model is a representation of how big media in a market economy sift the raw material of news fit to print or broadcast — the residue of which propagates the socioeconomic status quo in so-called democratic societies. Since “democracy” implies the civilized practice of resolving differences through public dialogue across groups and classes, Herman and Chomsky postulate that five major filters (ownership, size and profit orientation of dominant media; advertising; sourcing; flak; and dominant ideologies, fear, othering) work to “manufacture consent.”

Why are such efforts even necessary? As researcher and author Tim Coles notes in a Renegade Inc. interview, one of the aims of propaganda is to “alienate the public from their own interests,” so it’s natural that “whenever people in power are telling you that fake news is undermining democracy, they really mean that alternative sources of information are challenging their grip on power.”

Since terms such as “fake news” and “controlling the narrative” have become increasingly prevalent in the age of Trump, the relevance and importance of the Propaganda Model becomes self-evident in today’s public discourse and unchecked commodification.

Citizens with the temerity to reject officially approved mainstream narratives on Facebook are witnessing firsthand how their own online interactions are being actively shaped (or filtered) by media controllers and propagandists. Try as they may, however, those who possess the power to filter media content cannot fully control perception in a growing number of participants who boldly criticize mainstream mythologies on social networking platforms like Facebook.

The hypothesized filters that Herman and Chomsky describe in their book point to those actual mechanisms of control.

Regarding the production, flow and control of information, Facebook is the owner (first filter), and it sells to advertisers (second filter) the content its participants generate and interact with. As BuzzFeed reports, Facebook is gratuitously trafficking users’ personal data. Beyond its “partnering” with assorted titans of propaganda, it now has a seat on the Atlantic Council, which reveals the great utility of the third filter: that Facebook has been “drawn into a symbiotic relationship with powerful sources of information by economic necessity and reciprocity of interests.” Subdue the urge for another dopamine hit and scratch the visually appealing surface of the platform’s interface to see facts about Facebook’s other strange partnership with its newest “fake news fact-checkers” like the Koch-funded right-leaning website, The Daily Caller.

Facebook has further “plans to capture data on everyone, whether you’re a Facebook user or not.” Facebook management serves also as the system’s built-in flak machine (fourth filter) — enforcers who discipline users and alternative news outlets that fail to obey the algorithms written to filter out dissenting views.

As a commentator on new technologies and their uses, Kim Komando has distilled, in her brief description of Facebook’s efforts, each of the Propaganda Model’s filters. Perhaps because of the surprising origins of its clandestine funding sources, Facebook remains open to striking deals with a range of government agencies and nongovernmental intelligence services keen to sift through content and other personal data for pseudo-official oversight of public discourse for possible ideological threats to its system (fifth filter).

Given the economic forces exerted by the streams of funding for Facebook’s brand of socializing, the participants interacting within these landscapes serve as both producers and products.

Since one of the primary threats to personal privacy also comes from advertisers striving to track all activity on the web, it is hardly surprising that Facebook has highly questionable associations.

It turns out that WPP, the largest advertising agency on Earth, receives backing from In-Q-Tel, an investment fund of the CIA that develops tools that scan social media networks for meaningful data.

Further concerns about personal privacy have been recently heightened by Facebook’s hiring of Jennifer Newstead, who helped craft ever more powerful electronic surveillance guidelines for the PATRIOT Act during the Bush administration.

Given the explanatory power of the Propaganda Model to unfold such obscure machinations, it is possible to see the reasons why, notes Komando, the most significant part of “this plan is to be completely secretive about what data they collect, who gets to see it, and what kind of profile they build about your life.”

Shouldn’t these mechanisms of control, as explained by Herman and Chomsky’s Propaganda Model, be a part of any public discussion of media performance today? Or, should we simply accept vague terms like “fake news” and “controlling the narrative”? Should we go on pretending that practices of legitimization and mystification are no longer in play, or that structures of ownership and advertising are no longer relevant?

In the contemporary world, what media theorist Marshall McLuhan imagined as “a global village” where “everybody gets the message all of the time,” the personal is increasingly public, and information reigns supreme.

Today, information can enlighten and democratize. It can just as well imprison and impede the interests of the common good. The processes of commodification and marketization described here are sustained by a framework directed largely by Facebook, Google, Microsoft, Apple and Amazon, which monopolize privatized and commercialized discourse. This sort of power did not emerge naturally.

But can interpersonal and public discourse be liberated from the filters of commodification? Prisons are only as secure as citizens’ awareness of the echo chambers they channel us into. The Propaganda Model is a useful key for opening wider awareness of these forces shaping media content and performance. One needs only to make use of their powers of personal observation to witness the filtering and the active construction of our mass-mediated reality, a world of powerful illusions pretending to be real and meaningful.

Join us in defending the truth before it’s too late

The future of independent journalism is uncertain, and the consequences of losing it are too grave to ignore. To ensure Truthout remains safe, strong, and free, we need to raise $43,000 in the next 6 days. Every dollar raised goes directly toward the costs of producing news you can trust.

Please give what you can — because by supporting us with a tax-deductible donation, you’re not just preserving a source of news, you’re helping to safeguard what’s left of our democracy.