by Brian Awehali
[The 2016 U.S. Presidential election drew increased attention to the corrosive effects of social media-directed news and news bubbles. Regardless of your political views, consider the limits and perils of adopting or normalizing propagandistic modes of communication.]
“A man is whatever room he is in.”
Most people know a certain portion of people on the internet aren’t people at all, or aren’t the people they purport to be, especially on social networks like Google+, Twitter, and Facebook, where at least 5-6% of all profiles are fake. 97% of these imposters are estimated to identify as female, and apparently attractive college-aged bisexuals lead the field. Consider just Facebook’s roughly 1 billion users, then do the math. A conservative estimate is that 80 million of the profiles on the network are fictional. That’s roughly the population of Germany or Egypt, a quarter of the United States, fifteen Finlands. And yet most people don’t think such fakers are among the ranks of their own online “friends.”
“[Facebook is] the most appalling spying machine that has ever been invented.” — Julian Assange, speaking to Russia Today.
* * *
If you have a blog with any overtly “political” agenda or content, chances are pretty good you have some fake followers, too, and that you’ve posted comments by them. You may have had multi-part email or comment board exchanges with them. They might even have names of people familiar to you. If you’ve ever published/edited an independent political magazine, or, say, co-moderated a politicized Facebook page, you definitely interacted with a fair amount of vitriolic cognitive absolutists and disruptive personalities, but you almost surely also interacted with dozens or hundreds of deliberate fakes, either bots engaged in large-scale data harvesting attacks, military or law enforcement personnel who are “doing” the internet in order to influence public opinion, or others intent on exploiting a fundamental weakness of social networks and the internet in general, humorously summed up in a 20-year-old New Yorker cartoon:
“The analysis of the fake Facebook profile experiment showed that creating and maintaining a fake profile is an easy task.”
This was one of the main findings reported in a paper published in the Journal of Service Science Research last year. This is not a new story by any means, but it’s the first (and last) time I’m focusing on it here on LOUDCANARY. The paper is fairly detailed, but in March and April 2012, the authors created six “socially attractive fake Facebook profiles and integrat[ed] them into existing friendship networks to simulate a data harvesting attack.”
I was intrigued by the authors’ detail of how they generated profile pictures:
We algorithmically generated profile pictures. The goal was to obtain artificial images that did not represent a specific real person but that nevertheless had features of real faces. To this end, we applied image transformation algorithms recursively to average faces derived from a set of input faces. To reduce artifacts caused by the image transformation algorithms, we applied filters and various visual effects to mask them. The data was obtained from statistical evaluations of data based on the manually selected birth dates [of the profiles].
The report also describes in detail how they went about constructing other aspects of these fake identities, and what trends they noticed.
* * *
Sadly, these aren’t just the tactics of shadowy government spooks, the CIA, NSA, or whichever billionaire is trying to buy political power. Cass Sunstein, one-time administrator of the White House Office of Information and Regulatory Affairs in the Obama administration, wrote a 2008 paper encouraging the government towards the “cognitive infiltration” of social media for the (stated) purpose of combating conspiracy theorizing.
Sunstein suggested that government agents
might enter chat rooms, online social networks, or even real-space groups and attempt to undermine percolating conspiracy theories by raising doubts about their factual premises, causal logic or implications for political action.
The goal isn’t to discuss the legitimacy of a theory or have anything like an intellectually honest conversation; it’s to undermine and raise doubts about it.
A fair portion of “progressive” and “radical” organizations — seemingly populist movements, labor unions, “astroturfing” practitioners, etc. — are engaged in these kinds of activities as well, perhaps (being charitable) operating on an ultimately short-sighted illogic summed up as “If you can’t beat ’em, join ’em.” Any critical thinking person knows that information warfare is very real. The problem with warfare, and with these kinds of tactics, is that propaganda encourages the opposite of critical thinking, or reasoned response. It’s just manipulation by mass and emotion. To paraphrase historian Stuart Ewen, the core of it is that it’s not people’s brains that are in charge; it’s their spines.
The problem with warfare, and with these kinds of tactics, is that propaganda encourages the opposite of critical thinking, or reasoned response. It’s just manipulation by mass and emotion.
Also, by engaging in this kind of thing on a mass scale, it’s a deliberate attempt to undermine online media as a credible source of information. It’s a deliberate attempt to contaminate the democratic or populist possibilities of online discourse.
What Cass Sunstein articulated in the 2008 paper was hardly new or original — it’s Propaganda 101, same playbook, different field. A good deal of the underlying logic of Sunstein’s paper is crystallized in the words of the founder of public relations, uber-propagandist Edward Bernays, who wrote, in 1928, 80+ years ago, that:
There’s also the serious problem of how political commentary and debate happen now, especially online. I’d started to articulate more specifics about this, but came across a quote from a David Foster Wallace interview from The Believer in 2003 that describes the “uncomplicatedly sexy delusion” of contemporary political commentary, and it obviously applies especially to online media:
“…95 percent of political commentary, whether spoken or written, is now polluted by the very politics it’s supposed to be about. Meaning it’s become totally ideological and reductive: The writer/speaker has certain political convictions or affiliations, and proceeds to filter all reality and spin all assertion according to those convictions and loyalties. Everybody’s pissed off and exasperated and impervious to argument from any other side. Opposing viewpoints are not just incorrect but contemptible, corrupt, evil.
[…] Political discourse is now a formulaic matter of preaching to one’s own choir and demonizing the opposition. Everything’s relentlessly black-and-whitened.
Since the truth is way, way more gray and complicated than any one ideology can capture, the whole thing seems to me not just stupid but stupefying… How can any of this possibly help me, the average citizen, deliberate about whom to choose to decide my country’s macroeconomic policy, or how even to conceive for myself what that policy’s outlines should be, or how to minimize the chances of North Korea nuking the DMZ and pulling us into a ghastly foreign war, or how to balance domestic security concerns with civil liberties? Questions like these are all massively complicated, and much of the complication is not sexy, and well over 90 percent of political commentary now simply abets the uncomplicatedly sexy delusion that one side is Right and Just and the other Wrong and Dangerous. Which is of course a pleasant delusion, in a way—as is the belief that every last person you’re in conflict with is an asshole—but it’s childish, and totally unconducive to hard thought, give and take, compromise, or the ability of grown-ups to function as any kind of community.”
Imagine a different, less stupefying room, where you can tell if it’s a dog or not, where dialogue and hard thought can happen, where the mechanisms of manipulation and control aren’t so well automated, and all is not watched over by machines of loving grace. Sounds good to me.