diversity and inclusion, instructional design, social media

Article Review: “#Gamergate and The Fappening: How Reddit’s algorithm, governance, and culture support toxic technocultures”

Welcome to “Article Review”, where I put on my scholarly cap to give an academic article its due in critical analysis.

This week’s article: 

“#Gamergate and The Fappening: How Reddit’s algorithm,
governance, and culture support toxic technocultures” by Adrienne Massanari

For my latest review, I’ve opted to unpack Adrienne Massanari’s “#Gamergate and The Fappening”. This 2015 article examines how Reddit’s algorithm and policies combine with its “masculine geek” culture to create the perfect hothouse for what the author calls “toxic technocultures.”

Massanari takes a deep dive into the social psychology of today’s online social spaces, introducing several concepts that could explain the fraught dynamics of online interaction. Among others, I’m particularly taken by Massanari’s use of actor-network theory, which “emphasizes the importance of considering how non-human technological agents (algorithms, scripts, policies) can shape and are shaped by human activity” (p.2). Through such lenses, Massanari succinctly conducts a brutal autopsy of the dark, slimy innards of online toxicity.

Massanari’s conclusions should be compulsory reading for anyone who utilizes an online social platform like Reddit.

The heart of Massanari’s paper examines Reddit as a case study, illustrating how that particular social platform’s design, policies, and culture allow toxic players to flourish.

The algorithmic/design problems built into Reddit’s platform stem from three design choices: the ease of creating multiple (& anonymous) accounts, its use of “karma”, and the cross-subreddit aggregation of content, with all the problems that can arise from visibility.

Moderator-specific issues (the effects of unpaid labor, few tools, and mini-fiefdoms) and the platform’s hands-off policies further tip Reddit in favor of toxicity. Massanari’s pithy conclusion about Reddit’s community governance – “remaining ‘neutral’ in [toxic] cases valorizes the rights of the majority while often trampling over the rights of others” (p.11) – cuts to the quick.

Although Massanari focuses her argument on #Gamergate and The Fappening, the same argument could be applied to an endless list of “white male nerds behaving badly”. Similarly, the Reddit design/policy/culture elements she identifies are evident in other problematic social platforms, most notably Twitte and 4chan.

Let’s pause to consider two key concepts this paper is built upon: “toxic technocultures” and “geek masculinity”.

Two core characteristics of “toxic technoculture” should be familiar to anyone who has spent time online: the use of social platforms “as both a channel of coordination and harassment”, and tactics that “rely heavily on implicit or explicit harassment of others” (p. 5).

But the roots of this phenomenon come from somewhere much darker than simple organized harassment. Toxic technocultures pivot on an eye-watering mish mash of homophobia, sexism, and racism, favorable only to those who are heterosexual, male, and white. Indeed, habitual bigotry is what makes this genre of technoculture so “toxic.” Consider these two further characteristics identified by Massanari: “valorization of masculinity masquerading as a peculiar form of ‘rationality’”, and “retrograde ideas of gender, sexual identity, sexuality, and race” (p. 5).

Those of you who fall outside toxic technoculture’s prescribed hetero-male-white ideal are probably nodding your head in recognition. Social networks are teeming with this kind of targeted bigotry — if you’re aware of it you can’t help but notice it everywhere.

I found Massanari’s commentary on “geek masculinity” eye-opening, particularly how it “both repudiates and reifies elements of hegemonic masculinity” (p. 4). Geek culture’s fraught relationship with traditional masculinity can “create a sense of cognitive dissonance” for its members, who “likely view themselves as perpetual outsiders and thus are unable or unwilling to recognize their own immense privilege” (p. 4). In that single sentence, Massanari perfectly articulates something I’ve been struggling to understand for years.

My only criticism of Massanari’s article is that within the confines of this article the toxicity she talks about remains largely in the abstract. From the specific examples she provides, a reader could come away with the impression that “toxic technoculture” is just people behaving badly — viewing nudes without permission, saying sexist/racist/fat-phobic things, and, in general, being jerks.

That doesn’t begin to cover how disgusting and cruel their behavior is.

Addressing this criticism would be simple: call out what the “fap” in “fappening” refers to (instead of tucking it away in a footnote – let’s not be coy), reprint some post titles from /r/fatpeopleshaming, or describe the experience of someone targeted by such toxicity. Doing so would add an additional (and, in my opinion, necessary) level of emotional engagement to the author’s argument, inciting further reflection, research, and action.

Writ large, the effects of algorithm, design, and community guidelines on online spaces demand further investigation. Massanari has only scratched the surface of an issue that permeates the web and often spills over into the real world.

Future research could contrast Reddit to other notoriously toxic social platforms (particularly Nextdoor), dovetail into algorithm design bias, or further dive into Reddit, unpacking its algorithms, the rare subreddit with an effective moderator, the use of “flair” to verify a poster’s credentials, or a case study.

One potential case study: an activist who I worked with at my previous job has her own Reddit hate group. Over the years the group’s members have propagated fake press coverage about her, built websites discrediting her (perhaps why the third Google search suggestion as of this writing is “[activist’s name] liar”), stalked her in real life, conducted email/phone spam campaigns against companies that work with her (mine included), and generally been horrible human beings.

Reddit was, is, and will continue to be their base of operations. A case study could derive actionable information from this woman’s suffering by examining the factors that brought her hate group together, Reddit’s role in facilitating their toxicity, and the intersection of platform and pathos that perpetuates the group’s survival over years and years.

As someone who sees the potential value in social platforms while simultaneously opting to abstain from most of them based on past negative experiences, I can’t help but hope that Massanari’s research is the first step on the journey toward a new social media.

(I personally would love to see analysis of educational YouTube channels, since they seem to attract more than their share of toxicity.)

Once we can identify platform design/construction that encourages vibrant and respectful interactions, and uncover design elements that permit a space to be inviting and inclusive, online social interactions might finally have the opportunity to build its members up instead of tearing some members down.

References

Massanari, A. (2015, October 9). #Gamergate and The Fappening: How Reddit’s algorithm, governance, and culture support toxic technocultures. New Media & Society, 19(3). DOI: 10.1177/1461444815608807
 

Recommended Viewing

Emily Graslie’s brilliant breakdown of toxic technoculture on YouTube’s STEM channels (The Brain Scoop)

Leave a Reply

Please log in using one of these methods to post your comment:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s