Viewport width =
facebook
August 7, 2017 | by  | in Features |
Share on FacebookShare on Google+Pin on PinterestTweet about this on Twitter

Next Gen Gentry

In an increasingly secular age, we often look to technology as our salvation. In borrowing that narrative from religion, we treat every new development, from the steam engine to CRISPR, as either the Second Coming or the herald of an apocalypse. Social media is a complicated case of this. While it is restricted by programming and primarily dependent on advertising revenue to keep running, its uses have grown to serve journalism, democracy, media, commerce, and general social discourse. By incorporating so many domains and purposes, one might be mistaken in thinking that social media is the ultimate tool, that if it encompasses all human experience, then perhaps it can better all human experience.  

The technological saviour complex is common in Silicon Valley, including Facebook CEO Mark Zuckerberg. In February this year, Zuckerberg published a 6000-word essay to Facebook entitled “Building Global Community”. After initially scoffing at the idea that Facebook had contributed to the spread of “fake news,” this was Zuckerberg’s attempt to come to grips, in his own distanced technocratic way, with the fact that his Frankenstein’s monster of echo chambers hadn’t brought about a glittering utopia. Zuckerberg envisions a world where Facebook functions as a “universal city,” a global community where everyone knows everyone. In his essay, he particularly wanted to address how Facebook can “encourage civic participation,” especially as democracy is “receding in many countries.”  But the way Zuckerberg sees Facebook and how we see Facebook are two different things, and he fails to acknowledge how much it is defined by its users, the actions or inactions of its moderators, and the interests of corporations. Going back to the Frankenstein metaphor, he may have made the thing but he is not master over it.

Social media expands on something that humans do naturally: we build and sustain communities and share ideas with one another. With the advent of the internet, social media was the next logical step in creating a global community, one where every person was in contact with each other at any one time. With the creation of this new global territory came the idea that it would somehow elevate us to a new era of compassion and understanding.

That… didn’t really happen. While social media is a great way to keep in touch with friends, curate the content you consume, and gain an understanding of issues and beliefs you wouldn’t have been exposed to otherwise, the human brain is still operating on TribeThink: Fertile Crescent Edition and, as a result, our technology has rapidly outpaced our ability to comprehend it’s unexpected effects. As Zuckerberg puts it: “Giving everyone a voice has historically been a very positive force for public discourse because it increases the diversity of ideas shared. But the past year has also shown it may fragment our shared sense of reality.” In trying to make sense of the multitude of voices, issues are often simplified and polarised with people — however complex their humanity — being reduced to either an “us” or a “them”. The way social media is programmed often perpetuates this polarisation.

This is a massive problem when you understand that Facebook operates according to a completely different ethical framework to our present social and judicial understandings of society. An article by Julia Angwin and Hannes Grassegger on Propublica outlined how Facebook’s “secret” censorship policies operate on an incredibly specific algorithm that more often than not protects people that least need to be protected, and censors people who most need a voice. The article outlines how Facebook censors or removes any post that directly attacks what they classify as “protected categories,” which covers all races, religion, ethnicities, and sexual orientations. However, a way around this is to address your attack in a way that only incorporates a “subset” of those groups, and these subsets are not protected.

An example from the training document for Facebook’s 4,500 content viewers, their “human censors,” asks which out of three groups is protected from hate speech: women drivers, black children, or white men. The correct answer according to the document is white men, as they are a protected group, whereas the other two fall under the subsets of “children” and “drivers”, instead of the protected groups “women” and “blacks”. A post by a US congressman calling for the death of radicalised Muslims “for all that is good and righteous” will be ignored, as it is directed at a subset (radicals) of a protected group (Muslims). Meanwhile, a post by a Black Lives Matter activist saying “All white people are racist. Start from this reference point, or you’ve already failed” will be removed for attacking a protected category. So while Facebook seeks to treat everyone equally, it doesn’t focus on equity.

I asked a former Facebook employee, who chose to remain anonymous, about whether this was the fault of the company for not being more transparent, or whether Facebook users are not using the platform “correctly,” so to speak. “When it comes to social platforms I don’t believe there is a right or wrong way to use them,” they said. “It’s the challenge of these mediums; they can’t anticipate exactly how people will use their product and [they] can’t really be curated because users are dictating the pace.” They pointed out that Facebook frequently has to adjust. “Social platforms like Facebook are creating their policies as they go; there are no norms of behavior yet defined nor enshrined for this group.”

In his essay, Zuckerberg outlined solutions for addressing these issues. Artificial intelligences would be built to better determine the difference between hate speech and posts that directly address hate speech, alongside an upped censor task force of 7,500. Echo chambers would be combatted by making changes to the News Feed to provide users with the whole perspective on an issue, rather than one aggressively polarised one. While the approach is admirable, actually implementing it isn’t going to solve every problem at once, and one gets the impression that Zuckerberg has clearly set the standards for what Facebook is actually able to achieve too high.

The platform exists only to facilitate an experience to its users, and if too many things are changed that interfere with that experience, they might jump ship to another one that better suits them. As my contact from Facebook explained to me, “I think in our digital world people are less concerned with this rules-and-regulations view so much as what they are ‘experiencing’. Generations raised in this environment are very quickly picking and choosing their experiences (and ditching them quick if they get bored or it no longer resonates).”  

“Facebook… [is] primarily used by older people; teenagers don’t see it as relevant to them and way way way too slow. Going forward, remaining relevant seems more of a challenge than whether or not everyone understands the rules sitting in behind it all.”

Some platforms have worked those experiences into their business model. Where Facebook attempts to be about connection and fostering community, Twitter thrives on polarisation. A thread by user @pookleblinky explained that the reason Twitter is slow to act on abuse is because removing Tweets that harass people is antithetical to what Twitter is. When a harassment Tweet goes viral and the post calling it out goes viral in turn, all that does is play into Twitter’s business model of making more Tweets. Then, they can use that newly generated content (created for free by people needing to bring attention to the event) to create spaces for advertisements, and all the followers to those accounts involved in that dispute become a new “audience” they can sell things to. That constant stream of content puts eyeballs on screens and signs more followers up. That’s why they’re never going to ban, say, that one heinous alt-right user, or the perennial national security fiasco that is Donald Trump’s account, because that would be taking away Twitter “space” and, from Twitter’s perspective, that is a Bad Thing.

Social media, or any one technology, is not going to “save” the world. That’s asking the wrong question. Salvation implies an end goal, that there is a balm for all our problems or one perfect solution. But if we’re sensitive and clever about it, we can use what we have to do good where we can. Not end all harm, but reduce it. Alternatively, we can convince ourselves that we’re doing a lot while actually doing very little. Humans thrive on their capacity for self-deception, and if there’s one thing that social media curation is good at, it’s creating an experience in which everything conforms to our expectations of how the world works. Which (and I am well aware of the irony of this statement) accusations of Facebook’s complicity in spreading “fake news” only serves to prove.

There was a running joke through 2016 that we were living in a cyberpunk future that nobody wanted. Cyberpunk is a genre of science fiction that imagines futures where technological advancement occurs without the accompanying expected gains in social progress, and it examines the conflicts that arise out of that juxtaposition. Greg Rucka and Michael Lark’s Lazarus imagines a world where after several successive disasters, every government becomes privatised, and territory is divided up between a series of warring feudal corporate families. America is fought over between the agriculture giant Carlyle and the pharmaceutical empire Hock, whose advancements in genetically modified seeds and medicine respectively keep the world fed and healthy, but only by making the workers on their territories completely dependent on them, determined by their usefulness to the company as either “serfs” or “waste”. Imagine Game of Thrones but with Pfizer and Monsanto instead of Lannisters and Freys.

When data is a resource that people naturally generate through posts and blogs and Tweets and Snaps, the only lands left to conquer are invisible. And as these invisible lands get larger, the companies that run them are forced, half by choice and half by circumstance, into becoming a gentry to their consumers, and, in a more literal sense, to their employees. Last month, the Wall Street Journal reported that Google was buying and assembling 300 prefabricated apartment blocks to house their employees, who can’t find affordable housing due to rising rents around the company base in Mountain View, California. Google already owns around a tenth of the real estate in Silicon Valley, and recently got the go-ahead to expand into a 370,000 square-foot office space after settling a property dispute with LinkedIn. Facebook has similar plans to build what they called a “mixed-use village,” with 1,500 housing units, a pharmacy, a grocery store, and retail outlets, which is set to be completed by 2021. They also have plans to invest in public transport, or eliminate their employee’s need for it by allowing them to live in a corporate-owned apartment within walking distance to the office or the company store.

One Tweet put it succinctly: “Google is beta-testing feudalism.”

In the dystopian future, the internet serfs you.

Share on FacebookShare on Google+Pin on PinterestTweet about this on Twitter

About the Author ()

Add Comment

You must be logged in to post a comment.

Recent posts

  1. An (im)possible dream: Living Wage for Vic Books
  2. Salient and VUW tussle over Official Information Act requests
  3. One Ocean
  4. Orphanage voluntourism a harmful exercise
  5. Interview with Grayson Gilmour
  6. Political Round Up
  7. A Town Like Alice — Nevil Shute
  8. Presidential Address
  9. Do You Ever Feel Like a Plastic Bag?
  10. Sport
1

Editor's Pick

In Which a Boy Leaves

: - SPONSORED - I’ve always been a fairly lucky kid. I essentially lucked out at birth, being born white, male, heterosexual, to a well off family. My life was never going to be particularly hard. And so my tale begins, with another stroke of sheer luck. After my girlfriend sugge