If you’re in public relations, journalism, entertainment, or other similar fields, you already know that Facebook wields enormous power, probably far too much, in that its algorithms are what chiefly decide to what degree any content you post will be seen by users. Only the people who work at Facebook can know precisely how every factor is weighed, but we do know that there are elements of the content itself that determine its delivery, as well as how its received once it meets its first few sets of eyeballs.
You can almost think of Facebook like an acting agent.
When you sign up with the agency, you audition for the agent, and show them what you’ve got. This is you initially submitting your content to Facebook. (Let’s not take this analogy too literally — obviously the agent can turn you away, where Facebook almost never decline to post something you submitted.)
The agent then shops you around to a few producers and directors and casting directors, and sees how you far in the market. This is the first few moments of activity, if any, your content sparks on Facebook.
If you (the actor) start getting hired for things, make a good name for yourself, and engender more interest, the agent puts you further up their priority list and show you off to more and more muckity-mucks, for higher pay, and more exposure, which leads to even better jobs. This is when your post does well, earns a bunch of likes, comments, and shares, which in turn generates more likes, comments, and shares.
Or, the acting jobs don’t turn out, the agent loses interest, and you go into career limbo. This is the fate of most actors, really, and most content on Facebook.
So what a couple of folks lately have been trying to do is game the agent, and we have examples from two different directions: as the content creator/submitter (the actor in the analogy) and as the audience for the content (the producers and casting directors and box office numbers).
On the first part, we have Caleb Garling at The Atlantic, who decided he’d package his content in a particular way to trick the Facebook algorithm into giving his content traction.
I wanted to see if I could trick Facebook into believing I’d had one of those big life updates that always hang out at the top of the feed. People tend to word those things roughly the same way and Facebook does smart things with pattern matching and sentiment analysis. Let’s see if I can fabricate some social love.
I posted: “Hey everyone, big news!! I’ve accepted a position trying to make Facebook believe this is an important post about my life! I’m so excited to begin this small experiment into how the Facebook algorithms processes language and really appreciate all of your support!”
You can guess what happened. Though totally manufactured, the post did very well, as opposed to some of his previous substantive posts.
Garling admits that he doesn’t know precisely why this worked. You’d need to do a lot more testing outside of a clever novelty stunt to understand what will and will not make Facebook lift your material. But it’s an intriguing way of thinking about how this electronic brain that makes so many decisions for us in terms of what content we see online actually works.
On the other side, we have the brilliant Mat Honan at Wired who decided not to submit content, but to respond to it. All of it. With no particular goal in mind, Honan decided to run an experiment in which he would click “like” on almost literally everything Facebook put in front of him for 48 hours, just to see how his Facebook experience would change. And the results were varying degrees of horrifying. If you hate Facebook now (as I do), just imagine if it were always like this (and this was just in the first 60 minutes):
My News Feed took on an entirely new character in a surprisingly short amount of time. After checking in and liking a bunch of stuff over the course of an hour, there were no human beings in my feed anymore. It became about brands and messaging, rather than humans with messages.
But over time, like a great body of water, or really, a brain with moods and dispositions, things changed. As a result of a rogue like on a conservative-leaning comment, things got ugly and nutty really fast, with a swarm of frightening hard-right vitriol now flooding the feed. And this led something we now know all too well in a world of curation-by-servers:
This is a problem much bigger than Facebook. It reminded me of what can go wrong in society, and why we now often talk at each other instead of to each other. We set up our political and social filter bubbles and they reinforce themselves—the things we read and watch have become hyper-niche and cater to our specific interests. We go down rabbit holes of special interests until we’re lost in the queen’s garden, cursing everyone above ground.
And those bubbles can get very small. We tend to think of this in terms of liberals and conservatives, or maybe Apple and Android fans. But look at what it does to niche within niche, like the skepto-atheosphere, where a burgeoning movement of folks who largely all ought to be on the same side cannot seem to stop eating each other alive online, and hunkering down with those who are ideologically pure – at ever-increasing rates of purity, and therefore ever-shrinking bubbles. As Felicia Day put it on her own blog, “We’re being tricked into believing that our small worlds are much bigger than they really are in the grand scheme of things.”
Garling might have tricked the Facebook brain into thinking he had posted content that was not what it seemed. Honan definitely tricked the Facebook brain into thinking he was asking to see all manner of content he never really would.
What this tells me is that, yes, the brain is fallible, sure, but more importantly that intention matters enormously when it comes to social media. In previous posts here, I’ve talked about how the “crisis” of this filter bubble can be mitigated by intentional self-curation, by being mindful of what you approve of, what you click, what you post, and what you seek out.
Meanwhile, you can’t allow what you see on social media, or what you post to it, to define who you are in your own eyes. So the other lesson is to be intentional in your own self-perception. An actor’s sense of self can rise or fall by the approval of their agent and the industry to which the agent presents them. But it shouldn’t. If the Facebook algorithm is a brain, it’s just one brain, and it’s not a very wise one.