Facebook Changed Its Clickbait Policy and You'll Never Guess What Happened Next
Despite the constant tinkering to its News Feed algorithm that has at times confounded users as to what posts get seen and why, Facebook has by and large been nonprofit-friendly. Last summer, it enabled "Donate Now" buttons to (theoretically) make giving easier, and has since then continued to roll out new services for charities, including its dedicated Facebook for Nonprofits site, personal fundraising pages and more advanced options for marketers.
But Facebook's latest change is a step backward.
Yesterday, the social media behemoth announced a major overhaul of its News Feed, updating the algorithm to reduce the ranking and reach of articles with so-called "clickbait" headlines. Facebook attempted this before, in 2014, but those measures apparently weren't strong enough. Its users, the site said, want even fewer clickbait posts in their News Feeds.
Here's how Facebook described the process:
First, we categorized tens of thousands of headlines as clickbait by considering two key points: (1) if the headline withholds information required to understand what the content of the article is; and (2) if the headline exaggerates the article to create misleading expectations for the reader. For example, the headline “You’ll Never Believe Who Tripped and Fell on the Red Carpet...” withholds information required to understand the article (What happened? Who tripped?) The headline “Apples Are Actually Bad for You?!” misleads the reader (apples are only bad for you if you eat too many every day). A team at Facebook reviewed thousands of headlines using these criteria, validating each other’s work to identify a large set of clickbait headlines.
From there, we built a system that looks at the set of clickbait headlines to determine what phrases are commonly used in clickbait headlines that are not used in other headlines. This is similar to how many email spam filters work.
Our system identifies posts that are clickbait and which web domains and pages these posts come from. Links posted from or shared from pages or domains that consistently post clickbait headlines will appear lower in News Feed. News Feed will continue to learn over time—if a page stops posting clickbait headlines, their posts will stop being impacted by this change.
In theory, this is a good change. While many people's definition of "clickbait" will vary, most of us agree that the kind of headlines described above flat-out suck. Good, reputable organizations—be they news sites or nonprofits—rarely employ headlines like these anyway. Facebook said it expects little changes for most pages, so, no big deal.
The problem, though, is that Facebook hasn't made clear the parameters used to determine a clickbait headline. Those parameters may be well defined internally, for the purposes of the News Feed algorithm, but there's no way for publishers to know for sure what exactly constitutes clickbait. And there seems to be far too much ambiguity.
Even in the examples Facebook provided, things can get murky fast. The headline "Apples Are Actually Bad for You?!" is clickbait because everyone knows apples are only bad if you eat 243 of them a day, says Facebook. But what if a scientific breakthrough determined that ursolic acid, a compound in apple skins known for its obesity-fighting properties, is actually a carcinogen? Or, what if the headline was "Coffee Is Actually Bad for You?!" The general consensus is that coffee has some major health benefits, but there's a legitimate debate there. What if new information swings it the opposite way?
The obvious answer is to simply recast headlines so they're clearer and contain more information. But how specific do you have to be—and how much information is required? "Apples May Cause Cancer, Researchers Say" seems like it would easily pass the clickbait test, but would "Apples Might Not Be Good for Your Health" pass? In the hypothetical alternate universe where ursolic acid is bad for you, both headlines would be true. And both appear to meet the requirements outlined in Facebook's clickbait best practices. But there's no way to be absolutely sure that the News Feed algorithm won't flag the latter headline as clickbait for containing less information than the former.
Another issue is that a "clickbait" headline doesn't necessarily guarantee bad content. Take, for example, this post from Michael Rosen on his excellent blog, Michael Rosen Says. Here, we have an 1,100-word article with a clever conceit relating ice cream and fundraising, and in-depth analysis and five distinct points backing the argument. But the headline, "The #Fundraising Secret for Success You Need to Know," sure looks like it'd be flagged as clickbait.
Sure, Rosen could make it clearer—"Why Ice Cream is the #Fundraising Secret for Success" would likely pass Facebook's clickbait test and draw readers just as effectively as the current headline. But if the content is good, who cares? Shouldn't the content matter as much as, or more than, the headline? Facebook may have inadvertently said as much in its explanation for the News Feed changes—the headline "Apples Are Actually Bad for You?!" is only misleading because the content ("apples are only bad for you if you eat too many every day," as Facebook parenthetically noted) is bad.
(The Atlantic further underscored this point in a great piece reimagining famous headlines to comply with Facebook's new policy. Our favorite is "This Is a Long, Vividly Written Story About the History and Types of Oranges That Exist," the new, Facebook-friendly headline for journalist John McPhee's 1967 classic in The New Yorker, "Oranges." McPhee was a four-time Pulitzer Prize finalist and one-time winner.)
The changes won't much affect nonprofits, especially those that already produce strong content and have established, engaged Facebook communities. For most charities, social media is a support channel, not a primary one, and the new clickbait policy is a minor change. But it's one more thing to keep an eye on. And it begs the question—given its spotty record, why does Facebook get so much say in what people see and don't see?