Democracy destroyer

For years, media outlets desperately chased the clicks promised by Facebook; now that decision threatens to destroy them

0

As with any toxic relationship, the possibility of a breakup sparks feelings of terror — and maybe a little bit of relief.

That’s the spot that Facebook has put much of the news business in. Last month, the social media behemoth announced it would once again alter its News Feed algorithm to show users even more posts from their friends and family, and a lot fewer from media outlets.

The move isn’t all that surprising. Ever since the 2016 U.S. election, Facebook’s been under siege for creating a habitat where fake news stories flourished. Their executives were dragged before Congress last year to testify about how they sold ads to Russians who wanted to influence the election, and so, in some ways, it’s simply easier for them to get out of the news business altogether.

But for the many news outlets that have come to rely on Facebook funneling readers to their sites, the impact of a separation sounds catastrophic. “The End of the Social News Era?” a New York Times headline asked. “Facebook is breaking up with news,” an ad for the new BuzzFeed app proclaimed. When a giant like Facebook takes a step — until recently, the social media site had been sending more traffic to news outlets than Google — the resulting quake can cause an entire industry to crumble.

Consumers, meanwhile, have grimaced as their favorite media outlets have stooped to sensational headlines to lure Facebook’s web traffic. They’ve become disillusioned by the flood of hoaxes and conspiracy theories that have run rampant on the site.

A Knight Foundation/Gallup poll released last month revealed that only one-third of Americans had a positive view of the media. About 57 percent said that websites or apps using algorithms to determine which news stories readers see was a major problem for democracy. Two-thirds believed the media being “dramatic or too sensational in order to attract more readers or viewers” was a major problem.

Now, sites that rely on Facebook’s algorithm have watched the floor drop out from under them when the algorithm is changed — all while Facebook has gobbled up chunks of print advertising revenue.

It’s all landed many media outlets in a hell of a quandary: It sure seems like Facebook is killing journalism. But can journalism survive without it?

“Traffic is such a drug right now,” says Sean Robinson, a 53-year-old investigative reporter at the Tacoma News Tribune. “The industry is hurting so bad that it’s really hard to detox.”

Jeff Drew

You won’t believe what happens next

It’s perhaps the perfect summation of the internet age: a website that started because a college kid wanted to rank which co-eds were hottest became a global goliath powerful enough to influence the fate of the news industry itself.

When Facebook first launched its “News Feed” in 2006, it ironically didn’t have anything to do with news. At least, not how we think of it. This was the website that still posted a little broken-heart icon when you changed your status from “In a Relationship” to “Single.”

The News Feed was intended to be a list of personalized updates from your friends. When Facebook was talking about “news stories,” it meant, in the words of Facebook’s announcement, like “when Mark adds Britney Spears to his Favorites or when your crush is single again.”

But in 2009, Facebook introduced its iconic “like” button. Soon, instead of showing posts in chronological order, the News Feed began showing you the popular posts first.

And that made all the difference.

Facebook didn’t invent going viral — grandmas with AOL accounts were forwarding funny emails and chain letters when Facebook founder Mark Zuckerberg was still in grade school — but its algorithm amplified it. Well-liked posts soared. Unpopular posts simply went unseen.

Google had an algorithm too. So did YouTube.

Journalists were given a new directive: If you wanted readers to see your stories, you had to play by the algorithm’s rules. Faceless, mystery formulas had replaced the stodgy newspaper editor as the gatekeeper of information.

So when the McClatchy Company — a chain that owns 31 daily papers including the Tacoma News Tribune and the Bellingham Herald — launched its reinvention strategy last year, knowing how to get Facebook traffic was central.

“Facebook has allowed us to get our journalism out to hundreds of millions more people than it would have otherwise,” says McClatchy’s Vice President of News Tim Grieve, a fast-talking former Politico editor. “It has forced us, and all publishers, to sharpen our game to make sure we’re writing stories that connect with people.”

With digital ad rates tied to web traffic, the incentives in the modern media landscape could be especially perverse: Write short, write lots. Pluck heartstrings or stoke fury.

In short, be more like Upworthy. A site filled with multi-sentence emotion-baiting headlines, Upworthy begged you to click by promising that you would be shocked, outraged or inspired — but not telling you why. (One example: “His first 4 sentences are interesting. The 5th blew my mind. And made me a little sick.”)

By November of 2013, Upworthy was pulling in 88 million unique visitors a month. With Facebook’s help, the formula spread.

The McClatchy-owned Bellingham Herald headlined a short crime story about the arrest of a carjacker this way: “Four people, two cars, one gun. What happens next?”

A short Herald story asking for tips about a recent spree of indecent exposure was headlined, “She was looking at her phone, but the man wanted her to watch him masturbate.”

Even magazines like Time and Newsweek — storied publications that sent photojournalists to war zones — began pumping out articles like, “Does Reese Witherspoon Have 3 Legs on Vanity Fair’s Cover?” and “Trump’s Hair Loss Drug Causes Erectile Dysfunction.”

Newsweek’s publisher went beyond clickbait; the magazine was actually buying traffic through pirated video sites, allegedly engaging in ad fraud.

On Monday Feb. 5, Newsweek senior writer Matthew Cooper resigned in disgust after several Newsweek editors and reporters who’d written about the publisher’s series of scandals were fired. He heaped contempt on an organization that had installed editors who “recklessly sought clicks at the expense of accuracy, retweets over fairness” and left him “despondent not only for Newsweek but for the other publications that don’t heed the lessons of this publication’s fall.”

Mathew Ingram, who covers digital media for Columbia Journalism Review, says such tactics might increase traffic for a while. But readers hate it. Sleazy tabloid shortcuts gives you a sleazy tabloid reputation.

“Short-term you can make a certain amount of money,” Ingram says. “Long-term you’re basically setting fire to your brand.”

One strategy throughout the industry is to downplay the location of a story: readers in other markets are more likely to click if they don’t know it happened thousands of miles away.

Robinson, the veteran Tacoma News Tribune reporter, says local cops have complained about crime stories from elsewhere that were being shared on Facebook by local TV stations without context — worrying local readers were being misled into thinking they happened in Tacoma.

Grieve, the McClatchy executive, says that he doesn’t ever want to sensationalize a story. But he also says that “internet and social media are noisy places,” and papers have to sell their stories aggressively to be heard over the din.

“If you’re writing stories that aren’t getting read,” Grieve says, “you’re not a journalist — you’re keeping a journal.”

Jeff Drew

Clickbait and switch

Plenty of media outlets have tried to build their business on the foundation of the News Feed algorithm. But they quickly got a nasty surprise: That foundation can collapse in an instant.

As Facebook’s News Feed became choked with links to Upworthy and its horde of imitators, the social network declared war on clickbait. It tweaked its algorithms, which proved catastrophic for Upworthy.

“It keeps changing,” Ingram says, “Even if the algorithm was bad in some way, at least if it’s predictable, you could adapt.”

A 2014 Time magazine story estimated that two to three global algorithm tweaks on Facebook were happening every week.

Six years ago, for example, KHQ, a TV news station in Spokane, Washington, told readers they’d have “an ENTIRE day here on FB dedicated to positive local news” if the post got liked 500 times. It worked. The post got more than 1,200 likes, and KHQ followed through with a puppy-picture-laden “Feel Good Friday!!!”

Under the current Facebook algorithm that tactic could get their entire page demoted. So could using shameless “you-won’t-believe-what-happened-next” style phrases.

Much of the time, Facebook and Google don’t announce their shifts up front. Media outlets often have had to reverse-engineer the changes, before issuing new commands to their troops in the field.

“Oh, they changed their algorithm again?” Robinson says. “Oh, what is it today, coach? OK, it’s 50-word [headlines] instead of 60?”

A pattern emerged. Step 1: Media outlets reinvent themselves for Facebook. Step 2: Facebook makes that reinvention obsolete.

Big publishers leaped at the chance to publish “Instant Articles” directly on Facebook, only to find that the algorithm soon charged, rewarding videos more than posts and rendering Instant Articles largely obsolete. So publishers like Mic.com, Mashable and Vice News “pivoted to video,” laying off dozens of journalists in the process.

“Then Facebook said they weren’t as interested in video anymore,” Ingram says. “Classic bait and switch.”

Which brings us to the latest string of announcements: The News Feed, Zuckerberg announced last month, had skewed too far in the direction of social video posts from national media pages and too far away from personal posts from friends and family.

They were getting back to their roots.

And now, news organizations who’d dumped a lot of money into eye-catching pre-recorded video would suffer the most under the latest algorithm changes, Facebook’s News Feed VP Adam Mosseri told TechCrunch last month, because “video is such a passive experience.”

Even before the announcement, news sites had seen their articles get fewer and fewer hits from Facebook. Last year, Google once again became the biggest referrer of news traffic as Facebook referrals decreased. Many sites published tutorials pleading with their readers to manually change their Facebook settings to guarantee the site’s appearance in their news feeds.

“Some media outlets saw their [Facebook] traffic decline by as much as 30 to 40 percent,” Ingram says. “Everybody knew something was happening, but we didn’t know what.”

It might be easy to mock those who chased the algorithm from one trend to another with little to show for it. But the reality, Ingram says, is that many of them didn’t really have a choice.

“You pretty much have to do something with Facebook,” Ingram says. “You have to. It’s like gravity. You can’t avoid it.”

Zuckerberg’s comments that stories that sparked “meaningful social interactions” would do the best on Facebook caused some to scoff.

“For Facebook, it’s bad if you read or watch content without reacting to it on Facebook. Let that sink in for a moment,” tech journalist Joshua Topolsky wrote at The Outline. “This notion is so corrupt it’s almost comical.”

In subsequent announcements, Facebook gave nervous local news outlets some better news: They’d rank local community news outlets higher in the feed than national ones. They were also launching an experiment for a new section called “Today In,” focusing on local news and announcements, beta-testing the concept in cities like Olympia.

But in early tests, the site seemed to have trouble determining what’s local.

Seattle Times reporter Joe O’Sullivan noted on Twitter that of the five stories featured in a screenshot of Facebook’s Olympia test, “NONE OF THEM ARE OLYMPIA STORIES. ZERO.”

The Seattle Times and other outlets say they’re taking a “wait-and-see” approach to the latest algorithm, analyzing how the impact shakes out before making changes. They’ve learned to not get excited.

“It just, more and more, seems like Facebook and news are not super compatible,” says Shan Wang, staff writer at Harvard University’s Nieman Journalism Lab. 

At least not for real news. For fake news, Facebook’s been a perfect match.

Faking it

There was a time Facebook was positively smug about their impact on the world. After all, they’d seen their platform fan the flames of popular uprisings during the Arab Spring in places like Tunisia, Iran and Egypt.

“By giving people the power to share, we are starting to see people make their voices heard on a different scale from what has historically been possible,”  Zuckerberg bragged in a 2012 letter to investors under the header, “we hope to change how people relate to their governments and social institutions.”

And Facebook certainly has — though not the way it intended.

A BuzzFeed investigation before the 2016 presidential election found that “fake news” stories on Facebook, hoaxes or hyperpartisan falsehoods actually performed better on Facebook than stories from major trusted outlets like the New York Times.

That, experts speculated, is another reason why Facebook, despite its massive profits, might be pulling back from its focus on news.

“As unprecedented numbers of people channel their political energy through this medium, it’s being used in unforeseen ways with societal repercussions that were never anticipated,” writes Samidh Chakrabarti, Facebook’s product manager for civic engagement, in a recent blog post.

The exposure was widespread. A Dartmouth study found about one-fourth of Americans visited at least one fake-news website — and Facebook was the primary vector of misinformation. While researchers didn’t find fake news swung the election — though about 80,000 votes in three states is a pretty small margin to swing — the effect has endured.

Donald Trump has played a role. He snatched away the term used to describe hoax websites and wielded it as a blunderbuss against the press, blasting away at any negative reporting as “fake news.”

By last May, a Harvard-Harris poll found that almost two-thirds of voters believed that mainstream news outlets were full of fake news stories.

The danger of fake news, after all, wasn’t just that we’d be tricked with bogus claims. It was that we’d be pummeled with so many different contradictory stories, with so many different angles, the task of trying to sort truth from fiction just becomes exhausting.

So you choose your own truth. Or Facebook’s algorithm chooses it for you.

Every time you like a comment, chat or click on Facebook, the site uses that to figure out what you actually want to see: It inflates your own bubble, protecting you from facts or opinions with which you might disagree.

And when it does expose you to views from the other side, it’s most likely going to be the worst examples, the trolls eager to make people mad online, or the infuriating op-ed that all your friends are sharing.

That’s partly why many of the 3,000 Facebook ads that Russian trolls bought to influence the election weren’t aimed at promoting Trump directly. They were aimed at inflaming division in American life by focusing on such issues as race and religion. In all, Russian Facebook posts reached 126 million Americans.

Facebook has tried to address the fake news problem — hiring fact checkers to examine stories, slapping “disputed” tags on suspect claims, putting counterpoints in related article boxes — but with mixed results.

The recent Knight Foundation/Gallup poll, meanwhile, found that those surveyed believed that the broader array of news sources actually made it harder to stay well-informed.

And those who grew up soaking in the brine of social media aren’t necessarily better at sorting truth from fiction. Far from it.

“Overall, young people’s ability to reason about the information on the internet can be summed up in one word: bleak,” Stanford researchers concluded in a 2016 study of over 7,800 students. More than 80 percent of middle schoolers surveyed didn’t know the difference between sponsored content and a news article.

It’s why groups like Media Literacy Now have successfully pushed legislatures in states like Washington to put media literacy programs in schools.

That includes teaching students how information was being manipulated behind the scenes, says the organization’s president Erin McNeill.

“With Facebook, for example, why am I seeing this story on the top of the page?” she asks. “Is it because it’s the most important story, or is it because of another reason?”

But Facebook’s new algorithm threatens to make existing fake news problems even worse, Ingram says. By focusing on friends and family, it could strengthen the filter bubble even further. Rewarding “engagement” can just as easily incentivize the worst aspects of the internet.

You know what’s really good at getting engagement? Hoaxes. Conspiracy theories. Idiots who start fights in comments sections. Nuance doesn’t get engagement. Outrage does.

“Meaningful social interactions” is a hard concept for algorithms to grasp.

“It’s like getting algorithms to filter out porn,” Ingram says. “You and I know it when we see it. [But] algorithms are constantly filtering out photos of women breastfeeding.”

Facebook hasn’t wanted to push beyond the algorithm and play the censor. In fact, it’s gone in the opposite direction. After Facebook was accused of suppressing conservative news sites in its Trending Topics section in 2016, it fired its human editors. (Today, conspiracy theories continue to show up in Facebook’s Trending Topics.)

Instead, to determine the quality of news sites, Facebook is rolling out a two-question survey about whether users recognized certain media outlets, and whether they found them trustworthy. The problem, as many tech writers pointed out, is that a lot of Facebook users, like Trump, consider the Washington Post and the New York Times to be “fake news.”

The other problem? There are a lot fewer trustworthy news sources out there. And Facebook bears some of the blame for that, too.

A version of this article first appeared in the Inlander.

LEAVE A REPLY

Please enter your comment!
Please enter your name here