Facebook is letting its users flag news stories as fake or a hoax and working with fact checkers to vet them, the social media giant announced Thursday, in its first efforts to address fake news since the United States election.
Some news articles that were widely shared on the platform in the run-up to Election Day were obviously and demonstrably false, like the Pope and Denzel Washington endorsing Donald Trump for president — they did not. It's causing widespread confusion, according to a new survey by the Pew Research Center, and the propagation of a baseless conspiracy theory is being blamed for gunman walking into a Washington, D.C., pizzeria shop and shooting a rifle.
Facebook executives have indicated since the election that they were reviewing what changes to make, if any, to combat fake news, though none have said they believe the false news shared on the platform changed the outcome of the election. Those changes were announced at 1 p.m. ET Thursday.
News that's identified as fake by the fact checking organizations, which must sign on to Poynter’s International Fact Checking Code of Principles, will be marked as "disputed" and have an explainer accompanying that content, Facebook said. Facebook's algorithm may also have those stories appear lower in users' feeds. Recode reported that ABC News, Politifact, FactCheck and Snopes are the partner news organizations.
Facebook is also trying to reduce the financial incentive for creating and posting fake articles, and is testing a way to see if reading an article leads fewer people to share it indicates the story is misleading and should be ranked lower.
"We believe in giving people a voice and that we cannot become arbiters of truth ourselves, so we’re approaching this problem carefully," News Feed Vice President Adam Mosseri said in a statement. "We've focused our efforts on the worst of the worst, on the clear hoaxes spread by spammers for their own gain, and on engaging both our community and third party organizations."
A Pew survey released Thursday found that 64 percent of U.S. adults say fabricated news stories are causing confusion about basic facts in current events, while only 10 percent said they believed it was causing not much or no confusion.
Seventy-one percent of the 1,002 people surveyed between Dec. 1 and 4 said they see fake news online often or sometimes.
Fake news became a massive point of contention in the final days of the election and afterward, with Hillary Clinton calling fake news a "danger that must be addressed" quickly in a speech on Capitol Hill last week.
The fake news seemed to target Clinton more than Trump, according to analyses of the content, including one by Buzzfeed that found top false articles generated more engagement than top election stories posted by 19 major news outlets, like NBC News, The New York Times and others. Only three of the top 20 performing false stories didn't target Clinton or support Trump, it found.
Producing fake news became a cottage industry in one part of Macedonia, where NBC News spoke to a teenager who said he's earned $60,000 in six months off of baseless, incendiary posts that mainly targeted followers of Donald Trump, because "Nothing can beat Trump's supporters when it comes to social media engagement," he said.
Those stories appear to have had real-world effects. Edgar Maddison Welch took an AR-15 rifle and handgun into the popular Comet Ping Pong pizzeria in D.C. in early December, to investigate the a rumored child sex abuse ring purportedly run by a Clinton aide, police said. The store's owner had already been receiving death threats, as the hoax became popular on Reddit and other online forums, before spinning off into fake news stories.
Welch discharged his rifle, but no one was hurt, police said. He later told a New York Times reporter that his "intel on this wasn't 100 percent."
CEO Mark Zuckerberg has said he doesn't think fake news swayed the election, and Mosseri told The New York Times Thursday he doesn't believe the feed directly caused people to vote for a particular candidate: "the magnitude of fake news across Facebook is one fraction of a percent of the content across the network."
Americans are split on whether fake news should be limited by social media, according to a McClatchy-Marist poll of just over 1,000 adults out Thursday. Fifty-three percent said it should be up to users to determine what information is true, while 41 percent said Facebook and Twitter should be responsible for preventing false information from spreading.
A higher portion of those surveyed by Pew — 71 percent — said social networking sites and search engines bear a great deal or some responsibility for preventing their spread.
According to that poll, only 15 percent of people are not confident in their ability to spot fabricated news. But many have difficulty differentiating fake news from real, according to a recent Stanford study of students across the country.