Facebook’s track record for tackling misinformation on its platform is, well, you might say not exactly great. It’s not just limited to dumb or misleading political memes either—it also includes seriously dangerous health misinformation and alleged “cures.” Realizing the error of its ways and their potential for harm, Facebook has finally decided to delete this information altogether so it can no longer be shared or seen.
Just kidding! Facebook says it will leave this information up on its site, instead opting to merely “minimize” its reach in News Feed.
Facebook Product Manager Travis Yeh addressed the company’s half-assed effort to curb the issue in a blog post on Tuesday, writing that “misleading health content is particularly bad for our community.” Changes to address this misinformation went into effect last month and include demoting sensationalist or misleading content—such as posts about a supposed “miracle cure”—as well as posts about “a product or service based on a health-related claim.” These changes mostly appear to apply to Pages. Yeh went on in the post:
We anticipate that most Pages won’t see any significant changes to their distribution in News Feed as a result of this update…
Posts with sensational health claims or solicitation using health-related claims will have reduced distribution. Pages should avoid posts about health that exaggerate or mislead people and posts that try to sell products using health-related claims. If a Page stops posting this content, their posts will no longer be affected by this change.
Here’s the problem with this: Anyone seeking information on, say, poisoning their children with bleach to “cure” autism or whatever other ridiculous and harmful misinformation groups are parroting can, presumably, still navigate to a Page and find that information if they seek it out. By leaving misleading health content up, Facebook is still allowing that content to reach potentially susceptible viewers. This is the same reason Facebook recently came under fire for leaving up distorted videos of House Speaker Nancy Pelosi.
Facebook’s response to that fiasco, as with this one, was infuriating. Speaking with Anderson Cooper for an interview about the incident at the time, Facebook Head of Global Policy Management Monika Bickert said that the company thinks “it’s important for people to make their own informed choice about what to believe,” pointing to the platform’s fact-checking operation as justification for allowing misleading content to remain on its site.
“This is part of the way we deal with misinformation,” Bickert said. “We work with internationally certified fact-checking organizations that are independent from Facebook, and we think these are the right organizations to be making decisions about whether something is true or false.”
Let’s set aside for a moment that the largest social media platform on the planet appears largely indifferent to the misleading garbage that festers on its platform, in no small part, because Facebook exists. Placing the moderation of its content squarely on the shoulders of its fact-checking partners rather than taking any meaningful action to address potentially harmful or dangerous misinformation on its site is only slightly better than doing fuck all about the problem. Even fact-checkers have asked the company for greater transparency and metrics to gauge the value of their work and its effect.