As a disturbing number of measles outbreaks crop up around the United States, Facebook is facing challenges combating widespread misinformation about vaccinations on its platform, which has become a haven for the anti-vaccination movement.
The World Health Organization recently named “vaccine hesitancy” as one of the biggest global health threats of 2019. But on Facebook, in public pages and private groups with tens of thousands of members, false information about vaccines – largely stemming from a now-debunked 1998 study that tied immunizations to autism – is rampant and tough to pin down. In the bubble of closed groups, users warn against the dangers of vaccinations, citing pseudoscience and conspiracy theories.
Facebook has publicly declared that fighting misinformation is one of its top priorities. But when it comes to policing misleading content about vaccinations, the site faces a thorny challenge. The bulk of anti-vaccination content doesn’t violate Facebook’s community guidelines for inciting “real-world harm,” according to a spokesperson, and the site’s algorithms often promote unscientific pages or posts about the issue. Parents are left to wade through the mire, and as the viral spread of fake news has shown, many users have trouble distinguishing between reliable sources and unreliable ones.
The rise of “anti-vaxx” Facebook groups is overlapping with a resurgence of measles, a disease that was declared “eliminated” in the U.S. in 2000 because of the measles, mumps and rubella vaccine. But cases have increased in recent years, and at least 10 states have seen outbreaks this winter. Last month, Washington Gov. Jay Inslee, D, declared a state of emergency after 35 cases of measles cropped up in a single county, where nearly a quarter of kids attend school without measles, mumps and rubella immunizations. The WHO has named the highly contagious disease a leading cause of death for children.
Although the spread of misinformation about immunizations has potentially fatal repercussions, a Facebook spokesperson said the company doesn’t believe removing such content doesn’t help to increase awareness.
“While we work hard to remove content that violates our policies, we also give our community tools to control what they see as well as use Facebook to speak up and share perspectives with the community around them,” Facebook said in a statement emailed to The Washington Post. “If the content they’re posting crosses the line and violates our policies, we would remove the content as soon as we become aware of it.”
The company is considering options to make accurate information about vaccinations more accessible to users, but these efforts are in the early stages. In the meantime, Facebook sees factually accurate counter-speech by users as a possible safeguard, he said.
Wendy Sue Swanson, a pediatrician at Seattle Children’s Hospital and spokeswoman for the American Academy of Pediatrics, recently met with Facebook strategists about dealing with public health issues, including misinformation about vaccines, on the platform. Swanson said that it’s not Facebook’s job to police the dialogue around immunizations, but rather to make sure users have ample access to scientifically valid content.
“You wouldn’t go see a pediatrician who doesn’t hold medical certification, but on the internet, you might listen to them,” Swanson said. “Facebook isn’t responsible for changing quacks but they do have an opportunity to change the way information is served up.”
But Facebook’s algorithms often promote anti-vaccination content over widely accepted, scientifically backed posts or pages about vaccinations. A recent investigation from the Guardian found that Facebook search results regarding vaccines were “dominated by anti-vaccination propaganda.” Facebook did not respond to questions from the Guardian about its plans for dealing with the issue.
“Using a new account with no friends or likes, the Guardian used Facebook’s search bar to begin typing the word ‘vaccine’,” the investigation said. “Facebook’s autofill quickly began suggesting search terms that would steer a user toward anti-vaccine misinformation, such as ‘vaccination reeducation discussion forum,’ ‘vaccine reeducation,’ ‘vaccine truth movement’ and ‘vaccine resistance movement.’
Facebook also accepted advertising revenue from Vax Truther, Anti-Vaxxer, Vaccines Revealed and Michigan for Vaccine Choice, among others, according to another investigation from the Guardian.
A recent study from the Credibility Coalition and Health Feedback, a group of scientists that evaluates the accuracy of health media coverage, found that the majority of the most-clicked health stories on Facebook in 2018 were either fake, or contained a significant amount of misleading information. The study looked at the top 100 health stories with the most engagements on social media, and had a network of experts assess their credibility. The study found that less than half were “highly credible.” Vaccinations ranked among the three most popular story topics.
“Considering that the number of shares for neutral and poorly-rated articles amount to almost half the total shares, this indicates that more work needs to be done to curb the spread of inaccurate health news,” experts wrote in the study. “Much of the spread is facilitated by Facebook, which accounts for 96 percent of the shares of the top 100 articles.”
Health-related content is eligible to be reviewed by Facebook’s fact-checking partners, meaning content that’s found to be misleading or false will be demoted in users feeds and appear along with related articles from fact-checkers. But this doesn’t work in groups, where the bulk of anti-vaccination material is spread.
A working paper published in November by the National Bureau of Economic Research looked at the role of Facebook in spreading false information about vaccines. The paper found that Facebook’s ban on ads that linked to fake news stories did lead to a decrease in shares of anti-vaccination content. But in anti-vaccination circles, ads aren’t the primary issue – people are. The found that anti-vaccination groups on Facebook tend to pass around the same misleading links and junk science, then end up spreading the information to the broader public through likes, shares and word of mouth.
“The majority of misinformation about vaccines is spread by individuals – and the majority of that misinformation by a few individuals – sharing the message organically,” Catherine Tucker, a professor at the Massachusetts Institute of Technology and co-author of the NBER paper, said in an email to The Post. “That is a far harder problem to solve, as trying to clamp down on that kind of social sharing has tensions with trying to preserve free speech.”
Send questions/comments to the editors.
Comments are no longer available on this story