Facebook’s dilemma: How to police claims about unproven COVID-19 vaccines
Since the World Health Organization declared the novel coronavirus an international health emergency in January, Facebook has removed more than 7m pieces of content with false claims about the virus that could pose an immediate health risk to people who believe them.
The social media giant, which has long been under fire from lawmakers over how it handles misinformation on its platforms, said it had in recent months banned such claims as ‘social distancing does not work’ because they pose a risk of ‘imminent’ harm. Under these rules, Facebook took down a video post by US president Donald Trump in which he claimed children are “almost immune” to Covid-19.
But, in most instances, Facebook does not remove misinformation about the new Covid-19 vaccines that are still under development, according to the company’s vaccine policy lead Jason Hirsch, on the grounds that such claims do not meet its imminent harm threshold. Hirsch said the company is “grappling” with the dilemma of how to police claims about new vaccines that are as yet unproven.
“There’s a ceiling to how much we can do until the facts on the ground become more concrete,” Mr Hirsch said in an interview with Reuters.
Tom Phillips, editor at one of Facebook’s fact-checking partners, Full Fact, sees the conundrum this way: “How do you fact check about a vaccine that does not exist yet?”
The worry, public health experts said, is the spread of misinformation on social media could discourage people from eventually taking the vaccine.
At the same time, free speech advocates fret about increased censorship during a time of uncertainty and the lasting repercussions long after the virus is tamed.
Facebook representatives said the company has been consulting with about 50 experts in public health, vaccines, and free expression on how to shape its response to claims about the new Covid-19 vaccines.
Even though the first vaccines aren’t expected to go to market for months, polls conducted in the US show many people are already concerned about taking a new Covid-19 vaccine, which is being developed at a record pace.
Some 28% of Americans say they are not interested in getting the vaccine, according to a Reuters/Ipsos poll conducted between July 15-21. A
mong them, more than 50% said they were nervous about the speed of development. More than a third said they did not trust the people behind the vaccine’s development.
The UK-based non-profit Center for Countering Digital Hate reported in July that anti-vaccination content is flourishing on social media sites.
Facebook groups and pages accounted for more than half of the total anti-vaccine following across all the social media platforms studied by the CCDH.
Schneider told Reuters he is suspicious of the Covid-19 vaccine because he thinks it is being developed too fast to be safe. “I think a lot of people are freaking out,” he said.
Posts about the Covid-19 vaccine that have been labelled on Facebook as containing "false information" but not removed, include one by Schneider linking to a YouTube video that claimed the Covid-19 vaccine will alter DNA, and a post that claimed the vaccine would give people the coronavirus.
Facebook said these posts did not violate its policies related to imminent harm.
“If we simply removed all conspiracy theories and hoaxes, they would exist elsewhere on the internet and broader social media ecosystem. This helps give more context when these hoaxes appear elsewhere,” a spokeswoman said.
Facebook does not label or remove posts or ads that express opposition to vaccines if they do not contain false claims.
Hirsch said Facebook believes users should be able to express such personal views and that more aggressive censorship of anti-vaccine views could also push people hesitant about vaccines towards the anti-vaccine camp.
At the crux of Facebook’s decisions over what it removes are two considerations, Hirsch said.
If a post is identified as containing false information, it will be labelled and Facebook can reduce its reach by limiting how many people will be shown the post.
If the false information is likely to cause imminent harm, then it will be removed altogether.
In March 2019, Facebook said it would start reducing the rankings and search recommendations of groups and pages spreading misinformation about any vaccines.
Facebook’s algorithms also lift up links to organisations like the WHO when people search for vaccine information on the platform.
Misinformation about other vaccines has rarely met Facebook’s threshold for risking imminent harm.
However, in Pakistan last year, the company intervened to take down false claims about the polio vaccine drive that were leading to violence against health workers.
In the Pacific island state of Samoa, Facebook deleted vaccine misinformation because the low vaccination rate was exacerbating a dangerous measles outbreak.
To combat misinformation that doesn’t meet its removal criteria, Facebook pays outside fact-checkers who can rate posts as false and attach an explanation.
The company said 95% of the time, people who saw fact-checkers' warning labels did not click through to the content.
Still, the fact-checking programme has been criticised by some researchers as an inadequate response to the volume and speed of viral misinformation on the platforms.
Fact-checkers also do not rate politicians’ posts, and they do not judge posts that are exclusively in private or hidden groups.
Determining what constitutes a false claim regarding the Covid-19 shot is much harder than fact-checking a claim about an established vaccine with a proven safety record, Facebook fact-checkers said.
In a study published in May in the journal Nature, physicist Neil Johnson’s research group found that there were nearly three times as many active anti-vaccination groups on Facebook as pro-vaccination groups during a global measles outbreak from February to October 2019, and they proliferated faster.
Since the study was published, anti-vaccine views and Covid-19 vaccine conspiracies have flourished on the platform, Johnson said, adding, “It’s kind of on steroids.”





