Lets Keep the Vaccine Misinformation Problem in Perspective

Social media is not the reason the pandemic hasn’t been conquered. 

With Covid cases surging in parts of the United States and vaccinations proceeding at a crawl, all eyes are on social media platforms. Many people, particularly some Democrats in Washington, appear to believe that online misinformation is at the heart of the flagging vaccination campaign. President Joe Biden summed up the mood when he suggested that Facebook was “killing people,” kicking off weeks of frenzied coverage. (He later clarified that he was referring to the purveyors of misinformation, not Facebook itself.) Then, late last week, Senator Amy Klobuchar of Minnesota introduced a bill that would strip away Section 230 immunity for vaccine content promoted by social media algorithms.

The Klobuchar bill is an apt mascot for the confused state of discourse over misinformation. Because it would direct the government to decide what counts as misinformation and then treat content differently on that basis, it would probably violate the First Amendment. Of course, Klobuchar’s proposal will never become law; it is what’s known as a messaging bill. And the message seems to be that, in order to close the vaccination gap and finally bring the pandemic to a close, social media platforms just need to do something.

That focus, though, may misdiagnose the problem. At least in the US, vaccine hesitancy appears to be a largely top-down partisan phenomenon, in which public behavior is influenced by elite messaging. As of last month, just over 10 percent of adult Democrats had not been vaccinated, compared to nearly 50 percent of Republicans, according to multiple surveys. (And while access is still an issue for some Americans, the most-cited reasons for not getting the shot come down to willingness, not ability.) As my former colleague Daniel Engber recently pointed out in The Atlantic, this partisan gap has been remarkably stable, and it precedes Fox News’ recent turn toward constant vaccine-skeptical coverage. The underlying cause for the partisan split, Engber suggests, might therefore be the fact that Republicans have, throughout the pandemic, been much less afraid of Covid than Democrats. The country’s most influential Republican, Donald Trump, relentlessly played down the risk of the disease from the start (including during his own hospitalization), and millions followed his cue. Research has found that the biggest predictor of whether Americans view Covid-19 as a threat is not their scientific literacy or demographics, but whether they trust Fox News and Breitbart over CNN and The New York Times. Viral rumors on social platforms may have widened the divide, but it seems clear that Republican Party messaging, amplified by its traditional media architecture, created it.

It’s also a mistake to assume that all anti-vaccine sentiment is based on misinformation per se. There has lately emerged a wealth of polling and news reports providing insight into the motivations of the vaccine-hesitant. One of the most common reasons they give for their reluctance is that we don’t yet know if the vaccines have long-term side effects. Another is that the Food and Drug Administration hasn’t officially approved any of the vaccines. Both of these facts are technically true. Are they sensible reasons to refuse the vaccine, in light of all we do know? No. But they are not false.

So it goes with a great deal of anti-vaccine messaging. Yes, many people who say they’re afraid of side effects have bought into false rumors about vaccines harming fertility or altering one’s DNA. But many are responding to real, if rare, reports of serious side effects; the government really did pause administration of the Johnson & Johnson vaccine because of a rare blood-clot disorder. Drug companies really are greedy. The right fact, presented without adequate context, can be more than enough to scare people away.

The most visible vaccine-skeptical public figures, the likes of Tucker Carlson or Senator Ron Johnson (R-Wisconsin), understand this. They don’t need to spread demonstrable falsehoods. They can simply focus night after night on outlier cases of severe side effects. Or they can selectively present results of scientific studies or government communications in ways that seem to suggest something ominous about either the virus or the vaccine. Or they can skirt the scientific question entirely in favor of ranting about how the government’s vaccine push is really about social control. Like any illusionist, they know that the most powerful tool available is not misinformation, but misdirection.

That subtle distinction is often lost on members of the media and the political establishment. At times, “misinformation” becomes a catch-all term for any material used to dissuade people from getting the shot, whether or not it is objectively false. A recent New York Times article about the influential anti-vaxxer Joseph Mercola, for example, titled “The Most Influential Spreader of Coronavirus Misinformation Online,” concluded by noting that Mercola had made a Facebook post suggesting that the Pfizer vaccine was only 39 percent effective against infection by the Delta variant. Mercola was accurately relaying the findings of a real study, one that had been covered by mainstream news outlets. The Times article tweaked him, however, for not mentioning the study’s other finding, that the vaccine is 91 percent effective against serious illness.

No doubt Mercolaâ€"an osteopathic physician who has made a fortune selling “natural” health products often advertised as alternatives to vaccinesâ€"would have done his followers a service by sharing that data point. Cherry-picking true statistics to sow doubt in vaccines is dangerous. But to sweep that example under the umbrella of misinformation is to engage in concept creep. Misinterpretation is not the same thing as misinformation, and this is not merely a semantic distinction. Facebook, YouTube, and Twitter are rightly under immense pressure to do more to prevent the spread of dangerous falsehoods on their platforms. They often take their cues from established media organizations. It would be a troubling development for online free speech if, in the name of preventing real-world harm, platforms routinely suppressed as “misinformation” posts that don’t contain anything objectively false. It’s hard enough to distinguish between truth and falsity at scale. It would be reckless to ask platforms to take on the responsibility of judging whether a user’s interpretation of the factsâ€"their opinion about a matter of public policyâ€"is acceptable or not.

“It for sure is the case that misinformation is making things worse,” said Gordon Pennycook, a behavioral psychologist at the University of Regina. “There are people who believe things that are false, and they read those things on the internet. That for sure is happening.” But, Pennycook went on, “the more you focus on that, the less you talk about the avenues in which people come to be hesitant that have nothing to do with misinformation.”

In his research, Pennycook runs experiments to figure out how people actually respond to online misinformation. In one study, he and his coauthors tested whether people would be convinced by the claim in a fake news headline after being exposed to it online. (Sample headline: “Mike Pence: Gay Conversion Therapy Saved My Marriage.”) In one phase of the experiment, exposure to fake news headlines raised the number of people who rated the claim as accurate from 38 to 72. You could look at that and say online misinformation increases belief by 89 percent. Or, you could note that there were 903 participants overall, meaning the headlines only worked on 4 percent of them.

The current debate over vaccine misinformation sometimes seems to imply that we’re living in an 89 percent world, but the 4 percent number is probably the more helpful guidepost. It would still be a serious problem if only a small percentage of Facebook or YouTube users were susceptible to vaccine misinformation. They’d be more likely to refuse to get vaccinated, to get sick, and to spread the virusâ€"and, perhaps, their false beliefsâ€"to others. At the same time, it’s important to keep in mind that somewhere around one third of American adults are still choosing not to get vaccinated. Even if Facebook and YouTube could erase all anti-vaxx content from their platforms overnight, that would only take one bite out of a much larger problem.

“Misinformation is not irrelevant. It’s not something to ignore, but there are other things that are going to be more important,” said Katherine Ognyanova, an assistant professor of communication at Rutgers University and a member of the Covid States Project, which studies public opinion on the pandemic. “It’s still important to combat misinformation, but it’s only one factor. It’s not going to solve everything if it went away.”

It’s tempting to think that vaccine hesitancy can be overcome simply by taking false information out of circulation. It’s especially tempting to the sorts of highly educated people who make up the national media and political landscapes. People are rational, we think, and so if they are exposed to the right facts, they will reach enlightenment. At the same time, however, much of the discussion around misinformation seems to rest on a conflicting assumption, never stated aloud, that people simply cannot be trusted; that misinformation has an almost mystical power to overcome the intellect of any average Joe exposed to it, at which point it’s no use showing him the facts. Of course, neither notion is correct. People make decisions and form beliefs for all kinds of reasons, most of which have little to do with scientific evidence: gut feelings, partisanship, religious values, and so on. But they also don’t credulously believe every false rumor that appears on their feed. There are many other factors at play.

In the case of vaccine intentions, that means different techniques will be needed to convert different slices of the population. Some people still just need easier access or assurances that the vaccines really are free. For the “wait and see” crowd, evidence that hundreds of millions of people have been vaccinated without incident could eventually wear down resistance. (So could rising case rates, as we may already be seeing.) 

But for the more stubborn holdouts and outright denialists, a harder line will be necessary. As the political scientist Brendan Nyhan put it in the pre-Covid world of 2019, apropos of measles outbreaks and school immunization requirements, “the focus on anti-vaccine content on social media can obscure the most important factor in whether children get vaccinated: the rules in their home states.” The political capital currently going toward berating social media companies would probably be better spent pressuring governments and employers to impose vaccine mandates. (Happily, there has been some momentum in that regard over the past few days, with institutions like the Veterans Health Administration, the state of California, and even a coalition of San Francisco bar owners all implementing some degree of mandatory vaccination.)

That’s not to say that social media misinformation should be ignored. But it appears to be more of a long-term threat to our overall information environment than an immediate crisis preventing the US from conquering Covid-19. As Will Oremus recently explained in The Washington Post, the underlying issue is tech companies’ use of ranking algorithms designed to show users whatever is most engaging, rather than what is most edifying. This is easily stated, but it will not be an easy problem to solve. The platforms insist that they are already going to extreme lengths to shield users from vaccine misinformation and steer them toward accurate sources, and this may well be true. A recent report by the advocacy organization Avaaz, for example, is styled as proof of how Facebook continues to funnel users toward anti-vaxx material. But by the report’s own admission, a researcher who searched “vaccine” had to scroll past dozens of reputable sources, including public health bodies, before getting to unreliable pages. As any business that has been downranked by Google’s algorithm can tell you, being buried below the first page of results is nearly as bad as being banned. These results could just as easily be taken as proof that Facebook is working hard to suppress misinformation.

The problem, as always, is that we’re generally stuck taking Facebook’s word for it, and YouTube’s, because they don’t share enough data to allow outside researchers to evaluate the performance of their algorithms.

This week, Ognyanova and the Covid States Project released a paper analyzing the link between people’s vaccine attitudes and where they get their news. People who had gotten their news about Covid in the previous 24 hours from Facebook, the team found, were less likely even than people who get their news from Fox News to be vaccinated and more likely to say they would not in the future. (Of the news sources they asked about, only Newsmax predicted more hesitancy than Facebook.) But, the researchers note, it’s impossible to conclude that getting pandemic news on Facebook causes vaccine skepticism. Like Pennycook, they found that both vaccine hesitancy and Facebook news consumption correlated with low trust in mainstream media.

“People who don’t trust institutions, who don’t trust the media, who don’t trust doctors and hospitals, they’re going to be the same people who avoid mainstream media and get their news from social media,” said Ognyanova. “We have to ask ourselves, are those people not getting vaccinated because they’re getting more misinformation, or because they already mistrust the system?”

Ognyanova suspects that the answer is a mix of both. But it will be very hard to get to the bottom of it without being able to take survey data about vaccine attitudes over time and match it up with the content that Facebook users are actually seeing. Facebook and other social media platforms have that information, but they rarely give researchers access to it.

“The combination of understanding people’s attitudes and behavior, on one hand, and how their attention is targeted to items on social media platforms, on the otherâ€"that’s something that’s going to be very powerful,” said Ognyanova. “But that’s difficult to get.”

As long as it remains difficult to get, social media platforms will continue to invite the abuse they’re receiving. Which is why the road to solving the online misinformation problem begins with the boring but vital concept of transparency. Until outside researchers can find out what’s really going on, the misinformation debate will remain misinformed.

More Great WIRED Stories
  • ðŸ"© The latest on tech, science, and more: Get our newsletters!
  • A people's history of Black Twitter, part I
  • The latest twist in the life-on-Venus debate? Volcanoes
  • WhatsApp has a secure fix for one of its biggest drawbacks
  • Why some crimes increase when Airbnbs come to town
  • How to smarten up your home with Alexa routines
  • ðŸ'ï¸ Explore AI like never before with our new database
  • 🎮 WIRED Games: Get the latest tips, reviews, and more
  • 🏃🏽‍♀️ Want the best tools to get healthy? Check out our Gear team’s picks for the best fitness trackers, running gear (including shoes and socks), and best headphones
  • 0 Response to "Lets Keep the Vaccine Misinformation Problem in Perspective"

    Post a Comment