A measles outbreak in and of itself is shocking. Measles outbreaks in developed regions, such as those in the Pacific Northwest, New York, and the UK this winter, are all the more distressing given the relatively easy access to vaccines and reliable medical information in those areas. Worldwide, measles cases rose 50 percent last year, according to the World Health Organization which now ranks “vaccine hesitancy” as one of the world’s top ten global health threats. But the ongoing struggle to combat so-called anti-vaxx misinformation combines our natural inclination for myopia – the tendency to ignore far-off consequences – with the certainty that these consequences will be borne by someone else; which, in this case, is their children. Making a choice almost certain to result in harm to one’s children would almost certainly be considered child endangerment if it weren’t so readily explained by fields like behavioral science.

Humans, it seems, have a logical if not entirely irrational susceptibility to misinformation.

It’s tempting to think that the solution to ever declining vaccination rates lies in the triumph of rationality over irrationality, and information over propaganda. But this simple solution ignores the common fact that people continue to smoke cigarettes despite the Surgeon General’s warning, use fossil fuels while icebergs melt, and fail to save for retirement despite the certainty that we’re not getting any younger.

The key difference between those examples and anti-vaxxers is that the consequences of unvaccinated children are not far off in the future but are happening right now. Even more critically, the negative downstream effects are suffered not by the parents (who were likely vaccinated themselves), but by their children. Despite the abundance of reliable medical information, anti-vaccination misinformation seems to be more powerful.

The way out of the problem of well-meaning people choosing not to vaccinate their children is the way we got into it: behavioral science. Whether they know it or not, anti-vaxxers are achieving informational parity currently through the (arguably unintentionally) smart use of behavioral science. That’s the bad news.

The good news is that a path to victory is in a more effective and intentional use of behavioral science in public health campaigns. The medical community and government bodies have done a good job getting accurate information out there, but, like the old saying goes, you can lead a horse to water but you can’t make him think. You can’t reason someone out of a place they didn’t use reason to get themselves into. Public health campaigns would be more effective by focusing more on the irrational ways people behave rather than the rational ways they think.

The reality is that until public health advocates adopt behavioral tactics on a large-scale approach, there is a near-certainty that preventable outbreaks of measles and other diseases will continue. In the U.S., parental vaccine refusals have increased by 70 percent over the last six years. A drop in vaccination rates are being blamed for a four-fold increase in measles cases in the European Union, a problem exacerbated by the U.K. where child vaccination rates have fallen for the fourth straight year.

Even more troubling is the advent of the designation “measles hot spots” a term which identifies major cities including Detroit, Houston, Phoenix where vaccination rates are lower and may suggest future outbreaks.

Anti-Social Media

The appeal of misinformation is nothing new to humanity, whether unintentional (thunder is not a sign of an angry deity) or intentional (see also: the Trojan Horse). And the belief that, as Jonathan Swift wrote in 1710, “Falsehood flies, and the truth comes limping after it,” is so pervasive that quotes to this effect have been falsely attributed to everyone from Winston Churchill to Mark Twain to Thomas Jefferson, unintentionally proving its own point. We have known for centuries that lies are the hare and honesty is the tortoise, except in real life the rabbit never rests.

Factualizing what all have personally experienced, in 2018 scientists from MIT proved that lies diffuse 100 times faster than the truth. The reasons why falsehoods spread faster than the truth also make similar intuitive sense.

We found that false news was more novel than true news, which suggests that people were more likely to share novel information. Whereas false stories inspired fear, disgust, and surprise in replies, true stories inspired anticipation, sadness, joy, and trust. (“The spread of true and false news online,” Science, Mar. 9, 2018)

What’s noteworthy here is that the MIT researchers were not asking respondents what they thought or to explain their reasons for sharing. Instead, the study simply measured how people shared and looked to the actual triggers for those patterns of behavior. In this way, behavioral science gives a clearer understanding of the diffusion of information than something like traditional political polling.

While gossip and propaganda traveled much more slowly even 50 years ago, the misinformation of today now enjoys a much more expedient currency – social media. The research on fake news shows that misinformation “diffused significantly farther, faster, deeper, and more broadly than the truth in all categories of information.”

“It might have something to do with human nature,” said MIT data scientist Soroush Vosoughi, the study’s lead author.

What’s more, if not worse, is that this social media advantage does not seem to exist for the spread of accurate information.  It seems that our initial emotions of surprise, fear, and disgust latch onto our lower-order decision modes before we can engage higher-order processing. A reporter inadvertently demonstrated this after the Toronto terror attack in which a truck was driven into pedestrians. One tweet falsely described the driver as “wide-eyed, angry and Middle Eastern” while another more accurately painted the terrorist as white. The former, represented by the red line, made it around the world – literally, if virtually – within 24 hours before the accurate tweet had tied its shoes – albeit metaphorically.

measles

It’s true that people have always organized – and reorganized – themselves into tribes. People have always sought out new stories which confirm their suspicions or help us place blame. But until the internet, social networks only existed in what we now think of nostalgically as the “real world.”

Now many, if not most of us, organize virtual lives on online social networks that algorithmically reward engagement and the fallacy that the things that are easy to comprehend are true, such as rhyming epigram, “If it doesn’t fit, you must acquit.” As long as humans are attracted to disgusting, scary, or surprising novelties, social media as currently structured will provide advantageous platforms to misinformation over the boring, old truth, creating the most effective confirmation bias systems in human history.

Likewise, anti-vaccination movements predate social media, not to mention the internet. No sooner did smallpox vaccinations began in the 1800s than some members of the clergy denounced it as heretical. The Vaccination Act of 1853 mandated infants be immunized, spawning the first parental-rights movement. More recently, the Lancet published a paper by a British scientist linking the MMR vaccine to autism, a decision the journal later retracted and has been since thoroughly debunked as unscientific hogwash.

How Anti-Vaxxers Use Behavioral Science

If social media had existed in early 19th century Great Britain, we might still be losing loved ones to smallpox. As we’ve seen, social media rewards engagement, and engagement gives primacy to novelties that convey fear, surprise, and disgust. And whether they realize it or not, that is exactly how anti-vaccination activists are pushing their agenda over social networks.

First – and this cannot be overstated – anti-vaxxers enjoy a structural narrative advantage over the medical establishment. Vaccine safety assumes the default position – the status quo, if you will – allowing any opinion to the contrary to appear new, even novel. A parent logging onto social media will, by dint of the seeming novelty of an idea as old as the vaccines themselves, be attracted to what appears different from the norm because of its seeming novelty. 

Next, example images shared online reveal a marketing strategy perfectly calibrated to go viral over online social networks. An ostensibly health-related website posted a graphic this month saying people are more likely to die from getting vaccinated for measles

measlesthan they were from measles, eliciting fear. The image appeared in an article about Somalis who have resettled in Minnesota whom the article contended had the world’s highest rate of autism. This strategy not only compounds a fear of death with the fear of dark-skinned immigrants, but also speaks to our irrational weakness to the fallacy that information which is easy to understand or intuitive must be accurate.

Another element of the anti-vaccination campaign is the assertion that a conspiracy is being revealed. This isn’t just honest scientists and ordinary doctors wanting to prevent your child from contracting a horrible disease. No, they claim in self-published books and YouTube videos, these vaccinations are part of a secret social experiment. This, for the gullible, combines novelty with surprise, probably a little fear, and certainly feelings of invasion. The false contention that “your child may receive up to 81 vaccines by six years of age” surprises parents.

measles

Finally, anti-vaxxers adroitly capitalize on disgust – as well as on power of associations – in the meme that asks, “Do you know who also supported vaccination?” with an image of syringes arranged in the shape of a swastika. Never mind that one could say the same about Nazis usingautomobiles, bread, and shoes. Apparently, Godwin’s law is the only scientific law that anti-vaccination activists believe in.

All of these elements to the campaign against childhood immunizations employ what behavioral science has found are the best-practices for communication. If the MIT study is correct, we should expect to see that anti-vaccination messages enjoy a disproportionate amount of online real-estate against medically accurate information.

This is, in fact, exactly what was found. A 2018 study published by the National Institute of Health found that anti-vaccination sources of information comprised 32 percent of immunization-related materials on YouTube and 43 percent of websites in the first 100 vaccination websites in a Google search, including all of the top 10. At the same time, these sources got higher ratings on measlesYouTube and the Google search results got more traction, causing them to turn up again in a search five months later. While there is less anti-vaccination information out there these memes and articles receive a disproportionate amount of online attention, making their message feel like it has a larger following than it actually does.

A 2013 UNICEF report about anti-vaccination movements in Eastern Europe attributed the success of anti-vaccination online to the trust we place in others in our social networks, including those we do not actually know in real life.

One study found that 78% of consumers trusted social peer recommendations, while just 14% trusted advertisements. Intensive interaction and content sharing through social media means that an audience instinctively determines its own opinion leaders.

This, UNICEF concluded, is how Jenny McCarthy, an actress, model, television host, author, and screenwriter, has become best known in the social media age not for any of her eponymous television shows or any of the Hollywood movies in which she stars. Instead, her first descriptor on her Wikipedia page is that of an “American anti-vaccine activist” despite a lack of any medical credentials.

The results of the weaponization of immunization misinformation were borne out in a Dec. 2018 report issued by the Royal Society for Public Health, which found that 91 percent of parents agree that vaccines are important for their children’s health. At the same time, a quarter of parents hold the incorrect belief that “you can have too many vaccinations” (another example of the fallacy that which is easily understood sounds true). And to further testify to the disproportionate efficacy of online misinformation, half of all new parents in the study reported being exposed to negative misinformation about vaccines online.

An American Journal of Public Health found a similar result in the United States. While noting the opinion of a clear majority of Americans who think that vaccines are not only safe but should be required, the AJPH study found that those who went online were given the impression that a serious and ongoing debate about vaccine safety existed.

The authors of the 2018 NIH study concluded that inaccurate online messaging contributed to parents getting their children fewer vaccinations than recommended, leading to the outbreaks we are seeing now and begging the question about what can be done.

How Behavioral Science Can Prevent Future Outbreaks

Some, including the authors of the Royal Society study from last December, asked social media companies to kick anti-vaccination propagandists off their platforms, the press to more responsibly “share factual information,” schools to improve education about vaccines, and “a more diverse range of locations” to offer brochures about vaccinations. Some combination of censorship of misinformation and dissemination of accurate brochures could have some effect, but these ideas only deal with access to information, not how people act when they encounter the information.

Once people do encounter information, accurate or otherwise, we need behavioral science to help the right message get through. We turn to behavioral scientists who have found recent successes in changing communications tactics and provide hope for more effective public health campaigns in the future.

One method discovered by Sander van der Linden, who leads the Social Decision-Making Laboratory at the University of Cambridge, treated viral misinformation like a literal virus. And as with an actual virus, he devised a way to inoculate people, in this case against false information about the scientific consensus on climate change. The trick, he found, was to inoculate people against misinformation before they encountered demonstrable malarkey.

“If you try to debunk it, misinformation sticks with people,” he told the BBC last November. “Once it’s integrated into the long-term memory, it becomes very difficult to correct it.”

The results were encouraging for those who believe in the Enlightenment. When people were given the truth, warned that misinformation was out there, armed with facts about why that misinformation was false, and only then shown the misinformation, they developed a measurable immunity to false information. In other words, they were vaccinated against the flu before flu season. And while one might wonder how the medical community can beat social networks to the punch, we should keep in mind that at least one major retailer can identify when customers are entering their second trimester. Beating Jenny McCarthy to the punch requires behavioral science, not rocket science.

Immunizing people against false information is one promising avenue. Another, more recent development, has been the use of a new type of messenger to speak out against the anti-vaccination movement. A refreshing and relatable advocate, Ethan Lindenberger is a high school senior who had himself vaccinated despite the disapproval of his family.

Lindenberger spoke out against his antivaxx upbringing and articulated support for vaccinations on a national stage when he testified in front of congress last week. The Ohio teen recounted to his experience growing-up in an unvaccinated household and shared a powerful message against the dangers of anti-vaccine propaganda—shining a more personal light on the social media networks which fuel misinformation and suspicion against science.

While his message is a powerful reminder of those who are truly affected by the decision not to vaccinate—children and teens—Lindenberger was unknowingly leveraging an important behavioral science tool; what we call ‘the messenger effect’. When it comes to behavioral science, what we think of as the messenger can be almost as important as the message itself. Things like credibility, similarity, likability all increase the power of a message, with similarity often being the most powerful.

Historically, information on and support for vaccines has come from organizations like the CDC or doctors and other local health-care professionals. By changing the pro-vaccine messenger from institutional recommendation to human voice, Lindenberger has become a relatable conduit of accurate information for parents and teens at a national scale.

In addition to approaches like inoculating against misinformation and changing the messenger, we at Hill+Knowlton Strategies, an international PR consulting firm that employs behavioral scientists like myself, wanted to explore what other inexpensive and effective tactics might help influence vaccination behavior. We have found success in focusing not on misinformation but rather on overcoming educational and behavioral barriers to getting people vaccinated.

In one such instance, we worked with a pharmaceutical company to test the effectiveness of a range of messages to encourage adult vaccinations. We developed 16 new messages, each tapping into a different vaccine story theme (such as about protecting older people) and a behavioral insight (such as a social norm). For the purposes of the test, the accompanying picture and every other part of the campaign remained identical.  We tested these online and measured click-through rates to find out more about vaccines. Through this experiment, we identified a message that increased rates of vaccine information seeking by 31.3 percent.

Understanding that we could change interest in vaccine information by changing the messaging of an advertisement, we wanted to know if we could help people overcome the common behavioral barrier of just taking the time to get it done. Like many large organizations, H+K offers free flu shots, providing us with an ideal opportunity to use behavioral science on our own colleagues to see if we could increase vaccinations. And in doing so, we took a focus different than what most take in looking at vaccinations. Perhaps because of humanity’s cognitive susceptibility to misinformation, questions surrounding the ongoing conflict between accurate and inaccurate information comprises an overwhelming majority of information one finds about vaccinations.

Instead, we focused on those we call “intention-gap people,” or those of the sort who know they should eat vegetables, exercise, and save for retirement – not those who think that flossing causes autism. And though we don’t often talk about these folks, they comprise a significant portion of society and our lowest-hanging fruit when it comes to inoculations. Late last year a think tank at the University of Chicago released a poll showing that 14 percent of adults surveyed in the U.S. wanted to get the flu vaccine but had not yet. Add to that the 6 percent of those who did not plan to get the flu vaccine because of the cost and the 5 percent who didn’t think they had the time – as well as some portion of the small minority who answered, “I don’t know,” defying easy categorization – and you have approximately a quarter of adults who say they are in a place where they just need to close a narrow gap between intention and action.

measles

Of course, that’s just what they say. According to the Center for Disease Control, the percentage of Americans who say they’ve been vaccinated against the flu midway through flu season this winter easily exceeds those who got the flu shot in all of the 2017-18 flu season when 37.1 percent were vaccinated. Of course, it is possible that flu vaccination rates shot up in less than one complete winter, but it’s more likely that some of those who told the University of Chicago researchers they had been vaccinated were lying to prevent embarrassment. If anything, this suggest that there might be a bigger pool of “intention-gap people” than previously known.

measles

When it came to flu shots at H+K, cost was not a factor thanks to insurance benefits. The problem, we hypothesized, was the time. We aimed to improve the company’s vaccine uptake by applying behavioral science insights to routine staff communications. In practical terms that meant applying a behavioral science method to the previous year’s email to “nudge” people to get people vaccinated. One of the simpler insights we deployed was to change who sent the message.

The experiment was not complicated: the first message for the control group came from HR, and our revised message came from the team’s managing director. In doing so, we were deploying “the messenger effect,” the insight that who delivers the message can be as important a response driver as what the message is. The managing director was an effective choice as a person whom the email recipient knows and respects. In our “nudge” message, we even signed off the email with a note from the managing director “if I can make time, then you can, too.” This removed a natural barrier that people would have placed upon themselves and gave them permission to take time away from their desk for an appointment.

Our control message performed reasonably, with 27 percent of recipients booking an appointment to get a flu shot. However, the message informed by behavioral science blew the traditional approach out of the water with an uptake of 40 percent. With just a handful of wording changes we were able to produce a 13-percentage point increase in vaccine uptake.

This is all new, but it’s important to keep in mind at one time humans had no way to prevent small pox, the measles, or the flu. We at H+K are focusing our internal and external efforts getting people who are otherwise pro-vaccination but do things like forgetting to get their flu shot or failing to complete all doses in a series like the HPV vaccine. Meanwhile, the Cambridge study shows promise in educating people against viral misinformation. All of this suggests promise in the use of behavioral science as an extremely effective and inexpensive way to inoculate entire societies against preventable diseases for future public health campaigns—not to mention broader implications about the possibility of combating fake news online.