Social Media Is Used To Spread Coronavirus Disinformation, And Patient Groups Are Fighting Back
A study of Facebook found that misinformation about COVID-19 had been viewed far more often than messages from official health organizations.
For decades, people struggling with illnesses of all kinds have sought help in online support groups, and during 2020, such groups have been especially important for many COVID-19 patients who often must recover in isolation.
The fear and uncertainty regarding the coronavirus have made online groups targets for the spread of false information, however, and in an effort to help fellow patients, some of these groups are making a mission of stamping out misinformation.
Experts say that without such efforts, these groups could risk contributing to problems that have made fighting the coronavirus especially difficult.
“Within the case of COVID, it’s been really concerning to see how misinformation and disinformation is not only spreading quickly but really causing this undermining of trust of science and medicine,” says Elizabeth Glowacki, a Northeastern University health communications researcher.
When Matthew Long-Middleton got sick on March 12, he was desperate to find someone who would understand him.
The 36-year-old avid cyclist and training manager for America Amplified, which is based at KCUR, says his doctor dismissed his symptoms, and Long-Middleton struggled to find any medical information that would explain what he was experiencing.
Isolated in his apartment in Kansas City, Missouri, he joined a COVID-19 support group on the social media platform Slack, which is run by an organization called Body Politic.
“I had no idea where this road leads, and so I was looking for support and other theories and some places where people were going through a similar thing, including the uncertainty, and also the thing of like, we have to figure this for ourselves,” Long-Middleton says.
But Long-Middleton came to realize that he had opened the door to a stream of questionable information about the illness.
Much of the COVID-19 misinformation that has spread organically on social media seems to have started as seemingly inaccurate news reports, rumors or poor quality medical research, according to Emilio Ferrara, a University of Southern California researcher who has conducted computer analyses of tens of millions of COVID-19 related tweets.
Much of the hype surrounding the use of Hydroxychloroquine to treat COVID-19, for example, began as unverified medical rumors.
However, Ferrara explains that once this misinformation begins circulating through social media, it can function just the same as propaganda.
But disinformation, which is deliberately created and spread by influential groups, is common on social media, too. Ferrara’s tracking of tweets shows that disinformation began to proliferate in the U.S. during the “informational vacuum” of February.
“There was an eagerness for information about COVID-19, but not quite enough information out there,” Ferrara says. “So what we observed was that certain groups took advantage of that, in essence. They started talking about COVID-19 in the context of politics.”
Ferrara says that the conspiracy group QAnon and other alt-right organizations pushed disinformation that the virus was man-made, transmitted from products imported from China or related to 5G networks, among other false claims.
Unraveling the messaging
Concerns about disinformation led the international nonprofit human rights group Avaaz to study the problem.
Avaaz campaign director Fadi Quran says that some of these disinformation campaigns operate on the same scale and incorporate many of the same strategies that are used to undermine elections and destabilize governments across the globe.
“When we compared this to the reach of websites linking to the World Health Organization or the CDC, we found that the health misinformation spreading through networks we identify in this report had four times the reach of the official health websites, and that’s dangerous,” Quran says.
In the face of global disinformation campaign and a still-growing pandemic, some COVID-19 patients, including Vanessa Cruz, have been fighting back.
Cruz, a 43-year-old self-described “stay-at-home mom,” has spent most of the last six months moderating the COVID-19 Facebook support group “have it / had it” on her phone from her house in the Chicago suburbs.
Cruz has been recovering from COVID-19 since late March.
“If I was not working in the group, I would just have an anxiety attack all day,” Cruz says. “My relief and my way to escape it was basically working in the group and comforting other members.”
The group has more than 27,000 members in more than 100 countries, according to its founder, Jay Sinrod, who lives in Brooklyn. It draws about 1500 new members each week.
She and the group’s 16 other administrators and moderators, who include two nurses and a biologist, work to screen every person who wants to join the group and fact check every piece of content that is posted.
“We have to vouch for all that,” Cruz says. “We basically have to see the video through. We have to read the article. We have to check out the survey. We have to Google everything to make sure everything is legit and that it makes sense and that it can be sharable.”
Cruz says that Facebook helps. In April, the platform announced it would set up its efforts to fact check posts and label or remove false content related to the virus. Twitter has also blocked users and removed tweets that include COVID-19 misinformation.
Even with the fact checking, however, online support groups can still contain misleading or confusing information that doesn’t always align with current evidence.
Long-Middleton says that in the interest of providing a forum for patients to explain their personal experiences, support groups can inadvertently become platforms for touting unproven treatments.
“The things I found myself struggling with are like, ‘Oh, I’ve been taking this extract and doing this exercise every day, and it’s been helping so much,’” Long-Middleton says. “And it’s like, I guess I could try it, but that’s a sample size of one. It’s just your experience.”
He says that over time, the glut of information and anecdotes in the Body Politic group became overwhelming, and, unlike for Cruz, his support group became a source of anxiety that he needed to take breaks from.
Vanessa Cruz says conversation in her group can also be skewed by the “Covid fog” that seems to hamper many patients’ critical thinking, as well as polarization over certain topics like Hydroxychloroquine which can drive the discussion into toxic territory.
Cruz takes Hydroxychloroquine herself to treat her arthritis, and she says that puts her in a strong position to intervene in the frequent debates on the drug.
“It would get really heated and ugly and bullying,” Cruz says. “And it’s something where I can honestly step in and said, ‘OK, you guys just need to calm down first, and we need to respect each other’s stories, because that’s what we’re here about, above all else.’”
Not every support group has its own Vanessa Cruz to defuse tension, or its in-house expertise to weed out all the misinformation.
Knowing the facts
But Northeastern University’s Elizabeth Glowacki says there are steps individuals can take to keep support groups constructive and evidence-based.
Her research on social media messages during the 2014 Ebola pandemic tracked how public health messages were derailed by messages that aimed to stoke fear and assign blame.
She says support group members can avoid those traps by paying attention to the kind of language used in posts they engage with and share.
“Information that’s accurate, credible and informed shouldn’t try to incite a lot of fear,” Glowacki says. “So I think asking these questions of why is this information being pushed out, is there a certain agenda that this information is trying to fulfill or is it really trying to provide information in a more neutral way?”
Fadi Quran of Avaaz thinks larger changes are needed in social media platforms themselves.
He says that, while Facebook’s efforts to stamp out misinformation have helped, the platform must revise the way it prioritizes content.
“Facebook’s algorithm prefers misinformation, prefers the sensational stuff that’s going to get clicks and likes and make people angry,” Quran says. “And so the misinformation actors, because of Facebook, will always have the upper hand.”
Facebook did not respond to questions for this story.
When Matthew Long-Middleton was in need of hope during his months with symptoms, he would turn to a section of the Body Politic group called “victories,” in which people would describe milestones of overcoming the virus.
After almost half a year, he says finally felt healthy again in August, and he’s been working on his own victory post.
Looking back on his recovery, however, he says he’s frustrated by the many false or dangerous treatments he heard about, especially the suggestion voiced by President Trump that COVID-19 patients should inject themselves with bleach.
He says he understands how the confusion and fear that can result from severe COVID-19 illness could drive someone to consider such a desperate and potentially lethal action.
“You want to find hope, but you don’t want the hope to lead you down a path that hurts you. It’s this weird balance of, like, I need something to feel like, ‘Oh, maybe that will work. Maybe that will be the thing that makes me feel better.’”