In a recent study published in Science Advances, researchers used a mathematical model to depict how most of the people drifted away from best-science guidance early through the coronavirus disease 2019 (COVID-19) pandemic.
Study: Losing the battle over best-science guidance early in a crisis: COVID-19 and beyond. Image Credit: View Apart/Shutterstock
They empirically mapped and quantitatively analyzed the emitter-receiver network of COVID-19 guidance amongst online communities of Facebook, the dominant social media platform worldwide. Notably, Facebook has over three billion energetic users in nearly 156 countries.
Background
Distrust of guidance based on the most effective available science has reached dangerous levels. In the course of the COVID-19 2020 pre-vaccine period of maximal uncertainty and social distancing, many individuals went to their online communities for guidance about the best way to avoid catching it and proposed cures. A 13.2% jump in users of social media in 2020 surged total users to a whopping 4.2 billion, corresponding to 53.6% of the worldwide population. All these people joined social media to hunt details about safeguarding themselves and their close ones from the wrath of COVID-19.
Unfortunately, there is a large possibility that these members eventually get exposed to guidance that will not be the most effective science, which, in turn, leads to deaths as a consequence of rejecting masks or drinking bleach. It raised the query of who emits and receives guidance and the best way to intervene in current and future crises beyond COVID-19 (e.g., Monkeypox or climate change misinformation).
A node and a link represent a Facebook page and a page recommending one other page, respectively. Each page aggregates people around some common interest, and its evaluation doesn’t require accessing personal information. A page member simply mentioning one other page doesn’t work. But when a link from a Facebook page recommends one other page to all its members, they are going to robotically be exposed to fresh content, i.e., how an emitter-receiver network gets established.
Although not all members necessarily concentrate to such content, a recent study experimentally and theoretically showed that only 25% of members could tip a web-based community to an alternate standpoint.
In regards to the study
In the current study, researchers manually searched through Facebook pages made in 2018 and 2019 using keywords and phrases involving COVID-19 vaccines and vetted their findings via human coding and computer-assisted filters. Then, they indexed these pages’ connections to other Facebook pages. Finally, two independent researchers classified each identified node (or Facebook page) as neutral, pro-, or anti-vaccination by reviewing its posts, the ‘About’ section, and the self-described category.
A professional-page had content promoting best-science guidance; an anti-page, quite the opposite, opposed this guidance, and a neutral page had community-level links with pro/anti communities. Parenting pages, as an example, are considered neutral as they deal with topics corresponding to child education, pets, and organic food.
To make the initial seed of Facebook pages as diverse as possible, the researchers repeated the technique of manually identifying these pages posted in numerous languages, focused across geographical locations, and with managers from a wide selection of nations. Further, the researchers developed a mathematical model that mimicked the collective dynamics of those Facebook communities. The findings of this model could possibly be verified manually using standard calculus.
Study findings
The study classification methodology yielded a listing of 1356 interlinked Facebook pages comprising 86.7 million people. The info evaluation from December 2019 to August 2020 showed that the initial conversations over COVID-19 guidance began primarily among the many 501 anti-communities comprising 7.5 million individuals, much before the official announcement of the pandemic in March 2020.
Notably, there have been 211 pro-vaccine communities and 644 neutral communities comprising 13 and 66.2 million individuals, respectively. Essentially the most frequent manager locations were the USA, Canada, the UK, Australia, Italy, and France.
Nearly seven million individuals were exposed exclusively to COVID-19 guidance from non-pro communities, and 5.40 million were exposed to each. This imbalance was worse for people in parenting (neutral) communities, with 1.10 million exposed exclusively to COVID-19 guidance from non-pro communities. Upon randomly deleting as much as 15% of COVID-19-related links from the whole network to mimic missed Facebook links, the researchers still found that their findings and conclusions were robust.
Conclusions
Overall, the anti communities jumped in to dominate the conversation before the official announcement of the COVID-19 pandemic, and neutral communities (e.g., parenting) subsequently moved closer to extreme communities and hence became highly exposed to their content.
Thus, parenting communities began receiving COVID-19 guidance from anti-communities as early as January 2020, after which they even began adding their guidance to the conversation. Conversely, best science guidance from pro communities remained low throughout the whole study duration.
The mixture of network mapping and model showed more possible approaches to flipping the conversation than merely removing all extreme elements from the system. Removing all extreme elements may not even be essentially the most appropriate solution. It could come across as harsh, counter to the thought of open participation, and compromise the business model of maximizing user numbers.
Nevertheless, the study model could tackle the query of online misinformation more generally, beyond COVID-19 and vaccinations. It could also help predict tipping point behavior and system-level responses to interventions in future crises.