Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
149 tokens/sec
GPT-4o
9 tokens/sec
Gemini 2.5 Pro Pro
47 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Scientific Engagement on Social Media

Updated 4 July 2025
  • Scientific engagement on social media is the active participation of scientists, institutions, and the public to share and discuss research online.
  • Empirical evidence shows a significant overlap between users sharing scientific publications and those spreading unreliable content, challenging conventional misinformation models.
  • Proactive engagement and robust open science practices—including clear contextualization and network monitoring—are essential to counter misinformation online.

Scientific engagement on social media refers to the active participation of scientists, institutions, and a diverse public in the dissemination, discussion, critique, and contextualization of scientific information via online platforms, especially during periods of high societal relevance such as global crises. This engagement plays a pivotal role in shaping the quality, visibility, and reliability of scientific discourse—but also exposes both science and the broader public to risks of misinformation propagation, misappropriation of research, and the politicization of uncertain or incomplete scientific findings.

1. Overlap between Science and Misinformation: Empirical Patterns

An analysis of ∼407 million COVID-19–related tweets reveals a substantial convergence between communities sharing scientific information and those propagating unreliable content. Among approximately 1.2 million users who shared scientific publications (identified by DOIs), 45% also shared unreliable sources—classified as satire, clickbait, political, fake/hoax, or conspiracy/junk science domains according to Media Bias/Fact Check. In contrast, 14.6% of users posting unreliable content also shared DOIs, a higher overlap than between sharers of reliable news and science (11.7%).

Within the user base, groups most likely to also share science were those disseminating the most problematic sources: conspiracy theories (35%), satire (34.7%), and fake news (26.7%). This pattern directly challenges the “deficit” model of misinformation, which assumes that lack of scientific exposure or engagement is the primary risk factor.

2. Characteristics of Publications Shared in Unreliable Contexts

Scientific publications more frequently disseminated by users who also share unreliable content differ systematically from those primarily shared by reliable sources. They are statistically more likely to be preprints (17.2% vs. 10.1%), to have been retracted (0.20% vs. 0.10%), to accumulate fewer citations as measured by both absolute and normalized counts (average 108 citations and 5.5 normalized, versus 370 and 11.7, respectively), and to be published in journals with lower impact factors (average 61.7 vs. 90.3, normalized 3.29 vs. 4.73). Open-access prevalence is higher as well—93.6% among unreliably shared outputs compared to 91.9% among reliably shared.

Not all misuse is isolated to preprint literature; some peer-reviewed, high-profile research is co-opted in misinformation campaigns, particularly when the underlying topic is evolving or controversial. However, the data suggest open science, while facilitating legitimate knowledge democratization, also introduces avenues for premature or miscontextualized findings to circulate widely among non-expert or even adversarial communities.

3. Scientist Engagement: Mechanisms and Effects

The activity and visibility of scientists on social platforms correlate strongly with the information environment at both local and country scales. Countries with proportionally more active scientists posting—measured by user count and posting productivity—show higher ratios of scientific content relative to misinformation.

Sentiment analysis (VADER) of tweets indicates that scientists tend to express more positive sentiment when posting or discussing scientific content, while their references to untrustworthy sources are often negative/critical—suggesting debunking or explanatory intent rather than endorsement. The data also show that scientists with larger followings tend to share a smaller proportion of scientific content but a higher share of unreliable links, possibly reflecting their greater involvement in countering viral misinformation.

The metric for exposure in this context is: E(S)=mSKu(m)E(S) = \sum_{m \in S} K_u(m) where SS is a set of tweets and Ku(m)K_u(m) is the follower count of user uu who posted message mm. This formula provides a quantitative basis for tracking the reach of scientific (and unreliable) content and potentially for evaluating the impact of intervention by prominent accounts.

4. Open Science Practices: Benefits and Pitfalls

Open science—including preprints and open-access publishing—has enabled more rapid and transparent diffusion of scientific findings during the COVID-19 pandemic. However, it also facilitates access for non-experts or those seeking to propagate falsehoods, making it easier for preliminary, retracted, or low-impact results to feature in the architecture of misinformation. Publications with higher open-access rates—especially preprints—were disproportionately shared by unreliable-content-boosting user groups.

While open science is not itself the cause of misuse (since even well-established, peer-reviewed research can be misappropriated), its structures require complementary efforts: promoting public understanding of the provisional and self-correcting nature of science, and actively monitoring for context loss or deliberate distortion.

5. Strategies for Proactive Scientific Engagement

The analysis converges on the necessity for proactive, visible, and contextually responsive engagement by the scientific community:

  1. Increase Active Scientist Participation: Promote diverse scientist involvement to ensure high-quality, contextually adapted responses that resonate with varied audiences. Engagement should go beyond publication—explaining, contextualizing, and critiquing science in accessible language, particularly when preprints or complex findings are circulating.
  2. Preemptive Contextualization and Debunking: When sharing preliminary or controversial findings, scientists and institutions should explicitly highlight limitations, methods, and status (peer review, retraction) to preempt misinterpretation. Rapid response teams or individual “explainers” can counter viral misinformation by adding nuance and clarity.
  3. Open Science with Monitoring and Guardrails: Open science must be coupled with appropriate education on source credibility, peer review, and research limitations. Journals and preprint servers, in collaboration with platforms (using metrics such as sudden spikes in sharing by unusual clusters), can monitor for misuse and flag publications drawing disproportionate unreliable engagement.
  4. Leverage Influential Channels: Major institutions, verified high-profile scientists, and organizational accounts should serve as amplifiers of reliable information, actively debunking prominent misinformation when required.
  5. Sentiment and Network-Based Intervention: Automated sentiment analysis combined with exposure/network mapping enables identification of misinformation “hot spots” for targeted intervention.
  6. Community Engagement and Digital Literacy: Building direct relationships with prolific non-scientist disseminators of scientific content facilitates tailored education and guidance. Expanding digital literacy (distinguishing reliable from unreliable) is necessary to foster more resilient, critically engaged networked publics.

6. Quantitative Metrics for Monitoring and Future Directions

A set of empirical indicators is available for practitioners and researchers:

Metric/Characteristic Value (Unreliable Users) Value (Reliable Users)
Fraction sharing preprints 17.2% 10.1%
Retraction rate 0.20% 0.10%
Mean normalized citations 5.5 11.7
Mean journal impact (normalized) 3.29 4.73
Exposure (E(S)E(S)) as sum of follower counts Formula above Formula above

These quantitative tools support both the assessment of information diffusion risks and the efficacy of interventions aimed at strengthening the reliability of science communication.

7. Conclusion

During global crises, the boundary between science and misinformation online is permeable and dynamic. Widespread overlap between audiences and sharers of science and unreliability indicates that the challenge of misinformation is not due to a lack of exposure to science (“deficit model”) or insufficient scientific literacy alone. Instead, concerted, accessible, and context-aware scientific engagement—together with structural improvements in open science communication and the proactive monitoring of network diffusion patterns—is essential for ensuring the reliability and credibility of scientific information in the networked public sphere.

The findings emphasize that investment in both open science and open engagement by the scientific community and institutions is central to countering misinformation and maintaining a robust public understanding of science during high-stakes events.