Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
41 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
41 tokens/sec
o3 Pro
7 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Older Adults' Experiences with Misinformation on Social Media (2312.09354v1)

Published 14 Dec 2023 in cs.CY

Abstract: Older adults habitually encounter misinformation on social media, but there is little knowledge about their experiences with it. In this study, we combined a qualitative survey (n=119) with in-depth interviews (n=21) to investigate how older adults in America conceptualize, discern, and contextualize social media misinformation. As misinformation on social media in the past was driven towards influencing voting outcomes, we were particularly interested to approach our study from a voting intention perspective. We found that 62% of the participants intending to vote Democrat saw a manipulative political purpose behind the spread of misinformation while only 5% of those intending to vote Republican believed misinformation has a political dissent purpose. Regardless of the voting intentions, most participants relied on source heuristics combined with fact-checking to discern truth from misinformation on social media. The biggest concern about the misinformation, among all the participants, was that it increasingly leads to biased reasoning influenced by personal values and feelings instead of reasoning based on objective evidence. The participants intending to vote Democrat were in 74% of the cases concerned that misinformation will cause escalation of extremism in the future, while those intending to vote Republican, were undecided, or planned to abstain were concerned that misinformation will further erode the trust in democratic institutions, specifically in the context of public health and free and fair elections. During our interviews, we found that 63% of the participants who intended to vote Republican, were fully aware and acknowledged that Republican or conservative voices often time speak misinformation, even though they are closely aligned to their political ideology.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (59)
  1. N. C. for Chronic Disease Prevention and H. Promotion, “Indicator definitions - older adults,” 2015. [Online]. Available: https://www.cdc.gov/cdi/definitions/older-adults.html#:~:text=Category%3A%20Older%20Adults-,Demographic%20Group%3A,persons%20aged%20%E2%89%A565%20years
  2. C. Pang, Z. Collin Wang, J. McGrenere, R. Leung, J. Dai, and K. Moffatt, “Technology adoption and learning preferences for older adults: Evolving perceptions, ongoing challenges, and emerging design opportunities,” in Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems, ser. CHI ’21.   New York, NY, USA: Association for Computing Machinery, 2021. [Online]. Available: https://doi.org/10.1145/3411764.3445702
  3. M. Faverio, “Share of those 65 and older who are tech users has grown in the past decade,” Jan 2022. [Online]. Available: https://www.pewresearch.org/short-reads/2022/01/13/share-of-those-65-and-older-who-are-tech-users-has-grown-in-the-past-decade/
  4. N. Grinberg, K. Joseph, L. Friedland, B. Swire-Thompson, and D. Lazer, “Fake news on twitter during the 2016 u.s. presidential election,” Science, vol. 363, no. 6425, pp. 374–378, 2019.
  5. R. C. Moore and J. T. Hancock, “A digital media literacy intervention for older adults improves resilience to fake news,” Scientific Reports, vol. 12, no. 1, p. 6008, 2022.
  6. J. Nicholson, L. Coventry, and P. Briggs, “”if it’s important it will be a headline”: Cybersecurity information seeking in older adults,” in Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems, ser. CHI ’19.   New York, NY, USA: Association for Computing Machinery, 2019, p. 1–11. [Online]. Available: https://doi-org.ezproxy.depaul.edu/10.1145/3290605.3300579
  7. F. Sharevski, A. Devine, P. Jachim, and E. Pieroni, “Meaningful context, a red flag, or both? preferences for enhanced misinformation warnings among us twitter users,” in Proceedings of the 2022 European Symposium on Usable Security, ser. EuroUSEC ’22.   New York, NY, USA: Association for Computing Machinery, 2022, p. 189–201, https://doi.org/10.1145/3549015.3555671.
  8. A. Frik, L. Nurgalieva, J. Bernd, J. Lee, F. Schaub, and S. Egelman, “Privacy and security threat models and mitigation strategies of older adults,” in Fifteenth Symposium on Usable Privacy and Security (SOUPS 2019).   Santa Clara, CA: USENIX Association, Aug. 2019, pp. 21–40. [Online]. Available: https://www.usenix.org/conference/soups2019/presentation/frik
  9. S. Murthy, K. S. Bhat, S. Das, and N. Kumar, “Individually vulnerable, collectively safe: The security and privacy practices of households with older adults,” Proc. ACM Hum.-Comput. Interact., vol. 5, no. CSCW1, apr 2021. [Online]. Available: https://doi.org/10.1145/3449212
  10. N. Mcdonald and H. M. Mentis, ““citizens too”: Safety setting collaboration among older adults with memory concerns,” ACM Trans. Comput.-Hum. Interact., vol. 28, no. 5, aug 2021. [Online]. Available: https://doi.org/10.1145/3465217
  11. B. Morrison, L. Coventry, and P. Briggs, “How do older adults feel about engaging with cyber-security?” Human Behavior and Emerging Technologies, vol. 3, no. 5, pp. 1033–1049, 2023/09/02 2021. [Online]. Available: https://doi.org/10.1002/hbe2.291
  12. R. Jaster and D. Lanius, “Speaking of fake news,” The epistemology of fake news, vol. 19, 2021.
  13. M. Flintham, C. Karner, K. Bachour, H. Creswick, N. Gupta, and S. Moran, “Falling for fake news: Investigating the consumption of news via social media,” in Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems, ser. CHI ’18.   New York, NY, USA: Association for Computing Machinery, 2018, p. 1–10. [Online]. Available: https://doi.org/10.1145/3173574.3173950
  14. M. McClure Haughey, M. Povolo, and K. Starbird, “Bridging contextual and methodological gaps on the “misinformation beat”: Insights from journalist-researcher collaborations at speed,” in Proceedings of the 2022 CHI Conference on Human Factors in Computing Systems, ser. CHI ’22.   New York, NY, USA: Association for Computing Machinery, 2022. [Online]. Available: https://doi.org/10.1145/3491102.3517503
  15. N. M. Brashier and D. L. Schacter, “Aging in an era of fake news,” Current Directions in Psychological Science, vol. 29, no. 3, pp. 316–323, 2020.
  16. A. M. Guess, M. Lerner, B. Lyons, J. M. Montgomery, B. Nyhan, J. Reifler, and N. Sircar, “A digital media literacy intervention increases discernment between mainstream and false news in the united states and india,” Proceedings of the National Academy of Sciences, vol. 117, no. 27, pp. 15 536–15 545, 2020.
  17. M. Wason, S. S. Gupta, S. Venkatraman, and P. Kumaraguru, “Building sociality through sharing: Seniors’ perspectives on misinformation,” in Proceedings of the 10th ACM Conference on Web Science, ser. WebSci ’19.   New York, NY, USA: Association for Computing Machinery, 2019, p. 321–322. [Online]. Available: https://doi.org/10.1145/3292522.3326052
  18. F. Sharevski, A. Devine, E. Pieroni, and P. Jachim, “Folk Models of Misinformation On Social Media,” in Network and distributed system security symposium, 2023, https://dx.doi.org/10.14722/ndss.2023.24293.
  19. G. Pennycook and D. G. Rand, “Lazy, not biased: Susceptibility to partisan fake news is better explained by lack of reasoning than by motivated reasoning,” Cognition, vol. 188, pp. 39–50, 2019, the Cognitive Science of Political Thought. [Online]. Available: https://www.sciencedirect.com/science/article/pii/S001002771830163X
  20. E. Thorson, “Belief echoes: The persistent effects of corrected misinformation,” Political Communication, vol. 33, no. 3, pp. 460–480, 2016.
  21. F. Sharevski, R. Alsaadi, P. Jachim, and E. Pieroni, “Misinformation warnings: Twitter’s soft moderation effects on covid-19 vaccine belief echoes,” Computers & Security, vol. 114, p. 102577, 2022.
  22. R. DiResta, K. Shaffer, B. Ruppel, D. Sullivan, R. Matney, R. Fox, J. Albright, and B. Johnson, “The tactics & tropes of the internet research agency,” 2019.
  23. Tollefson, Jeff, “Disinformation researchers under investigation: what’s happening and why,” 2023. [Online]. Available: https://www.nature.com/articles/d41586-023-02195-3
  24. X. Tang, Y. Sun, B. Zhang, Z. Liu, R. LC, Z. Lu, and X. Tong, “”i never imagined grandma could do so well with technology”: Evolving roles of younger family members in older adults’ technology learning and use,” Proc. ACM Hum.-Comput. Interact., vol. 6, no. CSCW2, nov 2022. [Online]. Available: https://doi.org/10.1145/3555579
  25. D. Oliveira, H. Rocha, H. Yang, D. Ellis, S. Dommaraju, M. Muradoglu, D. Weir, A. Soliman, T. Lin, and N. Ebner, “Dissecting spear phishing emails for older vs young adults: On the interplay of weapons of influence and life domains in predicting susceptibility to phishing,” in Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems, ser. CHI ’17.   New York, NY, USA: Association for Computing Machinery, 2017, p. 6412–6424. [Online]. Available: https://doi.org/10.1145/3025453.3025831
  26. F. Sharevski, A. Devine, E. Pieroni, and P. Jachim, “Phishing with Malicious QR Codes,” in European Symposium on Usable Security, ser. EuroUSEC ‘22.   New York, NY, US: ACM, 2022, pp. 160–171.
  27. D. M. Sarno, J. E. Lewis, C. J. Bohil, and M. B. Neider, “Which phish is on the hook? phishing vulnerability for older versus younger adults,” Human Factors, vol. 62, no. 5, pp. 704–717, 2023/09/04 2019.
  28. Z. Epstein, G. Pennycook, and D. Rand, “Will the crowd game the algorithm? using layperson judgments to combat misinformation on social media by downranking distrusted sources,” in Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems, ser. CHI ’20.   New York, NY, USA: Association for Computing Machinery, 2020, p. 1–11. [Online]. Available: https://doi.org/10.1145/3313831.3376232
  29. H. Seo, J. Erba, D. Altschwager, and M. Geana, “Evidence-based digital literacy class for older, low-income african-american adults,” Journal of Applied Communication Research, vol. 47, no. 2, pp. 130–152, 2019. [Online]. Available: https://doi.org/10.1080/00909882.2019.1587176
  30. B. Swire, U. K. Ecker, and S. Lewandowsky, “The role of familiarity in correcting inaccurate information.” Journal of experimental psychology: learning, memory, and cognition, vol. 43, no. 12, p. 1948, 2017.
  31. C. Geeng, S. Yee, and F. Roesner, “Fake news on facebook and twitter: Investigating how people (don’t) investigate,” in Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems, ser. CHI ’20.   New York, NY, USA: Association for Computing Machinery, 2020, p. 1–14. [Online]. Available: https://doi.org/10.1145/3313831.3376784
  32. H. Seo, M. Blomberg, D. Altschwager, and H. T. Vu, “Vulnerable populations and misinformation: A mixed-methods approach to underserved older adults’ online information assessment,” New Media & Society, vol. 23, no. 7, pp. 2012–2033, 2021.
  33. N. M. Brashier, S. Umanath, R. Cabeza, and E. J. Marsh, “Competing cues: Older adults rely on knowledge in the face of fluency.” Psychology and aging, vol. 32, no. 4, p. 331, 2017.
  34. P. Scherer, Laura D. and P. Pennycook, Gordon, “Who is susceptible to online health misinformation?” American Journal of Public Health, suppl.Supplement 3, vol. 110, pp. S276–S277, 10 2020, name - University of Colorado; Copyright - Copyright American Public Health Association Oct 2020; Last updated - 2023-03-01; SubjectsTermNotLitGenreText - United States–US; India. [Online]. Available: https://ezproxy.depaul.edu/login?url=https://www.proquest.com/scholarly-journals/who-is-susceptible-online-health-misinformation/docview/2531706561/se-2
  35. C. Hertzog, R. M. Smith, and R. Ariel, “Does the cognitive reflection test actually capture heuristic versus analytic reasoning styles in older adults?” Experimental aging research, vol. 44, no. 1, pp. 18–34, 2018.
  36. E. Hargittai and K. Dobransky, “Old dogs, new clicks: Digital inequality in skills and uses among older adults,” Canadian Journal of Communication, vol. 42, p. 195–212, 2017. [Online]. Available: https://doi.org/10.22230/cjc.2017v42n2a3176
  37. D. R. Thomas, “A General Inductive Approach for Analyzing Qualitative Evaluation Data,” American Journal of Evaluation, vol. 27, no. 2, pp. 237–246, 2006.
  38. J. Cohen, “A coefficient of agreement for nominal scales,” Educational and Psychological Measurement, vol. 20, no. 1, pp. 37–46, 1960.
  39. A. Chadwick and J. Stanyer, “Deception as a Bridging Concept in the Study of Disinformation, Misinformation, and Misperceptions: Toward a Holistic Framework,” Communication Theory, vol. 32, no. 1, pp. 1–24, 10 2021. [Online]. Available: https://doi.org/10.1093/ct/qtab019
  40. B. Swire-Thompson, D. Lazer et al., “Public health and online misinformation: challenges and recommendations,” Annu Rev Public Health, vol. 41, no. 1, pp. 433–451, 2020.
  41. L. G. Stewart, A. Arif, and K. Starbird, “Examining trolls and polarization with a retweet network,” in Proc. ACM WSDM, workshop on misinformation and misbehavior mining on the web. 2018., 2018.
  42. S. Zannettou, T. Caulfield, W. Setzer, M. Sirivianos, G. Stringhini, and J. Blackburn, “Who let the trolls out? towards understanding state-sponsored trolls,” in Proceedings of the 10th ACM Conference on Web Science, ser. WebSci ’19.   New York, NY, USA: Association for Computing Machinery, 2019, p. 353–362. [Online]. Available: https://doi.org/10.1145/3292522.3326016
  43. S. van der Linden, “Misinformation: susceptibility, spread, and interventions to immunize the public,” Nature Medicine, vol. 28, no. 3, pp. 460–467, 2022.
  44. A. Ghenai and Y. Mejova, “Fake cures: User-centric modeling of health misinformation in social media,” Proc. ACM Hum.-Comput. Interact., vol. 2, no. CSCW, nov 2018. [Online]. Available: https://doi.org/10.1145/3274327
  45. E. Zeng, T. Kohno, and F. Roesner, “What makes a “bad” ad? user perceptions of problematic online advertising,” in Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems, ser. CHI ’21.   New York, NY, USA: Association for Computing Machinery, 2021. [Online]. Available: https://doi.org/10.1145/3411764.3445459
  46. P. Chandra and J. Pal, “Rumors and collective sensemaking: Managing ambiguity in an informal marketplace,” in Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems, ser. CHI ’19.   New York, NY, USA: Association for Computing Machinery, 2019, p. 1–12. [Online]. Available: https://doi.org/10.1145/3290605.3300563
  47. J. S. Huffaker, J. K. Kummerfeld, W. S. Lasecki, and M. S. Ackerman, “Crowdsourced detection of emotionally manipulative language,” in Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems, ser. CHI ’20.   New York, NY, USA: Association for Computing Machinery, 2020, p. 1–14. [Online]. Available: https://doi.org/10.1145/3313831.3376375
  48. G. Pennycook and D. G. Rand, “The psychology of fake news,” Trends in Cognitive Sciences, vol. 25, no. 5, pp. 388–402, 2021.
  49. J. J. Van Bavel and A. Pereira, “The partisan brain: An identity-based model of political belief,” Trends in cognitive sciences, vol. 22, no. 3, pp. 213–224, 2018.
  50. D. M. Kahan, “Misconceptions, misinformation, and the logic of identity-protective cognition,” 2017.
  51. B. Bago, D. G. Rand, and G. Pennycook, “Fake news, fast and slow: Deliberation reduces belief in false (but not true) news headlines.” Journal of experimental psychology: general, 2020.
  52. M. V. Bronstein, G. Pennycook, A. Bear, D. G. Rand, and T. D. Cannon, “Belief in fake news is associated with delusionality, dogmatism, religious fundamentalism, and reduced analytic thinking,” Journal of Applied Research in Memory and Cognition, vol. 8, no. 1, pp. 108–117, 2019.
  53. S. Prochaska, K. Duskin, Z. Kharazian, C. Minow, S. Blucker, S. Venuto, J. D. West, and K. Starbird, “Mobilizing manufactured reality: How participatory disinformation shaped deep stories to catalyze action during the 2020 u.s. presidential election,” Proc. ACM Hum.-Comput. Interact., vol. 7, no. CSCW1, apr 2023. [Online]. Available: https://doi.org/10.1145/3579616
  54. P. S. Hart, S. Chinn, and S. Soroka, “Politicization and polarization in covid-19 news coverage,” Science Communication, vol. 42, no. 5, pp. 679–697, 2020.
  55. G. Lima, J. Han, and M. Cha, “Others are to blame: Whom people consider responsible for online misinformation,” Proc. ACM Hum.-Comput. Interact., vol. 6, no. CSCW1, apr 2022. [Online]. Available: https://doi.org/10.1145/3512953
  56. A. Efstratiou and E. De Cristofaro, “Adherence to misinformation on social media through socio-cognitive and group-based processes,” Proc. ACM Hum.-Comput. Interact., vol. 6, no. CSCW2, nov 2022. [Online]. Available: https://doi.org/10.1145/3555589
  57. F. Sharevski, A. Devine, P. Jachim, and E. Pieroni, ““Gettr-ing” User Insights from the Social Network Gettr,” 2022, https://truthandtrustonline.com/wp-content/uploads/2022/10/TTO_2022_proceedings.pdf.
  58. E. Saltz, C. R. Leibowicz, and C. Wardle, “Encounters with visual misinformation and labels across platforms: An interview and diary study to inform ecosystem approaches to misinformation interventions,” in Extended Abstracts of the 2021 CHI Conference on Human Factors in Computing Systems, ser. CHI EA ’21.   New York, NY, USA: Association for Computing Machinery, 2021. [Online]. Available: https://doi.org/10.1145/3411763.3451807
  59. F. Sharevski, J. V. Loop, P. Jachim, A. Devine, and E. Pieroni, “Abortion misinformation on tiktok: Rampant content, lax moderation, and vivid user experiences,” arXiv preprint arXiv:2301.05128, 2023.
User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (2)
  1. Filipo Sharevski (31 papers)
  2. Jennifer Vander Loop (7 papers)