2,620
Views
0
CrossRef citations to date
0
Altmetric
Articles

Riding information crises: the performance of far-right Twitter users in Australia during the 2019–2020 bushfires and the COVID-19 pandemic

ORCID Icon, ORCID Icon & ORCID Icon
Pages 278-296 | Received 11 May 2022, Accepted 23 Mar 2023, Published online: 29 Apr 2023

References

  • Benkler, Y., Faris, R., & Roberts, H. (2018). Network propaganda: Manipulation, disinformation, and radicalization in American politics. Oxford University Press.
  • Busbridge, R., Moffitt, B., & Thorburn, J. (2020). Cultural Marxism: Far-right conspiracy theory in Australia’s culture wars. Social Identities, 26(6), 722–738. https://doi.org/10.1080/13504630.2020.1787822
  • Cinelli, M., Quattrociocchi, W., Galeazzi, A., Valensise, C. M., Brugnoli, E., Schmidt, A. L., Zola, P., Zollo, F., & Scala, A. (2020). The COVID-19 social media infodemic. Scientific Reports, 10(1), 16598. https://doi.org/10.1038/s41598-020-73510-5
  • Clauset, A., Newman, M. E. J., & Moore, C. (2004). Finding community structure in very large networks. Physical Review E, 70(6), 066111. https://doi.org/10.1103/PhysRevE.70.066111
  • Davis, M. (2019). Transnationalising the anti-public sphere: Australian anti-publics and reactionary online media. In M. Peucker, & D. Smith (Eds.), The far-right in contemporary Australia (pp. 127–149). Palgrave Macmillan. https://doi.org/10.1007/978-981-13-8351-9.
  • Douglas, K. M., Sutton, R. M., & Cichocka, A. (2017). The psychology of conspiracy theories. Current Directions in Psychological Science, 26(6), 538–542. https://doi.org/10.1177/0963721417718261
  • Ferrara, E. (2020). What types of COVID-19 conspiracies are populated by Twitter bots? First Monday. https://doi.org/10.5210/fm.v25i6.10633.
  • Ferrara, E., Varol, O., Davis, C., Menczer, F., & Flammini, A. (2016). The rise of social bots. Communications of the ACM, 59(7), 96–104. https://doi.org/10.1145/2818717
  • Gallotti, R., Valle, F., Castaldo, N., Sacco, P., & De Domenico, M. (2020). Assessing the risks of ‘infodemics’ in response to COVID-19 epidemics. Nature Human Behaviour, 4(12), 1285–1293. https://doi.org/10.1038/s41562-020-00994-6
  • Giglietto, F., Righetti, N., Rossi, L., & Marino, G. (2020). It takes a village to manipulate the media: Coordinated link sharing behavior during 2018 and 2019 Italian elections. Information, Communication & Society, 23(6), 867–891. https://doi.org/10.1080/1369118X.2020.1739732
  • Goldman, A. I. (2011). A guide to social epistemology. In A. I. Goldman, & D. Whitcomb (Eds.), Social epistemology: Essential readings (pp. 11–37). Oxford University Press.
  • Golebiewski, M., & boyd, d. (2019). Data voids: Where missing data can easily be exploited.
  • Graham, T., Bruns, A., Angus, D., Hurcombe, E., & Hames, S. (2021). #IStandWithDan versus #DictatorDan: The polarised dynamics of Twitter discussions about Victoria’s COVID-19 restrictions. Media International Australia, 179(1), 127–148. https://doi.org/10.1177/1329878X20981780
  • Graham, T., Bruns, A., Zhu, G., & Campbell, R. (2020). Like a virus: The coordinated spread of coronavirus disinformation. Centre for Responsible Technology, The Australia Institute. https://apo.org.au/node/305864.
  • Graham, T., & Keller, T. R. (2020, January 10). Bushfires, bots and arson claims: Australia flung in the global disinformation spotlight. The Conversation. http://theconversation.com/bushfires-bots-and-arson-claims-australia-flung-in-the-global-disinformation-spotlight-129556.
  • Gruzd, A., & Mai, P. (2020). Going viral: How a single tweet spawned a COVID-19 conspiracy theory on Twitter. Big Data & Society, 7(2), 205395172093840. https://doi.org/10.1177/2053951720938405
  • Hameleers, M., Humprecht, E., Möller, J., & Lühring, J. (2021). Degrees of deception: The effects of different types of COVID-19 misinformation and the effectiveness of corrective information in crisis times. Information, Communication & Society, 0(0), 1–17. https://doi.org/10.1080/1369118X.2021.2021270
  • Ho, D., Imai, K., King, G., & Stuart, E. A. (2011). Matchit: Nonparametric preprocessing for parametric causal inference. Journal of Statistical Software, 42, 1–28. https://doi.org/10.18637/jss.v042.i08
  • Hutchinson, J. (2021). The new-far-right movement in Australia. Terrorism and Political Violence, 33(7), 1424–1446. https://doi.org/10.1080/09546553.2019.1629909
  • Jamieson, K. H. (2018). Cyberwar: How Russian hackers and trolls helped elect a president: What we don’t, can’t, and do know. Oxford University Press.
  • Keller, F. B., Schoch, D., Stier, S., & Yang, J. (2020). Political astroturfing on Twitter: How to coordinate a disinformation campaign. Political Communication, 37(2), 256–280. https://doi.org/10.1080/10584609.2019.1661888
  • Kim, D., Graham, T., Wan, Z., & Rizoiu, M.-A. (2019). Analysing user identity via time-sensitive semantic edit distance (t-SED): A case study of Russian trolls on Twitter. Journal of Computational Social Science, 2(2), 331–351. https://doi.org/10.1007/s42001-019-00051-x
  • Kong, Q., Booth, E., Bailo, F., Johns, A., & Rizoiu, M.-A. (2022). Slipping to the extreme: A mixed method to explain how extreme opinions infiltrate online discussions. Proceedings of the International AAAI Conference on Web and Social Media, 16(1), 524–535. https://doi.org/10.1609/icwsm.v16i1.19312
  • Kumar, S., Cheng, J., Leskovec, J., & Subrahmanian, V. S. (2017). An army of me: Sockpuppets in online discussion communities. Proceedings of the 26th International Conference on World Wide Web, 857–866. https://doi.org/10.1145/3038912.3052677
  • Laudan, L. (2001). Epistemic crises and justification rules. Philosophical Topics, 29(1/2), 271–317. https://doi.org/10.5840/philtopics2001291/22
  • Lazer, D. M. J., Baum, M. A., Benkler, Y., Berinsky, A. J., Greenhill, K. M., Menczer, F., Metzger, M. J., Nyhan, B., Pennycook, G., Rothschild, D., Schudson, M., Sloman, S. A., Sunstein, C. R., Thorson, E. A., Watts, D. J., & Zittrain, J. L. (2018). The science of fake news. Science, 359(6380), 1094–1096. https://doi.org/10.1126/science.aao2998
  • Li, H. O.-Y., Bailey, A., Huynh, D., & Chan, J. (2020). YouTube as a source of information on COVID-19: A pandemic of misinformation? BMJ Global Health, 5(5), e002604. https://doi.org/10.1136/bmjgh-2020-002604
  • Marwick, A. E., & Lewis, B. (2017). Media manipulation and disinformation online. Data & Society Research Institute. https://datasociety.net/library/media-manipulation-and-disinfo-online/.
  • Nohrstedt, S. A. (1991). The information crisis in Sweden after Chernobyl. Media, Culture & Society, 13(4), 477–497. https://doi.org/10.1177/016344391013004004
  • Park, S., Fisher, C., Lee, J. Y., McGuinness, K., Sang, Y., O’Neil, M., Jensen, M., McCallum, K., & Fuller, G. (2020). Digital news report: Australia 2020 (Australia) [Report]. News and Media Research Centre. https://apo.org.au/node/305057.
  • Ross, K. (2020, August 25). Why QAnon is attracting so many followers in Australia—And how it can be countered. The Conversation. http://theconversation.com/why-qanon-is-attracting-so-many-followers-in-australia-and-how-it-can-be-countered-144865.
  • Serrano, MÁ, Boguñá, M., & Vespignani, A. (2009). Extracting the multiscale backbone of complex weighted networks. Proceedings of the National Academy of Sciences, 106(16), 6483–6488. https://doi.org/10.1073/pnas.0808904106
  • Shahi, G. K., Dirkson, A., & Majchrzak, T. A. (2021). An exploratory study of COVID-19 misinformation on Twitter. Online Social Networks and Media, 22, 100104. https://doi.org/10.1016/j.osnem.2020.100104
  • Shao, C., Ciampaglia, G. L., Varol, O., Yang, K.-C., Flammini, A., & Menczer, F. (2018). The spread of low-credibility content by social bots. Nature Communications, 9(1), 4787. https://doi.org/10.1038/s41467-017-02088-w
  • Wardle, C., & Derakhshan, H. (2017). Information disorder: Toward an interdisciplinary framework for research and policy making (DGI(2017)09). Council of Europe. http://tverezo.info/wp-content/uploads/2017/11/PREMS-162317-GBR-2018-Report-desinformation-A4-BAT.pdf.
  • Weber, D., Falzon, L., Mitchell, L., & Nasim, M. (2022). Promoting and countering misinformation during Australia’s 2019–2020 bushfires: A case study of polarisation. Social Network Analysis and Mining, 12(1), 64. https://doi.org/10.1007/s13278-022-00892-x
  • Wilson, T., & Starbird, K. (2020). Cross-platform disinformation campaigns: Lessons learned and next steps. Harvard Kennedy School Misinformation Review, 1(1), https://doi.org/10.37016/mr-2020-002
  • Woolley, S., & Howard, P. N. (eds.). (2019). Computational propaganda: Political parties, politicians, and political manipulation on social media. Oxford University Press.
  • Yang, K.-C., Pierri, F., Hui, P.-M., Axelrod, D., Torres-Lugo, C., Bryden, J., & Menczer, F. (2021). The COVID-19 Infodemic: Twitter versus Facebook. Big Data & Society, 8(1), 20539517211013860. https://doi.org/10.1177/20539517211013861
  • Zarocostas, J. (2020). How to fight an infodemic. The Lancet, 395(10225), 676. https://doi.org/10.1016/S0140-6736(20)30461-X
  • Zola, P., Cola, G., Martella, A., & Tesconi, M. (2022). Italian top actors during the COVID-19 infodemic on Twitter. International Journal of Web Based Communities, 18(2), 150–172. https://doi.org/10.1504/IJWBC.2022.124783