AI generated misinformation in public health PR: Combatting deepfakes in vaccine advocacy
DOI:
https://doi.org/10.58881/jllscs.v4i1.441Keywords:
misinformation, AI-generated misinformation, deepfakes, public health, vaccine advocacy, public health communication, misinformation detectionAbstract
This study was carried out to examine the impact of AI generated means information (deep fakes) on vaccine advocacy, investigate the sources of deep fakes and identify factors contributing to their spread. The elaboration likelihood model (ELM) was employed to explain how people process and respond to AI generated misinformation. A library research method was used involving the collection and analysis of existing data from various secondary sources. The study revealed that social media platforms, anti-vaccine groups, malicious actors, and influencers are primary sources of deep fakes. It was found that emotional appeal, personalization, vulnerability in media literacy, and confirmation bias contribute to the spread of misinformation. It was concluded that they proliferation of deep fakes, has significantly eroded public trust in vaccines and health authorities highlighting the need for a multifaceted approach to combat misinformation. It is therefore recommended that social media platforms should implement robust verification mechanisms, public health authorities should developed fact-based information addressing emotional concerns and the public should be educated on media literacy skills
Downloads
References
Adebayo, A., & Ojo, J. (2021). The role of social media in the spread of health misinformation. Health Communication, 36(10), 1234-1242.
Ahmed, A. (2020). The impact of misinformation on public health: A review. Journal of Public Health, 42(3), 456-467.
Ajder, H. (2022). Deepfakes: The new frontier of misinformation. Media, Culture & Society, 44(5), 789-802.
Akpan, I. & Obukoadata, P. O. (2013). Multi-Media Communication Systems for Social Development in Nigeria: A Symbiotic Deconstruction. New Media and Mass Communication 10, 23-32.
Babeze, M. (2022). The influence of AI-generated misinformation on public health initiatives. International Journal of Health Policy and Management, 11(4), 567-575.
Bathran, A. (2022). Understanding the dynamics of misinformation in public health. Public Health Reports, 137(2), 234-245.
Binns, R. (2021). Artificial intelligence and its implications for public health. Health Informatics Journal, 27(3), 146-158.
Chesney, R., & Citron, D. K. (2020). Deep fakes and the new disinformation war: The coming age of post-truth geopolitics. Foreign Affairs, 99(3), 1-10.
Dansuki, A. (2021). Ethical considerations in the use of AI in public health. Journal of Medical Ethics, 47(5), 305-310.
Godsgift, O. H., & Obukoadata, P. O. (2008). Cultural imperialism: a discourse. International Journal of Communication, 9, 125-135.
Gollust, S. E. (2020). Public health communication: Strategies for effective messaging. American Journal of Public Health, 110(5), 678-684.
Graham, M. L., et al. (2021). The impact of misinformation on public health communication during crises. Health Security, 19(4), 345-353.
Kahim, A. (2022). The ethical implications of deepfake technology in public health. Journal of Health Ethics, 18(1), 1-10.
Kim, J. (2015). Critiques of the Elaboration Likelihood Model: A review. Communication Theory, 25(3), 267-284.
Kim, J., & Lee, H. (2019). The role of motivation in processing AI-generated misinformation. Journal of Communication, 69(4), 456-478.
Kperogi, A. (2020). Misinformation and public health: The COVID-19 experience. Global Health, 16(1), 1-10.
Lee, J. (2020). Emotions in persuasion: A critical review of the Elaboration Likelihood Model. Persuasive Communication, 12(2), 123-145.
MacDonald, N. E. (2020). Vaccine hesitancy: Causes, consequences, and a call to action. Canadian Medical Association Journal, 192(47), E1387-E1391.
Moyo, M. (2023). The transformative potential of AI in healthcare. Health Technology, 13(1), 1-10.
Nduka, C. (2020). Misinformation in public health: A growing concern. Journal of Health Communication, 25(10), 789-795.
Nwadike, C. (2021). The role of deepfakes in vaccine hesitancy. Vaccine, 39(12), 1650-1655.
Obaze, O. (2021). Personalization of misinformation: Implications for public health. Journal of Public Health Policy, 42(2), 234-245.
Obukoadata, P. ‘Ruke, Okon, P. E., & Obogo, L. (2024). Exploration into usage, frequency, and prominence of propaganda devices by political parties in Nigerian newspapers during the 2019 electioneering campaigns. Newspaper Research Journal, 45(1), 90-109. https://doi.org/10.1177/07395329231213035
Obukoadata, P. ‘Ruke, Uduma, N. E., Eneokon, P., & Ulam, J. (2020). Deploying digital media as innovations in marketing government policies and enhancing civic engagement among vulnerable youths in Calabar. Media Watch, 11(2), 296-309. https://doi.org/10.15655/mw_2020_v11i2_195649
Obukoadata, P. O. (2010). Cultural globalisation: An abstraction. N. Ekenanyanwu & C. Okeke, Indigenous societies and cultural globalisation in the 21st century (pp. 335-358). Germany, VDM Verlag.
Obukoadata, P. O. (2022a). Reengaging Africanized pedagogy, theoretical postulations and indexing. In: Kehbuma Langmia (ed.) Decolonization of the Communication Studies (pp. 21-41). Newcastle, UK: Cambridge Scholars Publishing.
Obukoadata, P. O., Uduma, N. E. & Obukoadata, S. O. (2021). Consumers’ Inclusiveness and Migration: Evaluation of Select Brands through the Brand Identity Prism. Communication Today, 12 (1): 96-110.
Obukoadata, P.O. (2022b). Thematic Deconstructions of Urhobo/Isoko Musicology and Brand Identity Negotiation, Normalization and Contradictions: Discourse Narrative. In: Salawu, A., Fadipe, I.A. (eds) Indigenous African Popular Music, Volume 1. Pop Music, Culture and Identity (pp. 151-167). Palgrave Macmillan, Cham. https://doi.org/10.1007/978-3-030-97884-6_9
Ogunleye, O. (2023). AI in public health: Opportunities and challenges. Journal of Public Health Management and Practice, 29(1), 1-10.
Ogunyemi, A. (2022). Misinformation and its impact on public health communication. Journal of Health Communication, 27(3), 234-245.
Olajide, A. (2020). Vaccine hesitancy: A barrier to herd immunity. International Journal of Infectious Diseases, 95, 123-130.
Osazuwa, F. (2020). The impact of misinformation on vaccine uptake. Vaccine, 38(45), 7100-7105.
Roozenbeek, J. (2020). Vaccine advocacy in the age of misinformation. Health Promotion International, 35(4), 789-797.
Ular, A. (2018). The rise of deepfakes: Implications for public health. Journal of Health Communication, 23(10), 789-795.
Uwadia, C. (2020). Motivations behind the creation of deepfakes. Media Studies Journal, 34(2), 123-135.
Zhang, Y., & Wang, L. (2017). The Elaboration Likelihood Model: A review and future directions. Communication Research, 44(5), 1-25.
Zhou, Y., Kalim, V. & Mi, A. L. (2021). The emotional appeal of misinformation: Implications for public health. Health Communication, 36(10), 1234-1242.
Zuka, A. (2021). Misinformation and vaccine hesitancy during the COVID-19 pandemic. Journal of Public Health, 43(2), 234-245.
Downloads
Published
How to Cite
Issue
Section
License
Copyright (c) 2026 Presly Obukoadata, Emmanuel Arikoro

This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.


