International Journal of Research and Innovation in Social Science

Submission Deadline- 11th September 2025
September Issue of 2025 : Publication Fee: 30$ USD Submit Now
Submission Deadline-03rd October 2025
Special Issue on Economics, Management, Sociology, Communication, Psychology: Publication Fee: 30$ USD Submit Now
Submission Deadline-19th September 2025
Special Issue on Education, Public Health: Publication Fee: 30$ USD Submit Now

Fake News, Misinformation, and Its Impact on Political Decision Making in Edo State 2024 Governorship Election.

  • Dr. Josephine Obiajulu Omoruyi
  • Michael Adeniyi Thomas
  • Idehen Blessing Osasenaga
  • 361-380
  • Aug 29, 2025
  • Mass Communication

Fake News, Misinformation, and Its Impact on Political Decision Making in Edo State 2024 Governorship Election.

Dr. Josephine Obiajulu Omoruyi1, Michael Adeniyi Thomas2, Idehen Blessing Osasenaga3

Department Of Mass Communication, Igbinedion University, Okada, Edo State, Nigeria.

DOI: https://dx.doi.org/10.47772/IJRISS.2025.913COM0033

Received: 11 July 2025; Accepted: 19 July 2025; Published: 29 August 2025

ABSTRACT

The study focuses on fake news, misinformation, and its impact on political decision making in Edo state 2024 governorship election. Primarily targeted voters in Oredo local government in Edo State, as well as political parties, candidates, and other relevant stakeholders, such as media organizations and civil society groups. The mixed methods approach combining a survey and interviews was well-suited to the research objectives. Agenda setting theory has been applied to understand how misinformation can shape the public’s perception of what issues matter. The survey revealed a high prevalence of fake news and misinformation during the Edo State 2024 Governorship Election. These trends point to a serious threat to informed political decision-making. The findings indicate that strategies like fact-checking, media literacy education, and credible information campaigns are considered effective in countering fake news Fake news, Misinformation, Political decision-making, Governorship, Election

INTRODUCTION

In recent years, the proliferation of fake news and misinformation has become a significant concern globally, particularly in the context of political decision-making (Allcott & Gentzkow, 2017). The advent of social media and the rapid dissemination of information through digital platforms have made it easier for false or misleading information to spread, potentially influencing public opinion and electoral outcomes (Lazer et al., 2018). Nigeria, like many other countries, has witnessed the impact of fake news and misinformation on its political landscape, with significant implications for its democratic processes (Okoro & Emmanuel, 2018).

The September 2024 Edo State Governorship Election served as a critical test case for understanding the influence of misinformation on electoral processes in Nigeria’s digital age. The election, which marked the end of Governor Godwin Obaseki’s tenure, was characterized by intense political competition and widespread use of social media platforms for campaign messaging. The electoral period witnessed an unprecedented surge in information sharing across various digital platforms, making it particularly vulnerable to the spread of false narratives and manipulated content (Armsfree, 2024).

The rise of sophisticated technologies, including artificial intelligence and deepfake videos, has further complicated the information landscape, making it increasingly difficult for voters to distinguish between authentic and fabricated content. This technological advancement, combined with the high political stakes in Edo State’s governorship race, created an environment where misinformation could potentially influence voter behavior and electoral outcomes. The situation was further exacerbated by the state’s demographic composition, with a significant young population that relies heavily on social media for political information (Ajanaku, 2024).

Understanding the impact of fake news and misinformation on political decision-making in the context of the Edo State 2024 Governorship Election is crucial for several reasons. First, it provides insights into the vulnerabilities of Nigeria’s electoral system to digital manipulation. Second, it helps identify effective strategies for combating misinformation in future elections. Finally, it contributes to the broader discourse on safeguarding democratic processes in the digital age, particularly in developing democracies where institutional mechanisms for fact-checking and information verification may be less robust.

The proliferation of false narratives and manipulated content across social media platform creates an environment of information uncertainty. Also, the sophisticated nature of modern misinformation campaign including the modern use of artificial intelligence and deep fake technology, poses unprecedented challenges to traditional fact checking mechanism. Many voters particularly in rural areas lack the necessary skill to distinguish between authentic and fabricated content making them susceptible to manipulation through false information, thereby making wrong choices. This study addresses these several problems that emerges during critical electoral period, with a view to proffering solutions.

Objectives of the study

  1. To examine the types of fake news and misinformation circulating in the context of the Edo State 2024 Governorship Election.
  2. To identify the factors contributing to the spread of fake news and misinformation in the context of the election.
  3. To explore the strategies employed by political parties, candidates, and other stakeholders to counter fake news and misinformation during the election campaign.
  4. To propose recommendations for mitigating the impact of fake news and misinformation on political decision-making in future elections in Edo State and Nigeria at large.

Research questions

  1. What is the extent and nature of fake news and misinformation circulating in the context of the Edo State 2024 Governorship Election?
  2. What are the key factors contributing to the spread of fake news and misinformation during the election campaign?
  3. What strategies are employed by political parties, candidates, and other stakeholders to counter fake news and misinformation in the context of the election?
  4. What recommendations can be made to mitigate the impact of fake news and misinformation on political decision-making in future elections in Edo State and Nigeria?

Fake News and Misinformation

Before delving into the literature, it is important to clarify what is meant by “fake news” and “misinformation.” Allcott and Gentzkow (2017) define fake news as “news articles that are intentionally and verifiably false, and could mislead readers” (p. 213). They focus on the deliberate creation and spread of false information for political or financial gain. Wardle and Derakhshan (2017) offer a broader definition of misinformation as “information that is false, but not created with the intention of causing harm,” while disinformation is “information that is false and deliberately created to harm a person, social group, organization or country” (p. 20). They argue that both misinformation and disinformation fall under the umbrella term of “information disorder.”

Freelon and Wells (2020) critique the narrow focus on verifiably false information, arguing that it “overlooks the more subtle ways in which information can be manipulated to deceive, such as through selective presentation of facts or the use of misleading framing” (p. 145). They propose a broader definition of disinformation that includes “any form of communication that is intentionally false or misleading, regardless of its specific content or form” (p. 145). This expanded definition encompasses a wider range of deceptive information practices.

Jack (2017) also critiques the term “fake news” as being too vague and politically loaded, arguing that it has been co-opted by political actors to discredit legitimate news sources and sow confusion. She proposes a more nuanced taxonomy of problematic information that includes propaganda (persuasive communication with a political agenda), disinformation (false or misleading information spread deliberately), misinformation (false or misleading information spread unintentionally), and more. This taxonomy highlights the diverse forms and motivations behind the spread of problematic information.

Farkas and Schou (2018) view “fake news” as a floating signifier that is used by different actors for different political purposes. They argue that the term has been weaponized in a discursive struggle between hegemonic and counter-hegemonic forces, with each side claiming to be the arbiter of truth. This perspective emphasizes the political dimensions of the fake news debate and the way in which the term itself has become a site of contestation.

Mejia, Beckermann, and Sullivan (2018) situate the spread of false and misleading information within the broader context of political and economic structures. They argue that the “disinformation order” is characterized by the “institutionalization of deception” and the “systemic production of misleading, false, or manipulated information” (p. 123). This view highlights the structural factors that enable and incentivize the spread of problematic information.

Prevalence of Political Misinformation

Research has documented the widespread nature of fake news and misinformation in politics, particularly in the context of high-stakes elections. One of the most comprehensive studies on this topic, conducted by Allcott and Gentzkow (2017), analyzed over 1,200 news stories about the 2016 U.S. presidential election. The researchers found that false stories related to the election were shared on Facebook over 8 million times in the final three months of the campaign alone. This staggering figure highlights the immense reach and potential impact of fake news in shaping public discourse and opinions during a crucial political event. The study also revealed that the most popular fake news stories often outperformed legitimate news articles from reputable media outlets in terms of social media engagement. This suggests that fake news can sometimes eclipse truthful reporting in terms of its ability to capture attention and go viral online. Surveys conducted around the same time period further underscore the prevalence of fake news exposure among the American public. For instance, a 2016 Pew Research Center survey found that a majority of U.S. adults reported encountering made-up news stories intended to mislead readers (Barthel et al., 2016). Taken together, these findings paint a troubling picture of the pervasiveness of political misinformation in the digital age and its ability to reach and potentially influence large segments of the population.

Building on these initial findings, Silverman and Singer-Vine (2016) delved deeper into the specific dynamics of fake news dissemination on social media during the 2016 U.S. presidential election. Their analysis revealed that in the lead-up to the election, the top 20 fake news stories generated more engagement on Facebook than the top 20 stories from major news outlets. This finding is particularly alarming, as it suggests that fake news can sometimes outperform legitimate news in terms of reach and impact. The researchers argue that the highly shareable and emotionally provocative nature of many fake news stories may contribute to their ability to spread rapidly and widely on social media platforms. Moreover, the finding that fake news can surpass legitimate journalism in terms of engagement raises concerns about the ability of truthful information to compete with and correct misinformation in the online environment. It also highlights the need for social media companies to take proactive steps to curb the spread of fake news and promote authoritative sources of information, particularly during sensitive political moments such as elections.

Guess, Nagler, and Tucker (2019) further investigated the spread of fake news during the 2016 U.S. election using a unique dataset that combined survey responses with individual-level web traffic data. This innovative approach allowed the researchers to gain a more granular understanding of who was actually consuming fake news and how much time they spent engaging with it. The study found that approximately one in four Americans visited a fake news website during the election period, suggesting that exposure to misinformation was not limited to a small fringe group but rather affected a significant portion of the population. However, the researchers also noted that the overall consumption of fake news was relatively limited, with the average person spending only a few minutes on these sites over the course of the election. This finding suggests that while fake news reached a wide audience, its ability to command sustained attention and engagement may be more limited. Importantly, the study also found that individuals who visited fake news websites were disproportionately likely to be Republican and conservative, indicating that misinformation exposure may be linked to political ideology and partisan media consumption habits. This finding has important implications for understanding the potential electoral impact of fake news and the ways in which it may interact with existing political beliefs and predispositions.

While much of the early research on fake news focused on the U.S. context, a growing body of scholarship has documented the global nature of the problem. Researchers have identified politically motivated misinformation campaigns in countries around the world, suggesting that the challenge of fake news is not limited to any one national or regional context. For instance, Machado et al. (2019) analyzed the spread of misinformation on WhatsApp during the 2018 Brazilian presidential election. They found evidence of coordinated disinformation efforts, including the use of automated accounts to mass-distribute false and misleading content. Similarly, Farooq (2018) documented the widespread use of WhatsApp to spread political misinformation during the 2018 Indian general election, including false rumors and manipulated images designed to inflame religious and ethnic tensions. In the Nigerian context, Apuke and Omar (2021) found that exposure to fake news on social media was associated with increased political polarization and decreased trust in the electoral process during the 2019 presidential election. These studies highlight the ways in which misinformation can exploit social and political fault lines, as well as the specific technological and media ecosystems of different countries.

The prevalence of fake news and misinformation in Nigerian politics has been a growing concern in recent years. During the 2019 Nigerian general election, researchers found that a significant portion of the political information shared on social media was false or misleading. Hassan and Hitchen (2020) analyzed a sample of political tweets related to the two leading presidential candidates and found that over 20% of the information shared about them was misinformation. This finding suggests that fake news played a substantial role in shaping the online discourse surrounding the election and potentially influenced public perceptions of the candidates. The researchers argue that the spread of misinformation on social media in Nigeria is facilitated by a number of factors, including the country’s highly competitive and polarized political environment, the growing use of social media as a primary source of news and information, and the presence of coordinated disinformation campaigns by political actors. They also note that the prevalence of misinformation in Nigerian politics has important implications for the integrity of the country’s democratic processes and the ability of citizens to make informed political decisions.

Wasserman and Madrid-Morales (2019) conducted a comparative study of fake news consumption in three African countries: Kenya, Nigeria, and South Africa. Using survey data, the researchers assessed the extent to which citizens in these countries were exposed to and engaged with various types of misinformation on social media platforms. They found that exposure to fake news was high across all three countries, with a significant portion of respondents reporting that they frequently encountered false or misleading information online. However, the study also revealed notable differences between the countries, with Nigerians reporting the highest levels of exposure and engagement with fake news compared to Kenyans and South Africans. The authors argue that this finding may be attributed to Nigeria’s particularly intense and competitive political environment, as well as the widespread use of social media for political communication and mobilization in the country. The study highlights the need for further research on the specific contextual factors that shape the spread and impact of fake news in different African nations, as well as the importance of developing tailored interventions to combat misinformation that take into account the unique political, social, and media landscapes of each country.

In a global context, Bradshaw and Howard (2018) conducted a comprehensive analysis of computational propaganda efforts across 48 countries. Computational propaganda refers to the use of algorithms, automated accounts, and other digital tools to manipulate public opinion and spread misinformation online. The researchers found evidence of organized social media manipulation campaigns in 28 of the countries studied, suggesting that the use of these tactics is widespread and pervasive around the world. Importantly, the study revealed that these campaigns are often conducted by domestic political actors, such as government agencies, political parties, and candidates, rather than foreign entities. This finding underscores the need to understand the role of domestic politics and power structures in driving the spread of misinformation, rather than solely focusing on external threats. The researchers also identified a range of strategies and techniques used in these campaigns, including the use of bots, fake accounts, and paid human “trolls” to amplify certain messages and drown out opposing viewpoints. These findings highlight the sophistication and diversity of computational propaganda efforts worldwide and the challenges they pose for the integrity of democratic processes and public discourse.

The research on the prevalence of fake news and misinformation in politics underscores the global nature of the problem and the need for a nuanced understanding of the specific factors that shape its manifestation in different contexts. As Wardle and Derakhshan (2017) argue, misinformation should be understood as a complex, multi-dimensional phenomenon that is deeply embedded in the social, political, and technological realities of a given society. Effectively combating fake news and misinformation requires a multidisciplinary approach that takes into account the various actors, incentives, and structural conditions that enable its creation and spread. This may include measures to increase media literacy and critical thinking skills among citizens, as well as efforts to promote transparency and accountability in the production and dissemination of news and information. It may also involve the development of technological solutions, such as algorithms to detect and flag false content, as well as regulatory frameworks to govern the responsibilities of social media platforms and other digital intermediaries. Ultimately, the prevalence of fake news and misinformation in politics around the world highlights the urgent need for policymakers, researchers, and citizens to work together to protect the integrity of democratic discourse and decision-making in the face of this growing challenge.

Impact on Political Knowledge and Attitudes

Exposure to fake news and misinformation can significantly impact citizens’ factual political knowledge, with potentially serious consequences for democratic decision-making. One experimental study conducted by Balmas (2014) demonstrated the direct effect of fake news exposure on individuals’ ability to recall accurate information about political candidates. Participants were exposed to either a fake news article or a legitimate news article about a fictional political candidate. Those who read the fake news article showed decreased accuracy in recalling the candidate’s actual positions compared to those who read the legitimate article. This finding suggests that even a single instance of exposure to misinformation can override or distort an individual’s existing knowledge about a political figure. Similarly, Kuklinski et al. (2000) analyzed panel data to investigate the relationship between belief in misinformation and overall political knowledge. They found that respondents who held misperceptions about political issues had significantly lower scores on measures of political knowledge compared to those who did not hold such misperceptions. This study highlights the cumulative impact of misinformation on individuals’ understanding of political reality over time. Together, these findings underscore the risk that fake news and misinformation pose to citizens’ ability to form accurate beliefs about political candidates and issues, which is a cornerstone of informed democratic participation.

The impact of fake news on political knowledge is not uniform across all individuals, however. Pennycook and Rand (2019) conducted a series of experiments to investigate the cognitive factors that make some people more susceptible to believing and sharing false information online. They found that individuals who engaged in more analytical thinking, as measured by cognitive reflection tests, were better able to discern between true and false news headlines. In other words, people who tend to stop and reflect on the accuracy and credibility of information before accepting it as true are less likely to fall for fake news. Conversely, those who rely more on intuitive or heuristic thinking processes are more prone to believing and spreading misinformation. Importantly, the researchers found that this effect held true regardless of individuals’ political ideology or partisan affiliations. This suggests that susceptibility to fake news is not solely a function of motivated reasoning or confirmation bias, but also depends on broader cognitive skills and habits. The findings of this study have important implications for efforts to combat fake news, as they suggest that improving individuals’ critical thinking abilities and encouraging more reflective media consumption could help to reduce the impact of misinformation on political knowledge.

Building on the idea that cognitive factors play a role in fake news susceptibility, De keersmaecker and Roets (2017) investigated the specific relationship between cognitive ability and the ability to detect false information. In a series of experiments, they presented participants with true and false news headlines and asked them to evaluate their accuracy. The researchers found that individuals with higher scores on a test of cognitive ability were better able to distinguish between true and false headlines, even when the content of the headlines aligned with their own political beliefs. This finding suggests that cognitive skills such as reasoning, problem-solving, and information processing are important tools for navigating the complex and often misleading information environment of contemporary politics. Importantly, the study also found that the impact of cognitive ability on fake news detection was independent of political ideology, suggesting that these skills can serve as a buffer against misinformation across the political spectrum. The authors argue that their findings underscore the importance of education and cognitive training as strategies for building resilience against fake news and misinformation at the individual level.

While much of the research on fake news has focused on its impact on factual political knowledge, there is also evidence that exposure to misinformation can shape political attitudes and beliefs in powerful ways. One study by Jolley and Douglas (2014) found that even brief exposure to conspiracy theories can significantly reduce individuals’ intentions to engage in politics. Participants who were asked to read a short article promoting a conspiracy theory about climate change reported lower intentions to reduce their carbon footprint or engage in political activism compared to those who read a neutral article. The authors argue that this effect is driven by the way conspiracy theories can undermine trust in political institutions and create a sense of powerlessness or disillusionment with the political system as a whole. Similarly, Thorson (2016) demonstrated the power of negative misinformation to shape attitudes towards political candidates. In an experimental study, participants who were exposed to false negative information about a politician subsequently rated that politician more negatively, even after the misinformation was corrected. This finding highlights the potential for fake news to have lasting effects on political evaluations, even in the face of contradictory evidence.

The impact of misinformation on political attitudes was further explored by Weeks (2015) in the context of the 2012 U.S. presidential election. Using a nationally representative survey sample, Weeks investigated the relationship between exposure to negative misinformation about presidential candidates and individuals’ attitudes towards those candidates. He found that participants who reported encountering more negative misinformation about a candidate subsequently expressed more negative attitudes towards that candidate, even controlling for their prior attitudes. Importantly, this effect was particularly pronounced among individuals with lower levels of political knowledge, suggesting that those with a less developed understanding of political issues may be more vulnerable to attitude change in response to misinformation. Weeks argues that these findings underscore the power of fake news to shape public opinion during election campaigns, particularly among less informed voters. He also notes that the spread of misinformation can have a polarizing effect on the electorate, as it tends to reinforce and exacerbate existing partisan divisions.

The role of political elites in spreading misinformation and shaping public attitudes is another important area of research. Van Duyn and Collier (2019) conducted an experiment in which they exposed participants to misinformation that was attributed to either a prominent political figure or an anonymous source. They found that misinformation from political elites had a significantly greater impact on participants’ attitudes and policy preferences compared to identical misinformation from an unknown source. Importantly, this effect persisted even when participants were later presented with a correction of the misinformation. The authors argue that these findings demonstrate the unique power of political elites to shape public opinion, even when they are spreading false or misleading information. They suggest that the authority and credibility associated with political leaders may lead individuals to accept their statements at face value, without subjecting them to the same level of scrutiny as other sources. The study highlights the need for greater accountability and fact-checking of elite discourse in order to mitigate the impact of misinformation on public attitudes.

Given the evident impact of fake news and misinformation on political knowledge and attitudes, it is crucial to understand the effectiveness of different strategies for counteracting these effects. One common approach is the use of corrections or fact-checks to debunk false claims. However, research by Nyhan and Reifler (2010) suggests that the effectiveness of corrections may be limited, particularly among individuals with strong prior attitudes. In a series of experiments, they found that while corrections did sometimes reduce misperceptions, they could also have a “backfire effect” in which individuals became even more entrenched in their false beliefs. This effect was particularly pronounced among conservatives who were exposed to corrections of misperceptions that aligned with their political ideology. The authors argue that this finding highlights the challenge of overcoming motivated reasoning and the tendency for individuals to reject information that contradicts their existing beliefs. They suggest that effective corrections may need to be tailored to specific audiences and framed in ways that avoid triggering defensive reactions. The study underscores the complexity of combating misinformation and the need for further research on the psychological and social factors that shape individuals’ responses to corrective information.

The research on the impact of fake news and misinformation on political knowledge and attitudes paints a troubling picture of the ways in which false and misleading information can distort public understanding and opinion. From reducing accurate recall of candidate positions to shifting evaluations of politicians and dampening political engagement, misinformation has the potential to significantly undermine the quality and integrity of democratic processes. However, the research also points to important individual-level factors that can moderate the impact of misinformation, such as cognitive skills, analytical thinking, and prior political knowledge. Understanding these factors is crucial for developing effective interventions to build resilience against fake news and misinformation at both the individual and societal levels. Moreover, the findings on the role of political elites in spreading misinformation and the limited effectiveness of corrections highlight the need for a multi-faceted approach to combating this problem that goes beyond simply debunking false claims. Ultimately, addressing the challenge of fake news and misinformation will require a sustained effort to promote media literacy, critical thinking, and responsible communication practices among citizens, journalists, and political leaders alike.

Effects on Voting Behavior

Ultimately, the impact of misinformation on political decision making is a key concern, particularly when it comes to the crucial act of voting. Several studies have sought to quantify the extent to which fake news and misleading information can sway voters’ choices at the ballot box. One notable analysis by Gunther et al. (2018) focused on the 2016 U.S. presidential election and estimated the impact of belief in three prominent fake news stories on vote share. The researchers found that aggregate belief in these false stories depressed Hillary Clinton’s vote share by 4.2 percentage points, a substantial margin in a close election. This finding suggests that fake news has the potential to significantly influence electoral outcomes by shaping voters’ perceptions and judgments of candidates. Similarly, research on the 2016 Brexit referendum in the United Kingdom by Barone et al. (2019) found that exposure to misleading statements by the Leave campaign shifted support for leaving the European Union by 1.6 percentage points. While this effect may seem relatively small, it highlights the power of misinformation to sway public opinion on consequential policy decisions, even in the context of a national referendum with high stakes and intense scrutiny.

Building on these findings, Guess, Nyhan, and Reifler (2020) investigated the impact of exposure to fake news during the 2016 U.S. presidential election using a unique combination of survey data and individual-level web traffic records. This approach allowed the researchers to measure not only the prevalence of fake news exposure but also its distribution across the population. They found that exposure to fake news was heavily concentrated among a relatively small group of people, with just 10% of Americans accounting for 60% of total fake news consumption. This finding suggests that the direct effects of fake news may be limited to a subset of the electorate, rather than uniformly distributed. However, the researchers also estimated that this level of exposure, concentrated as it was, could still have had decisive effects in a close election like that of 2016. Specifically, they calculated that fake news exposure could have shifted the overall vote share by up to 0.7 percentage points, a margin that could easily prove consequential in a tight race. This study highlights the importance of considering not just the aggregate impact of fake news but also its distribution and concentration among particular segments of the population.

The impact of misinformation on voting behavior has also been studied outside of the U.S. context. Cantarella, Fraccaroli, and Volpe (2019) examined this issue in the context of the 2018 Italian general election, using a quasi-experimental design to estimate the effect of exposure to fake news on social media. The researchers leveraged a unique feature of the Italian electoral system, in which voters in different regions go to the polls on different dates, to compare the voting intentions of individuals before and after a major fake news event on social media. They found that exposure to misinformation had a significant impact on voting intentions, particularly among undecided voters and those with lower levels of political interest. In some regions, the researchers estimated that misinformation could have shifted the vote share by up to 2.3 percentage points, a sizable effect in a fragmented, multi-party system. This study underscores the potential for fake news to influence electoral outcomes even in contexts with different political and media landscapes from the United States. It also highlights the particular vulnerability of certain groups, such as the politically disengaged or undecided, to the effects of misinformation.

While the evidence suggests that fake news and misinformation can indeed sway votes, it is important not to overstate their electoral impact or to assume that they are the sole or even primary driver of voting behavior. In their analysis of the 2016 U.S. presidential election, Allcott and Gentzkow (2017) put the influence of fake news in perspective by comparing its consumption to another common form of political persuasion: television advertising. The researchers calculated that in order for fake news to have changed the outcome of the election, the average voter would need to have placed as much weight on a single fake news article as they did on 36 television commercials. Based on this comparison, they estimated that fake news would have needed to be about six times more influential than it actually was to have decisively swayed the election. This finding underscores the importance of considering the relative power of fake news compared to other sources of information and persuasion in the political environment. While misinformation can certainly have an impact, it is likely to be one factor among many that shape voters’ ultimate choices.

The impact of misinformation on political behavior may also extend beyond the act of voting itself. Pennycook et al. (2020) investigated the effects of fake news exposure on intentions to engage in other forms of political participation, such as donating to campaigns, volunteering, or attending political events. Across a series of survey experiments, the researchers found that exposure to fake news had modest and inconsistent effects on these behavioral intentions. In some cases, exposure to false stories did increase participants’ expressed willingness to engage in certain political activities. However, these effects were generally small in magnitude and varied considerably across different studies and contexts. The authors suggest that while fake news may have the potential to influence political engagement beyond voting, its impact in this domain may be more limited and contingent on specific circumstances. They also note the importance of distinguishing between self-reported intentions and actual behavior, as the former may be more susceptible to the influence of misinformation.

Even if fake news and misinformation can be shown to influence voters’ stated beliefs, intentions, and choices, their ultimate impact on real-world behavior may be more limited than it appears. Mercier (2020) makes this argument, drawing on the concept of “preference falsification,” or the tendency for people to publicly express opinions that differ from their privately held beliefs. In the context of politics, this might manifest as individuals claiming to support a candidate or position that aligns with perceived social desirability, while actually harboring different preferences. Mercier suggests that a similar dynamic may be at play with fake news: even if exposure to misinformation leads people to express different beliefs or intentions in a survey or public setting, they may revert to their true, underlying preferences when casting their votes in the privacy of the voting booth. In other words, the apparent impact of fake news on political attitudes may not always translate into real changes in behavior. While this argument is largely theoretical, it highlights the need for research that directly measures the behavioral consequences of misinformation, rather than relying solely on self-reported beliefs or intentions.

The complex and contingent nature of misinformation’s effects on voting behavior is further underscored by the range of findings across different studies and contexts. While some analyses, like those of Gunther et al. (2018) and Cantarella et al. (2019), suggest that fake news can substantially sway vote shares, others, like that of Allcott and Gentzkow (2017), indicate that its impact may be more limited relative to other factors. Similarly, while Guess et al. (2020) find evidence of potentially decisive effects concentrated among a small subset of the population, Pennycook et al. (2020) document more modest and variable impacts on political engagement beyond voting. These divergent results suggest that the electoral consequences of misinformation are likely to depend on a range of contextual factors, such as the specific nature and content of the false stories, the channels through which they spread, the characteristics and predispositions of the individuals exposed to them, and the broader political and informational environment. Fully understanding these complexities will require further research that carefully specifies the conditions under which fake news is most likely to influence voters and that directly measures its impact on real-world political behavior. In the meantime, the existing evidence suggests that while the effects of misinformation on voting are certainly cause for concern, they should be considered as one piece of a larger puzzle of political persuasion and decision making.

Strategies to Combat Misinformation

Considering the potential of misinformation to undermine democratic decision making, researchers have devoted significant attention to examining strategies for combating its spread and influence. One prominent approach that has gained traction in recent years is fact-checking. The basic premise of fact-checking is to verify the accuracy of claims made in public discourse and to provide corrective information when false or misleading statements are identified. A growing body of research has investigated the effectiveness of fact-checking in reducing belief in misinformation and promoting more accurate understanding of political issues. For example, studies by Porter et al. (2018) and Walter et al. (2020) have found that exposure to fact-checks can indeed improve the accuracy of individuals’ beliefs about the topics covered. These findings suggest that fact-checking can serve as a valuable tool for countering the effects of misinformation by providing credible, authoritative information to the public. However, the impact of fact-checking may not be uniform across all groups. Nyhan and Reifler (2012) found that the effectiveness of fact-checks in reducing misperceptions was largely limited to individuals with higher levels of political knowledge. This suggests that for fact-checking to have broad impact, it may need to be paired with efforts to improve overall political literacy and engagement.

Building on the insights from fact-checking research, Lewandowsky et al. (2012) have proposed a more comprehensive framework for debunking misinformation. Their approach emphasizes the importance of providing detailed refutations that directly address and discredit false claims, rather than simply labeling them as incorrect. This involves presenting clear, factual information that contradicts the misinformation and explains why it is wrong. The authors also stress the value of offering alternative explanations that provide a coherent, plausible account of the issue at hand. By giving people a compelling narrative to replace the misinformation, this strategy can help to fill the cognitive void left by debunking and reduce the risk of the false beliefs persisting. Another key element of Lewandowsky et al.’s framework is fostering an open and respectful dialogue with the individuals holding misperceptions. Rather than dismissing or attacking them, the authors recommend approaching these conversations with empathy and a genuine desire to understand their perspective. This can help to build trust and create a more receptive environment for corrective information. At the same time, Lewandowsky et al. caution against excessive repetition of the misinformation itself, even in the context of debunking. Research has shown that repeated exposure to false claims, even when they are being refuted, can inadvertently reinforce them in people’s minds through the illusory truth effect. Instead, the authors recommend focusing primarily on the facts and the counter-narrative, rather than dwelling on the misinformation.

The effectiveness of different fact-checking formats and strategies in the context of social media has been another area of research interest. Vraga and Bode (2017) conducted a series of experiments to investigate how variations in the presentation of fact-checks on social media platforms can influence their impact on users’ beliefs and behaviors. They compared two common approaches: direct rebuttals, which explicitly call out and correct false claims, and more subtle “snopes” boxes, which provide additional context and information without directly challenging the misinformation. The researchers found that both formats were effective in reducing participants’ belief in the false claims presented. However, the direct rebuttals were more likely to be noticed and shared by users, suggesting that they may have greater visibility and reach on social media. At the same time, the snopes-style fact-checks were less likely to be seen as confrontational or partisan, which could make them more palatable to a broader audience. These findings highlight the importance of considering not just the content of fact-checks but also their form and delivery when designing interventions for social media contexts.

While fact-checking can be an effective tool for correcting misinformation after it has spread, some researchers have explored the potential of proactive interventions that aim to prevent false content from gaining traction in the first place. Pennycook and Rand (2019) conducted a series of experiments investigating the impact of what they call “accuracy nudges” on social media sharing behavior. In these studies, participants were shown a mix of true and false news headlines and asked about their willingness to share each story on social media. In the treatment condition, participants were simply asked to rate the accuracy of each headline before deciding whether to share it. The researchers found that this subtle prompt to consider accuracy significantly reduced participants’ likelihood of sharing false stories, by up to 50% in some experiments. Importantly, this effect was observed across the political spectrum, among both liberals and conservatives. Pennycook and Rand argue that these findings demonstrate the power of encouraging people to engage in more reflective, deliberative thinking when consuming and sharing information online. By shifting attention to the concept of accuracy, even briefly, it may be possible to reduce the impulsive, uncritical propagation of misinformation. The researchers suggest that social media platforms could implement similar accuracy prompts as a scalable intervention to combat the spread of false content.

While technological solutions and platform-level interventions have a role to play in combating misinformation, many experts argue that promoting individual media literacy skills is a crucial long-term strategy. Media literacy education aims to equip people with the knowledge and tools to critically evaluate the information they encounter online, to understand the ways in which media can influence beliefs and behaviors, and to make informed decisions about their own media consumption and creation. Research has shown that well-designed media literacy interventions can be effective in improving people’s ability to distinguish credible information from misinformation. For example, Guess et al. (2020) conducted a randomized controlled trial of a media literacy intervention delivered to participants via a social media platform. They found that individuals who received the intervention, which focused on teaching strategies for evaluating the credibility of online information, were significantly better at identifying false news stories than those in the control group. Similarly, McGrew et al. (2017) developed and tested a curriculum for teaching students how to evaluate the credibility of digital sources. They found that students who participated in the curriculum showed significant improvements in their ability to assess the reliability of information and to detect indicators of misinformation, compared to a control group.

While much of the research on media literacy has focused on teaching critical evaluation skills, some scholars argue for a more comprehensive approach. Bulger and Davison (2018) propose a framework for media literacy education that goes beyond just helping individuals spot false content, to fostering a deeper understanding of the role of media in society and the ways in which media messages shape perceptions and beliefs. Their model includes components such as understanding media structures and economics, recognizing patterns of media representation and bias, and learning to create and communicate effectively using media tools. The authors argue that this more holistic approach to media literacy is necessary to prepare individuals for active, informed engagement in the digital public sphere. By empowering people not just to critically consume media but also to thoughtfully create and share their own messages, this model aims to cultivate a more participatory and resilient information ecosystem.

Ultimately, most experts agree that effectively countering misinformation will require a multifaceted approach that addresses both the supply and demand sides of the problem (Lazer et al., 2018). On the supply side, this may involve efforts by social media platforms and other information providers to identify and label false content, to limit its spread through algorithmic interventions, and to reduce the financial incentives for creating and disseminating misinformation. It may also require stronger fact-checking networks and partnerships between platforms, news organizations, and research institutions. On the demand side, empowering individuals to be more discerning and responsible consumers of information is key. This can involve the kinds of media literacy education and technological nudges discussed above, as well as efforts to boost overall trust in reliable information sources and to foster a culture of truth-seeking and critical inquiry.

One innovative approach to building demand-side resilience to misinformation is the use of “inoculation” strategies. Roozenbeek and van der Linden (2019) developed an online game called “Bad News” which seeks to preemptively expose people to the techniques used in the creation of fake news, so that they can better recognize and resist these techniques when encountered in the real world. In the game, players take on the role of a fake news creator, learning to use tactics like emotional manipulation, polarization, and conspiracy theories to gain followers and influence. Through this simulated experience, players develop a kind of “mental antibody” against common misinformation strategies. Initial studies have found that playing “Bad News” can significantly improve people’s ability to spot and reject fake news, with effects lasting for several months. The success of this gamified inoculation approach suggests that engaging, interactive experiences can be a powerful tool for building cognitive defenses against misinformation.

Combating the complex problem of misinformation will undoubtedly require sustained effort on multiple fronts, from improving the quality and reach of fact-checking, to promoting widespread media literacy, to designing effective technological and regulatory interventions. By understanding the psychological, social, and structural factors that enable the spread of false and misleading information, and by rigorously evaluating the impact of different counter-measures, researchers and practitioners can work towards a more informed, deliberative, and resilient information ecosystem. Ultimately, the fight against misinformation is a fight for the health and integrity of democratic society itself. As such, it demands the attention, ingenuity, and commitment of all stakeholders, from individual citizens to media organizations to policymakers. While the challenges are formidable, the research reviewed here offers cause for optimism, demonstrating the potential of evidence-based strategies to reduce the influence of misinformation and to empower people to make more informed, discerning choices about the information they consume and share.

 Theoretical Framework

 Agenda Setting Theory

The Agenda Setting Theory was formally introduced by Maxwell McCombs and Donald Shaw in 1972 through their groundbreaking study known as “The Chapel Hill Study,” published in Public Opinion Quarterly under the title “The Agenda-Setting Function of Mass Media.” Agenda setting theory describes the media’s ability to shape the public’s perception of issue importance (McCombs & Shaw, 1972). According to this theory, the issues that receive the most media coverage are the ones that the public comes to see as the most important, regardless of their objective significance. In other words, the media has the power to set the agenda for public discourse and influence which topics are at the forefront of people’s minds.

In the context of fake news, agenda setting theory has been applied to understand how misinformation can shape the public’s perception of what issues matter (Guo & Vargo, 2018). When fake news stories gain widespread attention and coverage, even if that coverage is critical or debunking in nature, they can still have the effect of making the topics they address seem more salient and important to the public.

For example, a study by Vargo et al. (2018) found that fake news stories during the 2016 U.S. presidential election were successful in setting the agenda for mainstream media coverage. The researchers found that the topics and issues addressed by fake news stories were subsequently picked up and covered by traditional news outlets, even if the specific claims made in the fake stories were not endorsed or were actively debunked. This suggests that fake news can influence the agenda of public discourse even when it is recognized as false.

The agenda setting power of fake news has important implications for political decision making. If misinformation is able to shape the issues that the public sees as important, it can potentially distort the priorities and concerns that individuals bring to bear when evaluating political candidates or policies. This can have downstream effects on voting behavior and other forms of political participation.

Understanding the agenda setting role of fake news also highlights the importance of media literacy and critical thinking skills. If individuals are able to recognize when a story or issue is being overhyped or manipulated, they may be less susceptible to having their priorities shaped by misinformation. This underscores the value of educational interventions that aim to equip individuals with the tools to critically evaluate the information they encounter in the media.

Similarly, the Motivated Reasoning theory posits that individuals process information in ways that conform to their pre-existing beliefs and desires (Kunda, 1990). This theory suggests that people are not always objective when evaluating information, but rather are motivated to reach conclusions that align with their existing attitudes and goals

METHODOLOGY       

The mixed methods approach combining a survey and interviews is well-suited to the research objectives. The population of study is the electorate of Oredo local government in Edo State, Nigeria who are eligible to vote in the September 2024 gubernatorial election. This encompasses citizens aged 18 and above who are registered voters in Edo State. For this purpose, Oredo local government was selected as the case study location. According to Independent National Electoral Commission (INEC) data from the Continuous Voter Registration exercise, there are approximately 313,553 registered voters in Oredo local government in Edo State for the 2024 elections. The sample size of 400 was determined using Yamane’s formula with a 95% confidence level and 0.05 margin of error.

Data presentation and analysis

Table 1: Age Range

Responses Frequency  Percentage %
18-24 95 27.1
25-34 105 30.0
35-44 80 22.9
45-54 50 14.3
55 years and above 20 5.7
Total 350 100

(Source: Field Survey, 2025)

Table 2: Gender of the Respondents

Responses Frequency Percentage (%)
Male 188 53.7
Female 162 46.3
Total 350 100

(Source: Field Survey, 2025)

TABLE 3: Frequency Distribution Showing Respondents’ Educational Qualifications

Response Frequency Percentage
No formal education 20 5.7
Primary 43 12.3
Secondary 102 29.1
Tertiary 185 52.9
Total 350 100

(Source: Field Survey, 2025)

Table 4: Frequency Distribution showing Respondents’ Occupation

Occupation Frequency Percentage
Student 33 9.4
Civil servant 49 14.0
Private sector employee 85 24.3
Self-employed 102 29.2
Unemployed 81 23.1
Total 350 100

Source: field survey 2025

Table 5: Frequency Distribution showing Respondents’ Marital Status

Response Frequency Percentage
Single 192 54.9
Married 140 40.0
Divorced 5 1.4
Separated 13 3.7
Total 350 100

Source: field survey 2025

Table 6: Types Of Fake News And Misinformation

S/N ITEM Yes

(Freq %)

No

(Freq %)

Not sure

(Freq %)

6 Have you seen stories promoting falsehoods about a candidate’s background, qualifications or experience? 295

84.3

50

14.3

5

1.4

7 Have you come across conspiracy theories or unverified allegations of corruption against any of the candidates? 250

71.4

60

17.2

40

11.4

8 Have you encountered doctored images or videos aimed at distorting a candidate’s image or words? 155

44.3

125

35.7

70

20.0

9 Have you seen fake opinion polls or surveys misrepresenting public sentiment about the candidates? 295

84.3

50

14.3

5

1.4

10 Do you encounter fabricated or misleading stories about the Edo State governorship candidates on social media? 300

85.7

50

14.3

0

0.0

11 Do you notice stories from unknown or suspicious sources posing as legitimate news about the election? 260

74.2

60

17.2

30

8.6

Field survey 2025

Table 7: Factors Contributing To Spread Of Fake News

S/N

STATEMENT

SA

(Freq %)

A

(Freq %)

SD

(Freq %)

D

(Freq %)

12 Partisanship and desire to promote preferred candidates contributes to people spreading fake news and misinformation. 154

61.6

75

30.0

12

4.8

9

3.6

13 Lack of adequate fact-checking and verification by the public enables the spread of fake election stories. 129

51.6

86

34.4

24

9.6

11

4.4

14 Sensationalism and the attention-grabbing nature of fake news encourages more sharing. 149

59.6

72

28.8

15

6.0

14

5.6

15 Distrust of mainstream media leads people to turn to and spread information from unverified alternative sources. 170

68.0

54

21.6

9

3.6

17

6.8

16 Low media literacy skills among the public make it hard to identify and resist fake news. 112

44.8

95

38.0

29

11.6

14

5.6

17 The anonymity of social media and messaging apps emboldens the sharing of misinformation. 98

39.2

105

42.0

34

13.6

13

5.2

18 Insufficient legal consequences or penalties for peddling political fake news encourages the practice. 101

40.4

79

31.6

15

6.0

55

22.0

19 Foreign interference, such as by hackers or troll farms, amplifies the fake news problem. 126

50.4

56

22.4

42

16.8

26

10.4

Field survey 2025

Table 8: Strategies To Counter Fake News

S/N STATEMENT SA

(Freq %)

A

(Freq %)

SD

(Freq %)

D

(Freq %)

20 Fact-checking initiatives by media houses, civil society and election authorities are effective at countering fake news. 109

43.6

100

40.0

28

11.2

13

5.2

21 Media literacy education for the public is crucial for building resilience to political misinformation. 95

38.0

125

50.0

25

10.0

5

2.0

22 Credible information campaigns by election management bodies can neutralize viral fake news. 156

62.4

75

30.0

12

4.8

7

2.8

23 Promoting ethical journalism practices and self-regulation in the media industry reduces misinformation. 134

53.6

98

39.2

9

3.6

9

3.6

24 Technological solutions like artificial intelligence are useful for detecting and blocking fake content. 129

51.6

94

37.6

22

8.8

5

2.0

25 Investigating and prosecuting perpetrators of political misinformation can deter the practice. 138

55.2

89

35.6

12

4.8

11

4.4

26 Encouraging responsible social media use and self-restraint by politicians limits misinformation. 146

58.4

91

36.4

10

4.0

3

1.2

27 International pressure and diplomacy can check foreign sponsorship of fake election news. 98

39.2

129

51.6

13

5.2

10

4.0

Field survey 2025

Table 9: Recommendations For Future Elections

S/N STATEMENT SA

(Freq %)

A

(Freq %)

SD

(Freq %)

D

(Freq %)

28 Political parties should commit to refraining from manipulation and focus on issue-based campaigns. 126

50.4

88

35.2

23

9.2

13

5.2

29 The government should strengthen laws and enforcement against political misinformation. 140

56.0

87

34.8

14

5.6

9

3.6

30 Media organizations must invest in fact-checking systems and uphold truth-telling standards. 98

39.2

109

43.6

8

3.2

35

14.0

31 The education system should prioritize digital literacy skills to empower informed voters. 128

51.2

72

28.8

39

15.6

11

4.4

32 Civil society groups should sustain misinformation surveillance and sensitization throughout the electoral cycle. 154

61.6

75

30.0

12

4.8

9

3.6

33 Electoral authorities need to proactively disseminate timely, accurate information to fill the void exploited by fake news. 129

51.6

86

34.4

24

9.6

11

4.4

34 Social media platforms must enhance content moderation and user verification policies during sensitive election periods. 149

59.6

72

28.8

15

6.0

14

5.6

35 International election observers should integrate fake news monitoring into their mandate. 170

68.0

54

21.6

9

3.6

17

6.8

Field survey 2025

DISCUSSION OF FINDINGS

Research Question 1. What is the extent and nature of fake news and misinformation circulating in the context of the Edo State 2024 Governorship Election?

To answer this question, six questions in the questionnaire were designed superficially for this purpose and the findings reveals fake news and misinformation circulate the content of the Edo state 2024 governorship election. This study aligns with the work of  Nyhan, and Reifler, (2012). Which stated that election in Nigeria, like many modern elections worldwide, saw a significant increase in the circulation of fake news and misinformation? This phenomenon, while not unique to Edo State, has gained traction due to the growing reliance on digital platforms for political discourse and information sharing. The extent and nature of misinformation during this election cycle have been substantial, with several factors contributing to its spread and impact.

In the finding it shows that spreading of fake news during the Edo State 2024 election was widespread, primarily amplified through social media channels such as Facebook, Twitter, WhatsApp, and blogs. Misinformation ranged from fabricated claims about the candidates’ backgrounds and policies to rumors intended to destabilize the credibility of the election process itself. Reports indicated that false information about the voting process, such as claims of rigging, vote-buying, and electoral violence, circulated widely. These rumors often had little or no basis in reality but were effective in shaping public perception, particularly among voters who lacked access to accurate and reliable information.

The extent and nature of fake news and misinformation in the Edo State 2024 governorship election were significant. The deliberate spread of false information, often designed to manipulate public opinion or undermine trust in the electoral process, had far-reaching implications for voter behavior and election integrity. Given the growing reliance on digital platforms for political communication, addressing the spread of misinformation in future elections will require comprehensive efforts from political actors, media organizations, fact-checking bodies, and the general public to ensure the integrity of the democratic process.

Research Question 2.  What are the key factors contributing to the spread of fake news and misinformation during the election campaign?

This research question aimed at establishing how the spread of fake news and misinformation during the Edo State 2024 governorship election can be attributed to several key factors, with the rapid growth of digital media being the most significant. Social media platforms such as WhatsApp, Facebook, and Twitter played a pivotal role in amplifying false narratives due to their viral nature. These platforms allowed unverified content to be shared quickly and widely, often without proper checks on accuracy. Political campaigns, supporters, and interest groups leveraged these platforms to disseminate misleading information about opponents, voting processes, and election outcomes. With the increasing penetration of smartphones and internet access across the state, even individuals in remote areas were exposed to, and often shared, this false content, contributing to its spread. The speed and scale at which misinformation circulated on social media made it increasingly difficult for fact-checkers and credible news outlets to keep up with debunking these claims in real-time.

Another key factor contributing to the spread of fake news was political polarization and the strategic use of misinformation by various political actors. As the election intensified, rival political parties and their supporters engaged in disinformation campaigns to undermine the credibility of their opponents and sway public opinion. Misinformation about candidates’ personal lives, false accusations of corruption, and exaggerated claims about their policies were common tactics used to tarnish reputations and diminish trust. Furthermore, the lack of media literacy among a significant portion of the electorate compounded the problem, as many voters were unable to critically assess the information they received. Combined with limited access to traditional, reliable news sources, voters often turned to social media as their primary source of information, making them more vulnerable to the influence of fake news. This created a perfect storm for the proliferation of false narratives during the election campaign.

Research Question 3. What strategies are employed by political parties, candidates, and other stakeholders to counter fake news and misinformation in the context of the election?

The findings in this study showed that to counter the spread of fake news and misinformation during the Edo State 2024 governorship election, political parties, candidates, and other stakeholders employed a variety of strategies aimed at both curbing misinformation and informing the public. One of the key strategies was the active engagement of fact-checking organizations. The Nigerian Fact-Checkers’ Coalition (NFC), along with other fact-checking bodies such as Africa Check and FactCheckHub, played a crucial role in monitoring and debunking false claims in real time. These organizations worked to verify the accuracy of political statements, social media posts, and news articles, ensuring that voters had access to credible information. Additionally, political parties and candidates engaged in media campaigns to directly address and refute misinformation. This included public statements, press conferences, and the use of official social media accounts to clarify issues, correct falsehoods, and promote their own policy platforms.

Another significant strategy involved in political parties’ collaboration between political stakeholders, media outlets, and technology companies. The Independent National Electoral Commission (INEC) worked with media organizations to ensure that election-related information was accurate and widely disseminated. INEC and other electoral bodies also issued guidelines on how to identify and report fake news, urging voters to rely on credible news sources. Political candidates and their teams worked to create educational campaigns aimed at raising awareness about the dangers of misinformation, encouraging voters to verify information before believing or sharing it. In some instances, media houses and journalists were trained on fact-checking techniques and tools to better scrutinize political content. This multi-pronged approach involving fact-checking, media collaboration, and public awareness campaigns represented a concerted effort by stakeholders to mitigate the harmful effects of misinformation during the election period.

Research Question 4. What recommendations can be made to mitigate the impact of fake news and misinformation on political decision-making in future elections in Edo State and Nigeria?

This objective aimed to establish the recommendations that can made to mitigate the impact of fake news and misinformation on political decision-making in future elections in Oredo, Edo State, a comprehensive approach involving education, regulation, and collaboration is crucial. First, enhancing media literacy across the population should be a top priority. Implementing widespread educational programs that teach citizens how to critically evaluate sources of information, identify misinformation, and distinguish between credible and unverified content is essential. These programs should target schools, universities, and communities, equipping voters with the skills necessary to make informed decisions and avoid being swayed by false narratives. Additionally, encouraging the use of fact-checking tools and platforms as part of electoral awareness campaigns can empower voters to seek out the truth and resist the spread of misinformation.

Secondly, stronger regulatory frameworks are needed to hold social media platforms and other digital content providers accountable for the spread of fake news. While many platforms have made efforts to curb misinformation, more robust policies are required to ensure that false political content is quickly flagged, reviewed, and removed. This could involve partnerships between the government, social media companies, and independent fact-checking organizations to create a unified strategy for tackling disinformation. Furthermore, stricter penalties for individuals or groups found intentionally spreading false information to influence elections could act as a deterrent. Finally, fostering collaboration among political parties, civil society organizations, media outlets, and technology companies is key to building a united front against misinformation. By working together, stakeholders can create an ecosystem of trust, transparency, and accountability that ensures the integrity of the electoral process and helps safeguard democracy in future elections.

CONCLUSION

The survey underscores the serious impact of fake news and misinformation on the Edo State 2024 Governorship Election. The widespread exposure to false claims, manipulated media, and deceptive sources highlights how misinformation can distort public perception and hinder informed political choices. This growing trend poses a significant challenge to the integrity of democratic processes. Addressing it requires greater media literacy, stronger fact-checking efforts, and increased public awareness to safeguard political decision-making and ensure a more transparent electoral environment.

The findings suggest that a multifaceted approach is essential for effectively combating fake news and misinformation in the electoral process. Strategies such as fact-checking, media literacy education, and credible information campaigns are widely recognized as impactful in building public resilience. Ethical journalism and self-regulation within the media industry, alongside technological solutions like AI, further strengthen the fight against misinformation. Holding perpetrators accountable through legal action and leveraging international cooperation to address foreign interference are also important steps. Together, these measures can help protect the integrity of elections and promote informed political participation.

Factors such as partisanship, low media literacy, lack of effective regulation, and the unchecked use of digital platforms all contributed to the problem. While efforts were made by stakeholders to counter misinformation through fact-checking and public awareness, these measures need to be scaled up and sustained. To protect the integrity of future elections in Edo State and Nigeria at large, a coordinated approach involving education, technology, regulation, and responsible political communication is essential.

RECOMMENDATION

Sequel to the above, the following are suggested:

  1. The government should implement targeted media literacy programs across schools, communities, and voter education platforms to equip citizens with the skills to critically assess and verify information, especially during election periods.
  2. Support the growth of independent fact-checking organizations and encourage collaboration with media houses and electoral bodies to swiftly identify and debunk fake news before it spreads widely.
  3. Establish and enforce clear legal consequences for individuals or groups found guilty of deliberately spreading political misinformation, while ensuring these measures do not infringe on freedom of expression.
  4. Encourage political parties, candidates, and their supporters to commit to ethical campaigning by refraining from the use of false or misleading content, and promoting transparency and accountability in their communications.

REFERENCES

  1. Ajanaku, A., Chioma, I., & Olatunji, A. (2024). Report on West Africa regional countering disinformation program: Information environment assessment ahead of 2024 Edo State off-cycle election. West Africa Regional Countering Disinformation Program [Technical Report].
  2. Allcott, H., & Gentzkow, M. (2017). Social media and fake news in the 2016 election. Journal of Economic Perspectives, 31(2), 211-236.
  3. Allcott, H., & Gentzkow, M. (2017). Social media and fake news in the 2016 election. Journal of Economic Perspectives, 31(2), 211-36.
  4. Apuke, O. D., & Omar, B. (2021). Fake news and COVID-19: Modelling the predictors of fake news sharing among social media users. Telematics and Informatics, 56, 101475.
  5. Apuke, O. D., & Omar, B. (2021). Fake news and COVID-19: Modelling the predictors of fake news sharing among social media users. Telematics and Informatics, 56, 101475.
  6. Balmas, M. (2014). When fake news becomes real: Combined exposure to multiple news sources and political attitudes of inefficacy, alienation, and cynicism. Communication Research, 41(3), 430-454.
  7. Barone, G., D’Acunto, F., & Narciso, G. (2019). Fake news, political scandal, and the vote: A laboratory experiment. Trinity Economics Papers, No. 0619.
  8. Barthel, M., Mitchell, A., & Holcomb, J. (2016). Many Americans believe fake news is sowing confusion. Pew Research Center, 15.
  9. Barthel, M., Mitchell, A., & Holcomb, J. (2016). Many Americans believe fake news is sowing confusion. Pew Research Center, 15.
  10. Boddy, C. R. (2016). Sample size for qualitative research. Qualitative Market Research: An International Journal, 19(4), 426-432.
  11. Bradshaw, S., & Howard, P. N. (2018). Challenging truth and trust: A global inventory of organized social media manipulation. The Computational Propaganda Project. https://comprop.oii.ox.ac.uk/wp-content/uploads/sites/93/2018/07/ct2018.pdf
  12. Bulger, M., & Davison, P. (2018). The promises, challenges, and futures of media literacy. Journal of Media Literacy Education, 10(1), 1-21.
  13. Cantarella, M., Fraccaroli, N., & Volpe, R. (2019). Does fake news affect voting behavior? DEMB Working Paper Series, No. 146.
  14. De keersmaecker, J., & Roets, A. (2017). ‘Fake news’: Incorrect, but hard to correct. The role of cognitive ability on the impact of false information on social impressions. Intelligence, 65, 107-110.
  15. Farkas, J., & Schou, J. (2018). Fake news as a floating signifier: Hegemony, antagonism and the politics of falsehood. Javnost – The Public, 25(3), 298-314.
  16. Farooq, G. (2018). Politics of fake news: How WhatsApp became a potent propaganda tool in India. Media Watch, 9(1), 106-117.
  17. Fazio, L. K., Brashier, N. M., Payne, B. K., & Marsh, E. J. (2015). Knowledge does not protect against illusory truth. Journal of Experimental Psychology: General, 144(5), 993–1002.
  18. Freelon, D., & Wells, C. (2020). Disinformation as political communication. Political Communication, 37(2), 145-156.
  19. Guess, A. M., Lerner, M., Lyons, B., Montgomery, J. M., Nyhan, B., Reifler, J., & Sircar, N. (2020). A digital media literacy intervention increases discernment between mainstream and false news in the United States and India. Proceedings of the National Academy of Sciences, 117(27), 15536-15545.
  20. Guess, A., Nagler, J., & Tucker, J. (2019). Less than you think: Prevalence and predictors of fake news dissemination on Facebook. Science Advances, 5(1), eaau4586.
  21. Guess, A., Nyhan, B., & Reifler, J. (2018). Selective exposure to misinformation: Evidence from the consumption of fake news during the 2016 U.S. presidential campaign. European Research Council, 9(3), 4.
  22. Guess, A., Nyhan, B., & Reifler, J. (2020). Exposure to untrustworthy websites in the 2016 US election. Nature Human Behaviour, 4(5), 472-480.
  23. Gunther, R., Beck, P. A., & Nisbet, E. C. (2018). Fake news may have contributed to Trump’s 2016 victory. Ohio State University: Communication. https://www.documentcloud.org/documents/4429952-Fake-News-May-Have-Contributed-to-Trump-s-2016.html
  24. Guo, L., & Vargo, C. (2018). “Fake News” and emerging online media ecosystem: An integrated intermedia agenda-setting analysis of the 2016 U.S. Presidential Election. Communication Research, 47(2), 178–200.
  25. Hassan, I., & Hitchen, J. (2020). Driving Division? Disinformation and the New Media Landscape in Nigeria. Centre for Democracy and Development.
  26. Independent National Electoral Commission. (2019). Distribution of registered voters by state and local government area. https://www.inecnigeria.org/wp-content/uploads/2019/03/Registered-Voters-by-State-and-LGA-1.pdf
  27. Jack, C. (2017). Lexicon of lies: Terms for problematic information. Data & Society. https://datasociety.net/library/lexicon-of-lies/
  28. Jolley, D., & Douglas, K. M. (2014). The effects of anti-vaccine conspiracy theories on vaccination intentions. PLoS One, 9(2), e89177.
  29. Kuklinski, J. H., Quirk, P. J., Jerit, J., Schwieder, D., & Rich, R. F. (2000). Misinformation and the currency of democratic citizenship. Journal of Politics, 62(3), 790-816.
  30. Kunda, Z. (1990). The case for motivated reasoning. Psychological Bulletin, 108(3), 480–498.
  31. Lazer, D. M. J., Baum, M. A., Benkler, Y., Berinsky, A. J., Greenhill, K. M., Menczer, F., … & Zittrain, J. L. (2018). The science of fake news. Science, 359(6380), 1094-1096.
  32. Lazer, D. M., Baum, M. A., Benkler, Y., Berinsky, A. J., Greenhill, K. M., Menczer, F., … & Zittrain, J. L. (2018). The science of fake news. Science, 359(6380), 1094-1096.
  33. Lewandowsky, S., Ecker, U. K., Seifert, C. M., Schwarz, N., & Cook, J. (2012). Misinformation and its correction: Continued influence and successful debiasing. Psychological Science in the Public Interest, 13(3), 106-131.
  34. Lupia, A. (2016). Uninformed: Why people know so little about politics and what we can do about it. Oxford University Press.
  35. Machado, C., Kira, B., Hirsch, G., Marchal, N., Kollanyi, B., Howard, P. N., … & Barash, V. (2019). News and political information consumption in Brazil: Mapping the first round of the 2018 Brazilian presidential election on Twitter. Comprop Data Memo4 / 17 October 2018, Oxford, UK.
  36. McCombs, M. E., & Shaw, D. L. (1972). The agenda-setting function of mass media. Public Opinion Quarterly, 36(2), 176-187.
  37. McGrew, S., Ortega, T., Breakstone, J., & Wineburg, S. (2017). The challenge that’s bigger than fake news: Civic reasoning in a social media environment. American Educator, 41(3), 4-9.
  38. Mejia, R., Beckermann, K., & Sullivan, C. (2018). The disinformation order: Disruptive communication and the decline of democratic institutions. European Journal of Communication, 33(2), 122-139.
  39. Mercier, H. (2020). Not born yesterday: The science of who we trust and what we believe. Princeton University Press.
  40. National Health Research Ethics Committee of Nigeria (NHREC). (2018). National code of health research ethics. https://nhrec.net/nchre_2018.pdf
  41. Nyhan, B., & Reifler, J. (2010). When corrections fail: The persistence of political misperceptions. Political Behavior, 32(2), 303-330.
  42. Nyhan, B., & Reifler, J. (2012). Misinformation and fact-checking: Research findings from social science. New America Foundation.
  43. Okoro, N., & Emmanuel, N. O. (2018). Fake news and hate speech: A threat to national peace and unity in Nigeria. Igwebuike: An African Journal of Arts and Humanities, 4(3), 54-69.
  44. Pennycook, G., & Rand, D. G. (2019). Fighting misinformation on social media using crowdsourced judgments of news source quality. Proceedings of the National Academy of Sciences, 116(7), 2521-2526.
  45. Porter, E., Wood, T. J., & Kirby, D. (2018). Sex trafficking, Russian infiltration, birth certificates, and pedophilia: A survey experiment correcting fake news. Journal of Experimental Political Science, 5(2), 159-164.
  46. Roozenbeek, J., & Van Der Linden, S. (2019). Fake news game confers psychological resistance against online misinformation. Palgrave Communications, 5(1), 1-10.
  47. Silverman, C., & Singer-Vine, J. (2016). Most Americans who see fake news believe it, new survey says. BuzzFeed News. https://www.buzzfeednews.com/article/craigsilverman/fake-news-survey
  48. Taber, C. S., & Lodge, M. (2006). Motivated skepticism in the evaluation of political beliefs. American Journal of Political Science, 50(3), 755-769.
  49. Thorson, E. (2016). Belief echoes: The persistent effects of corrected misinformation. Political Communication, 33(3), 460-480.
  50. Van Duyn, E., & Collier, J. (2019). Priming and fake news: The effects of elite discourse on evaluations of news media. Mass Communication and Society, 22(1), 29-48.
  51. Vargo, C. J., Guo, L., & Amazeen, M. A. (2018). The agenda-setting power of fake news: A big data analysis of the online media landscape from 2014 to 2016. New Media & Society, 20(5), 2028-2049.
  52. Vraga, E. K., & Bode, L. (2017). Using expert sources to correct health misinformation in social media. Science Communication, 39(5), 621-645.
  53. Walter, N., Cohen, J., Holbert, R. L., & Morag, Y. (2020). Fact-checking: A meta-analysis of what works and for whom. Political Communication, 37(3), 350-375.
  54. Wardle, C., & Derakhshan, H. (2017). Information disorder: Toward an interdisciplinary framework for research and policy making. Council of Europe Report, 27.
  55. Wardle, C., & Derakhshan, H. (2017). Information disorder: Toward an interdisciplinary framework for research and policy making. Council of Europe. https://rm.coe.int/information-disorder-toward-an-interdisciplinary-framework-for-researc/168076277c
  56. Wasserman, H., & Madrid-Morales, D. (2019). An exploratory study of “fake news” and media trust in Kenya, Nigeria and South Africa. African Journalism Studies, 40(1), 107-123.
  57. Weeks, B. E. (2015). Emotions, partisanship, and misperceptions: How anger and anxiety moderate the effect of partisan bias on susceptibility to political misinformation. Journal of Communication, 65(4), 699-719.
  58. Yamane, T. (1967). Statistics: An introductory analysis (2nd ed., pp. 886-887). Harper and Row.
  59. Zaller, J. R. (1992). The nature and origins of mass opinion. Cambridge University Press.

Article Statistics

Track views and downloads to measure the impact and reach of your article.

0

PDF Downloads

18 views

Metrics

PlumX

Altmetrics

Paper Submission Deadline

Track Your Paper

Enter the following details to get the information about your paper

GET OUR MONTHLY NEWSLETTER