How AI strengthens election security and mitigates corruption

Countries around the world have held and are holding elections this year, and it is estimated that as much as half of world’s population will have gone to the polls before the year is over, including in the United States. This has put election security front-and-center for many in intelligence communities.

American author David Brinn has been quoted as saying, “It is said that power corrupts, but actually it’s more true that power attracts the corruptible.” This statement reflects the reality that elevating people to powerful positions raises the risks of bribery and corruption. Mitigating the risks can mean the difference between a political system that is just and one that is not.

In addition to the need to mitigate corruption of politically exposed persons (PEPs), election security professionals also are tasked with identifying and responding to cyber threats and misinformation around elections, including understanding threats across languages, in order to mitigate corruption. 

PEPs and elections

The international standard and policy-making body, the Financial Action Task Force (FATF), defines a politically exposed person (PEP) as “an individual who is or has been entrusted with a prominent function.” The FATF goes on to state that many PEPs hold positions that can be abused for the purpose of laundering illicit funds or other predicate offences such as corruption or bribery. The organization recommends that anti-money laundering (AML) and methods for combatting financial terrorism (CFT) be applied. The measures also can be applied for election security.

Automated workflows, artificial intelligence (AI) based screening, and advanced name matching technologies are all examples of technologies using AI to enhance effectiveness and efficiency for PEP risk management and mitigation. These solutions help address some common problems in traditional PEP screening approaches. 

The amount of publicly available information (PAI) on the intent is massive and growing all the time, which has increased the amount of data used for screening, often resulting in more false positives. When it comes to PEP screening and election security, simply turning off the screening processes is not an option. Yet, using large “If-then” spreadsheets to assess PEP risk based on organizational policy creates large files that require actively managed, manual updates — possibly straining workforces and leaving gaps in the intelligence.

A best practice approach is using software to automate PEP identification and ongoing monitoring. Advanced solutions enable risk scoring of PEPs, enabling users to allow them to configure workflows based on identified risks, while also allowing for more auto-resolve options without increasing risk. By improving efficiency and freeing up human resources for other tasks, law enforcement can achieve strong returns on investment by including scoring methodology in the technology workflows. 

Mitigating PEP risk with name matching

Implementing name matching technology can be done via a replacement of software solutions or by integrating a post-processing engine, which refines the matches produced by an existing system to a more manageable number. The technology can efficiently analyze massive volumes of data and distill it to actionable information in a timely manner.

Deploying AI in this way highlights its role as a force-multiplier, allowing humans to focus on more difficult tasks, such as problem-solving, crisis management and decision-making. It augments human capabilities, enabling analysis at speed and scale beyond human capacity, without replacing human involvement.

AI and election security

Chris McIsaac, a fellow with a focus on governance for the public policy think tank, R Street, recently published a paper on this topic, “Impact of Artificial Intelligence on Elections.” In it, McIsaac underscores that AI is already having an in impact on elections and political races relative to false information. He also points out that it “…also holds the potential to bolster the security of an election’s cyber infrastructure and improve the efficiency of election administration.” 

McIsaac goes on to advocate for empowering local election officials to focus on strengthening defenses, among other tactics.

AI-powered searches can better prepare agencies to thwart cyber attackers from disrupting election processes in a range of ways in order to combat fraud or attacks on elections. By analyzing PAI, it makes it possible to understand where the threats lie across a landscape, i.e.: what is happening in conversations and message boards or other areas of online discussion, and it can do this in multiple languages and scripts. 

Once this information is gleaned, it’s then possible to take remedial actions to thwart disruptions. 

Technology’s growing role in mitigating PEP corruption and strengthening election security 

As elections play out globally, the emphasis on mitigating risks associated with PEPs and election security is vital. David Brin’s observation that power attracts the corruptible underscores the necessity of stringent measures to safeguard political systems from bribery and corruption. Automated workflows with AI-based screening and advanced name matching technologies are enhancing the effectiveness of PEP risk management, reducing false positives and streamlining remediation tasks. 

By leveraging AI, not only can PEP screening be more efficient, but election security and safety can be bolstered against cyber threats, misinformation and more. This combined approach ensures a more robust defense of democratic processes. Agents focus on critical tasks while technology handles repetitive and large-scale data analysis. 

As the world navigates the complexities of modern elections, integrating technologies grows as a key strategy in maintaining integrity and trust in — as well as safety of — political systems worldwide.