close
close

Election officials are warning about bad actors and tricks, but it’s not all doom and gloom

Election officials are warning about bad actors and tricks, but it’s not all doom and gloom

From Morgan Leason

Election experts worry that partisanship and a lack of trust in this November’s election could be a problem, especially because of the public’s susceptibility to misinformation.

Some are particularly concerned that information warfare could erupt in the final days before the election, when people can absorb false information but there is no time left to set the record straight.

“Elections tend to be a powder keg,” said Dan Avondoglio, an expert at the federal government’s Cybersecurity and Infrastructure Security Agency, also known as CISA. There are, he said, “opportunities to exert influence at the last minute.”

Less than a month before Election Day, experts gathered last week near the University of Maryland campus to discuss these concerns with the National Consortium for the Study of Terrorism and Responses to Terrorism, a university research and education center .

The common consensus was cause for concern. It is becoming increasingly difficult to conduct elections, officials said, and costs are rising. It’s easier than ever to spread misinformation and harder to combat it. To make matters worse, artificial intelligence will play a new and unpredictable role.

The driving cost drivers are the costs of training election workers, the costs of auditing and the costs of protecting databases from cyberattacks, said Ben Hovland, a member of the US Election Assistance Commission.

Securing democracy, polling station by polling station

“We are seeking an all-hands approach to this election,” Hovland said. And when it comes to polling stations, “those are even softer targets.”

In areas where elections are particularly close, states and counties are under scrutiny and could be active targets for bad actors, Hovland explained.

Cody Buntain, an assistant professor who studies misinformation and disinformation, said artificial intelligence should be a primary concern. It’s the nature of social media to create filter bubbles and echo chambers, he said, limiting people’s exposure to content they disagree with.

That’s why, he said, artificial intelligence has the power to “quickly generate content that reinforces people’s existing beliefs.”

Doug Lombardi, a professor in the College of Education’s Department of Human Development and Quantitative Methodology, said the attack strategy of the 2016 presidential cycle has evolved and become more effective. Similar to Russian propaganda at the time, today’s actors are involved in “efforts to promote social grievances and social disruption,” he said.

The solution is to help people think critically, Lombardi said, which means making decisions that help strengthen people’s agency and combat confirmation bias.

Amy Pate, director of the START program, argued that in today’s environment, misinformation is not only cheap and effective, but also difficult to correct.

Pate said most people voted without regard to how the vote would be counted. Now, she said, people need to know how it works to have confidence in the election outcome.

Still, the consortium’s message is not all doom and gloom. The more skeptical and aware people become of election misinformation, the more resilient they will be, officials say.

As more people become aware of the vote certification process, “a heightened awareness helps build some resilience,” Pate said.

Related Post