5/13/2024
Federal prosecutors will pursue tougher sentences in cases in which artificial intelligence is used to commit an election-related crime, including threatening violence against election workers and voter suppression, Deputy US Attorney General Lisa Monaco said Monday.
The Justice Department policy change is an effort to keep pace with a chaotic information environment ahead of the 2024 presidential election, as AI tools have made it much easier to mimic politicians’ voices and likenesses to spread false information. The new policy applies to cases in which AI makes the election-related crime “more dangerous and more impactful,” Monaco said.
“Our democratic process and the public servants who protect it have been under attack like never before as threats evolve and spread,” Monaco told a meeting of the Justice Department’s Election Threats Task Force on Monday afternoon. AI and other advances in technology are “emboldening those threatening election workers and the integrity of our elections,” she said.
The emergence in recent years of AI-driven software programs that can produce deepfakes, or fake audio or video, has complicated the threat environment for election workers and the federal and state officials trying to protect them.
US officials focused on election security are concerned that AI tools will exacerbate this already-fraught threat environment, pointing to an incident during the Democratic primary in New Hampshire in January as an example.
An AI-made robocall imitating President Joe Biden targeted thousands of New Hampshire voters, urging them not to vote in the primary. A New Orleans magician made the robocall at the behest of a political consultant working for Minnesota Rep. Dean Phillips, a long-shot Democratic challenger to Biden, the magician told CNN.
Federal officials have also been mapping out ways in which foreign powers might try something similar to the fake Biden robocall to try to influence voters. Senior officials from the Biden administration in December drilled for a hypothetical scenario in which Chinese operatives created an AI-generated video showing a Senate candidate destroying ballots.
The Justice Department is already under pressure from election officials to do more to investigate the flood of harassing phone calls and emails they have received in recent years. Many of those threats have been made without AI tools and by people who falsely believe there was widespread fraud in the 2020 election.
An Ohio man was sentenced in March to over two years in prison for making death threats to an Arizona election official, whom he accused of committing fraud.
The danger is persistent. Thirty-eight percent of more than 900 local election officials surveyed in February and March by the nonprofit Brennan Center for Justice reported experiencing threats, harassment or abuse. More than half of the election officials surveyed expressed concern for the safety of their colleagues and staff.
“Election offices across the country continue to deal with threats and harassment for doing their jobs, and in many places, this behavior has been nearly nonstop since mid-2020,” Amy Cohen, the executive director of the National Association of State Election Directors, told CNN. “This should be unacceptable to all Americans.”
Election offices have in recent years been working more closely with state and local law enforcement on personal safety training and “physical security best practices to help staff stay safe,” Cohen said.