US intel says AI is boosting, but not revolutionizing, foreign efforts to influence the 2024 elections

nexninja
5 Min Read



CNN
 — 

Synthetic intelligence helps “enhance” reasonably than “revolutionize” affect operations from Russia and Iran aimed toward November’s US elections, the Workplace of the Director of Nationwide Intelligence mentioned in an assessment launched Monday.

“The (US intelligence group) considers AI a malign affect accelerant, not but a revolutionary affect device,” an ODNI official instructed reporters.

The brand new US evaluation is a counterpoint to a number of the media and trade hype about AI-related threats. However the expertise continues to be a high concern for US intelligence officers monitoring threats to the presidential election.

The chance to US elections from overseas, AI-generated content material is dependent upon the power of overseas operatives to beat restrictions constructed into many AI instruments, to develop their very own refined AI fashions, or to “strategically goal and disseminate” AI-generated content material, the official mentioned. “International actors are behind in every of those three areas.”

International operatives are utilizing AI to attempt to overcome language boundaries in concentrating on US voters with disinformation, in response to US officers.

Iran, for instance, has used AI to generate content material in Spanish about immigration, which Tehran perceives as a divisive US political concern, the ODNI official mentioned. Tehran-linked operatives have additionally used AI to focus on voters throughout the political spectrum on polarizing points just like the Israel-Gaza battle, the official mentioned. US officers believe Tehran is making an attempt to undercut former President Donald Trump’s candidacy.

Russia has generated essentially the most AI content material associated to the US election of any overseas energy, in response to the ODNI official. The AI-laced content material — movies, images, textual content and audio — have been in line with Moscow’s efforts to spice up Trump’s candidacy and denigrate Vice President Kamala Harris’ marketing campaign, the official mentioned.

China, in the meantime, is utilizing AI “to amplify divisive U.S. political points,” however to not attempt to form particular US election outcomes, the brand new US intelligence evaluation mentioned.

International operatives have additionally embraced loads of old-school affect methods this election cycle, comparable to staging movies reasonably than producing them with AI.

US intelligence companies imagine that Russian operatives staged a video that circulated on X earlier this month that falsely claimed that Harris paralyzed a younger woman in a 2011 hit-and-run accident, the ODNI official mentioned. The Russians promoted the story via an internet site pretending to be an area San Francisco media outlet, according to Microsoft researchers.

One other Russian-made video, which drew at the least 1.5 million views on X, claimed to indicate Harris supporters attacking an attendee of a Donald Trump rally, in response to Microsoft.

US intelligence companies warned in July that Russia deliberate to “covertly use social media” to attempt to sway public opinion and undermine help for Ukraine in swing states.

“Russia is a way more refined actor within the affect house on the whole, and so they have a greater understanding of how US elections work and the place to focus on and what states to focus on,” the ODNI official mentioned.

This isn’t the primary basic US election the place overseas powers have thought of deploying AI capabilities.

Operatives working for the Chinese language and Iranian governments ready pretend, AI-generated content material as a part of a marketing campaign to affect US voters within the closing weeks of the 2020 election marketing campaign however selected to not disseminate the content material, CNN previously reported. Some US officers who reviewed the intelligence on the time had been unimpressed, believing it confirmed China and Iran lacked the aptitude to deploy deepfakes in a manner that might severely influence the 2020 presidential election.

Source link

Share This Article
Leave a comment

Leave a Reply

Your email address will not be published. Required fields are marked *