Sun, 10 May 2026

Study warns against malicious use cases of AI in 2024

Photo by Anna Shvets: https://www.pexels.com/photo/people-on-a-video-call-4226261/

Malicious use cases of artificial intelligence (AI) will most likely emerge from targeted deepfakes and influence operations, according to the Adversarial Intelligence: Red Teaming Malicious Use Cases for AI by Recorded Future.

Malicious use cases:

Deepfakes for impersonation: Threat actors can use publicly available short clips to generate deepfake and live cloning

Influence operations impersonating legitimate websites: Malicious actors can leverage AI to generate disinformation and automatically curate content based on generated text, decreasing the cost of content production by 100 times compared to traditional troll farms and human content writers.

Self-augmenting malware evading YARA: Malicious actors can use GenAI To evade string-based YARA rules by augmenting the source code of malware variants and scripts, lowering detection rates. 

ICS and aerial imagery reconnaissance: Threat players leverage multimodal AI to process public images and videos to geolocate facilities and identify industry control system (ICS) equipment.

Recommendations

As voices, videos, and photos of executives now become part of an organisation’s attack surface, Recorded Future analysts recommend organisations invest in multi-layered and behavioural malware detection capabilities to prepare for threat actors developing AI-assisted polymorphic malware. 

Moreover, organisations need to assess the risk of impersonation in targeted attacks. Recorded Future analysts suggest organisations use various alternate methods of communication and verification for sensitive transactions. 

To protect sensitive data, publicly available images and videos of critical infrastructure and sensitive sectors such as defence, government, energy, manufacturing, and transportation should be scrutinised and scrubbed. 

Related:  Temenos leads the way as Australian banks take to the cloud

Related Stories

MORE STORIES

Subscribe