Intelligent CISO Issue 76 | Page 72

end-point

ANALYSIS

HOW TO PROTECT BIOMETRIC SYSTEMS AGAINST DEEPFAKES

Mario Cesar Santos , VP Global Solutions at Aware , discusses the rise of deepfakes and how they can be exploited by malicious actors .

The rise of deepfakes significantly threatens people and businesses through videos and images created by Artificial Intelligence . Fraudsters and cybercriminals are finding applications for data breaches to generate deepfakes that have the potential to cause widespread damage .

By exploiting the credibility and authenticity of these fraudulent media , deepfakes can deceive , manipulate and defraud organisations and their customers . Understanding how deepfakes can be used against your customers is crucial for companies to develop effective strategies to mitigate their impact and protect against their misuse .
The term ‘ deepfake ’ originates from the combination of ‘ Deep Learning ’ and ‘ fake ’. Although it does not have a universally accepted definition , a deepfake generally means that a person present in existing content is replaced by the image of another person . Essentially , a deepfake is content like a photo , audio , or video that has been manipulated by Machine Learning ( ML ) and Artificial Intelligence ( AI ) to appear as something it is not .
Although deepfakes have garnered attention for their entertainment and creative value , they also present serious risks to businesses . Our idea here is to list some ways malicious actors can exploit this type of online fraud :
• Fraudulent content : One of the most immediate threats of deepfakes is their potential use in creating fraudulent content . Malicious actors can impersonate individuals in videos , making it seem like they are saying or doing things they never did . On a personal or business level , this can be used to spread false information and damage the reputations of people and brands .
• Social engineering attacks : Deepfakes can also be used in social engineering attacks , where attackers manipulate individuals to disclose confidential information or perform harmful actions . For example , a deepfake video could impersonate a CEO instructing an employee to transfer funds to a fraudulent account .
• Disinformation campaigns : Deepfakes can be weaponised in disinformation campaigns to manipulate public opinion . By creating convincing videos of political figures or other public figures saying or doing things they did not do , malicious actors can sow chaos and confusion .
• Identity theft : Deepfakes can be used to steal someone ’ s identity by creating fake videos or images that appear to be of the individual . This could be used to access confidential accounts or commit other forms of fraud .
• Sabotage and espionage : Deepfakes can also serve purposes of sabotage or espionage . For example , a deepfake video can manipulate a company ’ s stock price or damage its reputation .
72 WWW . INTELLIGENTCISO . COM