A report from the US Treasury has called on banks to bolster their risk management frameworks in order to avoid being overwhelmed by AI-powered fraudsters and cybercriminals.
The report, which was the result of an executive order issued by Joe Biden in 2023, interviewed 42 bank executives about their cybersecrity concerns.
It found that developments in AI and the the emergence of so-called deep fakes are an increasing worry for US banks.
“The financial services sector is increasingly subject to costly cybersecurity threats and cyber-enabled frauds,” states the report. “As access to advanced tools becomes more widespread, it is likely that, at least initially, cyberthreat actors utilising emerging AI tools will have the advantage by outpacing and outnumbering their targets."
The report comes amid an alarming rise in consumer complaints. The FBI’s Crime Complaint Center received 880,000 calls from victims of cyber crimes in 2023, a 22% increase on the previous year.
In addition to urging banks to update their risk management frameworks to acocunt for developments in AI, the report also calls for more collaboration between banks and for more data sharing.
It also calls on regulators to be mroe "dynamic" in their rule-making. Not only would this improve defences against fraudsters, it would also help to promote the swift deployment of AI tools that adhere to more robust risk management standards.
"Such a balanced regulatory environment is crucial for empowering institutions to harness AI’s full potential in combating sophisticated threats, without being hindered by overly restrictive oversight," states the report.
“Artificial intelligence is redefining cybersecurity and fraud in the financial services sector, and the Biden administration is committed to working with financial institutions to utilise emerging technologies while safeguarding against threats to operational resiliency and financial stability,” said Nellie Liang, undersecretary for domestic finance, in a statement that accompanied the report.
By on Thu, 28 Mar 2024 11:04:00 GMT
Original link