In early September 2023 U.S. Securities and Exchange Commission Chair Gary Gensler said that deepfakes pose a “real risk” to markets. Deepfakes, fake videos or images generated by artificial intelligence (AI) but appearing at first glance to be authentic, can be made to represent high-profile investors and even regulators like Gensler, seeming to show these influential figures saying things that are likely to sway elements of financial markets. Creators of the deepfakes in these cases stand to benefit when they successfully turn the market with this deception.

While the potential for market turmoil is significant, the threat of deepfakes extends well beyond just that. Global accounting firm KPMG has pointed to a sharp increase in scams targeting businesses of all kinds with deepfake materials. These and other risks have led cybersecurity researchers on a frantic search for ways to stop—or at least slow down—malicious actors armed with these powerful tools. Deepfakers have created falsified videos of celebrities, politicians, and many others—often for fun, but also frequently to spread misinformation and worse.

Perhaps the greatest negative impact of deepfakes early in the nascent development of this technology, however, has been on individuals targeted by this technology. Extortion scams are proliferating in a host of different areas and with various strategies. A significant proportion of these scams involve the use of deepfake technology to create sexually explicit images or video of unwilling targets. Scammers can then demand a payment from the real-life…

Source link

Leave a Reply

Your email address will not be published. Required fields are marked *

Fill out this field
Fill out this field
Please enter a valid email address.
You need to agree with the terms to proceed