Deepfake is not limited to slandering individuals. Instead, by employing it, regular individuals and businesses are also being taken advantage of. One such recent case, from Hong Kong, has surfaced. A global corporation lost $25 million as a result of this deepfake.
This is the first instance of its kind where numerous firm personnel appear in deepfake films. Then those company’s personnel were singled out and conned.
What Is The Hong Kong Case?
Deepfake has been used by con artists to target a worker at the company’s Hong Kong branch. He made deepfake films of the Chief Financial Officer and numerous other staff members for this purpose. Employees of the company participated in a video conference following this. wherein he was requested to move the funds.
With the exception of the victim, every employee in this video conversation was a phony. That implies that it contained every person’s deepfake avatar. The con artists achieved this by using movies and other footage that is accessible on open platforms, making each participant in the meeting appear genuine.
The police are looking into this. According to him, this is the first instance of this kind of large-scale scam occurring in Hong Kong. In this instance, the police have not provided any details regarding the business or its worker. For a while now, deepfake technology has been in the news.
The police were notified about the scam that occurred in Hong Kong by the branch’s finance department. The employee who fell victim to the hoax, according to the authorities, followed the instructions provided during the contact. He had made fifteen transactions across five separate bank accounts, totaling 200 million Hong Kong dollars.
The employee learned that this was a scam when he asked about it at the corporate office. When Rashmika Mandanna’s deepfake video surfaced, the conversation over deepfake in India began. Another image that has gone popular is a deepfake of pop artist Taylor Swift. Since then, there has been a global push for tight regulations pertaining to deepfakes.