- August 8, 2023
T.N. cybercrime police issue advisory on deepfake scams
The Cyber Crime Wing of the Tamil Nadu Police have issued an advisory on fake video calls that are being made by scamsters using artificial intelligence (AI) technology.
Deepfake technology is being used to perpetrate several types of fraudulent scams, by creating highly convincing and realistic fake content, often using AI to manipulate audio, video, or images. Initially, this technology was primarily utilised for entertainment purposes, enabling filmmakers and content creators to seamlessly integrate actors into scenes or impersonate historical figures. However, as the technology evolved, so did its misuse by criminals seeking to exploit the power of deception, the police have said.
Additional Director General of Police, Cyber Crime Wing Sanjay Kumar said, “The scam involving AI-generated deepfake video calls typically follows a series of carefully orchestrated steps, combining technological sophistication with psychological manipulation. The scamster creates a fake profile, often using stolen images or publicly-available photographs of trusted individuals, like friends or family members. They then use AI-powered deepfake technology to create highly realistic video calls on social media/other online platforms and impersonate someone the victim knows, such as a friend, family member, or colleague to deceive them into thinking it’s a genuine conversation. Later, they create a sense of urgency and request the victim to transfer money to their bank accounts.”
Police said the deepfake is carefully designed to mimic the appearance and mannerisms of the impersonated person. In addition to the video manipulation, scamsters are using AI-generated voice synthesis to mimic the voice of the impersonated person, enhancing the illusion of authenticity during the video call.
Mr. Kumar said, “Though there has been no complaint in this regard received so far from the State, we wish to alert citizens to be aware and to be vigilant about such frauds. People should stay informed about the latest scams, including those involving AI technology, and be cautious when receiving video calls from unexpected sources.”
When receiving a video call from someone claiming to be a friend or family member, make a phone call to their personal mobile number to verify their identity before transferring any money, said the ADGP.
Limit sharing of personal data
The advisory also asks citizens to limit the amount of personal data shared online and adjust privacy settings on social media platforms to restrict access to information, and to consider using multi-factor authentication and other identity verification measures to protect accounts from unauthorised access.
The Cyber Crime Wing Police also said if anyone suspects that they have been a victim of a deepfake video call fraud or have come across suspicious activity, it is crucial to take immediate action and report the incident by calling the Cyber Crime tollfree helpline number 1930 or by registering a complaint at www.cybercrime.gov.in. Citizen can also contact the concerned platform where the fraudulent activity took place, and provide them with all the relevant details, including the scamster’s profile information, messages, and any evidence collected.