Deepfakes and the Law: Addressing the Regulatory Gap and Mitigating the Risks of AI-Generated Content
DOI:
https://doi.org/10.48165/msilj.2024.1.2.1Keywords:
Deepfake technology, AI, personal data privacy, GDPR, Digital Personal Data Protection Act 2023, cybersecurity, misinformation, identity theftAbstract
The creation of fake pictures, videos, and voices made possible with the help of Artificial Intelligence is rapidly gaining traction as an emerging threat. Initially meant for artistic and educational use, deepfake technology can be and is misused to suit one’s needs. Disguising oneself, spreading untrue information, fabricating videos of an individual without their permission, and even changing the perceived view of a society are all deepfake possibilities that can be done easily with this technology.As with everything, deepfakes come with their own set of problems and risks. Their core weakness lies in their use for fraud; criminals are now able to use AI to impersonate an individual’s voice and trick employees into unknowingly performing money transfers. This has caused businesses and people to lose large sums of money. Another blatant issue is non-consensual deepfake videos, where an individual’s likeness is placed over obscene content. Research indicates that the majority of deepfake pornography involves women, making the whole phenomenon an internet issue of stalking and mistreatment.Even with all their expansion issues, the current laws set in place are not sufficient to address deepfake crimes. Europe’s General Data Protection Regulation (GDPR), as well as India’s Digital Personal Data Protection Act of 2023, were supposed to protect personal data but do not resolve the issue of deepfakes. Therefore, victims have little legal aid to take action, while numerous criminals remain unchecked.This article analyzes the failure of current laws and proposes solutions for the same. It suggests that aggressive legal measures, advanced deepfake detection technology, and increasing awareness of artificial intelligence abuse would be needed to address this complex challenge.