Abstract
Contemporary legislative frameworks, which intend to provide safeguards but continue to be ineffective in addressing the contemporary threats, including deepfakes, biometric spoofing, and AI impersonation, even though there is the presence of the Information Technology Act, 2000 and the Act on DPDP, 2023. This Article examines how these frameworks fail to be effective and permit abuse through vague definitions, weak penalties, and systemic loopholes like inferred consent and unregulated profiling. This Article analyses statutory texts and landmark cases while drawing comparative insights from global models like the GDPR and the Delete Act of California with the help of a doctrinal legal research approach. Suicide has been committed by elders because of cyber fraud and deepfake scams, and the same states that the legislative inaction costs humans. This Article refers to the issue as “legislative laundering” and states it as a failure to properly regulate. It wraps up with strong and urgent suggestions, which focus on fraud related to AI as a criminal offence. Further, it requires a clear consent of users, setting strict and transparent security standards, giving more power to regulatory bodies, and running awareness programs to protect vulnerable groups online. The legal response of India must not only prioritise rights, dignity, but also safety over formalistic compliance to ensure meaningful protection in the digital era driven by AI.
Introduction: Legislative Laundering and the Enabling Architecture of Cyber-Predation in India
The legal framework related to cyberspace in India is enabling the harms which it claims to prevent. A deeply tragic incident unfolded in Karnataka (March 2025), where an elderly couple took their own life because they lost around ₹50 lakh to scammers who were pretending to be government authorities. Another incident within 4 months occurred on June 7, 2025, where authorities booked a 42-year-old Hindi teacher for weaponising POCSO with algorithmic disinformation through a cyberspace platform or social media, and the same claimed that an 11th-class student had been sexually assaulted and impregnated by a colleague. A person from Kerala fell victim to a scam driven by AI that same month and ended up losing nearly forty thousand rupees. The scam was related to a deepfake involving some close relative. These are symptoms of systemic legislative laundering rather than isolated failures, where laws like the DPDP Act, 2023 and IT Act, 2000, create a facade of protection. Meanwhile, it permits AI-driven predation through planned ambiguities, unenforceable responsibilities, and engineered loopholes. Legislative laundering occurs when laws are made to look protective on paper but quietly allow harm through vague language, soft penalties, or built-in loopholes. According to the Cyber Threat Report of India 2025, India is now at a turning point where it is dealing with an overwhelming surge in cyber threats that is growing considerably. The Digital Threat Report 2025, published by a combined team of SISA, CERT-IN, and CSIRT-FIN, reported that the banking of India, financial services, and insurance (BFSI) sector saw an astonishing 175% increase in cyberattacks in 2024, in large part owing to AI-enabled phishing, deepfake scams, and business email compromise (BEC); BEC was 25% of all breaches. Hackers not only target third-party providers but also poorly configured cloud environments. As 2025 begins, dangers like deepfakes, AI hacking, crypto scams, and weak IoT devices (over 32.9 billion of them) are growing fast. Cyber laws of India remain frozen in an analogue or traditional logic. The definition of the IT Act ignores AI impersonation and biometric theft; on the other hand, the consent mechanisms of the DPDP Act crumble before algorithmic manipulation, as also highlighted in the Report on AI Governance Guidelines Development, 2025, released by MeitY. This Article employed a doctrinal research approach to examine the legislative framework, particularly the Information Technology Act, 2000 and the Digital Personal Data Protection Act, 2023, using the New Chicago School regulatory lens of Lawrence Lessig.[i] It argues that the legal system of India engages in legislative laundering, and the same superficially regulates cybercrime while enabling AI-based harm. It critically examines their provisions through a rights-based lens, and the same is supported by case laws, government publications, and international legislative comparisons. Part II uses global threat data and demographics to critique core IT Act provisions and DPDP Act sections, and the same is resulting in exposing outdated definitions, weak penalties, consent loopholes and poor enforcement that fuel AI‐driven fraud against vulnerable groups. Part II frames this as “legislative laundering” and expands the offences of the IT Act to cover AI impersonation and biometric theft.
II. A Comparative Critique of India’s Outdated Cyber Laws
The Global Cybersecurity Outlook 2025 of the World Economic Forum stated that cyberspace is not only impacted by threats which are driven by AI but also by deepfakes and geopolitical risks. The organisations which are small in numbers are facing huge cyber threats that have increased about seven times. It has also been inferred that only 37% of users take the time to properly check AI tools before using them. Cybercrime causes a loss of $1 trillion around the globe. As of 2021, there were around 14 crore senior citizens living in India. This number is expected to rise and reach 19.5 crore by the year 2031. The digital governance in countries like India is impacted heavily. On one side, we have the legislative framework, like the Information Technology Act, 2000, which was created with penalties (now outdated). On the other hand, we have the Digital Personal Data Protection Act, 2023 (DPDP), which is also a contemporary law that fails to provide real protection and, side by side, creates what many call a “consent theatre”. There is a crucial need to develop the right to control our data and stay safe online under Article 21 of the Constitution of India.[ii] Several countries have taken significant steps to tackle cybercrime. Countries like Germany, Japan, and China have not only updated their criminal laws to tackle contemporary digital threats but also shown how serious they are in dealing with online safety and strengthening the protection of data. The German Cybercrime regulation stands out for its focus on “data espionage” under Section 202a[iii] and “computer sabotage” under Section 303b of the German Criminal Code, 1871,[iv] which stops human beings from tampering with any kind of sensitive facts which are critical for an organisation. For example, where an individual hacks the security of a bank and thereby gains access to the information of consumers without their permission, then they would be prosecuted under Section 202a, and would face the same even if no alteration was made.[v] If they go further and crash the processing system of a bank, then the same would fall under Section 303b.[vi] Further, the approach of Germany is unique in nature, and it is based on the recognition of ethical hacking. Both countries criminalise not only the offences related to ransomware but also deepfakes and identity theft. The IT (Information Technology) Act of India (2000) is outdated as its penalties are weak. For example, if a person is accused of hacking, then he would be sentenced to 3 years under Section 66.[vii] The same lacks corporate accountability and even fails to cover the basic threats, which are modern in nature, like deepfakes. The urgency of this shift becomes very clear when we consider the seriousness. The IT Act offers very limited remedies. Sections 43[viii] and 66[ix] set compensation for damages at ₹5 crore and maximum jail time at 3 years, which is the lowest for those crimes which cause massive financial loss. On the other hand, Section 66C only covers basic identity theft using both passwords and electronic signatures;[x] meanwhile, modern threats like biometric spoofing and deepfakes are often ignored or unaddressed.[xi] The same section fails to cover those frauds generated by AI. Other provisions, like Section 66E[xii] and Section 72,[xiii] are too narrow and impose minimal fines that fail to prevent abuse. The DPDP Act contains 47 sections that focus more on procedures and less on actual protection. One major flaw is Section 7(a), the same relies on “voluntary provision” and “implied consent”.[xiv] This allows companies to assume that a user has consented just because they clicked “I agree” on complex terms and conditions, often without reading them. This is dangerous for vulnerable groups like seniors. Further, Section 13(2)[xv] doesn’t provide a clear timeline for redressing the grievances, the same contributes to causing the financial fraud. The Data Protection Board under Section 18[xvi] has no power to take proactive action or offer strong victim support. These laws are theoretical in nature (not practical in nature) and the same results to real harm. The Social Credit System of China gives people a specific score based on how they behave, for example, whether they pay their bills on time or follow public rules, and if someone thinks that they have been scored unfairly, then they can challenge the same before the prescribed authority. On the other hand, if the fingerprint of Aadhaar or biometric face scan fails to work, then the essentials like food rations, pension, or savings in the bank won’t be able to be accessed. Many people largely depend on these things in their daily lives. The Worst part is the lack of a proper way to complain or fix it. India must adopt a framework based on the rights under Article 21 and recognise six essential digital rights. First is to replace “implied consent” with explicit, informed, or opt-in consent for sensitive data like biometrics and health information, and the same should be similar to Article 7 of GDPR.[xvii] Second is where the IT Act should be updated to treat deepfakes and other scams which are generated by AI as “serious crimes”, along with strict punishment up to 10 years in jail. Third is where people should have a clear and simple right to ask companies to delete their data, which must be similar to the Delete Act of California. This act makes it mandatory for data brokers to erase your information if you request it. Fourth states that India must put a stop to profiling systems that automatically judge people based on their age, health, or financial condition. Such bias leaves the most vulnerable open to unfair treatment, aligned with GDPR Article 22.[xviii] Fifth states that Companies should be required to use top-level encryption and report any data leaks within 72 hours, and if they fail, then they shall be liable to face real penalties. Finally, the Data Protection Board should have real power, like the ability to run audits and must fine big corporations up to 4% of their global income, as per Article 58 of GDPR.[xix]
III. Conclusion: From Legislative Laundering to a Protective Legal Framework
‘Legislative laundering’ refers to the practice where laws, such as the IT Act (2000) and DPDP Act (2023), are designed to appear protective but incorporate deliberate ambiguities and loopholes that legitimise corporate exploitation and cyber threats, akin to laundering illicit gains. In the tech landscape of India, this allows AI-driven scams, deepfakes, and algorithmic biases to not only thrive unchecked but also to disproportionately harm vulnerable groups like the elderly, who lose savings to fraudulent schemes, and low-income individuals most of the times being denied to avail the essentials due to biometric glitches or implied consent manipulations, by prioritising corporate interests over genuine citizen safeguards. The legislative framework of India for cyberspace is not effective. The IT Act (2000) and DPDP Act (2023) fail to protect the rights of the citizens of India. When elderly couples commit suicide after losing all their savings to those scams committed with the help of AI, when the fraudsters contribute to rob the families of young and elderly people, and when giving and receiving food subsidies are all dependent on an algorithm, then in this case it reveals a brutal truth which is our laws aren’t capable to match with the contemporary issues. They create an illusion of safety while permitting harm through deliberate weaknesses, and vague definitions which often ignore contemporary issues like deepfakes, penalties, and loopholes like “implied consent” that let companies exploit the vulnerable groups (young and elderly). It is a legislative laundering rather than an oversight. India needs those laws which protect people, and not focus on all the paperwork. It is important to reform the legislative framework to tackle contemporary issues. First, we need to fix the gaps and amend the IT Act to put those offenders behind bars who have committed crimes using AI for more than 10 years with longer terms of imprisonment and criminalise biometric theft as a part of strict regulatory measures. Second, is that we must focus on protecting vulnerable groups by giving them real control, like replacing the word “implied consent” with clear permission for seniors. One more change is to reform the laws to make companies delete any data immediately after a breach. It is a fight for dignity and more than a concern related to rights regarding cyberspace. When an elderly person clicks “agree” as an option on screen, then she shouldn’t sign away all her savings. When a daily-wage worker scans his fingerprint for ration, he shouldn’t be denied food because of a technical glitch. The laws of India must focus more on protecting citizens rather than serving corporations.
[i] Lawrence Lessig, ‘The New Chicago School’ (1998) 27(2) Journal of Legal Studies 661
[ii] Constitution of India 1950, art 21.
[iii] Strafgesetzbuch (StGB) [German Criminal Code] 1871, § 202a.
[iv] Strafgesetzbuch (StGB) [German Criminal Code] 1871, § 303b.
[v] ibid
[vi] ibid
[vii] Information Technology Act 2000, s 66
[viii] ibid, 3
[ix] ibid
[x] Information Technology Act 2000, s 66C
[xi] Information Technology Act 2000, s 66D
[xii] Information Technology Act 2000, s 66E
[xiii] Information Technology Act 2000, s 72
[xiv] Digital Personal Data Protection Act 2023, s 47
[xv] Digital Personal Data Protection Act 2023, s 13(2)
[xvi] Digital Personal Data Protection Act 2023, s 13(2)
[xvii] Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data (General Data Protection Regulation) [2016] OJ L 119/1, art 7
[xviii] Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data (General Data Protection Regulation) [2016] OJ L 119/1, art 22
[xix] Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data (General Data Protection Regulation) [2016] OJ L 119/1, art 58
Leave a Reply