top of page

Dark Patterns and Platform Accountability: A Regulatory Gap Analysis

[Awaneesh Kumar & Tanya Gangwar are final year law students at Gujarat National Law University, Gandhinagar.]


Introduction


In today’s digital age, e-commerce has brought a revolution in the way consumers access the market. It has designed a system of convenience. Unfortunately, it carries within it the framework of exploitation. Increasingly, e-commerce websites are exposing consumers to invisible manipulations embedded in digital designs by using dark patterns. Dark patterns are powerful strategies of behavioural influence consisting of deceptive interface designs that mislead, nudge, or coerce consumers into choices they might not have made otherwise or are contrary to their preferred decisions. These are not the accidental consequences of poor digital design but calculated strategies crafted to monetise user attention into revenue, often by undermining user autonomy.


Recognizing these harms, the Indian authorities have recently declared a strict crackdown on the manifestation of dark patterns in varied and subtle forms. The Central Consumer Protection Authority (CCPA) in June, 2025 has issued advisory to all online retailers to eliminate interfaces which compromise with the fair consumer experience in the digital space. It has directed all e- commerce platforms to carry self- audits and eliminate dark patterns. In this context, this article critically examines the effectiveness of the 2025 advisory without a clear levy of penalty in case of violation and an uncertain enforcement mechanism, the behavioural mechanisms which underpin such manipulative designs, and the broader implications for legal consent, consumer autonomy, and platform accountability in the digital marketplace.


Understanding The Tricks Behind User Decisions


Digital freedom to users not only implies free access but also protection from manipulations. E-commerce platforms design interfaces in a manner to trick users to make unintended choices such as making an uninformed purchase, sharing information, or agreeing to legal terms without intending to do so. These tricks are known as ‘dark patterns’. This term was introduced in 2010 by Harry Bringull, a user experience specialist in 2010 who sought to distinguish between a design pattern (standardised, reusable solution to a commonly occurring problem) and anti- pattern (bad design) but found that dark pattern goes further to weaponize the design into purposeful deception and manipulation.


Dark patterns flourish in the opacity of design, unlike the traditional consumer fraud which relies on deception. Their legality cannot be assessed merely through the lens of intent to deceive. It must be considered whether the user’s consent extracted through such manipulation holds validity under the Indian law. The Consumer Protection Act, 2019 has traditionally dealt with the issues of misrepresentation of goods and services, including defective products, fraudulent promises, or misleading advertisements. But dark patterns are different as they reveal a shift in focus from ‘what is sold’ to ‘how it is sold’ through their manipulative design practices.


This puts into challenge the conventional legal understanding of unfair trade practices. Platforms like Flipkart and Amazon have previously faced public criticism for leveraging “false urgency” prompts like “Hurry! Only 2 left” and MakeMyTrip has auto- included items like donations or insurance. This evolving nature of deception has required the law to respond not just through content regulation but interface regulation.


The Department of Consumer Affairs (DoCA) in 2023 released the Guidelines for Prevention and Regulation of Dark Patterns, 2023 (2023 Guidelines) was the first attempt by India to regulate dark patterns. These Guidelines shifted the focus to the design itself rather than proving individual harm or intent which essentially meant that if a pattern has the tendency to mislead, it can be actionable. The Guidelines identify and prohibit 13 specific dark patterns, namely, false urgency, basket sneaking, confirm shaming, forced action, subscription trap, interface interference, bait and switch, drip pricing, disguised advertisement, nagging, trick question, Saas billing, rogue malware.  


For instance, BookMyShow was issued a notice for basket sneaking, where a ₹1 contribution to “BookASmile” was pre-selected without user consent. Following the CCPA’s intervention, the platform had to revise its interface to provide a clear opt-in choice. Similarly, IndiGo Airlines was also served a notice for confirm shaming by using coercive language such as “No, I will take risk” when a user used to decline add-ons. After CCPA’s notice, the interface was redesigned for neutrality and transparency, including clearer disclaimers on seat selection.


The 2025 Advisory For Self‑Audit: A Welcome Step, But Not Enough


The CCPA, in its 2025 Advisory, has advised all the e-commerce platforms to conduct internal self-audits within three months. This advisory is framed as a voluntary compliance mechanism which requires platforms to identify and eliminate dark patterns in their interfaces according to the categories listed under the 2023 Guidelines. It also introduces a Joint Working Group comprising government officials, legal experts and industry stakeholders to access the declarations undertaken by the platforms. This is a commendable step toward better regulation. However, it relies too much on self-audits. There is no strong legal enforcement and independent review mechanism. The results are not disclosed with the public. This raises serious doubts regarding the practical effectiveness of the advisory.


Certain cases from ride-hailing applications illustrate the limitations of the advisory. In May 2025, CCPA issued a notice to Uber for introducing an “advance tipping” feature which prompts users to advance pay tips with messages like “Add a tip for faster pickup”. Uber responded to the notice, but the feature remained widespread across similar services. Such practices qualify as confirm shaming and drip pricing which are dark patterns that users feel forced into paying for securing basic service. Yet, despite regulatory attention, these platforms are not known to have proactively removed the feature, highlighting a gap between advisory intent and actual reform.


The advisory requires self-audits but does not mandate third-party audits, fixed timelines for corrective action, or public disclosure of results. Without these elements, it risks becoming a formal exercise rather than meaningful policy. It reflects regulatory intent, but self-reporting alone is not sufficient, especially when platforms may address only the most visible issues. The framework must include independent audits, transparent reporting, and penalty provisions. This will help move it from a voluntary checklist to a reliable tool for consumer protection.


There is no doubt regarding the advisory carrying enforceable weight. Recently, the High Court of Delhi in National Restaurant of India v. Union of India, held that the guidelines issued by the CCPA under the power granted to it by Section 18(1) of the Consumer Protection Act, 2019 are enforceable and binding. This reinforces that the CCPA is not merely a recommendatory or advisory body and thus, advisories must be treated with seriousness of statutory obligation. Thus, in the absence of mandatory audit disclosures and penalties for non- compliance with the advisory, platforms enjoy absolute control over the disclosure mandated by the advisory. Thus, effective implementation of the advisory cannot rest solely on its legal enforceability, it must be accompanied by a framework for accountability which motivates the platforms to make serious reforms towards their obligations under the Advisory.


Other countries offer relevant lessons. The European Union’s Digital Services Act, 2022 requires very large online platforms to assess and reduce risks in how they are designed, including manipulative features. Compliance is checked through audits led by the European Commission. In the United Kingdom, the Competition and Markets Authority has looked into harmful design practices in subscriptions, such as traps and confusing processes, and has required changes to how users move through these sites. In the United States, the Federal Trade Commission has held that design can be a deceptive trade practice. It has taken actions against companies using hidden fees, pre-checked boxes, and hard-to-use unsubscribe options.


Conclusion


The 2025 Advisory recognizes the harms that deceptive designs pose to consumer autonomy. Such designs are against Rule 4(9) of the Consumer Protection E- Commerce Rules, 2020, which mandates that the consent of the consumer should be driven by explicit and affirmative actions. This Advisory acknowledges that the 2023 Guidelines have not been fully complied with, as evidenced by the issuance of notice to certain platforms by the CCPA. Understandably, the 2025 Advisory was expected to deliver a robust enforcement mechanism such as provisions for third-party independent audits, mandatory public declarations and a strict penalty structure.  


The voluntary nature of the Advisory lacks binding obligations. It poses risks of superficial compliance aimed at creating appearances of change rather than serious and systematic reforms. Consequently, the individual user has to be burdened with more vigilance from identifying, avoiding and subsequently reporting the dark patterns. This places an unfair burden on the consumers while undermining the regulatory intervention aimed at protecting their interests. In the absence of deep structural and regulatory reforms, user autonomy remains vulnerable to manipulative digital designs.


ree

 
 
 

Comments


Thanks for submitting!

  • LinkedIn
  • Instagram
  • Twitter

©2020 by The Competition and Commercial Law Review.

bottom of page