Abstract
The Digital Personal Data Protection Act, 2023 (DPDPA) seeks to address the global problem of “consent fatigue” by creating a new intermediary called the Consent Manager (CM). Here, the Consent Manager is conceived as a fiduciary agent for the Data Principal. The Consent Manager is meant to collect, manage, and operationalise consent through a transparent and interoperable dashboard. According to the principle, this design aims to shift power from data aggregators to individuals, recognised as Data Principals.
Purpose Of The Consent Manager Framework
The framework introduces a structured mechanism to empower individuals in the digital ecosystem.
- The Consent Manager acts as a fiduciary agent for the Data Principal.
- It collects, manages, and operationalises consent.
- Consent is administered through a transparent and interoperable dashboard.
- The design seeks to shift power from data aggregators to individuals recognised as Data Principals.
Concerns Under The Digital Personal Data Protection Rules, 2025
However, in this paper, you shall find out that the way this framework is implemented under the Digital Personal Data Protection Rules, 2025, raises concerns. The basic reason for these concerns is that the Rules impose only a high financial entry barrier of ₹2 crore in net worth. However, there is no mandate for any structural independence for Consent Managers. These choices risk undermining the purpose of the legislation.
Financial Entry Barrier And Regulatory Structure
| Regulatory Aspect | Provision Under The Rules | Concern Raised |
|---|---|---|
| Entry Requirement | ₹2 Crore Net Worth Requirement | High Financial Barrier For New Entrants |
| Structural Independence | No Explicit Mandate | Potential Conflict Of Interest |
| Regulatory Design | Limited Structural Safeguards | Risk Of Undermining Legislative Intent |
Risk Of “Captured Gatekeepers”
In this paper, the author argues that the current regulatory design may produce “captured gatekeepers” instead of independent guardians. Here, you shall find out that the regime does not prohibit vertical integration between Data Fiduciaries and Consent Managers. Which results, dominant digital platforms may create their own subsidiary Consent Managers. This paper makes it clear that it allows them to block competition and retain control over user data flows. This amounts to the idea of “Regulatory Capture by Design.”
Competition Law Perspective
From principles of competition law, such as vertical foreclosure, essential facilities doctrine, etc, the author presents the economic incentives of the present framework, which support data monopoly structures.
- Vertical foreclosure risks.
- Application of the essential facilities doctrine.
- Economic incentives favouring dominant digital platforms.
- Possibility of reinforcing data monopoly structures.
Proposed Regulatory Solution
The paper suggests a structured proposal, which calls for a mandatory separation between platforms and consent. It also recommends that the entity designated as Data fiduciaries or Data Processors must not have economic control/ interests over the entity that grants consent, directly or through third-party entities.
- Mandatory separation between digital platforms and consent managers.
- Restrictions on economic control by Data Fiduciaries or Data Processors.
- Prevention of indirect control through third-party entities.
Introduction
In India, personal data management continues to reflect a structural imbalance between platforms and users due to weak digital governance. The ‘notice-and-consent’ model, which was the cornerstone of privacy self-management, has now weakened in practically all settings. This is particularly evident where users are affected by cognitive overload or design patterns that nudge consent defaults.
The DPDP framework intends to respond to these concerns through the Consent Manager architecture, which was first outlined by the Justice B.N. Srikrishna Committee.[1] The same was used to pilot the Account Aggregator system in the Banking and Financial sector.[2] Instead of a direct, platform-level consent exchange, the model introduces the Consent Manager as a mediated interface. But ultimately, its effectiveness depends on implementation standards and the conduct of the independent fiduciary.
In India, the Digital Personal Data Protection Act, 2023 (DPDPA) terms this intermediary as a Consent Manager, and gives it a statutory form.[3] Under Section 6, read with Section 2(g), the Consent Manager is not treated as a mere service provider. It is required to be a registered entity with direct accountability to the Data Principal.[4]
The Consent Manager is required to mandatorily offer a transparent, accessible, and interoperable platform to give, manage, review, and withdraw consent. The statute gives a new and significant statutory right to the Data Principal. Firstly, it creates a mediating technical layer that separates “consent” from “service,” and, secondly, it sets up a single-point neutral console enabling individuals to control their digital footprint.
However, due to the framework, the author is concerned that this newly created statutory right is under a concealed threat. The situation is created by the economic barrier in the Digital Personal Data Protection Rules, 2025.[5] Rule 4 and the First Schedule set a steep financial eligibility criterion, requiring a minimum of ₹2-crore net worth, without any discussion on its structural independence. This benchmark condition alters the competitive landscape of this emerging sector in multiple ways, having a cascading effect.
The Paradox Of The “Corporate” Fiduciary
The author observes the diverging and conflicting pressures between the fiduciary nature of the Consent Manager and the imperatives of its commercial structure while observing its architecture. It appears that the Data Empowerment and Protection Architecture (DEPA) presume a “data-blind” Consent Manager. It would carry consent metadata without accessing the actual data of the individual.[6] Yet, the DPDP Act and its Rules do not address the corporate relationship between Consent Managers and Data Fiduciaries.
This statutory silence leaves space for market distortions. In practicality, without a ban on vertical integration, a dominant Data Fiduciary—such as a large social media platform or a major telecom provider—may establish a wholly owned subsidiary to act as a Consent Manager. Which surely creates a classical “Principal–Agent” conflict. On the one hand, the Consent Manager appears to serve the user, but on the other hand, it is financially dependent on the Data Fiduciary.
In this paper, you will find out that such an arrangement produces an “Illusion of Choice.”[7] When the entity that seeks user data also owns the entity that mediates consent, this means the privacy architecture will favour data extraction over data minimization. The legislated system will shift from “unmanaged consent” but to “captured consent” instead of “autonomous consent”. One can imagine that in such a scenario, the compliance infrastructure can become a tool for market foreclosure and consumer manipulation.
Competition Law And The Risk Of Vertical Foreclosure
The lesser-known aspect of this structural flaw is that the consequences extend into competition law. We already know that digital markets rely heavily on network effects and involve high switching costs. Therefore, access to data and acquiring the data become a critical input. If dominant platforms direct or subtly push users toward their in-house Consent Managers, they can entrap the Data Principal to consent data within their closed ecosystem, without anybody becoming aware of it.[8]
Therefore, the author has used the doctrine of vertical foreclosure from competition law to study this risk. When a dominant firm controls an essential downstream input, it can increase costs for competitors or limit the interoperability for independent Consent Managers.[9] This concern is a practical concern as global regulatory history shows that dominant companies often use privacy frameworks to harm competitors, which is seen in the debates around “privacy sandboxes.”[10]
Roadmap Of The Argument
This article follows a structured approach as stated below:
- The Second Section – The DPDP Act 2023 And The Promise Of Data-Principal Autonomy: Explains the provisions on Consent Managers. It compares rights-based language with the technical obligations and highlights the dependence on the DEPA model.
- The Third Section – Operationalisation Under DPDP Rules 2025: Makes a critical analysis of the recently notified Rules. It presents the registration conditions in Rule 4 and the First Schedule and argues that the high net-worth requirement excludes civil society groups and non-profit groups, who are better suited to this fiduciary function.
- The Fourth Section – Systemic Risks And Practical Concerns: Forms the core of the analysis. It discusses four risks:
- The exclusion of SMEs and startups
- The vertical integration of major Data Fiduciaries and Consent Managers
- The “Independence Paradox”
- The burdening of consumers with indirect compliance costs
- The Fifth Section – Inter-Regime Comparative Insights: The author has compared the Indian model in a global context. It compares India’s Consent Manager architecture with the GDPR’s Consent Management Platforms (CMPs) and Australia’s Accredited Data Recipients under the Consumer Data Right (CDR). It identifies global lessons for us to learn from compliance market failures.
- The Sixth Section – Competition-Law Overlay And Policy Consequences: The author has mapped potential jurisdictional overlap between the Data Protection Board of India (DPB) and the Competition Commission of India (CCI) and argues that being an essential facility connected to a statutory right, consent infrastructure should operate as a public utility rather than controlled or gated by a private entity.
- The Seventh Section – Reform Recommendations: Suggestions for legislative and policy reforms are made. It includes a structural separation clause and a tiered entry framework for non-profit Consent Managers.
Conclusion Preview
Finally, the article concludes by red-flagging concerns that, without these reforms, the DPDPA may only succeed in digitising consent but fail to truly democratise the rights of the Data Principal.
The DPDP Act 2023 and the Promise of Data-Principal Autonomy
The Digital Personal Data Protection Act, 2023 (DPDPA), marks a diametrical shift from the traditional “omnibus” privacy models as applicable in the EU or the US. The GDPR places most of the responsibility on the Data Controller.[11] The Indian law takes an entirely different route. The Indian system creates an institutional layer or entity that aims to directly empower the data principal.[12] This layer or entity is called the Consent Manager. Only after we have examined its statutory identity, its technical origins, and the specific rights it is designed to operationalise can we understand the risks surrounding this entity.
Statutory Genesis: From Financial Aggregation to General Privacy
The concept of Consent Manager did not originate within the DPDPA alone. Its legal limb has its intellectual roots in the Justice B.N. Srikrishna Committee Report.[13] That report identified a core problem: individuals cannot control their personal data because the cost of doing so is too high.
Whereas its technical roots are derived from the Account Aggregator (AA) framework governed by the Reserve Bank of India.[14] In the fintech ecosystem, AAs function as “data-blind” pipes. In this, a user’s “explicit consent” is required to move financial information from a Financial Information Provider to a Financial Information User. This model was later expanded by NITI Aayog through the Data Empowerment and Protection Architecture (DEPA).[15] DEPA aimed to “democratize data access” by allowing individuals to port their data securely across platforms with “explicit consent”.
The DPDPA takes this narrow, sector-specific architecture to transform it into a general rule of privacy law. In doing so, it tries to solve the familiar “notice-and-consent” failure.
- Privacy policies are too long.
- They are too complex.
- Individuals cannot manage them.
The Act attempts to fix this by shifting consent management to a specialised and interoperable intermediary.
The Statutory Identity of the Consent Manager
The DPDPA defines a Consent Manager as:
“a person registered with the Board, who acts as a single point of contact to enable a Data Principal to give, manage, review and withdraw her consent through an accessible, transparent and interoperable platform.”[16]
Read with Section 6(7), this definition gives the Consent Manager three legal characteristics.
Agency
The phrase “enable a Data Principal” reflects an agency relationship. A Data Fiduciary processes data for its own ends. A Consent Manager processes consent only for the individual’s ends. It is designed to act as a digital extension of the user’s will.
Interoperability
The Act requires an “interoperable platform.” This means consent must be portable across systems. A user cannot be locked into a single ecosystem. A consent granted through one Consent Manager must be recognised by any Data Fiduciary.
Accountability
Section 6(9) mandates that the Consent Manager “shall be accountable to the Data Principal.” This statutory responsibility has the effect of changing the Consent Manager from an entity hosting a software tool to a quasi-governmental body, assigned with a duty to safeguard a statutory right.[17]
The Technical Core: The “Consent Artefact”
To appreciate the risk and responsibilities involved in digital gatekeeping, one must understand the technical unit of consent. Under the DPDP framework, “consent” is converted into a technical document called the Consent Artefact.[18]
The DPDP Rules of 2025 and the DEPA protocol rely heavily on this concept.[19] Simply put, the Consent Artefact is a machine-readable, digitally-signed file.
Key Parameters of the Consent Artefact
| Parameter | Description |
|---|---|
| Data Fiduciary | An entity requesting data – called the Data Fiduciary.[20] |
| Purpose of Data | The specific data being sought – purpose of the Data.[21] |
| Data Life | The duration for which the data will be retained – termed as the Data Life.[22] |
| Revocation Mechanism | How the user can stop the flow of data.[23] |
Technically, this revocation is instantaneous. Unlike earlier systems, where ‘unsubscribe’ requests face delays of 24 to 48 hours,[24] the Consent Artefact operates in real-time by way of a real-time API call. When a user initiates revocation on the Consent Manager console/dashboard, the cryptographic key assigned to that specific artefact is invalidated. Thereafter, any subsequent attempt by the Data Fiduciary to fetch data using that token will return an authentication error. This is called enforcing a ‘hard stop’ on data processing at the architectural level.[25]
Similarly, whenever a user clicks “Allow” on the Consent Manager interface, it is not merely an express agreement. The click generates the cryptographic token, without which the Data Fiduciary was not able to process the required data.[26]
This system addresses the classic “Clickwrap” problem. In the online environment these days, consent is usually a static checkbox. And, therefore, under this framework, consent becomes a dynamic token that the user can revoke at any time. When revoked, the token expires/cancelled, and the Data Fiduciary cannot process the required data.[27]
The Fiduciary Gap: Lawyer or App?
However, despite these technical safety measures, a central ambiguity remains unresolved because Section 6 does not clearly define the nature of the Consent Manager’s role.[28] The question — Is the Consent Manager a legal representative or a neutral utility, remains unanswered.
Though the Act makes Consent Manager a “utility” as a “platform”, yet, it suggests a fiduciary role for it by making it “accountable to the Data Principal.”[29] This “Gatekeeper vs. Instrument” dilemma lies at the heart of the confusion.
If the Consent Manager Acts Like a Lawyer
- It should guide the user.
- It should warn the user about risky Data Fiduciaries.
- It should flag unnecessary data requests.
- It should act as a protective filter.
If the Consent Manager Acts Like a Utility
- It simply transmits a signal.
- It does not judge fairness.
- It only verifies the validity of consent.
- It functions as a passive pipe.[30]
Furthermore, the Act does not give the Consent Manager the power to assess the fairness of a consent request. Though it can validate the signal, it cannot assess the underlying terms. This suggests that the Consent Manager only promotes autonomy and not necessarily protection.[31]
The “Grievance Redressal” Mechanism as a Market Feature
Section 13 gives Data Principals the right to grievance redressal.[32] Consent Managers must also provide grievance redressal systems. This creates a marketplace for trust. Users should, in theory, gravitate toward Consent Managers with better dispute resolution services.
But this model assumes:
- Perfect information for users.
- Easy switching between Consent Managers.
If a user cannot see that their Consent Manager is failing them, they cannot exit. If a Consent Manager is owned by a Data Fiduciary, the user may be unable to escape the conflict of interest. The promise of Section 6 becomes hollow.
The Act builds a complex system for user autonomy. Yet it leaves the governance of Consent Managers to the Rules. As the next section shows, the DPDP Rules, 2025, may have unintentionally handed control of this system to the very incumbents the Act was meant to check.
Operationalisation under DPDP Rules 2025
The DPDPA provides only the basic structure for Consent Managers. The detailed rules on eligibility, conduct, and functioning are set out in the Digital Personal Data Protection Rules, 2025 (DPDPR). These Rules convert the abstract idea of data autonomy into concrete regulatory conditions.
A close reading of Rule 4, along with the First Schedule, shows a technocratic approach. It places financial strength above the independence required in a fiduciary role.
Rule 4 and the First Schedule: The Registration Firewall
According to Rule 4 of DPDP Rules, 2025, every applicant for Consent Manager is required to meet the conditions listed in the First Schedule. The most significant among these is the financial solvency requirement:
“The entity shall have a minimum net worth of Rupees Two Crore as per the preceding financial year’s audited balance sheet.”
At first glance, this financial firewall appears to be an innocuous and prudent safeguard. This is because the State has a legitimate interest in ensuring system stability while at the same time preventing the entry of “fly-by-night” operators or potential scam entities that often pop up in the digital service sector.
As Consent Managers will be expected to handle a large volume of sensitive transactions, the regulator aims to ensure that only entities with:
- “Skin in the game”
- Operational longevity
- Financial reliability
are entrusted with this responsibility. In this view, a net-worth requirement acts as a form of risk control to build trust in a nascent ecosystem.
While this logic is sound for banking or insurance, it does not fully fit a fiduciary environment, say the Education sector. Consent Managers, as envisaged under the DPDP Act, do not hold user funds; they only hold permissions.
Therefore, a financial entry barrier of ₹2 crore functions as a crude filter. It resembles like a licensing obstacle of the pre-liberalisation period, effectively blocking a large category of potential and dedicated participants who are rich in trust but poor in capital, such as:
- Civic-tech non-profit organisations that rely on grants instead of accumulated capital.
- Academic consortia that can build credible and neutral platforms but cannot meet laid-down fiscal/commercial net-worth norms.
- Bootstrapped startups that may be innovative and reliable but cannot raise the required venture capital at the outset.
Duties, Accountability, and Technical Interoperability
Beyond registration, the Rules impose several operational obligations. These obligations focus mainly on technical compliance, such as:
- Interoperability with other consent managers or fiduciaries according to prescribed technical standards.[33]
- Maintenance of audit logs that record consent grants and withdrawals.[34]
Though these requirements strengthen the technical reliability of the ecosystem, yet the Rules remain silent on deeper structural safeguards.
- There is no guideline or mandate to separate the Consent Manager’s technical operations from the commercial interests of any parent company.
- There is no restriction or prohibition on exclusive arrangements that may privilege particular Data Fiduciaries.[35]
The Rules, therefore, despite securing the technical flow of consent, do not assure the neutrality of that flow. It is like ensuring the integrity of the pipes without addressing who controls the direction of data flow inside those pipes.
Systemic Risks And Practical Concerns
DPDPA’s silence on vertical integration and the DPDPR’s high financial entry barriers create a heady cocktail recipe, preparing a fertile ground for market failure or manipulation. Four systemic risks that may transform the Consent Manager ecosystem from a tool of empowerment into a mechanism of control and manipulation are examined in this paper.
Net-Worth Threshold As Entry Barrier: Impact On SMEs And Start-Ups
Instead of creating a level-playing field, the high entry barriers create a ‘doctored pitch’ which reduces market contestability amongst interested and otherwise qualified players and moves industries toward oligopolistic structures. Also, the requirement of a ₹2 crore net worth signals that privacy management is reserved for established players.
Due to this, instead of a diverse market which could support specialized Consent Managers (such as one focused on health data or one dedicated to children’s safety), we are faced with no option but to accept homogenized ones. The “market for lemons” effect necessarily follows.[36] Instead of specialised entities, which are usually smaller and mission-driven, high capital requirement excludes them, and permit only large, common players. The resulting homogenized market leaves users facing limited real choice, and that too by entities that offer standardized, risk-averse services or with limited options. The developing information asymmetry drives out high-quality entities, without even getting a fair opportunity.
This requirement adversely affects Data Fiduciaries, particularly SMEs. These types of firms that cannot build their own consent infrastructure and are compelled to rely on dominant Consent Managers. This creates a dependency loop, and small innovators end up paying a “compliance tax” to large incumbents simply to participate in the legally mandated consent system, distorting the market even further.
Entry Barrier Market Impact Summary
| Factor | Impact |
|---|---|
| ₹2 Crore Net-Worth Requirement | Limits entry to large established firms |
| Exclusion Of Smaller Players | Specialized or mission-driven consent services are discouraged |
| Market Homogenization | Users face limited choices and standardized services |
| SME Dependency | Small firms must rely on dominant Consent Managers |
| Compliance Tax | Innovative firms pay additional compliance costs to incumbents |
Vertical Integration — Big Fiduciary Subsidiaries, Consent Managers, And Regulatory Capture
The most critical vulnerability in the current system is the lack of prohibition on Vertical integration. Vertical integration occurs when one entity controls both the demand for data and the gateway through which consent is processed.
A hypothetical “super-app” scenario demonstrates this risk. Let us assume the existence of “MegaTech,” a dominant digital platform. Under existing rules, MegaTech can establish a subsidiary, “MegaConsent.” And have it registered as a Consent Manager. It can distort and dissuade by:
- Nudging: Users’ onboarding with MegaTech are pushed toward MegaConsent as their preferred Consent Manager, by offerings or inducements.
- Friction: Users who try independent Consent Managers can face subtle barriers, such as delayed OTPs or security warnings, and interoperability.
- Data Harvesting: MegaConsent cannot view data content but can access metadata. This includes the apps a user engages with, the frequency of consent revocations, and the user’s risk behaviour. This metadata itself becomes a form of competitive intelligence to bottleneck competitors of MegaTech.
This system creates an illusion of choice without knowing it. Users have a Consent Manager, and Data Fiduciaries comply with formal requirements. Yet the fiduciary agent, i.e. the Consent Manager, is owned by the entity, it is meant to oversee. This becomes regulatory capture by design, where compliance mechanisms serve corporate goals, and instead of oversight, subservience is established.
The Independence Paradox — Who Protects The Data Principal?
The DPDPA positions the Consent Manager as a fiduciary accountable to the Data Principal. Yet the Act leaves its business model to market forces.
- If the user pays, privacy becomes a luxury service. Wealthier users will be able to afford independent, high-security Consent Managers. Others will depend on “free” versions.
- If the Data Fiduciary pays, the Consent Manager becomes financially dependent on the entity requesting the data.
This produces an independence paradox. When a Consent Manager earns revenue through transaction fees paid by Data Fiduciaries, it benefits from higher rates of consent grant and lower levels of revocation. A Consent Manager that advises users to deny or revoke consent harms its own revenue stream. Without regulation of this financial structure, economic incentives conflict directly with the duty of data minimization.
Cost Burden Transfer To Consumers And Market Distortion
Compliance costs will eventually shift to consumers. If the market consolidates into an oligopoly of major players, pricing power increases.
Data Fiduciaries will pass consent-related fees to consumers. This may take the form of subscription increases or increased advertising load. The Consent Manager system, designed as a protective mechanism, takes the form of an economic burden. This becomes a charge imposed by intermediaries who add procedural friction without providing any form of meaningful autonomy.
These systemic risks are real and intrinsic to the corporate world. Similar failures exist in the GDPR’s Consent Management Platform market, where most providers serve the interests of advertising firms rather than the data principals. India stands to risk repeating this pattern, on a much larger scale, unless we address Rule 4’s structural flaws and incorporate safeguards on vertical integration at the earliest.
Inter-Regime Comparative Insights
To understand the likely trajectory of India’s Consent Manager ecosystem, it is necessary to look beyond the DPDPA and look at other jurisdictions which have already experimented with similar intermediaries. India’s fiduciary model is unique. However, market behaviour around consent intermediaries in the European Union and Australia offers important warnings, as these can be expected to manifest in the Indian context, as well. These examples show how legislative intent, despite being diverse, remains far from market reality.
GDPR and the Market for Consent Management Platforms (CMPs)
GDPR created a large market for Consent Management Platforms, which turned out to be software vendors for the Data Controller. Their main purpose is to protect the corporation from liability by being a required compliance system, despite being intended to operationalise lawful consent.
In the Indian context, the Consent Manager is defined by statute to be an agent of the Data Principal.[37] This is a significant conceptual difference, but it is also fragile.
The Vendor Trap
In the EU, CMPs such as OneTrust and Cookiebot have designed their interfaces that increase “opt-in” rates for their clients. This has led to the widespread use of dark patterns. Rejecting cookies becomes harder than accepting them to obtain access.[38]
| Aspect | Observed Practice in EU CMPs | Implication |
|---|---|---|
| User Interface Design | Interfaces structured to increase opt-in rates | Encourages user consent even when not intended |
| Cookie Controls | Rejecting cookies made difficult compared to accepting | Use of dark patterns |
| Compliance Purpose | Focus on protecting corporations from liability | Consent becomes a legal shield rather than user control |
The IAB Europe Case
The Belgian Data Protection Authority found IAB Europe’s Transparency and Consent Framework to be unlawful. The ruling showed that even the consent framework itself could become a source of illegal data processing.[39]
Lesson For India
If Indian Consent Managers rely on Data Fiduciaries for revenue, they will move toward the vendor model. Without structural independence, an Indian Consent Manager becomes a functional equivalent of a GDPR CMP. It becomes a compliance shield for the fiduciary while appearing to act for the user.
- Revenue dependency may influence neutrality.
- Consent managers may prioritize fiduciary interests.
- The system risks turning into a compliance mechanism rather than a user-empowerment tool.
Lessons from Australia’s Consumer Data Right (CDR)
Australia’s Consumer Data Right introduced Accredited Data Recipients.[40] These entities receive data on a user’s behalf. Their role is similar to the “data-blind” transfer role envisioned for Indian Consent Managers.
The Compliance Cost Barrier
The CDR regime faces low adoption because of very high accreditation costs. Banks and large fintech firms are the only entities that can meet the security and insurance requirements.
Market Concentration
As a result, incumbents dominate the system. Small innovators cannot enter and must instead partner with large intermediaries.
| Factor | Impact in Australia’s CDR System |
|---|---|
| Accreditation Costs | Very high regulatory compliance costs |
| Eligible Participants | Mainly banks and large fintech companies |
| Innovation Barrier | Small startups forced to partner with large players |
| Market Structure | Dominance of incumbents |
Relevance To India
The ₹2 crore net-worth requirement in Rule 4 mirrors the Australian problem. It risks creating a top-heavy ecosystem. Only well-capitalized entities can participate. This undermines the goal of a more democratic data environment.
- High entry barriers discourage small innovators.
- Market may become dominated by large technology firms.
- Competition and innovation could be limited.
The Global Privacy Control (GPC) Alternative
A different model exists in the United States under the CCPA and CPRA.[41] The Global Privacy Control is a browser-level signal. It is not a corporate intermediary. It functions as a decentralized and non-commercial tool.
This model is less feature-rich than the dashboard-based approach in India. However, it avoids the principal–agent problem entirely. India’s decision to adopt an institutional model rather than a protocol-based signal creates a higher responsibility. The institutions must remain independent and uncorrupted for the system to work as intended.
| Model | Structure | Advantages | Limitations |
|---|---|---|---|
| India – Consent Manager | Institutional intermediary | Feature-rich dashboards and centralized management | Risk of principal–agent conflict |
| EU – CMPs | Corporate compliance tools | Widely adopted for regulatory compliance | Potential misuse through dark patterns |
| US – Global Privacy Control | Browser-level protocol | Decentralized and non-commercial | Less feature-rich functionality |
Competition-Law Overlay & Policy Consequences
The risks identified in this paper—vertical integration, entry barriers, and market foreclosure—are not only privacy harms. They are also antitrust harms. The intersection between the DPDPA and the Competition Act, 2002, is the space where the next major regulatory disputes will arise.
Market Concentration Risk and Barriers to Entry
Section 4(2)(c) of the Competition Act, 2002,[43] prohibits a dominant enterprise from engaging in conduct that results in the denial of market access. The high capital requirement for Consent Managers under Rule 4 and the First Schedule of the DPDPR naturally produces a concentrated market.[44]
If the market consolidates around three or four dominant Consent Managers, likely controlled by telcos or Big Tech firms, these entities will acquire monopsonistic power over Data Fiduciaries. A small startup that needs to request user consent will have no real choice. It will be forced to integrate with these dominant Consent Managers and accept whatever fee structure and data-sharing terms they impose. This creates a bottleneck in the digital economy. Innovation becomes subject to a compliance tax imposed by the gatekeepers of consent.
Key Market Risks Identified
- High capital requirements creating barriers to entry.
- Market consolidation around a few dominant Consent Managers.
- Monopsonistic power over Data Fiduciaries.
- Compliance costs acting as a tax on digital innovation.
Relevant Competition Law Provisions
| Provision | Legal Principle | Implication for Consent Managers |
|---|---|---|
| Section 4(2)(c), Competition Act | Prohibits denial of market access by dominant enterprises | Dominant Consent Managers may restrict access for startups |
| Section 4(2)(e), Competition Act | Prohibits leveraging dominance in one market to enter or protect another | Big Tech firms may extend OS dominance into consent infrastructure |
Vertical Foreclosure and the Essential Facilities Doctrine
The best analytical framework to detect this risk is the Essential Facilities Doctrine. A facility is termed essential when it is controlled by a monopolist, cannot be reasonably copied, and is critical for business operations.[45]
In a fully digitized economy under the DPDPA, the Consent Manager infrastructure can be termed as an essential facility because no entity can process data legally without it.[46]
Characteristics of an Essential Facility
- Controlled by a monopolist or dominant enterprise.
- Technically or economically impossible to duplicate.
- Critical for participation in the market.
Illustrative Scenario
If a Company such as Google or Meta operates a dominant Consent Manager and refuses to or uses dilatory tactics during interoperation with a rival Consent Manager, or slows down API performance for a specific Data Fiduciary, it amounts to denying access to an essential facility.
Leveraging dominance: Section 4(2)(e) of the Competition Act prohibits a firm from using dominance in one market to protect or extend dominance in another.[47] However, proving it becomes quite difficult because it requires defining the relevant market, collecting evidence which is in adverse possession, both of which are especially complex in multi-sided digital platforms.
Indian Jurisprudence on Digital Platform Abuse
This existential risk of vertical foreclosure finds distinct echoes in Indian jurisprudence. The Competition Commission of India (CCI) has previously recognized ‘search bias’ and ‘self-preferencing’ as abuses of dominance.[48]
In Matrimony.com Ltd. v. Google LLC, the CCI held that a dominant platform cannot rig its search algorithms to privilege its own downstream services (Google Flights) over competitors.[49]
Similarly, in the Android interactions case (2022), the Commission penalized the mandatory pre-installation of proprietary apps, viewing it as a leveraging strategy that denied market access to rival developers.[50]
Self-Preferencing Risk Under DPDPA
In the context of the DPDPA, a dominant operating system or ‘super-app’ pushing users toward its subsidiary Consent Manager constitutes similar harm. Worse, it is given effect under the garb of providing a statutory right. This is a form of ‘self-preferencing’ where the platform leverages its technical control over the user interface (the OS) to foreclose competition in the adjacent market of consent management.
If a Big Tech company uses its operating system dominance, such as Android, to push its own Consent Manager (“Use Android Consent Manager for one-tap approval”) while creating friction for third-party Consent Managers (“Enter your 16-digit ID manually”), this is vertical foreclosure.
Limitations of Antitrust Enforcement
Therefore, applying the Essential Facilities Doctrine in the Indian context faces significant hurdles. The Competition Commission of India (CCI) establishes a high evidentiary burden for this doctrine. A plaintiff must prove that the facility is technically impossible to duplicate. Mere commercial inconvenience is not sufficient to establish a claim.
Furthermore, digital platforms are often multi-sided, which complicates the definition of a relevant market.
Why Ex-Post Litigation Is Insufficient
- High evidentiary burden to prove essential facility.
- Difficulty defining relevant markets in digital ecosystems.
- Evidence often controlled by dominant firms.
- Slow legal process risks irreversible market tipping.
Therefore, relying solely on ex-post antitrust litigation is risky. The legal process is often too slow to prevent irreversible market tipping. This reality necessitates specific ex-ante regulations within the DPDP framework itself. The current DPDP Rules do not include neutrality mandates to prevent such OS-level self-preferencing.
Coordination Between DPB and CCI for Oversight
This leads to the institutional tension between the Data Protection Board of India and the Competition Commission of India.
Regulatory Overlap
The overlap: A privacy breach by a Consent Manager, such as sharing data with its parent Fiduciary, is a violation of the DPDPA. But if that sharing also provides a competitive advantage, it becomes an offence under the Competition Act.
International Precedent: Facebook v. Bundeskartellamt
The Facebook v. Bundeskartellamt precedent: The German Federal Court of Justice held that competition authorities may consider data protection violations when assessing abuse of dominance.[51]
Supreme Court Guidance: Bharti Airtel Case
The jurisdictional overlap between the Data Protection Board (DPB) and the Competition Commission of India (CCI) requires legal clarity. The Supreme Court of India addressed a similar conflict in Competition Commission of India v. Bharti Airtel Ltd.[52]
The Court held that the sectoral regulator possesses the primary jurisdiction to decide technical mandates. The competition authority’s jurisdiction arises only after the sectoral regulator makes its factual findings. Therefore, the legal sequence is that the DPB must first determine if a Consent Manager has violated fiduciary duties. Only then, the CCI can investigate the anti-competitive effects of that violation.
Regulatory Sequence
| Step | Authority | Action |
|---|---|---|
| 1 | Data Protection Board (DPB) | Determine violation of fiduciary duties under DPDPA |
| 2 | Competition Commission of India (CCI) | Assess anti-competitive effects of the violation |
This sequential approach prevents conflicting rulings but delays effective enforcement against dominant players.
Regulatory Arbitrage Risk
The Indian gap: The Competition Act includes a consultation mechanism under Sections 21 and 21A. However, the DPDPA does not require similar coordination.
A vertically integrated Consent Manager may claim that its data-sharing practices are technically compliant under the DPDPA, because consent was obtained properly. It then uses this argument to evade scrutiny under competition law with respect to the quality or voluntariness of that consent. This is called regulatory arbitrage where the entity exploits differences or gaps between regulatory bodies to avoid proper compliance or gain competitive advantage.
Institutional Gap
- No statutory coordination mechanism between DPB and CCI.
- Risk of firms exploiting regulatory gaps.
- Privacy compliance used as a strategic shield against competition scrutiny.
Lack of a formal Memorandum of Understanding or a statutory link between the DPB and the CCI, many forms of compliance capture will go unchallenged. Firms may use privacy law as a strategic tool to exclude competitors while appearing fully compliant.
Reform Recommendations: Realigning Law with Policy Goals
The structural flaws identified in this paper—vertical foreclosure, the exclusion of civic-tech actors, and the illusion of choice—are not permanent. They arise from regulatory design. They can therefore be corrected through regulatory redesign. To ensure that the Digital Personal Data Protection Act (DPDPA) remains a tool for empowerment rather than control, the Central Government and the Data Protection Board (DPB) must adopt four specific reforms.
The “Structural Separation” Amendment
The silence on vertical integration remains the regime’s most serious weakness. To address this, the DPB should take guidance from the SEBI (Investment Advisers) Regulations, 2013, which impose strict separation between advisory and distribution functions.[53]
A new Rule 4A should be introduced into the DPDP Rules, 2025, to mandate an arm’s-length relationship.
Draft Proposal: Rule 4A – Independence and Conflict of Interest
- No Data Fiduciary designated as a Significant Data Fiduciary shall hold more than 10 per cent beneficial ownership,[54] directly or indirectly, in a registered Consent Manager.
- Where a Data Fiduciary and a Consent Manager belong to the same group entity, the Consent Manager shall:
- Maintain a distinct legal identity with a separate Board of Directors, with at least 50 per cent Independent Directors.
- Maintain segregated IT infrastructure to ensure that no data or metadata flows to the Data Fiduciary except through the standard consent artefact protocol.
- Avoid exclusive routing agreements that restrict the Data Principal’s right to choose an alternative Consent Manager.
- Breach of this Rule shall constitute a violation of Section 6(9) of the Act.
This “Chinese Wall” ensures that even if Big Tech companies own Consent Managers, they cannot operationalise them in a manner that harms user autonomy.[55]
The “Public Choice” Dashboard (The UPI Model)
Platforms currently design their own onboarding flows. This allows them to hide independent Consent Managers behind complex menu structures. To counter this, the DPB should mandate a Common Consent Interface similar to the UPI payment flow.
- The precedent: UPI allows a user to choose any payment app, regardless of the bank they hold. In the same way, a Data Fiduciary’s login screen must offer a neutral “Select Your Consent Manager” dropdown.
- The registry: The DPB must maintain a public API of all registered Consent Managers, similar to the DigiLocker Issuer Directory or the NPCI Member List.
- The mechanism: When a user logs into a new app, the app queries this Registry. The user selects their preferred Consent Manager once. That choice then becomes their default “digital guardian” across the internet. This separates the app from the consent layer.
Differential Regulatory Thresholds for Non-Profits
The ₹2 crore net-worth threshold under Rule 4 and the First Schedule excludes civic-tech actors. This can be addressed through a tiered licensing system.
| Tier | Category | Requirements | Operational Model |
|---|---|---|---|
| Tier 1 | Commercial Consent Managers | Subject to the full net-worth requirement of ₹2 crore | Allowed to charge fees to Data Fiduciaries |
| Tier 2 | Non-Profit or Public Interest Consent Managers | Exempt from net-worth norms but required to follow stricter governance rules such as operating as a Section 8 Company or Trust | Allowed limited data volumes initially, with expansion based on audit performance.[56] |
To sustain non-profit Consent Managers, we need to look beyond philanthropic grants, as such funding is often sporadic and unpredictable. Also, it is insufficient for maintaining a high-security infrastructure. Therefore, a distinct revenue model is required to ensure their viability. The regulator should authorize a nominal “administrative fee” for these entities, on a no-profit, no-loss basis or with a marginal profit. This fee should be paid by the Data Fiduciary for every valid consent artefact. The fee caps must be strictly determined by the Data Protection Board.[57] This structure covers operational costs without creating profit-driven incentives for data maximization. Such entities must be registered as a Sec 8 company under the Companies Act, 2013.[58]
To ensure the economic viability of these Tier 2 entities, the regulator must approve a hybrid revenue model. While these non-profits should be barred from monetizing data insights, they cannot survive on grants alone. A viable approach would mirror the UPI model, where a nominal, capped ‘switching fee’ is paid by the Data Fiduciary, not the Data Principal, for every valid consent artefact generated.[59] This fee must be standardised by the DPB (similar to the MDR caps set by RBI) to cover operational costs like server maintenance and audit compliance, without creating a profit incentive to maximize data sharing.
These types of Sec. 8 Companies, within the DPDP framework, will enable universities, educational institutes, NGOs, and rights-based groups to participate without creating overheads for Data Fiduciaries. It also creates a public-interest alternative that competes with commercial players.
Transparency And Fee Structure Disclosures
If Consent Managers appear free to users, it is highly likely that the users themselves become the product. The DPB must therefore mandate a clear Business Model Disclosure on every console.
- “We are funded by [X].”
- “We charge Data Fiduciaries [Y] per consent.”
This transparency will allow users and regulators to detect Consent Managers that function as proxies for data aggregators. It also acts as a check and balance on market forces.
Conclusion
The Digital Personal Data Protection Act, 2023, is a significant shift in India’s privacy framework. It presents a model of mediated autonomy which differs from a model of implied permission. By introducing the Consent Manager, the legislature has, in principle, offered the Data Principal a shield against the surveillance-driven digital economy.
However, this paper has shown that the current operational rules may weaken this protection. The DPDP Rules, 2025, place financial strength above structural independence. As a result, they risk creating a class of captured gatekeepers. Without the safeguards proposed in this article, this risk will deepen. These safeguards include a bar on vertical integration and lower entry barriers for civic actors. If these measures are absent, the Consent Manager ecosystem may consolidate into an oligopoly. It may then fall under the control of the same telecom and technology giants it was meant to regulate.
The future of India’s data democracy depends on the strength of its market design. The statute alone cannot secure this goal. If the Consent Manager remains an independent fiduciary, it can transform the privacy landscape. But if it becomes an arm of Big Tech, it will only automate the exploitation it seeks to prevent. The outcome will depend on the regulator’s commitment to enforcing the fiduciary promise that the law sets out. End-Notes:
- Justice B N Srikrishna Committee, White Paper of the Committee of Experts on a Data Protection Framework for India (2017) ch 2; Justice B N Srikrishna Committee (n 2) 55–59 (recommending consent dashboards and intermediaries).
- Reserve Bank of India, Non-Banking Financial Company–Account Aggregator (Reserve Bank) Directions, 2016 (2 September 2016) para 3; see also Ministry of Electronics and Information Technology, Electronic Consent Framework for Enabling Data Sharing for Open Banking and Beyond (referenced in IAPP, ‘Consent Manager Framework under India’s Personal Data Protection Bill’ (IAPP, 25 September 2022)).
- Digital Personal Data Protection Act 2023 (India) s 2(g) (defining ‘Consent Manager’).
- Digital Personal Data Protection Act 2023 (India) s 6(7)–(9) (permitting consent to be given, managed, reviewed or withdrawn through a Consent Manager, which must be registered with the Board and act on behalf of the Data Principal).
- Digital Personal Data Protection Rules 2025 (India), notified under s 40 of the Digital Personal Data Protection Act 2023.
- NITI Aayog, Data Empowerment and Protection Architecture: A Secure Consent-Based Data Sharing Framework (2020) 7–8 (describing Consent Managers as “data blind” entities that do not access or store user data); NITI Aayog, Data Empowerment and Protection Architecture: Executive Summary (2020) 3.
- For the concept of perceived versus effective choice in consent-based architectures, see World Bank, Data Empowerment and Protection Architecture: Operationalising Trust in Data Sharing (World Bank workshop paper, 2023).
- OECD, ‘Competition and Competition Policy in a Data-Driven Economy’ (2018) 8–11 (explaining that control over user data can create closed ecosystems and entry barriers in digital markets).
- Michael H Riordan and Steven C Salop, ‘Evaluating Vertical Mergers: A Post-Chicago Approach’ (1995) 63 Antitrust LJ 513; see also Carl Shapiro, ‘Vertical Mergers and Input Foreclosure: Lessons from the AT&T/Time Warner Case’ (2019).
- Movement for an Open Web, ‘The Role of Consent Following the UK CMA’s Privacy Sandbox Commitments’ (2022); Brave, ‘Privacy and Competition Concerns with Google’s Privacy Sandbox’ (25 January 2022); and Econsultancy, ‘Google’s Privacy Sandbox: What Are the Latest Concerns?’ (21 August 2024).
- Regulation (EU) 2016/679 (General Data Protection Regulation) art 24; see also GDPR Recital 74 (‘The responsibility and liability of the controller for any processing of personal data carried out by the controller or on the controller’s behalf should be established’).
- Digital Personal Data Protection Act 2023 (India) s 2(i), s 6(7) (defining a Consent Manager as a registered person acting as a single point of contact to enable the Data Principal to give, manage, review or withdraw consent through a transparent, accessible and interoperable platform); see also Sahamati, ‘Empowering Digital India: Consent-based Sharing and Data Protection’ (16 August 2023).
- Justice B N Srikrishna Committee (n 1) ch 2.
- Reserve Bank of India (n 2) para 3.
- NITI Aayog (n 6) 7–8.
- Digital Personal Data Protection Act 2023 (India) (n 3) s 2(g).
- Digital Personal Data Protection Act 2023 (India) (n 12) ss 6(7), 6(9).
- Digital Personal Data Protection Rules 2025 (India) r 5(1) (defining Consent Artefact as the technical representation of consent).
- NITI Aayog (n 6) 12–15; Digital Personal Data Protection Rules 2025 (India) First Schedule, Part B.
- Digital Personal Data Protection Act 2023 (India) s 2(h); Digital Personal Data Protection Rules 2025 (India) r 5(2)(a).
- Digital Personal Data Protection Act 2023 (India) s 6(1)–(3); Digital Personal Data Protection Rules 2025 (India) r 5(2)(b).
- Digital Personal Data Protection Rules 2025 (India) r 5(2)(c) (Data Life as retention period parameter).
- ibid r 5(3); NITI Aayog (n 6) 16.
- Digital Personal Data Protection Rules 2025 (India) r 5(4) (mandating real-time revocation).
- NITI Aayog (n 6) 17–18.
- Digital Personal Data Protection Rules 2025 (India) r 5(5) (token generation upon consent affirmation).
- Reserve Bank of India (n 2) Annex I (technical specifications for dynamic consent tokens, addressing clickwrap limitations).
- Digital Personal Data Protection Act 2023 (India) s 6(7)–(9) (describing Consent Manager functions without clearly delineating legal representative versus neutral utility status).
- Digital Personal Data Protection Act 2023 (India) s 6(8) (‘acting on behalf of the Data Principal’); s 2(g) (defining as registered entity providing platform services).
- Digital Personal Data Protection Rules 2025 (India) r 5(1)–(5) (limiting Consent Manager to technical validation and transmission of Consent Artefacts).
- ibid; NITI Aayog (n 6) 14 (emphasising autonomy through revocable consent over substantive fairness assessment).
- Digital Personal Data Protection Act 2023 (India) s 13.
- Digital Personal Data Protection Rules 2025 (India) First Schedule, Part A(1)(d) (requiring Consent Managers to ensure ‘interoperability with other Consent Managers and Data Fiduciaries’).
- ibid Part A(1)(f) (mandating ‘maintenance of audit logs recording all consent grants, reviews, and withdrawals’).
- ibid (setting technical and financial conditions but silent on corporate separation, ownership restrictions, or exclusivity prohibitions).
- George A Akerlof, ‘The Market for “Lemons”: Quality Uncertainty and the Market Mechanism’ (1970) 84 QJ Econ 488.
- Digital Personal Data Protection Act 2023 (India) s 6(8) (‘The Consent Manager shall be accountable to the Data Principal and shall act on her behalf’).
- SecurePrivacy, ‘Dark Pattern Compliance: How to Stop Manipulative Cookie Banners’ (17 June 2025); Vice, ‘Companies Use “Dark Patterns” to Mislead Users About Privacy Law, Study Shows’ (2022).
- Belgian Data Protection Authority, Decision on IAB Europe’s Transparency and Consent Framework (Litigation Chamber, 2 February 2022); IAPP, ‘Belgian DPA Fines IAB Europe 250K Euros Over Consent Framework GDPR Violations’ (4 September 2024).
- Competition and Consumer Act 2010 (Cth) sch 7 (Treasury Laws Amendment (Consumer Data Right) Act 2019 (Cth)) ch 3; Consumer Data Right Rules 2020 (Cth) ch 2.
- California Consumer Privacy Act of 2018, Cal Civ Code §§ 1798.100–1798.199 (West 2020), as amended by California Privacy Rights Act of 2020, Proposition 24 (effective 1 January 2023).
- Competition Act 2002 (India) s 4(2)(c) (‘No enterprise or group shall abuse its dominant position’ including ‘limiting or restricting the provision of goods or services or market access’).
- Digital Personal Data Protection Rules 2025 (India) r 4, First Schedule Part A(1)(a) (requiring minimum net worth of ₹2 crore for Consent Manager registration).
- Competition Act 2002 (India) s 4.
- ibid; see also CUTS, Essential Facilities Doctrine (CUTS-CCIER Working Paper).
- ibid s 4(2)(e).
- CBLTR, ‘Big Tech Bias and Antitrust Law: India’s Approach to Self-Preferencing’ (6 September 2025).
- Matrimony.com Ltd v Google LLC (Competition Commission of India, Case No 07 & 30 of 2012, 8 February 2018).
- Google LLC (Competition Commission of India, Case No 39/2018, 20 October 2022).
- Facebook v Bundeskartellamt (Bundeskartellamt, Case B6-22/16, 6 February 2019).
- Competition Commission of India v Bharti Airtel Ltd (2019) 2 SCC 521.
- SEBI (Investment Advisers) Regulations 2013 regs 21–22.
- Companies (Significant Beneficial Owners) Rules 2018 r 2(1)(h); Companies Act 2013 s 90(1).
- SEBI (Investment Advisers) Regulations 2013 reg 9(3)–(4).
- Companies Act 2013 (India) s 8; RBI, Master Directions on Unified Payments Interface (UPI) (2024).
- SEBI (Investment Advisers) Regulations 2013 reg 24.
- Companies Act 2013 (India) s 8(1)(c).
- Reserve Bank of India, Guidelines on Regulation of Payment Aggregators and Payment Gateways (17 March 2020); NPCI, UPI Product Overview and Participant Guidelines (2025).


