Countries once looked across their borders and measured danger through armies, missiles, and military alliances. Today the threat looks different. It comes from companies that hold more information about citizens than governments ever did. These platforms do not only sell ads or deliver content. They shape daily life. They know where we travel, who we talk to, our interests, our fears, and our habits. They build systems that watch and listen, then package that information into influence. They decide what we see, what we buy, and sometimes how we vote. They look less like businesses and more like private states.

This is the rise of the data-state. This is how a few technology firms turned vast piles of personal data into political power, economic control, and social influence. It is also what happens when laws try to catch up.

This didn’t happen by accident. The business model depends on prediction. These companies make most of their money from advertising. The better they can predict what we will click, buy, or watch, the more advertisers will pay. To predict well, they collect details about us: searches, likes, messages, purchases, locations, and more. Over time these details become a deep profile that can be used to sell things, shape opinions, or favor some businesses over others. Eventually we become a digital version of ourselves inside the company’s databases. That version is used to influence our behavior and sometimes to influence society.

The Cambridge Analytica scandal opened many eyes. A simple quiz app took personal details of millions of Facebook users and passed it to a political consultancy. The data was collected through an app called “This Is Your Digital Life”, which allowed access to users’ Facebook data as well as data of all their Facebook friends. The app harvested the data of up to 87 million Facebook profiles. Cambridge Analytica was alleged to have used this data to assist 2016 presidential campaigns of Donald Trump, interfering with the UK’s Brexit referendum, interfering in the elections in the Caribbean country of Trinidad and Tobago, among others1. That information was used to target voters, guide campaigns, and even change public opinion. Data that once looked harmless became a weapon for persuasion. Many people realized that private power had entered democratic space.

The numbers reveal the scale. Courts and regulators have pointed out that Google controlled almost the entire desktop search market in some regions. A single company could steer access to information for most of the world2. Amazon has faced investigation for using data from independent sellers to build its own competing products3. Such a move gives a company unfair advantage and can hurt smaller businesses that depend on the platform. When a company becomes the gateway for online life, the power of choice slowly shifts from people to the platform.

Governments have started reacting. Courts in different countries have begun treating data collection and surveillance as constitutional questions, not just commercial problems. Europe passed the Digital Markets Act (DMA)4 to stop the largest technology companies from favoring their own services and locking out competition. The DMA came into force in 2022 and began to be applied in 2023. It targets big “gatekeepers”, the companies that block others from competing. 

The European Union’s General Data Protection Regulation (GDPR)5 is built on the idea that data belongs to the individual. It gives users the right to data portability, meaning a person can request a company to hand over their personal data and move it elsewhere. That idea flips the power equation. Data is not a company asset. It is tied to autonomy and control.

The United States has filed major antitrust cases against firms accused of involving in unfair tactics to maintain dominance6. Regulators are starting to treat these companies as something more than private businesses. They are treated like public infrastructure. The American Supreme Court took a similar turn in Carpenter v. United States (2018)7. The case involved police accessing historical cell-site location data without a warrant. The Court held that detailed location tracking raised serious Fourth Amendment concerns. The Fourth Amendment to the United States Constitution prohibits the federal government from conducting unreasonable searches and seizures8. Even though a private service provider held the data, the surveillance implications demanded constitutional protection.

This rights-based reading became stronger in the Schrems I and II decisions of the Court of Justice of the European Union. The courts did not treat Facebook’s cross-border data transfers as just a commercial matter. They held that American surveillance laws risked violating European fundamental rights to privacy and data protection, and struck down the Safe Harbor and Privacy Shield arrangements9.

The Indian Supreme Court used the same rights language in Justice K.S. Puttaswamy v. Union of India (2017)10. The Court declared privacy a fundamental right and warned against letting data regimes create “invisible architectures of surveillance.” It was an explicit recognition that the threat does not only come from the State. Private companies collecting and analyzing data also have constitutional consequences. The judgment marked a shift in Indian law away from treating data as property and more as a condition for personal freedom, consent, and dignity.

China passed the Personal Information Protection Law (PIPL) in 202111. On paper it is strict. Companies must minimize the data they collect and get clear consent. It limits what private businesses can do with personal information. But the same system allows wide-scale government surveillance and has weak limits on state access. The law protects people from companies more than it protects people from the State.

India’s Digital Personal Data Protection Act (2023)12 does something similar. The law gives citizens new rights to access and correct their personal data and requires companies to follow safety rules. Brazil’s general data protection law, the Lei Geral de Proteção de Dados (LGPD) follows the European GDPR model. It treats privacy as a fundamental right and sets up an independent authority to supervise data protection.

Even then, the law struggles to keep pace. Technology changes faster than legislation. Companies redesign their systems, push new products, and gather more data much faster than a new law can take shape. 

This shift affects daily life. A small seller that depends on a platform might vanish if the platform favours its own product. A journalist can lose reach if an algorithm decides their stories are less important. A voter can see a stream of customized messages designed to persuade silently. All this shifts power from public institutions (e.g., courts, parliaments, local governments) to private corporations that are not always accountable in the same way.

Regulations may fix the issues or at least slow down this dominant behaviour. Some progress is visible. Courts are challenging monopoly tactics. New laws demand transparency. Audits, regulations, and fines have forced companies to adjust. But nothing about this battle is easy. Technology giants have resources that match countries. They can fight for years in court and adapt their business models13.

The current system waits for violations and then imposes fines. That model worked for traditional companies. However, it may not work the same way for platforms that change their code every week and influence people every hour. We may need to look for structural remedies, not punishment after the harm. Platforms should face routine algorithmic audits so their decision tools are not hidden. They should have fiduciary duties similar to lawyers or doctors, where the platform must act in the user’s best interests. There should be transparency-by-design rules so the public does not depend on leaks or scandals to understand how a platform works. It is about putting limits in advance before the damage spreads.

Young people need to understand what is at stake. Every search, like, and click feeds a profile that companies use to decide what you see and who sees you. These platforms shape culture, politics, and markets. Laws matter, but so does public awareness and everyday choices.

The data-state didn’t arrive with a declaration. It arrived quietly, through convenience, efficiency, and habit. The real danger is not when a platform becomes bigger than a government. The danger is when people trust it more than public institutions. This is not a battle between security and privacy. It is a contest between democracy and private governance.

The rise of a data-state shows how private tech can become a public problem. It is a new kind of power that needs new forms of accountability. That does not mean shutting down technology. It means designing rules and choices so that people, not a handful of companies, decide how we live together in a digital world.

 

By Bhaskar Dutta, 3rd Year LL.B., Faculty of Law, The Maharaja Sayajirao University of Baroda, Vadodara.

1 Christopher Wylie, “What role did Cambridge Analytica play in the Brexit vote?” DW.com, Mar. 27, 2018, available at dw.com.

2 Purdue Global Law School, U.S. v. Google: A Landmark Case and Warning Shot to Big Tech, Purdue Global Law School Blog (Sept. 3, 2025), https://www.purduegloballawschool.edu/blog/news/google-landmark-case  (last visited Dec. 3, 2025).

3 Reuters, “SEC probes Amazon’s handling of employees’ use of sellers’ data for private labels – WSJ,” Reuters (6 Apr. 2022).

4 About the Digital Markets Act, European Commission, https://digital-markets-act.ec.europa.eu/about-dma_en (last visited Dec. 3, 2025).

5 GDPR.eu, What is GDPR, the EU’s new data protection law?, https://gdpr.eu/what-is-gdpr/  (last visited Dec. 3, 2025).

6 Federal Trade Commission v. Facebook, Inc. (FTC v. Meta Platforms, Inc.), Civil Action No. 20-cv-3590 (D.D.C.) (filed Sept. 30, 2025)

7 Carpenter v. United States (2018) 585 US 296.

8 Constitution.findlaw.com, “Fourth Amendment (Amendment IV) to the United States Constitution”, available at https://constitution.findlaw.com/amendment4.html (last visited Dec. 3, 2025).

9 Steven Peers and Max Schrems, “EU-US Data Transfer, Safe Harbour & Privacy Shield” (European Papers, 2024) https://www.europeanpapers.eu/e-journal/eu-us-data-transfer-safe-harbour-privacy-shield  (last visited Dec. 3, 2025).

10 Bhandari, Vrinda; Kak, Amba; Parsheera, Smriti; Rahman, Faiza, “An Analysis of Puttaswamy: The Supreme Court’s Privacy Verdict,” IndraStra Global, 003(004), ISSN 2381-3652

11 Personal Information Protection Law of the People’s Republic of China, promulgated by the Standing Committee of the 13th National People’s Congress on 20 August 2021.

12 Digital Personal Data Protection Act, 2023 (Act No. 22 of 2023).

13 Department of Justice, “Department of Justice Wins Significant Remedies Against Google,” Justice.gov, available at https://www.justice.gov/opa/pr/department-justice-wins-significant-remedies-against-google (last visited Dec. 3, 2025).