imagine having rights (or, digital privacy in the united states vs. the eu)
For my final paper for my course on digital surveillance, I chose to analyze the legal landscape of the United States and European Union. The course was rooted in the U.S. legal system, and really hammered home that private corporations are spying on you all the time, that data brokers sell all their data to the U.S. government, meaning that the U.S. government is effectively spying on your digital activities all the time, and that it is perfectly legal for the U.S. government to do so,1 whether you live in the U.S. or not, as long as the surveillance is in the name of national security.2
Nearly every lecture strengthened my urgent desire to get out of this country. My professor would mention the few privacy safeguards Americans have3 and then add "...which was dissolved on January 21." But it's got to be better in other countries, right? Now that I know exactly how bad things are with digital surveillance over here, what kinds of rights do Europeans have? It's comforting to know that, under their legal system at any rate, the answer is "a lot."
The paper that I actually turned in was done as a partner project, but I will publish here only the parts that I wrote (slightly revised so that it stands better on its own.)
Digital Privacy in the US vs. the EU: A Side-by-Side Comparison
"Digital privacy" is something that has a completely different meaning depending on the legal system one lives under. As the internet has grown both in complexity and surveillance capacity, new laws have sprung up to regulate it — but the protections they offer are not the same everywhere. This paper investigates the rights that individuals in the European Union and United States have and don't have in regards to defending their privacy.
The American Legal Landscape: Digital Privacy
The United States Constitution establishes a limited right to privacy insofar as privacy can be defined as "the right of the people to be secure in their persons, houses, papers, and effects, against unreasonable searches and seizures[.]" Under the Fourth Amendment, intrusions into one's personal domicile or searches of one's possessions require a warrant to be issued on "probable cause" that the findings will lead to the discovery of evidence of a serious crime. However, the Constitution, ratified in 1789, says nothing about the digital sphere.
Security in one's devices?
The 2014 case Riley v. California extended "papers and effects" to include the files stored on a locked cell phone. In this case, which made its way to the Supreme Court, Chief Justice John G. Roberts declared that "officers must generally secure a warrant" before conducting a search of the data on cell phones.4 Notably, the Court allowed for exceptions in the case of an emergency, a provision which in the year 2025 has led to numerous examples of warrantless searches which would normally be considered unlawful.5
Basic Principles of American Privacy Law
To understand the landscape of American digital privacy, it's critical to look at the fundamental legal principles around privacy. The 1967 case Katz v. United States, which determined whether it was legal for the police to remotely eavesdrop without a warrant on Charles Katz while he was in a phone booth, expanded the Fourth Amendment's protections of "security" to any place where a person has a "reasonable expectation of privacy."6 While encouraging, this standard is, in practice, too subjective to reliably offer much protection.
The year after Katz was decided, the Federal Wiretap Act (also known as Title III) was passed. This act outlaws the intentional interception, use, or disclosure of any wire, oral, or electronic communication, regardless of whether that interception was successful or not.7 The only exceptions to this rule are telecommunication service providers and officers of the law who have a warrant. This is one of the only restrictions on the information that private companies can collect on individuals—they cannot literally listen in. Yet, there's practically no limit on the other information companies can collect on individuals in America.
A public-private distinction
Law in America primarily restricts the government in its surveillance of its citizens: while wiretaps are illegal when performed by private companies or individuals, the amount of information that can be collected on individuals' digital activities by private companies is practically limitless under third-party doctrine.
Third-party doctrine states that "a person has no legitimate expectation of privacy in information he voluntarily turns over to third parties." The case which decided this, Smith v. Maryland, ruled that individuals do not have a "reasonable expectation of privacy" in the phone numbers that they dial when placing a call, since they voluntarily provide that information to the phone company. Under third-party doctrine, this was extended to mean that such information voluntarily held by a third party can legally be recovered by the U.S. government, without a warrant.
This notion of "voluntary," however, is very weak. Generally, all cases short of certain death should the information not be shared (i.e., if an individual is bleeding to death and will be denied medical care without signing a document first) are considered "voluntary" by American legal standards.8 Although it is well known that users of online services rarely, if ever, read the terms and conditions of use of the platforms for which they are signing up, selecting a box marked "Accept" is considered legal consent, even though there is no way to access those services without "consenting" to the processing of personal data. In the American system, therefore, "consent" carries very little meaning.
Content vs. Metadata
It's important to realize the full implications third-party doctrine has. What it means is that while the content of a phone call is protected, all metadata associated with it is legally up for grabs. Since the numbers that a person dials are considered metadata, not content, they aren't protected — even though an individual has no way to make a call without specifying a phone number. The numbers one dials can still reveal a great deal of sensitive information — for example, dialing the number of an abortion clinic could be considered evidence of a crime in Texas, where so much as "aiding and abetting" a person seeking an abortion is illegal. But there's literally no way to conceal that information from Texas law enforcement, who under third-party doctrine can get that information from the phone company simply by asking.
The European Digital Privacy Landscape
European law, however, takes a much more rigid approach to guarding individuals' privacy, which can be seen in EU-wide legislation such as the General Data Protection Regulation (GDPR).
GDPR
The GDPR, which went into effect on May 25, 2018, is an EU-wide law that governs the collection of data of EU citizens and persons within the EU. The GDPR's definitions emphasize that its regulations are far-reaching, binding any organization or entity that processes the personal data of persons in the EU, even if that processing happens on foreign soil9 and even when those organizations are foreign.10 Entities collecting data are referred to as "controllers." However, member states' activities in regard to EU-wide national security are explicitly excluded from its scope.11 Instead, member states' collection of data is regulated by Regulation (EU) 2018/1725, which will be discussed in a later section.12
Importantly, the GDPR is explicitly intended to be technology-neutral, meaning that it will continue to apply even to technologies that were not invented at the time of its framing.
The GDPR lays out several core rights, which will be discussed in later sections:
- The right to be informed
- The right of access
- The right to erasure ("right to be forgotten")
- The right to restrict processing
- The right to data portability
- The right to object
When is it legal to collect user data?
Article 6 of the GDPR stipulates that "processing shall be lawful only if and to the extent that at least one of the following applies:
a. the data subject has given consent to the processing of his or her personal data for one or more specific purposes;
b. processing is necessary for the performance of a contract to which the data subject is party [...]
c. processing is necessary for compliance with a legal obligation to which the controller is subject;
d. processing is necessary in order to protect the vital interests of the data subject or of another natural person;
e. processing is necessary for the performance of a task carried out in the public interest or in the exercise of official authority vested in the controller;
f. processing is necessary for the purposes of the legitimate interests pursued by the controller or by a third party, except where such interests are overridden by the interests or fundamental rights and freedoms of the data subject which require protection of personal data, in particular where the data subject is a child."
As can be seen, there are circumstances under which it is permissible to collect data about an individual without their consent. In particular, "in the public interest" and "legitimate interests" are rather fuzzy terms. However, most businesses tend to not wish to risk violating the GDPR, and thus secure users' consent with a simple cookie banner for their EU users.
Definition of consent
The GDPR defines consent as "any freely given, specific, informed and unambiguous indication of the data subject’s wishes by which he or she, by a statement or by a clear affirmative action, signifies agreement to the processing of personal data relating to him or her."13 Furthermore, the GDPR stipulates that in order to determine whether consent was "freely given," "utmost account shall be taken of whether, inter alia, the performance of a contract, including the provision of a service, is conditional on consent to the processing of personal data that is not necessary for the performance of that contract."14
What kind of user data can legally be collected?
Sensitive data
Article 9 of the GDPR lays out restrictions on when organizations can collect "special categories" of sensitive data, meaning "personal data revealing racial or ethnic origin, political opinions, religious or philosophical beliefs, or trade union membership, and the processing of genetic data, biometric data for the purpose of uniquely identifying a natural person, data concerning health or data concerning a natural person’s sex life or sexual orientation." The GDPR has a blanket ban on the collection of any of this data, unless the data subject has given "explicit consent to the processing of those personal data for one or more specified purposes" (emphasis added). However, additional carveouts are given for a variety of other circumstances, such as if the data was "manifestly made public" by the individual in question, or when processing is necessary for the establishment of legal claims.
There's also an exception for data that's "of public interest in the area of public health," so information about a person in the EU's vaccination status is not protected under this regulation. Significant room is given for individual Member States to adjust this section.
What happens when user data is collected?
Reading you your rights
Upon collecting personal data from a data subject, the organization collecting the data is required to tell the subject the purpose and legal basis for the collecting of that data, the length for which it will be stored, and who it will be shared with. Furthermore, the organization must provide the subject with their contact information, as well as the contact details of a data protection officer.15 The organization must inform the subject of their right to object, their right to request access to their data or ensure its erasure, their right to data portability, and their right to lodge a complaint with a higher authority. As written, this section is rather reminiscent of an American police officer reading a suspect their rights and giving them a lawyer after they've been taken into custody — but in this case, it's the data that's being held, rather than a person.
Right of access
The GDPR includes a provision allowing individuals to obtain from companies a copy of their data. However, this makes it possible for unauthorized individuals to claim to be someone else in order to obtain the personal information of another person. It's up to companies to verify a user's identity when they request a copy of their data. However, many companies fail to authenticate customers properly, leaving a gaping security vulnerability.16
Individuals also have the right to know whether they are being subjected to automated decision-making, including profiling. If they are, they are entitled to "meaningful information about the logic involved."17 It will be interesting to see how companies manage to comply with this, as automated decision-making by AI is becoming increasingly common, despite AI notoriously offering little transparency or even consistency in its decision-making.
Right to data portability
Individuals can request that the data they have provided to a controller be sent to them in a "structured, commonly used and machine-readable format." This is empowering for customers who would like to switch to a different provider of a service without losing the information built up over the course of their relationship with a different service provider. Users of a messaging app can also request a copy of their entire message history (often sent in a .csv file), so they can keep their messages if they choose to delete their account.
Right to be forgotten
Individuals possess the right to force companies and other entities to delete the data they've stored about them, if they withdraw their consent or if they object to the data being processed. When this happens, the data controller is forced to comply. However, there are numerous exceptions where the right to erasure does not apply, such as "to the extent that processing is necessary … for the establishment, exercise, or defence of legal claims."18 There's also an exception for when data is of public interest in the area of public health. Information about the vaccination status of a person in the EU, for example, would not need to be erased even if that person had requested its erasure.
Right to object
Additionally, if a subject objects to the processing of their personal data, the entity collecting the data must stop collecting it, unless they can demonstrate compelling legitimate grounds for not doing so.19 An individual can specifically object to data being used for direct marketing purposes, which includes profiling.20 In fact, a data subject can even set up a script to automatically object to marketing on their behalf, according to Article 21(5), which permits a subject to "exercise his or her right to object by automated means using technical specifications."21
Regulation 2018/1725: GDPR, but for EU governments
Regulation 2018/1725 is the version of the GDPR that applies to the public sector. In part, it is virtually identical to the GDPR with regard to the rights that individuals enjoy with regard to their data: it includes a right to erasure, with the same exceptions for when that right does not apply; and like the GDPR, when EU institutions collect a subject's data, they are required to notify that subject with the information to be collected, its length of storage, who it will be shared with, and all other information stipulated in the GDPR. EU institutions must document the information they keep in a centralized, publicly accessible register.22
Like in the GDPR, personal data must be collected for "specified, explicit, and legitimate purposes,"23 as well as "lawfully, fairly, and in a transparent manner in relation to the data subject."24 As with in the GDPR, however, lawfulness does not always require the data subject's consent, so long as it is necessary for a contract, legal obligation, the subject's vital interest, or a task carried out in the public interest or in the controller's official authority.
However, Regulation 2018/1725 includes a chapter on the processing of personal data by EU agencies when investigating criminal matters, which is much more lax in its restrictions on those EU agencies. Processing of sensitive data is allowed where it is "strictly necessary for operational purposes," and discrimination is prohibited. Police officers do not even need to read a suspect their rights, à la GDPR Article 13, if it would endanger the public or national security of member states.25 Police and other government agents can also deny a suspect the right of access in order to protect the national security of member states or the rights and freedoms of others (such as victims and witnesses.)26
Legacy of the GDPR: The California Consumer Privacy Act
The GDPR was a groundbreaking document when it was first adopted in 2016, and inspired California to pass similar legislation. While the GDPR's breadth has caused many companies to extend its protections to all of their customers out of fear of lawsuits, the CCPA's jurisdiction does not extend anywhere outside of California: the "consumer" protected by the California Consumer Privacy Act is defined as "a natural person who is a California resident."
The California Consumer Privacy Act grants California residents:
- The right to delete personal information
- The right to know what personal information is being collected
- The right to opt out of the sale or sharing of personal information
Definition of consent
Like the GDPR, the CCPA requires "consent" to be "freely given, specific, informed, and unambiguous;" unlike the GDPR, consent must be "for a narrowly defined particular purpose."27 The CCPA specifically excludes acceptance of a general terms of use or similar document from qualifying as "consent," as well as excluding "agreement obtained through use of dark patterns." However, a business does not need to get a user's consent before collecting their data—consent only comes into play once a user sends a request to opt-out or for deletion.
Right to know what personal information is being collected
The CCPA stipulates the information that businesses must provide California consumers if they are collecting their personal information: the "the categories of personal information to be collected", the purposes for which they will be collected or used, and whether that information is sold or shared. However, "categories" is a rather broad term: businesses can merely label information as "marketing cookies" or "statistical cookies," which tells the consumer very little about what information is actually being collected.
Unlike the GDPR, there is no requirement for companies to state a specific purpose for the information they are collecting — and yet, consent can only be "for a narrowly defined particular purpose." It's difficult for a customer to meaningfully consent without full knowledge of what they are consenting to.
What this looks like in everyday life
The CCPA offers a clear solution for how companies can comply with the right to be informed – by "providing the required information prominently and conspicuously on the homepage of its internet website." In practice, this usually takes the form of a banner at the bottom of the screen, with a link that users can click on to access this information. Consumers do see this notice: according to a study performed by Consumer Action, 86% of survey respondents said they saw the notice of their rights on websites they visited at least some of the time.28
Right to delete personal information
California consumers can "request that a business delete any personal information about the consumer which the business has collected from the consumer." However, this doesn't include information that the business has collected about the consumer from other entities such as data brokers. Therefore, it's rather limited in what it can accomplish.
Businesses that receive a "verifiable consumer request" from a consumer to delete their information "shall delete the consumer's personal information from its records" and notify all third parties that it's shared the personal information with of the CCPA request as well – "unless this … involves disproportionate effort."29 Companies, then, have explicit permission to not comply by complaining that it would be too much work.
Furthermore, unlike in the GDPR, nothing stipulates the time frame with which companies would even have to delete this data. Companies can delay the deletion of personal data almost indefinitely.
While it is possible for privacy-conscious individuals to claim that they are from California in an endeavor to pressure a business to delete their personal data, the term "verifiable" precludes companies from having to do anything at all if they cannot be sure that the individual is, in fact, from California. This leads to an unfortunate paradox where, in order to protect their privacy, a California user has to give up their privacy to prove their state of residence. On the other hand, companies with improper verification procedures risk granting a third party unauthorized access to personal information that could be used for identity theft or stalking.30
Exceptions to the right to delete personal information
Even if a user is who they claim to be, businesses are free from an obligation to delete that user's data if it is "reasonably necessary … in order to:
- Complete the transaction for which the personal information was collected, fulfill the terms of a written warranty or product recall conducted in accordance with federal law, provide a good or service requested by the consumer, or reasonably anticipated by the consumer within the context of a business’ ongoing business relationship with the consumer, or otherwise perform a contract between the business and the consumer.
- Help to ensure security and integrity to the extent the use of the consumer’s personal information is reasonably necessary and proportionate for those purposes.
- Debug to identify and repair errors that impair existing intended functionality.
- Exercise free speech, ensure the right of another consumer to exercise that consumer’s right of free speech, or exercise another right provided for by law.
- Comply with the California Electronic Communications Privacy Act […]
- Engage in public or peer-reviewed scientific, historical, or statistical research [...] when the business’ deletion of the information is likely to render impossible or seriously impair the ability to complete such research, if the consumer has provided informed consent.
- To enable solely internal uses that are reasonably aligned with the expectations of the consumer based on the consumer’s relationship with the business and compatible with the context in which the consumer provided the information.
- Comply with a legal obligation."31
These exceptions are rather far-reaching. How much information about the specifications of an individual's computer can be captured under the guise of "debugging?" What kind of "solely internal uses" might be deemed "reasonably aligned with the expectations of the consumer?" Companies have a great deal of leeway to interpret these caveats to suit their business purposes.
Right to Opt Out of Sale or Sharing of Personal Information
The CCPA requires businesses to have a "clear and conspicuous" link on their homepage titled “Do Not Sell or Share My Personal Information.” where customers can opt out of their data being shared by clicking on it. Consumers have the right at any time to opt out. Furthermore, businesses are not legally allowed to sell or share the personal information of a consumer if the business knows that that consumer is less than 16 years of age — even if the business pretends not to know about it.32 Violations where the business sells the data of minors result in fines of up to $7,500 per violation — a considerable incentive for companies to actually comply.
Equal protection for Californians who exercise their rights
The CCPA does have teeth, so to speak: it prohibits businesses from retaliating or discriminating against customers who exercise their CCPA rights, and it forces companies found to be violating the CCPA to "cease and desist" and pay fines of up to $7,500 for each intentional violation of the CCPA.33
Conclusion
The differences between the legal landscapes of digital privacy in the European Union and the United States boil down to their fundamentally diverging philosophies. Third-party doctrine has been the crux of the American landscape of digital surveillance — without the notion that users cannot expect privacy in data they have "voluntarily" shared, the industry in which Google Analytics operates could never have flourished.
Whereas the EU treats privacy as a fundamental human right, embedding transparency, user control, and informed consent into the design of digital interactions; the United States leaves it to states like California to establish (limited) versions of the GDPR, whose protections can only apply to residents of those states.
These contrasting approaches have profound implications for everyday users' online realities. From the ubiquity or absence of cookie banners to the meaning or lack thereof of "consent" to store one's data, users face very different experiences depending on their jurisdiction. It's up to users to know their rights — and exercise them.
Some exceptions where it's illegal for the U.S. government to spy on you: when you are talking on your phone in a private setting. This is called a "wiretap" and it is only legal if law enforcement has obtained a warrant beforehand to wiretap you specifically. You're also secure in your own house and "in your person" under the Fourth Amendment — but almost everything in the digital sphere is free game.↩
See the Foreign Intelligence Surveillance Act, which permits U.S. surveillance of basically anyone in any foreign country, and especially the Patriot Act, which greatly expanded the information the U.S. gov't is allowed to collect on its citizens and also reduced the rights of everyone passing through the border. (Its section on border security really set things up for the abuses of power we're seeing today in 2025 along the U.S. border...)↩
Namely, PCLOB (the Privacy and Civil Liberties Oversight Board) was a committee appointed to oversee FBI/CIA surveillance of Americans, and make sure that people's rights aren't violated, was gutted one week after Trump took office.↩
Riley v. California. 573 U.S. Supreme Court 373. 2014. p. 10.↩
Breeden, Aurelien. "U.S. Turned Away French Scientist Over Views on Trump Policies, France Says." The New York Times, Mar. 20, 2025. https://www.nytimes.com/2025/03/20/world/europe/us-france-scientist-entry-trump-messages.html.↩
United States, Congress, House. "18 U.S. Code § 2511 - Interception and disclosure of wire, oral, or electronic communications prohibited." Legal Information Institute, Cornell Law School. https://www.law.cornell.edu/uscode/text/18/2511.↩
Citing my professor here, who I'm not going to name in order to preserve my privacy.↩
GDPR, Article 3 (2, 3)↩
GDPR, Article 2(2): "This Regulation does not apply to the processing of personal data: ... (b) by the Member States when carrying out activities which fall within the scope of Chapter 2 of Title V of the TEU." I'll save you some googling and tell you that Chapter 2 of Title V of the Treaty on European Union is entitled "Specific provisions on the common foreign and security policy."↩
GDPR, Article 2 (3): "For the processing of personal data by the Union institutions, bodies, offices and agencies, Regulation (EC) No 45/2001 applies." However, Regulation (EC) No 45/2001 was repealed in 2018 and replaced by Regulation (EU) 2018/1725. (Yes, I was very meticulous with this paper :)↩
GDPR, Article 4(11)↩
GDPR, Article 7(4)↩
GDPR, Article 13(1, 2)↩
Pavur, James, and Knerr, Casey. "GDPArrrrr: Using Privacy Laws to Steal Identities." Blackhat USA, 2019. https://i.blackhat.com/USA-19/Thursday/us-19-Pavur-GDPArrrrr-Using-Privacy-Laws-To-Steal-Identities-wp.pdf.↩
GDPR, Article 15(1) (h)↩
Regulation (EU) 2018/1725, Article 19(3) (e)↩
GDPR, Article 21(1)↩
GDPR, Article 21(2, 3)↩
GDPR, Article 21(5): The data subject may ""exercise his or her right to object by automated means using technical specifications."↩
Regulation 2018/1725, Article 31↩
Regulation 2018/1725, Article 4(1) (b)↩
Regulation 2018/1725, Article 4(1) (a)↩
Regulation 2018/1725, Article 79(3)↩
Regulation 2018/1725, Article 81(1)↩
California Consumer Privacy Act, 1798.140. (h)↩
"Executive Summary: Survey Shows Too Many Californians are Still Unaware of Their Rights." The California Privacy Initiative, 2022. www.consumer-action.org/downloads/english/CCPA-Privacy-Rights-Survey-Summary.pdf.↩
CCPA, 1798.105.© (1)↩
Bai, Annie, and Peter McLaughlin. "Why the CCPA’s ‘verified Consumer Request’ Is a Business Risk." IAPP, 14 Aug. 2019, iapp.org/news/a/verified-consumer-request-dont-naively-slip-into-the-crack-or-is-it-a-chasm/.↩
CCPA, 1798.105.(d)↩
CCPA, 1798.115.©↩
CCPA, 1798.155(a)↩