General Data Protection Regulation

The General Data Protection Regulation (EU) 2016/679 (GDPR) is a regulation in EU law on data protection and privacy in the European Union (EU) and the European Economic Area (EEA). It also addresses the transfer of personal data outside the EU and EEA areas. The GDPR’s primary aim is to give individuals control over their personal data and to simplify the regulatory environment for international business by unifying the regulation within the EU. Superseding the Data Protection Directive 95/46/EC, the regulation contains provisions and requirements related to the processing of personal data of individuals (formally called data subjects in the GDPR) who are located in the EEA, and applies to any enterprise—regardless of its location and the data subjects’ citizenship or residence—that is processing the personal information of individuals inside the EEA.

Controllers and processors of personal data must put in place appropriate technical and organizational measures to implement the data protection principles. Business processes that handle personal data must be designed and built with consideration of the principles and provide safeguards to protect data (for example, using pseudonymization or full anonymization where appropriate). Data controllers must design information systems with privacy in mind. For instance, using the highest-possible privacy settings by default, so that the datasets are not publicly available by default and cannot be used to identify a subject. No personal data may be processed unless this processing is done under one of the six lawful bases specified by the regulation (consent, contract, public task, vital interest, legitimate interest or legal requirement). When the processing is based on consent the data subject has the right to revoke it at any time.

Data controllers must clearly disclose any data collection, declare the lawful basis and purpose for data processing, and state how long data is being retained and if it is being shared with any third parties or outside of the EEA. Firms have the obligation to protect data of employees and consumers to the degree where only the necessary data is extracted with minimum interference with data privacy from employees, consumers, or third parties. Firms should have internal controls and regulations for various departments such as audit, internal controls, and operations. Data subjects have the right to request a portable copy of the data collected by a controller in a common format, as well as the right to have their data erased under certain circumstances. Public authorities, and businesses whose core activities consist of regular or systematic processing of personal data, are required to employ a data protection officer (DPO), who is responsible for managing compliance with the GDPR. Businesses must report data breaches to national supervisory authorities within 72 hours if they have an adverse effect on user privacy. In some cases, violators of the GDPR may be fined up to €20 million or up to 4% of the annual worldwide turnover of the preceding financial year in case of an enterprise, whichever is greater.

The GDPR was adopted on 14 April 2016, and became enforceable beginning 25 May 2018. As the GDPR is a regulation, not a directive, it is directly binding and applicable, but does provide flexibility for certain aspects of the regulation to be adjusted by individual member states.

The regulation became a model for many national laws outside EU, including Chile, Japan, Brazil, South Korea, Argentina and Kenya. The California Consumer Privacy Act (CCPA), adopted on 28 June 2018, has many similarities with the GDPR.

Contents

The GDPR 2016 has eleven chapters, concerning general provisions, principles, rights of the data subject, duties of data controllers or processors, transfers of personal data to third countries, supervisory authorities, cooperation among member states, remedies, liability or penalties for breach of rights, and miscellaneous final provisions.

General provisions

The regulation applies if the data controller (an organisation that collects data from EU residents), or processor (an organisation that processes data on behalf of a data controller like cloud service providers), or the data subject (person) is based in the EU. Under certain circumstances, the regulation also applies to organisations based outside the EU if they collect or process personal data of individuals located inside the EU. The regulation does not apply to the processing of data by a person for a “purely personal or household activity and thus with no connection to a professional or commercial activity.” (Recital 18)

According to the European Commission, “Personal data is information that relates to an identified or identifiable individual. If you cannot directly identify an individual from that information, then you need to consider whether the individual is still identifiable. You should take into account the information you are processing together with all the means reasonably likely to be used by either you or any other person to identify that individual.” The precise definitions of terms such as “personal data”, “processing”, “data subject”, “controller”, and “processor” are stated in Article 4 of the Regulation.

The regulation does not purport to apply to the processing of personal data for national security activities or law enforcement of the EU; however, industry groups concerned about facing a potential conflict of laws have questioned whether Article 48 of the GDPR could be invoked to seek to prevent a data controller subject to a third country’s laws from complying with a legal order from that country’s law enforcement, judicial, or national security authorities to disclose to such authorities the personal data of an EU person, regardless of whether the data resides in or out of the EU. Article 48 states that any judgement of a court or tribunal and any decision of an administrative authority of a third country requiring a controller or processor to transfer or disclose personal data may not be recognised or enforceable in any manner unless based on an international agreement, like a mutual legal assistance treaty in force between the requesting third (non-EU) country and the EU or a member state. The data protection reform package also includes a separate Data Protection Directive for the police and criminal justice sector that provides rules on personal data exchanges at national, European, and international levels.

A single set of rules applies to all EU member states. Each member state establishes an independent supervisory authority (SA) to hear and investigate complaints, sanction administrative offences, etc. SAs in each member state co-operate with other SAs, providing mutual assistance and organising joint operations. If a business has multiple establishments in the EU, it must have a single SA as its “lead authority”, based on the location of its “main establishment” where the main processing activities take place. The lead authority thus acts as a “one-stop shop” to supervise all the processing activities of that business throughout the EU (Articles 46–55 of the GDPR). A European Data Protection Board (EDPB) co-ordinates the SAs. EDPB thus replaces the Article 29 Data Protection Working Party. There are exceptions for data processed in an employment context or in national security that still might be subject to individual country regulations (Articles 2(2)(a) and 88 of the GDPR).

Principles

Unless a data subject has provided informed consent to data processing for one or more purposes, personal data may not be processed unless there is at least one legal basis to do so. Article 6 states the lawful purposes are:

  • (a) If the data subject has given consent to the processing of his or her personal data;
  • (b) To fulfill contractual obligations with a data subject, or for tasks at the request of a data subject who is in the process of entering into a contract;
  • (c) To comply with a data controller’s legal obligations
  • (d) To protect the vital interests of a data subject or another individual;
  • (e) To perform a task in the public interest or in official authority;
  • (f) For the legitimate interests of a data controller or a third party, unless these interests are overridden by interests of the data subject or her or his rights according to the Charter of Fundamental Rights (especially in the case of children)

If informed consent is used as the lawful basis for processing, consent must have been explicit for data collected and each purpose data is used for (Article 7; defined in Article 4). Consent must be a specific, freely-given, plainly-worded, and unambiguous affirmation given by the data subject; an online form which has consent options structured as an opt-out selected by default is a violation of the GDPR, as the consent is not unambiguously affirmed by the user. In addition, multiple types of processing may not be “bundled” together into a single affirmation prompt, as this is not specific to each use of data, and the individual permissions are not freely-given. (Recital 32)

Data subjects must be allowed to withdraw this consent at any time, and the process of doing so must not be harder than it was to opt in. (Article 7(3)) A data controller may not refuse service to users who decline consent to processing that is not strictly necessary in order to use the service. (Article 7(4)) Consent for children, defined in the regulation as being less than 16 years old (although with the option for member states to individually make it as low as 13 years old (Article 8(1)), must be given by the child’s parent or custodian, and verifiable (Article 8).

If consent to processing was already provided under the Data Protection Directive, a data controller does not have to re-obtain consent if the processing is documented and obtained in compliance with the GDPR’s requirements (Recital 171).

Rights of the data subject

Transparency and modalities

Article 12 requires that the data controller provides information to the “data subject in a concise, transparent, intelligible and easily accessible form, using clear and plain language, in particular for any information addressed specifically to a child.”

Information and access

The right of access (Article 15) is a data subject right. It gives people the right to access their personal data and information about how this personal data is being processed. A data controller must provide, upon request, an overview of the categories of data that are being processed (Article 15(1)(b)) as well as a copy of the actual data (Article 15(3)); furthermore, the data controller has to inform the data subject on details about the processing, such as the purposes of the processing (Article 15(1)(a)), with whom the data is shared (Article 15(1)(c)), and how it acquired the data (Article 15(1)(g)).

A data subject must be able to transfer personal data from one electronic processing system to and into another, without being prevented from doing so by the data controller. Data that has been sufficiently anonymised is excluded, but data that has been only de-identified but remains possible to link to the individual in question, such as by providing the relevant identifier, is not. In practice, however, providing such identifiers can be challenging, such as in the case of Apple’s Siri, where voice and transcript data is stored with a personal identifier that the manufacturer restricts access to, or in online behavioural targeting, which relies heavily on device fingerprints that can be challenging to capture, send, and verify.

Both data being ‘provided’ by the data subject and data being ‘observed’, such as about behaviour, are included. In addition, the data must be provided by the controller in a structured and commonly used standard electronic format. The right to data portability is provided by Article 20 of the GDPR.

Rectification and erasure

right to be forgotten was replaced by a more limited right of erasure in the version of the GDPR that was adopted by the European Parliament in March 2014. Article 17 provides that the data subject has the right to request erasure of personal data related to them on any one of a number of grounds within 30 days, including noncompliance with Article 6(1) (lawfulness) that includes a case (f) if the legitimate interests of the controller are overridden by the interests or fundamental rights and freedoms of the data subject, which require protection of personal data (see also Google Spain SL, Google Inc. v Agencia Española de Protección de Datos, Mario Costeja González).

Right to object and automated decisions

Article 21 of the GDPR allows an individual to object to processing personal information for marketing, sales, or non-service related purposes. This means the data controller must allow an individual the right to stop or prevent controller from processing their personal data.

There are some instances where this objection does not apply. For example if:

  1. Legal or official authority is being carried out
  2. “Legitimate interest”, where the organisation needs to process data in order to provide the data subject with a service they signed up for
  3. A task being carried out for public interest.

GDPR is also clear that the data controller must inform individuals of their right to object from the first communication the controller has with them. This should be clear and separate from any other information the controller is providing and give them their options for how best to object to the processing of their data.

There are instances the controller can refuse a request, in the circumstances that the objection request is “manifestly unfounded” or “excessive”, so each case of objection must be looked at individually.

Controller and processor

To be able to demonstrate compliance with the GDPR, the data controller must implement measures which meet the principles of data protection by design and by default. Article 25 requires data protection measures to be designed into the development of business processes for products and services. Such measures include pseudonymising personal data, by the controller, as soon as possible (Recital 78). It is the responsibility and the liability of the data controller to implement effective measures and be able to demonstrate the compliance of processing activities even if the processing is carried out by a data processor on behalf of the controller (Recital 74).

When data is collected, data subjects must be clearly informed about the extent of data collection, the legal basis for processing of personal data, how long data is retained, if data is being transferred to a third-party and/or outside the EU, and any automated decision-making that is made on a solely algorithmic basis. Data subjects must be informed of their privacy rights under the GDPR, including their right to revoke consent to data processing at any time, their right to view their personal data and access an overview of how it is being processed, their right to obtain a portable copy of the stored data, their right to erasure of their data under certain circumstances, their right to contest any automated decision-making that was made on a solely algorithmic basis, and their right to file complaints with a Data Protection Authority. As such, the data subject must also be provided with contact details for the data controller and their designated data protection officer, where applicable.

Data protection impact assessments (Article 35) have to be conducted when specific risks occur to the rights and freedoms of data subjects. Risk assessment and mitigation is required and prior approval of the data protection authorities is required for high risks.

Article 25 requires data protection to be designed into the development of business processes for products and services. Privacy settings must therefore be set at a high level by default, and technical and procedural measures should be taken by the controller to make sure that the processing, throughout the whole processing lifecycle, complies with the regulation. Controllers should also implement mechanisms to ensure that personal data is not processed unless necessary for each specific purpose.

A report by the European Union Agency for Network and Information Security elaborates on what needs to be done to achieve privacy and data protection by default. It specifies that encryption and decryption operations must be carried out locally, not by remote service, because both keys and data must remain in the power of the data owner if any privacy is to be achieved. The report specifies that outsourced data storage on remote clouds is practical and relatively safe if only the data owner, not the cloud service, holds the decryption keys.

Pseudonymisation

According to the GDPR, pseudonymisation is a required process for stored data that transforms personal data in such a way that the resulting data cannot be attributed to a specific data subject without the use of additional information (as an alternative to the other option of complete data anonymisation). An example is encryption, which renders the original data unintelligible in a process that cannot be reversed without access to the correct decryption key. The GDPR requires for the additional information (such as the decryption key) to be kept separately from the pseudonymised data.

Another example of pseudonymisation is tokenisation, which is a non-mathematical approach to protecting data at rest that replaces sensitive data with non-sensitive substitutes, referred to as tokens. While the tokens have no extrinsic or exploitable meaning or value, they allow for specific data to be fully or partially visible for processing and analytics while sensitive information is kept hidden. Tokenisation does not alter the type or length of data, which means it can be processed by legacy systems such as databases that may be sensitive to data length and type. This also requires much fewer computational resources to process and less storage space in databases than traditionally-encrypted data.

Pseudonymisation is a privacy-enhancing technology and is recommended to reduce the risks to the concerned data subjects and also to help controllers and processors to meet their data protection obligations (Recital 28).

Records of processing activities

According to Article 30, records of processing activities have to be maintained by each organisation matching one of following criteria:

  • employing more than 250 persons;
  • the processing it carries out is likely to result in a risk to the rights and freedoms of data subjects;
  • the processing is not occasional;
  • processing includes special categories of data as referred to in Article 9(1) or personal data relating to criminal convictions and offences referred to in Article 10.

Such requirements may be modified by each EU country. The records shall be in electronic form and the controller or the processor and, where applicable, the controller’s or the processor’s representative, shall make the record available to the supervisory authority on request.

Records of controller shall contain all of the following information:

  • the name and contact details of the controller and, where applicable, the joint controller, the controller’s representative and the data protection officer;
  • the purposes of the processing;
  • a description of the categories of data subjects and of the categories of personal data;
  • the categories of recipients to whom the personal data have been or will be disclosed including recipients in third countries or international organisations;
  • where applicable, transfers of personal data to a third country or an international organisation, including the identification of that third country or international organisation and, in the case of transfers referred to in the second subparagraph of Article 49(1), the documentation of suitable safeguards;
  • where possible, the envisaged time limits for erasure of the different categories of data
  • where possible, a general description of the technical and organisational security measures referred to in Article 32(1).

Records of processor shall contain all of the following information:

  • the name and contact details of the processor or processors and of each controller on behalf of which the processor is acting, and, where applicable, of the controller’s or the processor’s representative, and the data protection officer;
  • the categories of processing carried out on behalf of each controller;
  • where applicable, transfers of personal data to a third country or an international organisation, including the identification of that third country or international organisation and, in the case of transfers referred to in the second subparagraph of Article 49(1), the
  • documentation of suitable safeguards;
  • where possible, a general description of the technical and organisational security measures referred to in Article 32(1).

Security of personal data

Article 33 states the data controller is under a legal obligation to notify the supervisory authority without undue delay unless the breach is unlikely to result in a risk to the rights and freedoms of the individuals. There is a maximum of 72 hours after becoming aware of the data breach to make the report. Individuals have to be notified if a high risk of an adverse impact is determined (Article 34). In addition, the data processor will have to notify the controller without undue delay after becoming aware of a personal data breach (Article 33).

However, the notice to data subjects is not required if the data controller has implemented appropriate technical and organisational protection measures that render the personal data unintelligible to any person who is not authorised to access it, such as encryption (Article 34).

Data protection officer

Article 37 requires appointment of a data protection officer. If processing is carried out by a public authority (except for courts or independent judicial authorities when acting in their judicial capacity), or if processing operations involve regular and systematic monitoring of data subjects on a large scale, or if processing on a large scale of special categories of data and personal data relating to criminal convictions and offences (Articles 9 and Article 10,) a data protection officer (DPO)—a person with expert knowledge of data protection law and practices—must be designated to assist the controller or processor in monitoring their internal compliance with the Regulation.

A designated DPO can be a current member of staff of a controller or processor, or the role can be outsourced to an external person or agency through a service contract. In any case, the processing body must make sure that there is no conflict of interest in other roles or interests that a DPO may hold. The contact details for the DPO must be published by the processing organisation (for example, in a privacy notice) and registered with the supervisory authority.

The DPO is similar to a compliance officer and is also expected to be proficient at managing IT processes, data security (including dealing with cyberattacks) and other critical business continuity issues associated with the holding and processing of personal and sensitive data. The skill set required stretches beyond understanding legal compliance with data protection laws and regulations. The DPO must maintain a living data inventory of all data collected and stored on behalf of the organization. More details on the function and the role of data protection officer were given on 13 December 2016 (revised 5 April 2017) in a guideline document.

Organisations based outside the EU must also appoint an EU-based person as a representative and point of contact for their GDPR obligations (Article 27). This is a distinct role from a DPO, although there is overlap in responsibilities that suggest that this role can also be held by the designated DPO.

Remedies, liability and penalties

Besides the definitions as a criminal offence according to national law following Article 83 GDPR the following sanctions can be imposed:

  • a warning in writing in cases of first and non-intentional noncompliance
  • regular periodic data protection audits
  • a fine up to €10 million or up to 2% of the annual worldwide turnover of the preceding financial year in case of an enterprise, whichever is greater, if there has been an infringement of the following provisions: (Article 83, Paragraph 4)
    • the obligations of the controller and the processor pursuant to Articles 81125 to 39, and 42 and 43
    • the obligations of the certification body pursuant to Articles 42 and 43
    • the obligations of the monitoring body pursuant to Article 41(4)
  • a fine up to €20 million or up to 4% of the annual worldwide turnover of the preceding financial year in case of an enterprise, whichever is greater, if there has been an infringement of the following provisions: (Article 83, Paragraph 5 & 6)
    • the basic principles for processing, including conditions for consent, pursuant to Articles 567, and 9
    • the data subjects’ rights pursuant to Articles 12 to 22
    • the transfers of personal data to a recipient in a third country or an international organisation pursuant to Articles 44 to 49
    • any obligations pursuant to member state law adopted under Chapter IX
    • noncompliance with an order or a temporary or definitive limitation on processing or the suspension of data flows by the supervisory authority pursuant to Article 58(2) or failure to provide access in violation of Article 58(1)

Exemptions

These are some cases which aren’t addressed in the GDPR specifically, thus are treated as exemptions.

  • Personal or household activities
  • Law Enforcement
  • National Security

When the GDPR was being created, it was strictly created for the regulation of personal data which goes into the hands of companies. What isn’t covered by the GDPR are your non commercial information or household activities. An example of these household activities may be emails between two high school friends.

In addition, the GDPR does not apply when data is potentially linked to a police investigation.

Conversely, an entity or more precisely an “enterprise” has to be engaged in “economic activity” to be covered by the GDPR. Economic activity is defined broadly under European Union competition law.

Applicability outside of the European Union

The GDPR also applies to data controllers and processors outside of the European Economic Area (EEA) if they are engaged in the “offering of goods or services” (regardless of whether a payment is required) to data subjects within the EEA, or are monitoring the behaviour of data subjects within the EEA (Article 3(2)). The regulation applies regardless of where the processing takes place. This has been interpreted as intentionally giving GDPR extraterritorial jurisdiction for non-EU establishments if they are doing business with people located in the EU.

EU Representative

Under Article 27, non-EU establishments subject to GDPR are obliged to have a designee within the European Union, an “EU Representative”, to serve as a point of contact for their obligations under the regulation. The EU Representative is the Controller’s or Processor’s contact person vis-à-vis European privacy supervisors and data subjects, in all matters relating to processing, to ensure compliance with this GDPR. A natural (individual) or moral (corporation) person can play the role of an EU Representative. The non-EU establishment must issue a duly signed document (letter of accreditation) designating a given individual or company as its EU Representative. The said designation can only be given in writing.

An establishment’s failure to designate an EU Representative is considered ignorance of the regulation and relevant obligations, which itself is a violation of the GDPR subject to fines of up to €10 million or up to 2% of the annual worldwide turnover of the preceding financial year in case of an enterprise, whichever is greater. The intentional or negligent (willful blindness) character of the infringement (failure to designate an EU Representative) may rather constitute aggravating factors.

An establishment does not need to name an EU Representative if they only engage in occasional processing that does not include, on a large scale, processing of special categories of data as referred to in Article 9(1) of GDPR or processing of personal data relating to criminal convictions and offences referred to in Article 10, and such processing is unlikely to result in a risk to the rights and freedoms of natural persons, taking into account the nature, context, scope and purposes of the processing. Non-EU public authorities and bodies are equally exempted.

Third countries

Chapter V of the GDPR forbids the transfer of the personal data of EU data subjects to countries outside of the EEA — known as third countries — unless appropriate safeguards are imposed, or the third country’s data protection regulations are formally considered adequate by the European Commission (Article 45). Binding corporate rules, standard contractual clauses for data protection issued by a DPA, or a scheme of binding and enforceable commitments by the data controller or processor situated in a third country, are among examples.

United Kingdom implementation

The applicability of GDPR in the United Kingdom is affected by Brexit. Although the United Kingdom formally withdrew from the European Union on 31 January 2020, it remained subject to EU law, including GDPR, until the end of the transition period on 31 December 2020. The United Kingdom granted royal assent to the Data Protection Act 2018 on 23 May 2018, which augmented the GDPR, including aspects of the regulation that are to be determined by national law, and criminal offences for knowingly or recklessly obtaining, redistributing, or retaining personal data without the consent of the data controller.

Under the European Union (Withdrawal) Act 2018, existing and relevant EU law was transposed into local law upon completion of the transition, and the GDPR was amended by statutory instrument to remove certain provisions no longer needed due to the UK’s non-membership in the EU. Thereafter, the regulation will be referred to as “UK GDPR”. The UK will not restrict the transfer of personal data to countries within the EEA under UK GDPR. However, the UK will become a third country under the EU GDPR, meaning that personal data may not be transferred to the country unless appropriate safeguards are imposed, or the European Commission performs an adequacy decision on the suitability of British data protection legislation (Chapter V). As part of the withdrawal agreement, the European Commission committed to perform an adequacy assessment.

In April 2019, the UK Information Commissioner’s Office (ICO) issued a proposed code of practice for social networking services when used by minors, enforceable under GDPR, which also includes restrictions on “like” and “streak” mechanisms in order to discourage social media addiction and on the use of this data for processing interests.

In March 2021, Secretary of State for Digital, Culture, Media and Sport Oliver Dowden stated that the UK was exploring divergence from the EU GDPR in order to ” more on the outcomes that we want to have and less on the burdens of the rules imposed on individual businesses”.

Reception

As per a study conducted by Deloitte in 2018, 92% of companies believe they are able to comply with GDPR in their business practices in the long run.

Companies operating outside of the EU have invested heavily to align their business practices with GDPR. The area of GDPR consent has a number of implications for businesses who record calls as a matter of practice. A typical disclaimer is not considered sufficient to gain assumed consent to record calls. Additionally, when recording has commenced, should the caller withdraw their consent, then the agent receiving the call must be able to stop a previously started recording and ensure the recording does not get stored.

IT professionals expect that compliance with the GDPR will require additional investment overall: over 80 percent of those surveyed expected GDPR-related spending to be at least US$100,000. The concerns were echoed in a report commissioned by the law firm Baker & McKenzie that found that “around 70 percent of respondents believe that organizations will need to invest additional budget/effort to comply with the consent, data mapping and cross-border data transfer requirements under the GDPR.” The total cost for EU companies is estimated at around €200 billion while for US companies the estimate is for $41.7 billion. It has been argued that smaller businesses and startup companies might not have the financial resources to adequately comply with the GDPR, unlike the larger international technology firms (such as Facebook and Google) that the regulation is ostensibly meant to target first and foremost. A lack of knowledge and understanding of the regulations has also been a concern in the lead-up to its adoption. A counter-argument to this has been that companies were made aware of these changes two years prior to them coming into effect and, therefore, should have had enough time to prepare.

The regulations, including whether an enterprise must have a data protection officer, have been criticized for potential administrative burden and unclear compliance requirements. Although data minimisation is a requirement, with pseudonymisation being one of the possible means, the regulation provide no guidance on how or what constitutes an effective data de-identification scheme, with a grey area on what would be considered as inadequate pseudonymisation subject to Section 5 enforcement actions. There is also concern regarding the implementation of the GDPR in blockchain systems, as the transparent and fixed record of blockchain transactions contradicts the very nature of the GDPR. Many media outlets have commented on the introduction of a “right to explanation” of algorithmic decisions, but legal scholars have since argued that the existence of such a right is highly unclear without judicial tests and is limited at best.

The GDPR has garnered support from businesses who regard it as an opportunity to improve their data management. Mark Zuckerberg has also called it a “very positive step for the Internet”, and has called for GDPR-style laws to be adopted in the US. Consumer rights groups such as The European Consumer Organisation are among the most vocal proponents of the legislation. Other supporters have attributed its passage to the whistleblower Edward Snowden. Free software advocate Richard Stallman has praised some aspects of the GDPR but called for additional safeguards to prevent technology companies from “manufacturing consent”.

Impact

Academic experts who participated in the formulation of the GDPR wrote that the law, “is the most consequential regulatory development in information policy in a generation. The GDPR brings personal data into a complex and protective regulatory regime. That said, the ideas contained within the GDPR are not entirely European, nor new. The GDPR’s protections can be found – albeit in weaker, less prescriptive forms – in U.S. privacy laws and in Federal Trade Commission settlements with companies.

Despite having had at least two years to prepare and do so, many companies and websites changed their privacy policies and features worldwide directly prior to GDPR’s implementation, and customarily provided email and other notifications discussing these changes. This was criticised for resulting in a fatiguing number of communications, while experts noted that some reminder emails incorrectly asserted that new consent for data processing had to be obtained for when the GDPR took effect (any previously-obtained consent to processing is valid as long as it met the regulation’s requirements). Phishing scams also emerged using falsified versions of GDPR-related emails, and it was also argued that some GDPR notice emails may have actually been sent in violation of anti-spam laws. In March 2019, a provider of compliance software found that many websites operated by EU member state governments contained embedded tracking from ad technology providers.

The deluge of GDPR-related notices also inspired memes, including those surrounding privacy policy notices being delivered by atypical means (such as an Ouija board or Star Wars opening crawl), suggesting that Santa Claus’s “naughty or nice” list was a violation, and a recording of excerpts from the regulation by a former BBC Radio 4 Shipping Forecast announcer. A blog, GDPR Hall of Shame, was also created to showcase unusual delivery of GDPR notices, and attempts at compliance that contained egregious violations of the regulation’s requirements. Its author remarked that the regulation “has a lot of nitty gritty, in-the-weeds details, but not a lot of information about how to comply”, but also acknowledged that businesses had two years to comply, making some of its responses unjustified.

Research indicates that approximately 25% of software vulnerabilities have GDPR implications. Since Article 33 emphasizes breaches, not bugs, security experts advise companies to invest in processes and capabilities to identify vulnerabilities before they can be exploited, including Coordinated vulnerability disclosure processes. An investigation of Android apps’ privacy policies, data access capabilities, and data access behaviour has shown that numerous apps display a somewhat privacy-friendlier behavior since the GDPR was implemented, although they still retain most of their data access privileges in their code. An investigation of the Consumer Council of Norway (called Forbrukerrådet in Norwegian) into the post-GDPR data subject dashboards on social media platforms (such as Google dashboard) has concluded that large social media firms deploy deceptive tactics in order to discourage their customers from sharpening their privacy settings.

On the effective date, some international websites began to block EU users entirely (including Instapaper, Unroll.me, and Tribune Publishing-owned newspapers, such as the Chicago Tribune and the Los Angeles Times) or redirect them to stripped-down versions of their services (in the case of National Public Radio and USA Today) with limited functionality and/or no advertising so that they will not be liable. Some companies, such as Klout, and several online video games, ceased operations entirely to coincide with its implementation, citing the GDPR as a burden on their continued operations, especially due to the business model of the former. Sales volume of online behavioural advertising placements in Europe fell 25–40% on 25 May 2018.

In 2020, two years after the GDPR began its implementation, the European Commission assessed that users across the EU had increased their knowledge about their rights, stating that “69% of the population above the age of 16 in the EU have heard about the GDPR and 71% of people heard about their national data protection authority.” The Commission also found that privacy has become a competitive quality for companies which consumers are taking into account in their decisionmaking processes.

Enforcement and Inconsistency

Facebook and subsidiaries WhatsApp and Instagram, as well as Google LLC (targeting Android), were immediately sued by Max Schrems’s non-profit NOYB just hours after midnight on 25 May 2018, for their use of “forced consent”. Schrems asserts that both companies violated Article 7(4) by not presenting opt-ins for data processing consent on an individualized basis, and requiring users to consent to all data processing activities (including those not strictly necessary) or would be forbidden from using the services. On 21 January 2019, Google was fined €50 million by the French DPA for showing insufficient control, consent, and transparency over use of personal data for behavioural advertising. In November 2018, following a journalistic investigation into Liviu Dragnea the Romanian DPA (ANSPDCP) used a GDPR request to demand information on the RISE Project’s sources.

In July 2019, the British Information Commissioner’s Office issued an intention to fine British Airways a record £183 million (1.5% of turnover) for poor security arrangements that enabled a 2018 web skimming attack affecting around 380,000 transactions. British Airways was ultimately fined the reduced amount of £20m, with the ICO noting that they had “considered both representations from BA and the economic impact of COVID-19 on their business before setting a final penalty”.

In December 2019, Politico reported that Ireland and Luxembourg — two smaller EU countries that have had a reputation as a tax havens and (especially in the case of Ireland) as a base for European subsidiaries of U.S. big tech companies — were facing significant backlogs in their investigations of major foreign companies under GDPR, with Ireland citing the complexity of the regulation as a factor. Critics interviewed by Politico also argued that enforcement was also being hampered by varying interpretations between member states, the prioritisation of guidance over enforcement by some authorities, and a lack of cooperation between member states.

While companies are now subject to legal obligations, there are still various inconsistencies in the practical and technical implementation of GDPR. As an example, according to the GDPR’s right to access, the companies are obliged to provide data subjects with the data they gather about them. However, in a study on loyalty cards in Germany, companies did not provide the data subjects with the exact information of the purchased articles. One might argue that such companies do not collect the information of the purchased articles, which does not conform with their business models. Therefore, data subjects tend to see that as a GDPR violation. As a result, studies have suggested for a better control through authorities.

According to the GDPR, end-users’ consent should be valid, freely given, specific, informed and active. However, the lack of enforceability regarding obtaining lawful consents has been a challenge. As an example, a 2020 study, showed that the Big Tech, i.e. Google, Amazon, Facebook, Apple, and Microsoft (GAFAM), use dark patterns in their consent obtaining mechanisms, which raises doubts regarding the lawfulness of the acquired consent.

In March of 2021, EU member states led by France were reported to be attempting to modify the impact of the privacy regulation in Europe by exempting national security agencies.

Influence on international laws

Mass adoption of these new privacy standards by international companies has been cited as an example of the “Brussels effect”, a phenomenon wherein European laws and regulations are used as a global baseline due to their gravitas.

The U.S. state of California passed the California Consumer Privacy Act on 28 June 2018, taking effect 1 January 2020: it grants rights to transparency and control over the collection of personal information by companies in a similar means to GDPR. Critics have argued that such laws need to be implemented at the federal level to be effective, as a collection of state-level laws would have varying standards that would complicate compliance.

The Republic of Turkey, a country holding its candidate status for European Union membership has adopted The Law on The Protection of Personal Data on 24 March 2016 in compliance with the EU acquis.

Timeline

  • 25 January 2012: The proposal for the GDPR was released.
  • 21 October 2013: The European Parliament Committee on Civil Liberties, Justice and Home Affairs (LIBE) had its orientation vote.
  • 15 December 2015: Negotiations between the European Parliament, Council and Commission (Formal Trilogue meeting) resulted in a joint proposal.
  • 17 December 2015: The European Parliament’s LIBE Committee voted for the negotiations between the three parties.
  • 8 April 2016: Adoption by the Council of the European Union. The only member state voting against was Austria, which argued that the level of data protection in some respects falls short compared to the 1995 directive.
  • 14 April 2016: Adoption by the European Parliament.
  • 24 May 2016: The regulation entered into force, 20 days after its publication in the Official Journal of the European Union.
  • 25 May 2018: Its provisions became directly applicable in all member states, two years after the regulations enter into force.
  • 20 July 2018: the GDPR became valid in the EEA countries (Iceland, Liechtenstein, and Norway), after the EEA Joint Committee and the three countries agreed to follow the regulation.

EU Digital Single Market

The EU Digital Single Market strategy relates to “digital economy” activities related to businesses and people in the EU. As part of the strategy, the GDPR and the NIS Directive all apply from 25 May 2018. The proposed ePrivacy Regulation was also planned to be applicable from 25 May 2018, but will be delayed for several months. The eIDAS Regulation is also part of the strategy.

In an initial assessment, the European Council has stated that the GDPR should be considered “a prerequisite for the development of future digital policy initiatives”.