logo

Do you work with Big Data? How to conduct an impact assessment

LetsLaw / Data Protection  / Do you work with Big Data? How to conduct an impact assessment
Impact assessment

Do you work with Big Data? How to conduct an impact assessment

The General Data Protection Regulation (GDPR), in force since 25 May 2018, introduced the obligation to carry out a Data Protection Impact Assessment (DPIA) for processing operations that may entail a high risk to the rights and freedoms of individuals.

Any company or ecommerce platform that processes personal data should understand what this preventive analysis involves and when it is mandatory. At LETSLAW, we have previously addressed the key aspects of DPIAs, and today we update this information in light of the most recent European and national regulatory developments.

The European Data Protection Board (EDPB) (formerly the Article 29 Working Party) has issued several guidelines interpreting the GDPR and setting out practical criteria on when a DPIA should be carried out and what it should include. Moreover, recent European regulations on Artificial Intelligence and data governance have reinforced its relevance in Big Data and automation environments.

Privacy risks in Big Data

Big Data involves the large-scale, continuous, and automated processing of massive amounts of information, often using predictive analytics, profiling, or artificial intelligence techniques. These processing activities can seriously compromise privacy if their risks are not properly assessed.

Article 35 of the GDPR establishes that a DPIA must be carried out whenever processing operations are “likely to result in a high risk to the rights and freedoms of natural persons.” There are three specific scenarios in which this obligation always applies:

  1. Large-scale processing of sensitive data, such as health, political beliefs, sexual orientation, or biometric data.
  2. Systematic large-scale monitoring of publicly accessible areas.
  3. Automated evaluations or decisions producing legal or similarly significant effects on individuals (for example, profiling or credit scoring systems).

 

The GDPR also clarifies that this list is not exhaustive. Any processing operation that poses significant risks must be evaluated.

The EDPB and the Spanish Data Protection Authority (AEPD) identify several criteria that help determine when a processing activity is considered high-risk, including:

  • Evaluation or scoring of individuals (profiling or prediction).
  • Automated decision-making with legal or economic consequences.
  • Systematic monitoring or surveillance.
  • Processing of special categories of data or other confidential information.
  • Large-scale data processing.
  • Combination of datasets from different sources.
  • Processing data of vulnerable individuals (minors, employees, patients, etc.).
  • Use of innovative or disruptive technologies, such as AI, biometrics, or advanced geolocation.
  • International transfers of data outside the EU.
  • Processing operations that restrict the exercise of rights or access to services or contracts.

 

As a general rule, if the processing meets two or more of these criteria, conducting a DPIA is considered advisable — and in most cases, mandatory.

When a data protection impact assessment is mandatory

A DPIA must be conducted before starting the processing and forms part of the accountability principle required by the GDPR.

Even when it is uncertain whether a DPIA is mandatory, both the EDPB and the AEPD recommend carrying it out, as it serves as an effective tool to identify and mitigate risks and to demonstrate regulatory compliance in the event of an inspection.

From 2025 onwards, this assessment gains even greater importance in certain contexts:

  • When the processing involves high-risk Artificial Intelligence systems, under the new Regulation (EU) 2024/1689 – AI Act, the DPIA must be coordinated with the conformity assessment required for AI systems.
  • In processing operations involving data sharing or reuse under the Data Act (Regulation EU 2023/2854) or the Data Governance Act (Regulation EU 2022/868), the DPIA should include an additional analysis of data access conditions, anonymisation, and traceability controls.
  • In Spain, the AEPD updated its Practical Guide on DPIAs in 2023, adding new examples of high-risk processing operations, such as:
    • Intelligent video surveillance or facial recognition.
    • AI systems for assessing behaviour or employee performance.
    • Big Data platforms for large-scale user analytics.
    • Processing of biometric or genetic data.

What a data protection impact assessment should include

According to the EDPB Guidelines and the AEPD Practical Guide (2023), a complete DPIA should contain at least the following elements:

Detailed description of the processing

  • Nature, scope, context, and purpose of the processing.
  • Categories of personal data and recipients.
  • Data retention periods.
  • Technical and organisational measures in place.
  • Compliance with approved codes of conduct or certifications.

Assessment of necessity and proportionality

  • Lawfulness of the processing.
  • Adequacy, relevance, and data minimisation.
  • Reasonable retention and processing period.
  • Measures to guarantee data subject rights: information, access, rectification, portability, objection, restriction, and erasure.

Identification and management of risks

  • Analysis of the origin, nature, likelihood, and severity of the risks.
  • Potential effects on rights and freedoms (unauthorised access, alteration, loss, misuse).
  • Security and mitigation measures adopted.

Involvement of relevant parties

  • Consultation with the Data Protection Officer (DPO).
  • Consultation, where appropriate, with data subjects or their representatives.

Follow-up and review

  • Periodic review of the DPIA whenever changes occur in the processing activity or risk level.
  • Documentation and traceability of all decisions made.

 

When the measures identified are insufficient to reduce the risk to an acceptable level, the controller must consult the competent supervisory authority (the AEPD in Spain) before starting the processing.

Consequences of non-compliance

Failure to carry out a DPIA when required, or conducting it incorrectly, may result in administrative fines of up to €10 million or 2 % of the organisation’s total worldwide annual turnover, whichever is higher.

Additionally, failure to conduct a DPIA for high-risk AI projects may lead to further infringements under the AI Act, with penalties reaching up to 7% of the organisation’s global annual turnover.

Big Data, Artificial Intelligence, and data interconnectivity significantly increase privacy risks. A Data Protection Impact Assessment is not only a legal requirement but also a strategic tool to anticipate legal issues, strengthen transparency, and build user trust.

At Letslaw, a law firm specialising in digital law, data protection, and artificial intelligence, we help organisations determine whether their processing activities require a DPIA and guide them in conducting it in accordance with the latest GDPR standards, AEPD recommendations, and the evolving European data and AI regulatory framework.

Contact Us

    By clicking on "Send" you accept our Privacy Policy - + Info

    I agree to receive outlined commercial communications from LETSLAW, S.L. in accordance with the provisions of our Privacy Policy - + Info