
AEPD bans the use of facial recognition technologies for surveillance in online exams
In a context in which distance education has become a fundamental tool, especially after the pandemic caused by COVID-19, digital remote monitoring tools for taking online exams have proliferated. Among them, facial recognition technologies stand out, used by some platforms to verify the student’s identity and monitor their behavior during the test.
Why has the AEPD ruled on the matter?
Faced with this reality, the Spanish Data Protection Agency (AEPD) has issued a strong resolution prohibiting the use of facial recognition technologies in the context of online exams. This decision is based on the fact that such treatment of biometric data is considered a disproportionate intrusion in the fundamental rights of students.
The AEPD considers that there are less intrusive alternatives that achieve the same purpose, i.e., guaranteeing the student’s identity and preventing academic fraud, without resorting to automated surveillance technologies based on facial recognition, which involve a massive collection and processing of particularly sensitive data.
Legal and ethical risks of this use in education
Facial recognition, when dealing with biometric data that allows a person to be uniquely identified, falls into the category of data specially protected by the General Data Protection Regulation (GDPR) in Article 9. Its processing is only allowed under very specific circumstances, such as reasons of essential public interest or with the explicit consent of the data subject, and even then strict safeguards must be met.
In the field of education, the application of this type of technology raises serious ethical and legal conflicts. Firstly, the principle of proportionality is called into question: is it justifiable to process biometric data to prevent possible fraud in an exam? The AEPD has considered that it is not, especially when there are less intrusive alternative means.
In addition, the use of these technologies can produce discriminatory effects, such as false positives or negatives in identification, which unequally affect certain groups. This jeopardizes the principle of educational equity. On the other hand, subjecting students to an environment of constant surveillance can generate a climate of distrust and discomfort that negatively impacts their performance and emotional well-being.
From a legal point of view, the use of facial recognition could be considered a violation of the right to privacy, especially if it is imposed in a mandatory manner without truly free consent and without providing a valid alternative.
Legal practices for online surveillance of exams
In light of the AEPD’s pronouncement, it is essential for educational institutions to review their assessment practices in digital environments and comply with current regulatory requirements.
Online proctoring of exams can be done legally and ethically if the principles of the GDPR are respected: data minimization, proportionality, transparency and legitimate purpose.
There are viable alternatives that ensure academic integrity without compromising the fundamental rights of students. Some of these practices include:
- Live video call monitoring, with informed consent and under proportionality criteria.
- Connection and delivery time control systems, to detect irregularities without the need to invade the student’s privacy.
- Open or project-based assessments, which reduce the need for monitoring by focusing on the student’s analysis, reflection or creativity.
- Prior verification of identity through official documents, but without continuous storage of biometric data.
In addition, it is essential to have clear privacy policies, to adequately inform students about data processing and to ensure that any tools used comply with legal standards on data protection.
In conclusion, the AEPD’s decision reinforces the idea that technology must be used appropriately and be at the service of people. Digital education must evolve in harmony with fundamental rights and not at the expense of them. Committing to responsible, transparent and privacy-friendly assessment practices is not only a legal obligation, but also an ethical commitment to students.

Midiala Fernández es abogada especialista en propiedad intelectual, derecho de las nuevas tecnologías y protección de datos.
Desde 2019 asesora en materias como comercio electrónico, marketing digital, publicidad, competencia desleal y ciberseguridad. Es graduada en Derecho por la Universidad Complutense de Madrid y cuenta con formación de posgrado en derecho y compliance TIC por la Universidad Camilo José Cela.






