
The use of facial recognition surveillance systems: legal limits and privacy risks
In recent years, the deployment of surveillance systems equipped with facial recognition technology has sparked considerable legal and social debate. While these tools may enhance security and efficiency in certain settings, they also raise serious concerns regarding privacy, proportionality, and legal compliance. This is particularly relevant in Europe and Spain, where the regulatory framework for personal data protection is especially strict. As such, the use of this technology must be approached with the utmost caution.
Facial recognition involves the processing of biometric data capable of uniquely identifying an individual. Under the European Union’s General Data Protection Regulation (GDPR), this type of data is considered a special category, subject to enhanced protection. As a general rule, its processing is prohibited unless a specific legal basis justifies it. The GDPR permits exceptions in cases where the data subject has given explicit consent, where there is a substantial public interest, or when necessary for reasons of public security—provided these are supported by appropriate legal provisions.
In Spain, the Spanish Data Protection Agency (AEPD) has repeatedly emphasized that the use of facial recognition systems must be fully justified, proportionate to the intended purpose, and carried out with all the guarantees established by law. Moreover, any project involving this technology requires a Data Protection Impact Assessment (DPIA), a formal evaluation that examines potential risks to the rights and freedoms of affected individuals.
One of the most frequently asked questions is whether a private company can implement facial recognition surveillance. Generally speaking, the answer is no—except in highly specific and exceptional cases. A company cannot rely solely on efficiency or general security concerns to justify the deployment of such systems. When consent is required, it must be freely given, informed, and specific—something difficult to ensure in work environments where a power imbalance exists. The AEPD has made it clear that the convenience or modernity of biometric technologies does not in itself justify their use over less invasive alternatives.
The indiscriminate use of these systems also presents serious legal risks for individuals. First, it can significantly infringe upon the right to privacy and data protection, particularly when facial images are collected or stored without the data subject’s knowledge or consent. In addition, numerous studies have shown that facial recognition algorithms may exhibit biases that disproportionately affect certain racial groups, age ranges, or genders, potentially leading to indirect discrimination or misidentification.
Another critical issue is the lack of transparency. Often, people are not adequately informed that facial recognition technology is being used, which limits their ability to exercise rights such as access, objection, or data deletion. Furthermore, the improper or unlawful use of these technologies may result in substantial administrative fines, as both the GDPR and Spanish law impose significant penalties for violations of data protection rules.
Ultimately, while facial recognition surveillance systems can offer value in highly specific and tightly regulated contexts, their widespread use poses serious risks to fundamental rights. The principles of legality, necessity, and proportionality—as well as respect for personal privacy—must be central to any decision involving this type of technology. Companies and organizations seeking to adopt such systems should proceed with extreme caution, ensure full legal compliance, and above all, guarantee that the rights of individuals are not compromised in the name of technological advancement.

Letslaw es una firma de abogados internacionales especializada en el derecho de los negocios.






