On the 22nd of last month, Dr. Sao Paulo court Has decided that Sao Paulo Metro will be barred from operating a facial recognition system which it is applying to its benefits. This decision was made public after filing the Civil Action PSao Paulo State Public Defender’s Office, Union Public Defender’s Office and a number of civil society organizations (IDC, CADhu, Speakers And Article 19), which seeks to prevent Metro from capturing users’ biometric data and to pay damages. Morality A sum of more than 42 million reas.
Proponents of her case have been working to make the actual transcript of this statement available online. Proponents of her case have been working to make the actual transcript of this statement available online.
Metrô, in turn, argues that the use of technology to ensure users’ safety is legitimate, also increases the likelihood of applying for the location of missing children.
While these arguments may, at first glance, suggest that the implementation of facial recognition cameras is consistent with the safety of children and adolescents, ensuring greater safety and ensuring the final location of missing persons, an in-depth analysis of the problem reveals that compliance Violates the most basic principles and rights guaranteed by law and international standards.
Child protection, therefore, cannot be made clear in favor of protecting these habits – it will certainly serve as an additional argument against them. To make this clear, it is important to note the very high discriminatory potential of oral recognition technology, especially when applied to black populations or other ethnic minorities.
It has been widely documented that the development of these technologies is surrounded by unconscious bias (generalizations based on stereotypes) that affect their accuracy when applied to non-white faces. This is because, in general, the phenotypes used to train artificial intelligence for face recognition are precisely Caucasian men, these techniques greatly increase the chances of incorrectly identifying a black person as a criminal, for example.
Examples of this phenomenon, called algorithmic racism, abound. Last year, the case of a black American teenager who, by facial recognition technology, was barred from accessing a skating rink, was accused of having fought before the establishment, which in reality she had never entered, gained notoriety. More worryingly, several people of color were mistakenly arrested in the United States for their facial recognition problems.
The application of this technology to public transport already has the potential to increase the unbearable racial inequality and inequality that plagues Brazil, therefore, immensely. It is not an exaggeration to consider, for example, that black teenagers have not transgressed as a result of growing irrational attitudes – a situation that, if regulated now in the Brazilian context, would certainly become more frequent if legitimized by incorrectly considered technology. Neutral and high precision.
It is important to remember that non-discrimination is one of the structural principles of the Convention on the Rights of the Child, of which Brazil is a signatory, as well as constituting itself as one of the pillars of our federal constitution. , Without prejudice to gender, color, age or any other form of discrimination “(Art. 3, IV), guarantees for children, adolescents and young men and women, Article 227, protection against any form of discrimination and oppression – a provision which Is echoed by art. 3, the only paragraph of the ECA.
This is the same instrument that ensures a series of rights for children and adolescents that are totally incompatible with the development of a social model driven by universal observation of the population. Freedom of association, expression, development and movement are all privileges guaranteed by the rules of child protection which, unless immediately interrupted by the use of facial recognition technology, certainly find less space to fully practice in an increasingly marked state. Caution
It is also important to highlight the privacy and security of personal data in the face of serious threats to the rights of children and adolescents as a result of the installation of facial recognition systems on public transport. The leaking or misuse of these individuals’ data can have detrimental consequences for a number of their rights, which is why the management of this data must always be guided by the highest standards of protection and security.
Also, the lack of transparency in the treatment of collected data violates what LGPD prescribes for the treatment of personal data of children and adolescents, raising concerns about the possibility that they may be used for commercial gain.
To make matters worse, there are studies that indicate that the accuracy of face recognition technology in correct identification of children and adolescents, whose faces still change over time, also decreases. Thus, the argument that these technologies can be used to identify missing children falls to the ground.
For all of these reasons, UN Committee on the Rights of the Child General Commentary No. 25 (a document detailing how the Convention on the Rights of the Child should be interpreted and applied to the digital environment), was part of a partnership between Instituto Alana and the Sao Paulo Public The latest commentary version, launched through, provides that “any digital surveillance of children involved in any automated processing of personal data must respect the child’s right to privacy and must not be regularized.” In the case of his mother, father or guardian; or without the right to object to such supervision. ” As is clear, the surveillance intended by Metro is completely inconsistent with what the UN sets.
In short, although the implementation of facial recognition cameras may seem tempting from a child safety perspective, this conclusion does not stand for a deep reflection of its practical effects and consequences. Children and adolescents have the right to grow up in a free, egalitarian society that prioritizes their rights and best interests – and the presence of cameras on public transport, without a doubt, goes in the opposite direction.
* Joao Francisco de Aguirre is a lawyer for the child and enjoyment program at the Coelho Instituto Alana.