July 26, 2021

Tri-City Weekly

Complete News World

Rights Defender warns of “significant risks” of biometric technologies

Posted on Tuesday, July 20, 2021 at 08:28 AM.

Can you be wrongly imprisoned because of a facial recognition system, or can you be fired from a job interview because a computer sees you as nervous? Faced with the development of biometric technologies, the rights defender is cautious in a report released on Tuesday about the “significant risks” they pose.

This independent authority, led by Claire Hayden, warns that “improvements that biometric technologies allow will not be harmful to a section of the population, or at the expense of general monitoring.”

Over the years, these technologies have become pervasive by analyzing a person’s biometric data such as facial features, voice or behavioral characteristics, even to identify or identify his personality.

But they are “particularly intrusive” and “significant risks of violating fundamental rights,” the report said.

They first expose “security violations with particularly serious consequences,” the rights defender explains. If the password can be changed after a hack, the fingerprints cannot be changed when they are stolen from a database. “Danger to anonymity in public space by allowing a form of public surveillance,” the company says.

Beyond the protection of privacy, the report specifically points to the “unparalleled potential of biometric technologies to amplify and automate discrimination” in terms of learning methods. However, under their apparent mathematical neutrality, they can be non-discriminatory, especially due to inadvertently dependent databases.

“In the United States, three blacks have already been wrongly imprisoned for errors in facial recognition,” the report recalls.

Security, human resources, education … there are many fields involved.

READ  Joe Biden's economic prowess


Some recruitment companies already market software that gives candidates scores during a job interview. These technologies, which are aimed at assessing emotions, “make a lot of mistakes”.

The document also points to the “blocking effect” of biometric technologies, which may discourage some from proving it.

For better oversight, the Rights Defender makes a number of recommendations. First, it calls on all private and public actors to reject emotional evaluation techniques.

With regard to facial recognition already banned by the Constitutional Council for police drones, the rights defender believes it should be for other devices such as video surveillance or pedestrian cameras.

Finally, the company should establish “external and independent audit” of biometric identification devices in the model of “monitoring the adverse effects of drugs” and routine testing of the effects of the mechanisms.