On Biometrics

Three Reasons why we do not employ facial recognition technology

Signatrix considers itself an ethical company. We want to improve the retail experience for human beings and not turn retail stores into the Brave New Shopping World, where fully surveilled shopping drones go get their fix of Soilent Green. As an ethical company, we are not only bound by our own moral code, but also by law and some very pragmatic considerations, to refrain from using facial recognition or any other biometrics in the development and deployment of our solutions.

Obviously, we have to adhere to GDPR standards for our products to be deployable in customers stores. The hard thing here is figuring out what exactly those standards are, given vague wordings and a lack of precedents regarding data for AI-applications. It is common consensus that it is illegal to keep personal records on customers, hence using facial recognition or other biometrics to identify customers is out of the question. Even keeping analogue pictures of banned customers to ensure that certain people won’t (re-)enter the store is technically illegal in Europe. That is why some stores offer store thieves not to press charges in exchange for their agreement to be added to an in-house list of potential repeat offenders, complete with pictures to enable the security staff to easily identify potential threats.

One thing to take note of here is that retail stores are private property and that upon entering the store customers are notified of and agree to the fact, that they will be video monitored and that video data will be evaluated. The fact that the way they are notified is usually a simple sign that is easily overlooked and most of them might not be aware of giving their consent does not change the judicial situation. It does however effect what we consider to be an appropriate way of handling the recorded video data.

Before ethical or judicial considerations even come into play, there is the issue of computability, both on an economical and a technical level. While facial recognition via Computer Vision works quite well these days, it requires high-resolution data, which in turn requires more computational power to be processed. Employing it would greatly increase the cost of training our AI-models and thus the cost of our products. Therefore it is not only in the interest of our customer’s customers but also in our customer’s interest that it should be avoided.

Let’s reiterate the three reasons at the risk of being redundant:

  1. Our own moral code
  2. GDPR Standards
  3. The hardware costs of analysing biometrics.

A critical mind is bound to realise the third reason might well fall prey to technological progress.Luckily we have ethics and the law to rely upon if that should be the case.

Leave a Reply

Your email address will not be published.

Share This

Copy Link to Clipboard

Copy