Microsoft’s Azure Cognitive service, Face, is a well known API that provides several algorithms for the detection, recognition and analysis of human faces in images.
Many businesses will find such capabilities very important for different scenarios, like biometric security and image content analysis. However, there are several situations where it is important to check if the facial recognition service will be able to provide sufficiently accurate results.
Several research papers have carefully examined facial recognition, but they have mainly or exclusively focused on high-quality images. Therefore, we may face difficulties when it is not possible to have high quality images, or there might be a large age gap between different images of the same person.
In this context, if you want to build an application that will compare small, low quality, or blurry faces, there is a major challenge because the images often lack local details which are essential for facial recognition. So in order to develop software that will process images from identity documents (such as passports and ID cards), it is important to test and validate if the software will be effective at distinguishing between different people and their images.
As a result, here at Belatrix we conducted a few experiments to compare low quality images from identity documents against selfies we took with an Android smartphone. Unfortunately the results of our analysis indicate that Azure’s Face is not yet ready to be used in situations where you require very high accuracy.
The following sections will detail the sample application and describe the experiments that we conducted. Finally, we will present the results and some final conclusions.
Many applications have a problem with fake users, since it’s very easy for one person to register multiple accounts. In this context, an interesting security feature is to validate if the account information belongs to a real person.
One possible solution is to get the required information from an identity document. However, here you also need to verify that the document is not fake or belongs to a different person. In order to solve such a problem, a good approach is to compare the image in the identity document with a selfie of the owner, to validate that the application is dealing with the same person.
However, in most cases, images in identity documents lack quality or are pretty old. This presents a problem since most facial recognition solutions work with middle/high quality images. Thus, we set out to see if Azure’s Face service is ready to face this kind of situation.
We conducted several experiments to determine the aspects to consider when using the Azure Cognitive Face service with the problem of identifying people in low quality images. The smartphone we used to take the photos was an Xiaomi Mi A3 (Android 9 – One Edition) with the following specifications:
We completed two sets of experiments. The first one aimed to validate if it is possible to recognize people from their identity documents. The second one had the purpose of finding out an approximate number of false positives.
For the experiments, we asked six people (two women and four men from Peru) to take three photos of their identity documents and some selfies (see figure 1). The age of the people ranged from 24 to 51 years old, and the emission date of their documents ranged from 2013 to 2018. For the initial experiments, we basically compared each person’s selfies with their corresponding identity document images and got the following results.
In table 1, the six people are listed in the first column by gender, while columns DNI 1, DNI 2 and DNI 3 show the average confidence results for each photo of their identity document. It is important to mention that the Azure service recognizes that two images contain the same person when the confidence is over 0.50.
Thus, in most cases the service failed to recognize the identity documents of their respective owners. Nevertheless, it is important to note that the average confidence values were pretty close to the minimum accepted. Due to license limitations the tests were executed with just four selfies from each person.
For the second set of experiments, our intention was to evaluate the error of comparing the identity documents with faces of other people (not the owners). With that in mind we used some images from the UTKFace data set. We only used images from people listed as having their race as “Others” (in the dataset this includes for example Hispanic, Latino, Middle Eastern) and with a similar age range. Finally, six random images were selected from our data set for each test and compared with each photo of the identity documents, giving the following results.
The results of table 2 show that the service correctly detected that none of the selfies belonged to any of the identity documents. Furthermore, it is interesting to mention that the confidence values were very small in most cases. This is a huge difference in comparison with the failure cases of table 1. Finally, another important aspect is that the cases with older identity documents had worse detection results.
Nowadays, security is more important than ever. Most applications are looking to improve their security layers by adding new ways to identify users. Among different solutions, more app developers are looking to use facial recognition, as there is a lot of research on the technology, and it continues to improve its accuracy.
Nevertheless, in cases where images don’t meet the required characteristics (for example they have low quality or are blurry), some solutions like Azure’s Cognitive Face API may not be effective. We are not saying that it is not possible to use it, just that you may need to adjust the confidence threshold in order to recognize the person. For example, you may conclude that an identity document belongs to a person with a confidence of over 0.4. However, you have to be aware that this decrease in confidence threshold may increase the number of false positives, and thus not be as secure as it could be.
April 23 / 2020
As we gradually get used to our new COVID-19 reality, daily life from just a few weeks ago now feels like a lifetime away. For businesses this has created,...Read post