Analysing images for a safer India

The Centre is planning to use face recognition technology to modernise the police force. But before that, fears around it must be dispelled
Analysing images for a safer India

The Automated Facial Recognition (FRS) System, for which the National Crime Records Bureau (NRCB) has invited bids for procurement has come under serious debate. The technology, having potential to act as an investigation enhancer, is being criticised mainly for two reasons. It is said that the technology is discriminatory against dark-skinned populations and has a gender bias. Owing to these two issues, the application has been branded inefficient as it might implicate innocents as suspects. Second, the law enforcement agencies would be able to track, and hence, intrude into the privacy of any person. Doubts have also been raised that the police would have free access to the CCTV footage in private and public spaces, which may further impinge upon the privacy of individuals. It is therefore imperative to understand how the FRS works, and allay the above mentioned fears before discrediting the benefits of this technology.

Much of the criticism is based on the study conducted by Joy Buolamwini, a young African-American computer scientist at the Massachusetts Institute of Technology (MIT) Media Lab, more than a year back on the performance of the artificial intelligence (AI) based face recognition systems of leading three companies. An FRS algorithm is developed by ‘training’ where a facial dataset (collection of photos of human faces—along with some photos of animal faces and face-like objects) is appended with metadata to (in)validate the guesses of a learning facial recognition algorithm. Once the algorithm is ‘trained’, it is tested and used in the real world scenario with some acceptable confidence level. Thus, AI software is as smart as the data used to train it. Joy, in her research, found all three systems (under study) wanting on dark-skinned populations, more so on black females. However, two, out of three companies, claim to have already improved their software for better accuracy and expressed commitment to “unbiased” and “transparent” services. This shows that racial bias can be minimised or eliminated by having a diverse training datasets that best suits its customer. Joy is now an advocate in the field of “algorithmic accountability” which seeks to make automated decisions more transparent, explainable and fair.

Further, the New York based Institute of Electrical and Electronics Engineers (IEEE), a large professional organisation in computing, is working to create standards for accountability and transparency in facial recognition software. In the meantime, protocols can be developed to avoid rushing to judgement based only on the results of FRS but to apply a secondary check. The apprehensions of the critics therefore cannot be sufficient ground to reject the use of FRS entirely if it can help investigating agencies in narrowing down on the real culprit or help in recovering missing persons.

As far as the issue of intrusion into privacy is concerned, enforcement agencies are not above law; they are always required to comply with the constitutional provisions. The state agencies, though, were exempted from the penal provisions of the Data Protection Bill, which has lapsed and is yet to be introduced in Parliament again; the state cannot eschew its duty to protect the ‘right to privacy’ of its citizens which the Supreme Court has recently in Puttaswamy case (2017) held to be embedded in the fundamental right to life and personal liberty.

The law enforcement agencies therefore do not have unfettered powers to misuse personal data of citizens. The ‘biometric information’ (including ‘facial pattern’) of a person is also subject to the Information Technology (Reasonable Security Practices and Procedures and Sensitive Personal Data and Information) Rules, 2011 as ‘sensitive personal data’ and any violation of the rules may be held against the body corporate having its possession and would be liable to pay for damages by way of compensation. Thus, there are already the SC’s directions and certain riders in law which vouch for the interests of individuals.

The fear of the critics, therefore, can be dispelled by incorporating provisions of diverse datasets and making it mandatory for the bidder to successfully demonstrate the ‘proof of concept’ to qualify for the next stage of procurement. Similarly, protocols may be developed by the national Bureau of Police Research and Development (BPR&D) to streamline the usage of technology across the states and ensure that the law enforcement agencies are not only able to harness the advantages of technology but also held accountable to protect the citizens’ privacy.

The home ministry has already clarified that only the Crime Criminal Tracking Network and System (CCTNS) database of suspects and criminals, recorded missing persons and unidentified dead bodies will be utilised for the FRS. Therefore, the Indian society must not be put to a disadvantage by forbidding its enforcement agencies to exploit the benefits of the upcoming technology to safeguard society’s interests.

R K Vij
The author is a senior IPS officer in Chhattisgarh.
Views expressed are personal
Email: vijrk@hotmail.com

Related Stories

No stories found.

X
The New Indian Express
www.newindianexpress.com