views
In the wake of the Bengaluru Rameshwaram Cafe blast, the investigators have turned to facial recognition technology (FRT) — a vital asset for law enforcement agencies, offering expedited investigations and enhanced crime resolution capabilities — to identify the suspect.
The law enforcement officials investigating the explosion at the cafe on Kundalahalli Road in Whitefield, used AI-powered facial recognition technology to ascertain the identity of the individual who placed a bag containing the explosive device at the site. Soon, surveillance footage surfaced, revealing a man entering the cafe with the bag.
The city police later stated that the individual’s facial features were recorded by CCTV cameras, with efforts underway to match them using facial recognition systems for tracking purposes. Amid the National Investigation Agency (NIA) probe, the delay in apprehending the suspect, despite employing AI-powered FRT, raised questions.
However, one key challenge emerged in the initial probe — the individual captured on CCTV cameras was wearing a mask, a precautionary measure adopted by many amid the COVID-19 pandemic.
FACE MASK HURDLE: THE HONG KONG EXAMPLE
Identifying individuals, even with FRT, becomes challenging due to the usage of face masks. This was notably observed during the 2019 Hong Kong pro-democracy protests, where participants, predominantly students, put on masks to evade surveillance cameras with FRT and as safeguard against tear gas. In response, Carrie Lam, the Chief Executive of Hong Kong at the time, imposed a temporary ban on wearing masks during public assemblies.
As per tech experts, FRT can identify individuals wearing masks under specific conditions. These conditions involve placing greater emphasis on visible areas such as the eyes, forehead and nose space for identification. Additionally, it also claimed that newer systems are becoming more sophisticated and can work better with partial facial data.
However, the tech becomes less accurate when individuals wear masks, which prompted the Hong Kong authorities to enforce a ban. The tech’s limitation arises because FRT systems rely on analysing the entire face. Masks obstruct the lower face, diminishing available data points for identification. Human brains recognise faces best when the encounter conditions (masked vs. unmasked) are similar to when they first formed the memory of the face. So FRTs may struggle if the initial image used for identification was unmasked and the person is now masked.
FRT USAGE AND THE DEBATE
It needs to be highlighted that the CCTVs with in-built FRT software for real-time analysis and matching CCTV footage with separate FRTs are two different approaches. The latter is considered traditional. Face masks make any of this more difficult. While they may not completely prevent identification, they prolong the process.
Real-time analysis of CCTV video feeds with FRT allows for comparisons with databases of known individuals. Moreover, FRT’s ability to track individuals on watchlists entering sensitive areas offers proactive intervention opportunities, potentially preventing crimes before they occur.
In this context, human rights and privacy concerns take the discussion on the mass usage of FRT systems to another level. Many believe that striking the right balance appears difficult. Despite the undeniable utility of FRT systems in bolstering global security measures, debates or concerns regarding privacy violation, accuracy, and bias dampen enthusiasm for widespread deployment.
However, responsible FRT usage requires robust goverance structures, including transparent data handling protocols. Moreover, transparency regarding FRT implementation and stakeholder involvement becomes imperative to cultivate public trust in its ethical deployment. Achieving the right balance between security needs and privacy concerns could significantly enhance law enforcement’s capabilities by deploying FRT with city CCTV cameras.
Comments
0 comment