Orlando police department scraps Amazon’s facial recognition software after a year struggling to fix glitches and bandwidth problems that made it unreliable

Orlando police department scraps Amazon’s facial recognition software after a year struggling to fix glitches and bandwidth problems that made it unreliable
A police department in Orlando has ended its pilot of Amazon's facial recognition software after being unable to get its system working properly.

James Pero | The Daily Mail | Source URL

A police department in Orlando has terminated its trial of Amazon’s AI-powered facial recognition for the second time, citing costs and complexity.

According to a report from Orlando Weekly, the department ended its trial of the technology, called Rekognition, after 15 months of glitches and concerns over whether the technology was actually working. 

‘At this time, the city was not able to dedicate the resources to the pilot to enable us to make any noticeable progress toward completing the needed configuration and testing,’ Orlando’s Chief Administrative Office said in a memo to City Council, as reported by Orlando Weekly.  

The decision marks the second time in just 10 months that the department decided not to proceed with using the technology. A previous decision to end the pilot came after pressure from critics but was restarted in October.

Rekognition is sold to police departments as a means of turning camera streams like the commonplace CCTV feeds into surveillance networks that are capable of reading people’s faces and matching them against a database of mugshots or other sources. 

According to Amazon, the tool, which is being deployed by police department across the country, is designed to help track and identify criminals.

In Orlando, however, police officials say that’s not exactly how Rekognition has worked out. 

Among the issues for the department were cameras lacking a high enough resolution to read subject’s faces and the inability to set up a reliable stream presumably due to network bandwidth.

‘We haven’t even established a stream today. We’re talking about more than a year later,’ Rosa Akhtarkhavari, Orlando’s chief information officer, told Orlando Weekly. 

‘We have not, today, established a reliable stream.’

An anonymous official quoted by Orlando Weekly told the outlet that even when the department was able to establish a stream, they would often disconnect randomly.

As a result Akhtarkhavari told Orlando Weekly that the department has ‘no immediate plans regarding future pilots to explore this type of facial recognition technology.’

While the Orlando Police Department’s decision to end its use of the software doesn’t appear to be an ideological one, other municipalities across the country have begun to make their stance against facial recognition known.

Recently, Oakland California became the third city across the country to ban facial recognition software’s use, following major metro areas like San Francisco and smaller examples like Somerville Massachusetts. 

Critics say that use of the technology carries a number of risks including potentially increasing the risks of falsely accusing someone of a crime.

AI systems driving software have been shown to have greater difficulty reading the faces of people of color having been trained mostly on white faces.

Rekognition purports to be able to read the faces of people and match them against a database.
Rekognition purports to be able to read the faces of people and match them against a database. 

HOW DOES FACIAL RECOGNITION TECHNOLOGY WORK?

Facial recognition is increasingly used as way to access your money and your devices. When it comes to policing, it could soon mean the difference between freedom and imprisonment (stock)
Facial recognition is increasingly used as way to access your money and your devices. When it comes to policing, it could soon mean the difference between freedom and imprisonment (stock)

Facial recognition is increasingly used as way to access your money and your devices.

When it comes to policing, it could soon mean the difference between freedom and imprisonment.

Faces can be scanned at a distance, generating a code as unique as your fingerprints. 

This is created by measuring the distance between various points, like the width of a person’s nose, distance between the eyes and length of the jawline.

Facial recognition systems check more than 80 points of comparison, known as ‘nodal points’, combining them to build a person’s faceprint.

These faceprints can then be used to search through a database, matching a suspect to known offenders.

Facial scanning systems used on personal electronic devices function slightly differently, and vary from gadget to gadget.

The iPhone X, for example, uses Face ID via a 7MP front-facing camera on the handset which has multiple components.

One of these is a Dot Projector that projects more than 30,000 invisible dots onto your face to map its structure.

The dot map is then read by an infrared camera and the structure of your face is relayed to the A11 Bionic chip in the iPhone X, where it is turned into a mathematical model.

The A11 chip then compares your facial structure to the facial scan stored in the iPhone X during the setup process.  

Security cameras use artificial intelligence powered systems that can scan for faces, re-orient, skew and stretch them, before converting them to black-and-white to make facial features easier for computer algorithms to recognise.

Error rates with facial recognition can be as low as 0.8 per cent. While this sounds low, in the real world that means eight in every 1,000 scans could falsely identify an innocent party..

One such case, reported in The Intercept, details how Steven Talley was falsely matched to security footage of a bank robber.

Leave a Reply

Your email address will not be published. Required fields are marked *