Steven Musil | cNet | Source URL
The tech giant warns the technology could be used to create a 1984-like dystopia.
Microsoft is urging governments to enact legislation next year that requires facial-recognition technology to be independently tested to ensure accuracy, prevent unfair bias and protect individuals’ rights.
“The facial recognition genie, so to speak, is just emerging from the bottle,” Microsoft chief counsel Brad Smith wrote in a blog post published Thursday. “Unless we act, we risk waking up five years from now to find that facial recognition services have spread in ways that exacerbate societal issues. By that time, these challenges will be much more difficult to bottle back up.”
Smith advocated for human review of facial recognition results rather than leaving them to computers.
“This includes where decisions may create a risk of bodily or emotional harm to a consumer, where there may be implications on human or fundamental rights, or where a consumer’s personal freedom or privacy may be impinged,” he wrote.
He added that those deploying the technology must “recognize that they are not absolved of their obligation to comply with laws prohibiting discrimination against individual consumers or groups of consumers.”
Facial-recognition technology is commonly used for everyday tasks such as like unlocking phones and tagging friends on social media, but privacy concerns persist. Advances in artificial intelligence and the proliferation of cameras have made it increasingly easy to watch and track what individuals are doing.
Law enforcement agencies frequently rely on technology to help with investigations, but the software isn’t without its flaws. Software used by the UK’s Metropolitan Police was reported earlier this year to produce incorrect matches in 98 percent of cases.
Microsoft isn’t alone in raising concerns over the technology’s use. In May, the ACLU revealed that Amazon was selling its facial recognition technology, Rekognition, to law enforcement agencies in the US, including the Orlando Police Department. An ACLU test of Rekognition in July found that the system mistakenly confused 28 congressmen with known criminals.
Smith also cautioned government use of the technology could encroach on democratic freedoms and human rights.
“When combined with ubiquitous cameras and massive computing power and storage in the cloud, a government could use facial recognition technology to enable continuous surveillance of specific individuals,” Smith wrote.
“We must ensure that the year 2024 doesn’t look like a page from the novel 1984.”