By Paul Kiefer
In early November, a blogger’s public records request turned up evidence that a Seattle Police Officer has used a widely-criticized facial recognition software called Clearview AI for over a year, possibly violating Seattle Police Department policy and raising questions from privacy advocates about the use of prohibited surveillance technology within SPD.
On Wednesday, the ACLU of Washington responded to the revelation by calling for Mayor Jenny Durkan to issue a specific ban on the use of facial recognition software by city agencies, as well as for a city council hearing to question SPD representatives about their use of surveillance tools.
As PubliCola first reported in November, the ACLU first sounded the alarm after the department released roughly 200 emails containing references to Clearview AI, a search engine for faces that enables law enforcement agencies to identify unknown people—protest participants, for example—by matching their photos to online images, allowing police to arrest or interrogate them.
Clearview AI has been the subject of harsh condemnation from privacy and police accountability advocates since it first drew national attention last year. The company’s business model relies on scraping billions of images from across the internet without permission; as a result, Clearview AI’s database of faces includes untold numbers of people with no criminal background whatsoever.
Most of the emails SPD released were promotional offers sent from Clearview AI to SPD officers of all ranks, including former Police Chief Carmen Best. But one officer—Detective Nicholas Kartes of the South Precinct’s burglary unit—accepted the company’s offer, opening an account with his work email in September 2019. In the past year, Kartes corresponded with a Clearview AI representative about his experiences “experimenting” with the application, and login alerts sent to Kartes’ work email indicated that the account was used on at least two desktop computers. Both computers’ IP addresses place them in Seattle city government buildings, and one IP address belongs to a secure city network.
The revelation was alarming enough to prompt Office of Police Accountability Director Andrew Myerberg to launch an investigation into Kartes’ use of Clearview AI. However, Myerberg told PubliCola in November that merely opening an account with Clearview AI might not constitute a policy violation, though using the account for law enforcement purposes would be a clear violation of department policy. He added that there is no precedent for that kind of misconduct.
But the city council’s 2018 surveillance ordinance that restricts SPD’s use of surveillance technologies might not cover Kartes’ use of an unapproved software. Mary Dory, a public safety auditor working with the Office of the Inspector General on the case, told PubliCola in November that the ordinance was designed to address the use of surveillance technologies by SPD itself, not the behavior of an individual officer using surveillance software without the department’s knowledge.
That dilemma is now at the center of the ACLU’s disagreement with Interim Police Chief Adrian Diaz. Jennifer Lee, the manager of the ACLU of Washington’s Technology and Liberty Project, told PubliCola that her organization sees Kartes’ use of Clearview AI as a violation of the surveillance ordinance, and believes that SPD is liable for Kartes’ infractions. She cited Kartes’ use of his work email—and, possibly, his work computer—as evidence that the detective opened a Clearview AI account for law enforcement purposes.
Lee says that the ACLU of Washington is calling for Durkan to issue a targeted ban on facial recognition technology. “We have a surveillance ordinance which is supposed to prevent exactly what happened: SPD secretly using a surveillance technology,” she told PubliCola. “But it’s clear that without an explicit prohibition on facial recognition use, there are risks that remain.”
A press release from the ACLU sent out on Wednesday morning also called for council members Lisa Herbold and Alex Pedersen, the chairs of the council’s public safety and transportation and public utilities committees, respectively, to hold a public hearing to “get answers from SPD about its use of Clearview AI and other surveillance tools.”
In a response sent to the ACLU of Washington on Wednesday afternoon, Diaz categorically denied that SPD has sanctioned the use of Clearview AI by its officers. “We have no intention or interest in pursuing a partnership with Clearview AI or acquiring the use of any facial recognition technology,” he wrote. He also challenged the ACLU’s assertion—included in their press release—that multiple SPD detectives have used Clearview AI since September, pointing out that the emails only clearly point to Kartes’ use of the technology. (In November, Lee told PubliCola that the login alerts from multiple desktop computers point to the possibility of multiple detectives using Kartes’ account).
Diaz also made a passing reference connecting the Clearview AI promotional emails to a possible phishing attempt involving city of Seattle email addresses; PubliCola has reached out for clarification.
Because Diaz’s response dismisses the ACLU’s assertion that the department is liable for Kartes’ conduct, the ACLU’s call for Durkan to issue a specific ban on facial recognition software is effectively dead in the water.