Tag: Clearview AI

ACLU Calls on Durkan to Ban Facial Recognition Software After Possible SPD Violation

Clearview AI Software Logo (Source: Creative Commons)

By Paul Kiefer

In early November, a blogger’s public records request turned up evidence that a Seattle Police Officer has used a widely-criticized facial recognition software called Clearview AI for over a year, possibly violating Seattle Police Department policy and raising questions from privacy advocates about the use of prohibited surveillance technology within SPD.

On Wednesday, the ACLU of Washington responded to the revelation by calling for Mayor Jenny Durkan to issue a specific ban on the use of facial recognition software by city agencies, as well as for a city council hearing to question SPD representatives about their use of surveillance tools.

As PubliCola first reported in November, the ACLU first sounded the alarm after the department released roughly 200 emails containing references to Clearview AI, a search engine for faces that enables law enforcement agencies to identify unknown people—protest participants, for example—by matching their photos to online images, allowing police to arrest or interrogate them.

Clearview AI has been the subject of harsh condemnation from privacy and police accountability advocates since it first drew national attention last year. The company’s business model relies on scraping billions of images from across the internet without permission; as a result, Clearview AI’s database of faces includes untold numbers of people with no criminal background whatsoever.

Support PubliCola

If you’re reading this, we know you’re someone who appreciates deeply sourced breaking news, features, and analysis—along with guest columns from local opinion leaders, ongoing coverage of the kind of stories that get short shrift in mainstream media, and informed, incisive opinion writing about issues that matter. Earlier this month, we took a look back at just some of the work we’ve been able to do thanks to generous contributions from our readers, but those pieces represent just a handful of the hundreds of stories we’ve published this year.

We know there are a lot of publications competing for your dollars and attention, but PubliCola truly is different. We cover Seattle and King County on a budget that is funded entirely and exclusively by reader contributions—no ads, no paywalls, no secondary businesses behind the scenes.

Being fully independent means that we cover the stories we consider most interesting and newsworthy, based on our own news judgment and feedback from readers about what matters to them, not what advertisers or corporate funders want us to write about. It also means that we need your support. So if you get something out of this site, consider giving something back by kicking in a few dollars a month, or making a one-time contribution, to help us keep doing this work. If you prefer to Venmo or write a check, our Support page includes information about those options. Thank you for your ongoing readership and support.

Most of the emails SPD released were promotional offers sent from Clearview AI to SPD officers of all ranks, including former Police Chief Carmen Best. But one officer—Detective Nicholas Kartes of the South Precinct’s burglary unit—accepted the company’s offer, opening an account with his work email in September 2019. In the past year, Kartes corresponded with a Clearview AI representative about his experiences “experimenting” with the application, and login alerts sent to Kartes’ work email indicated that the account was used on at least two desktop computers. Both computers’ IP addresses place them in Seattle city government buildings, and one IP address belongs to a secure city network.

The revelation was alarming enough to prompt Office of Police Accountability Director Andrew Myerberg to launch an investigation into Kartes’ use of Clearview AI. However, Myerberg told PubliCola in November that merely opening an account with Clearview AI might not constitute a policy violation, though using the account for law enforcement purposes would be a clear violation of department policy. He added that there is no precedent for that kind of misconduct.

But the city council’s 2018 surveillance ordinance that restricts SPD’s use of surveillance technologies might not cover Kartes’ use of an unapproved software. Mary Dory, a public safety auditor working with the Office of the Inspector General on the case, told PubliCola in November that the ordinance was designed to address the use of surveillance technologies by SPD itself, not the behavior of an individual officer using surveillance software without the department’s knowledge.

That dilemma is now at the center of the ACLU’s disagreement with Interim Police Chief Adrian Diaz. Jennifer Lee, the manager of the ACLU of Washington’s Technology and Liberty Project, told PubliCola that her organization sees Kartes’ use of Clearview AI as a violation of the surveillance ordinance, and believes that SPD is liable for Kartes’ infractions. She cited Kartes’ use of his work email—and, possibly, his work computer—as evidence that the detective opened a Clearview AI account for law enforcement purposes.

Lee says that the ACLU of Washington is calling for Durkan to issue a targeted ban on facial recognition technology. “We have a surveillance ordinance which is supposed to prevent exactly what happened: SPD secretly using a surveillance technology,” she told PubliCola. “But it’s clear that without an explicit prohibition on facial recognition use, there are risks that remain.”

A press release from the ACLU sent out on Wednesday morning also called for council members Lisa Herbold and Alex Pedersen, the chairs of the council’s public safety and transportation and public utilities committees, respectively, to hold a public hearing to “get answers from SPD about its use of Clearview AI and other surveillance tools.”

In a response sent to the ACLU of Washington on Wednesday afternoon, Diaz categorically denied that SPD has sanctioned the use of Clearview AI by its officers. “We have no intention or interest in pursuing a partnership with Clearview AI or acquiring the use of any facial recognition technology,” he wrote. He also challenged the ACLU’s assertion—included in their press release—that multiple SPD detectives have used Clearview AI since September, pointing out that the emails only clearly point to Kartes’ use of the technology. (In November, Lee told PubliCola that the login alerts from multiple desktop computers point to the possibility of multiple detectives using Kartes’ account).

Diaz also made a passing reference connecting the Clearview AI promotional emails to a possible phishing attempt involving city of Seattle email addresses; PubliCola has reached out for clarification.

Because Diaz’s response dismisses the ACLU’s assertion that the department is liable for Kartes’ conduct, the ACLU’s call for Durkan to issue a specific ban on facial recognition software is effectively dead in the water.

SPD Detective’s Use of Prohibited Facial Recognition Software Raises Questions About Surveillance Oversight

Image by FlitsArt from Pixabay

By Paul Kiefer

Over the past year, more than a dozen Seattle Police Department officers have received promotional emails advertising a controversial artificial intelligence software called Clearview AI, which bills itself as a kind of Google search for faces. Clearview enables law enforcement agencies to identify unknown people—protest participants, for example—by matching their photos to online images and arrest or interrogate them after the fact.

In March, one of the promotional emails made its way into then-Chief Carmen Best’s inbox, along with the inboxes of numerous other SPD officers of varying ranks. But only one officer—Detective Nicholas Kartes of the South Precinct’s burglary unit—appears to have taken the company’s offer, opening an account with his official Seattle email address more than a year ago.

Under most circumstances, an individual detective’s subscription to questionable surveillance software would go unnoticed. But Clearview AI is uniquely reviled by privacy advocates: its business model, which relies upon billions of images scraped without permission from every corner of the internet, has prompted horrified coverage from outlets as prominent as the New York Times. In fact, Kartes’ subscription to Clearview AI came to light because of an episode of HBO’s Last Week Tonight With John Oliver on the subject.

Support PubliCola

PUBLICOLA NEEDS YOUR HELP.

This ad-free website is supported ENTIRELY by generous contributions from readers. At a time when real local news is more threatened than ever by declining revenues and the growing spread of misinformation, PublICola is a trusted source of breaking news, commentary, and deep dives on issues that matter.

If you enjoy the work we do here at PubliCola, please help us KEEP IT GOING by donating a few bucks a month or making a one-time donation via PayPal, Venmo (Erica-Barnett-7) or by check at P.O. Box 14328, Seattle, WA 98104. We’re truly grateful for your support.

The episode prompted Seattle-area blogger Bridget Brululo to submit a public records request to SPD in June to determine whether anyone with SPD is using the service. Earlier this month, the department fulfilled the request, providing Brululo a collection of roughly 200 emails to or from SPD officers mentioning Clearview AI. Most of the emails were promotional, but they also included evidence that Kartes has communicated with the software company and possibly used their service earlier this year.

Aside from the controversy that surrounds it, SPD officers aren’t currently allowed to use Clearview AI for law enforcement purposes. The surveillance ordinance passed by the city council in 2018 requires city departments to submit new surveillance technologies to a review process that ends with a council vote to approve or prohibit the technology’s use by city departments.

Clearview AI—which first attracted widespread attention late last year—is not on the council’s list of approved technologies. But according to Mary Dory, a public safety auditor currently working on the Kartes case with the Office of the Inspector General (OIG), that ordinance doesn’t address the use of surveillance technology by individual officers. “If the department is caught using something outside the bounds of the ordinance, the city can take it away from them,” she said. “It isn’t focused on individual officers who have gone rogue or made a mistake.”

That leaves the city’s accountability partners responsible for investigating Kartes’ use of Clearview AI—namely, the Office of Police Accountability (OPA) and the OIG—in an unfamiliar position. “We’ve seen instances in which officers just didn’t know that they were breaking the rules,” Dory said. “But that points to something systemic—why didn’t the department make sure their officers knew the rules? Or did the officer just ignore them?”

It’s also unclear whether Kartes violated department policy. To Office of Police Accountability Director Andrew Myerberg, the revelation that an SPD detective is using Clearview AI was alarming enough to prompt his office to launch an investigation, but he told PubliCola that the act of creating an account itself might not constitute a policy violation. “If they used the account for an investigation,” he added, “that would be a clear violation of policy.”

Randall Huserik, a Public Information Officer for SPD, didn’t deny that Kartes used his Clearview AI account within the past year. However, he told PubliCola that the detective downloaded the application onto his personal phone to “experiment with its capacities—not in the course of his duties.” Continue reading “SPD Detective’s Use of Prohibited Facial Recognition Software Raises Questions About Surveillance Oversight”