Tag: surveillance

ACLU Calls on Durkan to Ban Facial Recognition Software After Possible SPD Violation

Clearview AI Software Logo (Source: Creative Commons)

By Paul Kiefer

In early November, a blogger’s public records request turned up evidence that a Seattle Police Officer has used a widely-criticized facial recognition software called Clearview AI for over a year, possibly violating Seattle Police Department policy and raising questions from privacy advocates about the use of prohibited surveillance technology within SPD.

On Wednesday, the ACLU of Washington responded to the revelation by calling for Mayor Jenny Durkan to issue a specific ban on the use of facial recognition software by city agencies, as well as for a city council hearing to question SPD representatives about their use of surveillance tools.

As PubliCola first reported in November, the ACLU first sounded the alarm after the department released roughly 200 emails containing references to Clearview AI, a search engine for faces that enables law enforcement agencies to identify unknown people—protest participants, for example—by matching their photos to online images, allowing police to arrest or interrogate them.

Clearview AI has been the subject of harsh condemnation from privacy and police accountability advocates since it first drew national attention last year. The company’s business model relies on scraping billions of images from across the internet without permission; as a result, Clearview AI’s database of faces includes untold numbers of people with no criminal background whatsoever.

Support PubliCola

If you’re reading this, we know you’re someone who appreciates deeply sourced breaking news, features, and analysis—along with guest columns from local opinion leaders, ongoing coverage of the kind of stories that get short shrift in mainstream media, and informed, incisive opinion writing about issues that matter. Earlier this month, we took a look back at just some of the work we’ve been able to do thanks to generous contributions from our readers, but those pieces represent just a handful of the hundreds of stories we’ve published this year.

We know there are a lot of publications competing for your dollars and attention, but PubliCola truly is different. We cover Seattle and King County on a budget that is funded entirely and exclusively by reader contributions—no ads, no paywalls, no secondary businesses behind the scenes.

Being fully independent means that we cover the stories we consider most interesting and newsworthy, based on our own news judgment and feedback from readers about what matters to them, not what advertisers or corporate funders want us to write about. It also means that we need your support. So if you get something out of this site, consider giving something back by kicking in a few dollars a month, or making a one-time contribution, to help us keep doing this work. If you prefer to Venmo or write a check, our Support page includes information about those options. Thank you for your ongoing readership and support.

Most of the emails SPD released were promotional offers sent from Clearview AI to SPD officers of all ranks, including former Police Chief Carmen Best. But one officer—Detective Nicholas Kartes of the South Precinct’s burglary unit—accepted the company’s offer, opening an account with his work email in September 2019. In the past year, Kartes corresponded with a Clearview AI representative about his experiences “experimenting” with the application, and login alerts sent to Kartes’ work email indicated that the account was used on at least two desktop computers. Both computers’ IP addresses place them in Seattle city government buildings, and one IP address belongs to a secure city network.

The revelation was alarming enough to prompt Office of Police Accountability Director Andrew Myerberg to launch an investigation into Kartes’ use of Clearview AI. However, Myerberg told PubliCola in November that merely opening an account with Clearview AI might not constitute a policy violation, though using the account for law enforcement purposes would be a clear violation of department policy. He added that there is no precedent for that kind of misconduct.

But the city council’s 2018 surveillance ordinance that restricts SPD’s use of surveillance technologies might not cover Kartes’ use of an unapproved software. Mary Dory, a public safety auditor working with the Office of the Inspector General on the case, told PubliCola in November that the ordinance was designed to address the use of surveillance technologies by SPD itself, not the behavior of an individual officer using surveillance software without the department’s knowledge.

That dilemma is now at the center of the ACLU’s disagreement with Interim Police Chief Adrian Diaz. Jennifer Lee, the manager of the ACLU of Washington’s Technology and Liberty Project, told PubliCola that her organization sees Kartes’ use of Clearview AI as a violation of the surveillance ordinance, and believes that SPD is liable for Kartes’ infractions. She cited Kartes’ use of his work email—and, possibly, his work computer—as evidence that the detective opened a Clearview AI account for law enforcement purposes.

Lee says that the ACLU of Washington is calling for Durkan to issue a targeted ban on facial recognition technology. “We have a surveillance ordinance which is supposed to prevent exactly what happened: SPD secretly using a surveillance technology,” she told PubliCola. “But it’s clear that without an explicit prohibition on facial recognition use, there are risks that remain.”

A press release from the ACLU sent out on Wednesday morning also called for council members Lisa Herbold and Alex Pedersen, the chairs of the council’s public safety and transportation and public utilities committees, respectively, to hold a public hearing to “get answers from SPD about its use of Clearview AI and other surveillance tools.”

In a response sent to the ACLU of Washington on Wednesday afternoon, Diaz categorically denied that SPD has sanctioned the use of Clearview AI by its officers. “We have no intention or interest in pursuing a partnership with Clearview AI or acquiring the use of any facial recognition technology,” he wrote. He also challenged the ACLU’s assertion—included in their press release—that multiple SPD detectives have used Clearview AI since September, pointing out that the emails only clearly point to Kartes’ use of the technology. (In November, Lee told PubliCola that the login alerts from multiple desktop computers point to the possibility of multiple detectives using Kartes’ account).

Diaz also made a passing reference connecting the Clearview AI promotional emails to a possible phishing attempt involving city of Seattle email addresses; PubliCola has reached out for clarification.

Because Diaz’s response dismisses the ACLU’s assertion that the department is liable for Kartes’ conduct, the ACLU’s call for Durkan to issue a specific ban on facial recognition software is effectively dead in the water.

Morning Crank: Incongruous With Their Fundamental Mission

Image result for futurewise logo

1. For years, environmental advocates who support urban density as a tool against sprawl have grumbled about the fact that the anti-sprawl nonprofit Futurewise has two men on its board who make a living fighting against the foundational principles of the organization—attorneys Jeff Eustis and David Bricklin. Both men were ousted from the Futurewise board last month after the board voted to impose term limits on board members, who will be limited to no more than three successive terms from now on.

Both Eustis and Bricklin are crossways with Futurewise on a number of high-profile local issues, including the question of whether Seattle should allow more people to live in single-family areas, which occupy 75 percent of the city’s residential land but house a shrinking fraction of Seattle’s residents. Eustis is currently representing the Queen Anne Community Council, headed by longtime anti-density activist Marty Kaplan, in its efforts to stop new rules that would make it easier to build backyard cottages and basement apartments in single-family areas. Bricklin represents homeowner activists working to stop the city’s Mandatory Housing Affordability plan, which would allow townhouses and small apartment buildings in  7 percent of the city’s single-family areas.

To get a sense of how incongruous this work is with Futurewise’s primary mission, consider this: Futurewise is one of the lead organizations behind Seattle For Everyone, the pro-density, pro-MHA, pro-housing group. Bricklin co-wrote an op/ed in the Seattle Times denouncing MHA and calling it a “random” upzone that fails to take the concerns of single-family neighborhoods into account.

Bricklin’s firm also represents the Shorewood Neighborhood Preservation Coalition, a group of homeowners who have protested a plan by Mary’s Place to build housing for homeless families on Ambaum Blvd. in Burien on the grounds that dense housing (as opposed to the existing office buildings) is incompatible with their single-family neighborhood. The Burien City Council approved the upzone, 4-3, after a heated debate this past Monday night at which one council member, Nancy Tosta, suggested that instead of allowing homeless families to live on the site, the city should preserve it as office space, since “part of the way of dealing with homelessness is to have people make more money.”

Bricklin is still on the boards of Climate Solutions, the Washington Environmental Council, and Washington Conservation Voters.

Support

2. Seattle City Council members reached no resolution this week on a proposal from the mayor’s office to approve the city’s purchase of GrayKey, a technology that enables police to easily (and cheaply) unlock any cell phone and review its contents, including location data, without putting the technology through a privacy assessment under the city’s stringent surveillance ordinance. If the city determines that a technology is a form of surveillance, the city has to prepare a surveillance impact report that “include[s]  an in-depth review of privacy implications, especially relating to equity and community impact,” according to the ordinance. The process includes public meetings, review by a special advisory group, and approval by the council at a meeting open to the public. In contrast, technologies that intrude on privacy but aren’t considered surveillance only require a “privacy impact analysis” that is not subject to formal public process or council approval. Previous examples of technologies the city has deemed to be surveillance include license-plate readers (used to issue traffic tickets) and cameras at emergency scenes.

The city’s IT department, which answers to the mayor, determined that GrayKey is not a “surveillance technology” after the company submitted answers to a list of questions from the city suggesting that the technology would only be used if the Seattle Police Department obtained a warrant to search a person’s phone. In an email appended to that report, Seattle’s chief privacy officer, Ginger Armbruster, wrote, “If phones are acquired either under warrant or with suspect[‘]s knowledge then this is not surveillance by ordinance definition.” In other words, Armbruster is saying that as soon as SPD gets a warrant to break into someone’s phone and scrape their data, the surveillance rules, by definition, no longer apply.

ACLU Technology and Liberty Project Director Shankar Narayan disagrees with this interpretation, noting that the surveillance law doesn’t include any exemption for warrants. “The ordinance is about the entire question of whether it’s an appropriate technology for an agency to have, and encompasses a much broader set of concerns. If the warrant serves the same function as a surveillance ordinance”—that is, if anything the police do after they get a warrant is de facto not surveillance—”then why do we need a surveillance ordinance? The intent of the council was to put scrutiny on technologies that are invasive—as, clearly, a technology that allows police to open your cell phone and download data about the intimate details of your life is.” It’s the technology, in other words—not how the city claims it will be used—that matters.

The city’s initial privacy assessment is brief and unilluminating. GrayKey skipped many of the city’s questions, answered others with perfunctory, one-word answers, and followed up on many of the skipped questions with the same all-purpose sentence: “this solution is used for Police case forensic purposes only. ”

Proponents of GrayKey’s technology (and GrayKey itself) say that the police will limit its use to child sexual abuse cases—the kind of crimes that tend to silence concerns about privacy because of their sheer awfulness. Who could possibly object to breaking into the phones of child molesters? Or terrorists? Or murderers? As council member Bruce Harrell, who said he does not consider GrayKey a surveillance technology, put it Tuesday, “No one has a right to privacy when they are visiting child pornography sites.”

The problem is that in the absence of review under the surveillance ordinance, even if police claim they will only use GrayKey to investigate the worst kinds of crimes, there will be no way of knowing how they are actually using it. (Narayan says police departments frequently claim that they will only use surveillance technology to hunt down child molesters, or terrorists, to create political pressure to approve the technology or risk looking soft on crime.) The council can state its preference that the technology be limited to certain types of especially heinous crimes, but if the phone-cracking technology isn’t subject to the ordinance which allows the city council to place legally binding limits on the use of surveillance tools, the decision facing the city is essentially binary: Approve (and purchase) the technology and hope for the best, or don’t.

This is why privacy advocates consider it so important to look at surveillance technology thoroughly, and to give the public real opportunities to weigh in on granting the city sweeping authority to review people’s movements and access their data.  Harrell said Tuesday that he didn’t want to “jump every time the ACLU says [a technology] raises issues,” and that he was confident that additional review by the executive would resolve any questions the council might have. But, as council member Lisa Herbold pointed out, there’s no requirement that the mayor’s office present the results of any future internal privacy assessment to the council—they can run it through a privacy impact assessment, reach the same conclusions they’ve already reached, and post it on the website with all the others without any additional input from the council or the public. The only way to ensure that concerns are daylighted before the city buys this, or any other, technology that could invade people’s privacy is to determine that GrayKey is surveillance, and put it through the process. At the end of Tuesday’s meeting, the council’s governance, equity, and technology committee had made no decision on whether to subject GrayKey to additional scrutiny or wait to see what the mayor’s office does next. The city currently plans to purchase the phone-cracking technology sometime in the third quarter of next year.