Archive for May, 2014

Promising Seattle budget workshop – privacy in the works

We had a promising conversation at last night’s Public Safety and Civic Rights budget workshop. In addition to letters in support from the Citizens Technology and Telecommunications Board (CTTAB) and the Human Rights Commission, it was a great help to have the support both the West Seattle Blog and maybe most of all, Brendan Kiley at the Stranger, whose article was cited to me by one legislative aide as the impetus for a more than one briefing yesterday. Can’t appreciate or fight for our free press enough.

It sounds like privacy oversight is in the works, although what form it will take remains to be determined and we’ll be keeping a close eye on it. With the right combination of luck and annoying persistence, we’ll have a role in shaping it. Councilmember O’Brien is taking the lead, Councilmember Harrell told us tonight, but Harrell himself made a strong statement of support for the formation of a citizen’s advisory board. Councilmember Burgess pointed us in the direction of Redlands CA as an example of privacy activists having input into policing, so we’ll check that out.

FWIW, here’s what I MEANT to say tonight to Councilmembers Harrell, Burgess, Rasmussen, and Licata, at the Public Safety and Civil Rights budget workshop meeting. As to what I actually said — that’s God’s own private mystery, as Sailor would say.

Seattle needs a strong privacy process. Today, there’s none whatsoever, to the detriment of all. There’s no Chief Privacy Officer, there’s no privacy oversight board, there’s no formal consideration of privacy, and there is no documentation of that non-consideration.

This lack comes with a cost. For example, the surveillance cameras that the City installed and then removed from Cal Anderson Park, according to the fiscal note with Ordinance 123411, cost the City $145,800 to install, deploy, and then remove.

The City’s inability to address privacy concerns also prevents full deployment of the City’s already installed mesh network, designed to assist emergency first responders.
The lack of oversight resulted in the surprise purchase of drones, later boxed due to privacy concerns. It has resulted in the installation of 28 unusable surveillance cameras along Seattle waterways from Alki Beach to Alaskan Way to Ballard. The cameras are reportedly disabled, but their fate is in limbo, and they have already cost taxpayers $5 million.

We can’t go on this way. Please, make privacy protection a funding priority this year.”

 

 

ShotSpotter (SST, Inc.) Fact Sheet prepared for City of Seattle

We are about to send a letter to the City to propose a privacy oversight board. Accompanying it will be a fact sheet about ShotSpotter, the gunfire location service. Why ShotSpotter? Because the company has been mentioned as a possible solution to the recent shootings in the CD and elsewhere. In general, Seattle Privacy is skeptical of technological quick-fixes for deep social problems, so we did some digging about ShotSpotter, and here are the results:

ShotSpotter (SST, Inc.) Fact Sheet

 

But also: A SpotShotter Gallery

The company doesn’t like to reveal what their expensive and dubiously effective equipment looks like, so here’s a remedy for that!

 

 

 

 

shotspotter1

(Source: http://www.milwaukeecriminallawyerblog.com/2014/01/bill-would-increase-funding-for-milwaukee-pds-shotspotter-program.shtml)

 

 

 

 

Watch out, bird. We are listening. And looking. And probably irradiating you. shotspotter2

(Source: http://cedarposts.blogspot.com/2012/08/cmpds-shot-spotter-goes-live-in-uptown.html)

 

Whoa! This is a nice shot.

 

shotspotter3

In the last three years, gunshot detection sensors in Newark went off 3,632 times, and 17 shooters were arrested on scene. But for more than half of the sensors in Newark, there is no accompanying camera for several blocks. That leaves officers with insufficient information to act. “So you might get a vehicle taking off, you might pick up somebody discharging a weapon,” Carpenter said. But catching the person who fired the weapon? “Very rare, because you would have to have cameras in every corner of the city in order for that to actually work.” It costs Newark taxpayers about $80,000 a year to maintain the current system. But critics argue the total cost is much more than that, given the way police respond when a detector goes off. Since 2010, 75 percent of the gunshot alerts have been false alarms. But police are often deployed to the location anyway, just in case there is a shooter.

(Source: http://cedarposts.blogspot.com/2012/08/cmpds-shot-spotter-goes-live-in-uptown.html)

Sample Code of Practice for Privacy Impact Assessments (PIAs)

We’ve been looking around for tools and practices that Seattle city government could leverage to make progress in protecting the privacy of Seattlites, and we’ve run across Privacy Impact Assessments, or PIAs, and Chief Privacy Officers, or CPOs.

We’re still researching them, but PIAs seem like they could be deployed with respect to municipal departments and programs in a way similar to how the Racial Equity Toolkit is used in Seattle now.

The Racial Equity Toolkit is a tool from the Seattle Race and Social Justice Initiative, a citywide effort to end institutionalized racism and race-based disparities in City government. RSJI builds on the work of the civil rights movement and the ongoing efforts of individuals and groups in Seattle to confront racism. The Initiative’s long term goal is to change the underlying system that creates race-based disparities in Seattle and to achieve racial equity.designed to assist departments to analyze the racial equity impact of policies, programs, initiatives, and budget issues.

You can download a PDF of the Racial Equity Toolkit here, and see the worksheet designed to help city departments evaluate their programs. We’re proud of Seattle City government’s commitment to social justice and the City’s commitment of time and resources to addressing this issue.

We’d love to see a similar spirit of problem-solving brought to the ever-growing problem of privacy protection, and we’re wondering if the adopting of PIAs might be a step in that direction.

A privacy impact assessment is a tool organizations can use to identify and reduce privacy risks for their projects.

Here are a couple of useful links for examples of PIAs, pointed to by Adam Shostack in his new book Threat Modeling: Designing for Security:

And more examples we’ve found online:

Here’s a PDF of a code of practice for creating and performing privacy impact assessments created by the UK’s Information Commissioner’s office.

Here’s an excerpt where the code defines physical and informational privacy and explains the concept of privacy risk:

“Privacy, in its broadest sense, is about the right of an individual to be let alone. It can take two main forms, and these can be subject to different types of intrusion:

  • Physical privacy – the ability of a person to maintain their own physical space or solitude. Intrusion can come in the form of unwelcome searches of a person’s home or personal possessions, bodily searches or other interference, acts of surveillance and the taking of biometric information.
  • Informational privacy – the ability of a person to control, edit, manage and delete information about themselves and to decide how and to what extent such information is communicated to others. Intrusion can come in the form of collection of excessive personal information, disclosure of personal information without consent and misuse of such information. It can include the collection of information through the surveillance or monitoring of how people act in public or private spaces and through the monitoring of communications whether by post, phone or online and extends to monitoring the records of senders and recipients as well as the content of messages.”

The code is mostly designed to address informational privacy. Here’s a nice summary of privacy risk — the risk of harm arising through an intrusion into privacy.

“Some of the ways this risk can arise is through personal information being:

  • Inaccurate, insufficient or out of date;
  • Excessive or irrelevant;
  • Kept for too long;
  • Disclosed to those who the person it is about does not want to have it;
  • Used in ways that are unacceptable to or unexpected by the person it is about; or
  • Not kept securely.

Harm can present itself in different ways. Sometimes it will be tangible and quantifiable, for example financial loss or losing a job. At other times it will be less defined, for example damage to personal relationships and social standing arising from disclosure of confidential or sensitive information. Sometimes harm might still be real even if it is not obvious, for example the fear of identity theft that comes from knowing that the security of information could be compromised. There is also harm which goes beyond the immediate impact on individuals. The harm arising from use of personal information may be imperceptible or inconsequential to individuals, but cumulative and substantial in its impact on society. It might for example contribute to a loss of personal autonomy or dignity or exacerbate fears of excessive surveillance.”

Visit the UK Information Commissioner’s Office website for more information about the agency’s code of practice here.

For in-depth discussion and description of Chief Privacy Officers, see Chief Privacy Officer, US Department of Education, or Authorities and Responsibilities of the Chief Privacy Officer, Department of Homeland Security, or Chief Privacy Author, IT Law Wiki.