Back to News & Commentary

Shotspotter CEO Answers Questions on Gunshot Detectors in Cities

Photo of microphone on street pole next to house
Photo of microphone on street pole next to house
Jay Stanley,
Senior Policy Analyst,
ACLU Speech, Privacy, and Technology Project
Share This Page
May 5, 2015

Gunshot detectors in cities have been in the news in recent weeks as a result of their adoption in the biggest U.S. city, New York City. These systems consist of a network of microphones installed around a city that listen for gunshots, pinpoint their location, and alert police. I wrote about this issue two years ago, and while saying there were questions about it, and that transparency was important, I wrote that “gunshot detection in a city does not implicate any significant privacy interests.”

The key question remains whether the technology can record voices. Activists in Oakland, CA pointed out to me that audio from the devices has been used in several court cases, and that some of the system’s microphones are placed on the edge of private property (they also produced this video critical of the technology).

I sat down recently in our New York offices with the CEO of Shotspotter, Ralph Clark, to ask him questions about his company’s system. He outlined for me how the system works:

  • Shotspotter is based on the placement of 15-20 “sensors” per square mile, each containing a microphone, GPS for clock data, memory and processing, and cell capability to transmit data.
  • The sensors are placed as low as 20 feet above the ground, though Clark told me “It’s better for us the higher they are. We don’t like ambient noise, it complicates our lives…. And the higher up the sensors are, the fewer we have to deploy.”
  • The sensors constantly record audio, and monitor that audio for explosion-like noises like a gunshot.
  • The audio is recorded and locally stored by the sensor for “hours or days not weeks” Clark said. I got the sense that it was 48 hours. The audio is overwritten on a rolling basis. This spool cannot be used as a live audio stream. The reason for the retention, Clark said, is so “if we miss a gunshot, we can remotely pull the data off” to analyze what went wrong.
  • When the sensor detects a gunshot-like noise, it sends a report (just a timestamp) to a centralized “LocServer.” When the LocServer gets reports from three or more sensors that line up in time and space, that is considered a “possible gunshot.” The LocServer then contacts the sensors and downloads audio of the sound, including 2 seconds before and 4 seconds after the shot or shots.
  • Those audio snippets are transmitted to a review center, where humans analyze them (listen, review the waveforms, etc.) to decide whether they think it was a real gunshot. If such a determination is made, an alert is sent to local police, as is the audio. The audio goes to the local patrol car, so that the officers know what they’re dealing with. For example if the gunfire is bursts from a fully automatic weapon, Clark said, “you don’t send just one officer into that situation.”

The company also offers some details on its operation in its privacy policy here.

Clark told me the company has evolved over time, partly in response to privacy concerns. They moved from just selling the sensors to a managed services model in which they sell the gunshot information as a data subscription, introduced a privacy policy, and, about three years ago, standardized the length of the audio snippets, making it “less loosey-goosey than it used to be.” The court cases in which Shotspotter audio has been used were cases in which audio of relevant voices was recorded just after the gunshots.

Although Clark was very open and forthcoming with me, and his company has clearly become more privacy-aware, I have several concerns about the situation. The biggest may be that audio from live microphones is stored for days. Storage of any data always raises the specter of security vulnerabilities, and we just don’t know what uses or abuses of such data may emerge down the road. Importantly, Clark did say that Shotspotter had never received a legal subpoena for such recordings. First, he argued, police agencies know that “the cost involved and the likelihood of getting anything useful” make for a bad equation, and that in most cases it wouldn’t work because of the shortness of the spool. “If there were gold in those hills, we’d have many examples,” he argued. Second, he told me, “We’d be fighting them every step of the way. We’d say you’re wasting our time, and you’re jeopardizing our efforts if you make it look like something more than it is, which is a gunshot detection system.” Still, I’d like to see them shorten their retention periods, the shorter the better.

Second, if a microphone were stationed outside my house, I don’t think I would love that, no matter how many assurances I was given. Clark’s answer to that was,

If you’re really worried about that, what about your cellphone? If you’re worried about NSA boogeymen, they’re not going to be using our sensors, they’ll be using your phone. It’s in your pocket and has a better microphone.

That’s likely accurate at the moment. But our job is to worry about every surveillance vector, and the existence of worse surveillance programs is no reason not to worry about lesser ones. Besides, in a few years, the technology and policy landscape may shift, for example should NSA surveillance be rolled back. Our concern is a long-term one.

Third, I’d like to see the company release its source code. As a general principle any computers that serve important public functions should be required to have their source code made public (I have previously discussed this with regards to cars and drones). It is a very sensitive thing to place microphones in public spaces, and while I personally believe what Clark tells me about the focus of his system, and his company’s incentives to keep this technology narrowly focused on gunshot detection make sense, that’s not enough. Not everybody may believe Clark, and it’s fair to want proof. And once the microphones become an embedded, accepted part of our urban landscape, they’re not going to disappear. The assurances Clark is providing, on the other hand, if they’re not rigorously entrenched into law, practice, and expectation, may disappear over time. Clark will not be the only CEO and Shotspotter may not be the only company overseeing such microphones.

Clark did offer to give us at the ACLU a confidential (NDA-protected) review of the company’s source code. Our technologist Daniel Kahn Gillmor tells me that while the making of an offer is somewhat confidence-inspiring, (a) we don’t have the bandwidth to undertake such a project, and (b) it would not be sufficient because, among other reasons, the code could be completely replaced and changed through upgrades at any time. A broader systems audit would also be necessary. Perhaps there is some other auditing mechanism that would work for such technologies, such as having an accounting firm (paid for by the city) doing the kind of review and monitoring that would assure populations of the narrow focus of these microphones.

The Oakland activists have also questioned Shotspotter’s terms of service (sample here), in which the company reserves the right to own and sell the data it collects. Clark said the intent there is to “develop a ‘Big Data’ play that is monetized with a large federal agency or a research institution” buying their data. He envisions the FBI or Urban Institute (which has already done some narrow work on this) doing work combining gunshot data with data such as school dropout rates, incidence of hypertension, business formation, and the like. “These are traumatic events, these gunfire events,” he said, pointing out that guns are fired a hundred times for every actual gunshot victim. “They’re not being measured. What does it do to a child’s emotional and mental health?” (As an African-American man who grew up in East Oakland, Clark repeatedly spoke with passion about the problems faced in high-crime neighborhoods.)

Overall, given the parameters of the system as described by Clark, and his seeming realization that his company sits on the edge of controversy (thanks in part to those Oakland activists), and thus his incentive not to allow it to be turned toward broader ends, I am not losing sleep over this technology at this time. But I am concerned over the precedent of allowing our cities to be sprinkled with live microphones that are not subject to transparent operation, and where that will lead over coming years and decades.

Learn More About the Issues on This Page