Police in New Mexico said they recently received a life-saving 911 call from an unexpected source — a smart home device.
According to ABC News, an armed man, threatening to kill his girlfriend, asked, “Did you call the sheriffs?” A nearby voice-activated device (Amazon’s Echo, which has a virtual assistant named Alexa) mistook the question as a command and called 911, prompting a concerned dispatcher to send over the cops. The girlfriend and her daughter escaped physically unharmed and the man was arrested after an hours-long standoff with police.
“The unexpected use of this new technology to contact emergency services has possibly helped save a life,” said Bernalillo County Sheriff Manuel Gonzales III, in a statement to ABC News.
Later reports disputed the role the smart home devices played in the scenario. Amazon says that currently its home device Alexa can’t call numbers like 911 because there is no mobile phone network attached to it, as there would be, for instance to the iPhone’s Siri. Alexa could call another Alexa-powered device, assuming the other device is in the caller’s contact list but that didn’t happen here. Google Home can’t call 911 either.
While the specifics of this recent incident still aren’t clear (the police report said the victim could be heard on the phone saying “Alexa, call 911” which doesn’t make a lot of sense since she was already on the phone to law enforcement), it’s possible that in the near future, smart home devices will be able to interpret ambient conversations and maybe take action. Later iterations of these devices may be able to call 911 if they detected signs of an emergency (cries of “Help me, I’m hurt!” or “The house is on fire!”) or even a crime in progress, for instance overhearing a gunman say, “Put your hands up!”
Smart home devices are always “listening” for voice commands, even recording and storing snippets of audio to improve their search algorithms and artificial intelligence. If police had access to these recordings — both on the device and in the cloud — that could provide critical evidence for investigators.
In a high-profile case, Arkansas police investigating a 2015 murder tried to compel Amazon to turn over cloud-based recordings from an Echo recovered from the defendant’s home, but the company fought back, citing First Amendment protections. The case highlights the ongoing struggle between efforts by law enforcement to ensure public safety and constitutional protections against unreasonable surveillance, search and seizure.
Natasha Duarte is a policy analyst at the Center for Democracy & Technology, a nonprofit that advocates for digital rights and privacy. She says that the home is where Americans have the greatest expectation of privacy.
“I don’t think that having a smart home device should be seen as changing that expectation,” Duarte says. “It’s a complex space that we’re entering where we have these recordings being made in the place where people expect to have the most privacy.”
But as the artificial intelligence capabilities of smart home technology improve, it’s easy to imagine a device that goes beyond basic voice commands and acts on its own to “serve and protect.”
James Baker is a 40-year veteran of the Vermont State Police who’s now director of advocacy for the International Association of Chiefs of Police. His organization believes in balancing privacy concerns with providing public safety, but also thinks that technology should be used more proactively to safeguard citizens. And that includes future smart devices that would call the police in an emergency.
“From our viewpoint, anything that passes constitutional muster that provides public safety to citizens would be a good thing,” Baker says. “I expect the knee-jerk reaction to that would be, ‘Oh my God, that’s violating people’s privacy.'”
Not necessarily, says Duarte at the Center for Democracy & Technology. There are already some smart devices on the market, like voice-activated panic detectors for the elderly, that contact emergency dispatchers when triggered by pre-selected voice commands. These “wake words,” says Duarte, provide an important privacy protection.
The bigger privacy and safety concerns come when we let the machine make decisions for itself. Duarte cites the relatively low success rates of the predictive policing tool COMPAS, which uses big data analysis to predict whether an individual with a criminal record will commit another crime within two years. A ProPublica study found the system was wrong 40 percent of the time.
“Considering that machines are not very good at actually predicting crime yet, you increase the risk of false positives,” says Duarte. “So, you increase the risk of law enforcement being called or having access to information when there was no criminal activity. And, of course, that’s a privacy issue.”
For Baker, who ran the Vermont State Police and its police academy, the defense of privacy can stand in the way of critical police work. His organization is particularly concerned with strong encryption on smartphones and apps that allow criminals and terrorists to easily “go dark.” Even when police have a lawful warrant from a judge to search a device, they often can’t get access without a password. Companies like Apple and Google have refused to build in backdoors for law enforcement.
Baker believes that if a smart home device could help solve a crime or prevent a tragedy, we should use it.
“These devices can hear what’s going on in the house and the technology allows 911 to pinpoint where the device is and get police resources there. That’s public safety,” Baker says. “That’s saving people’s lives.”