The Washington Post What Crime Looks Like Before It Happens
Text by Ana Swanson, February 9 2016






Stopping criminal activity before it happens is usually the domain of science fiction – as in “Minority Report,” where police officers in 2054 use the ability to see into the future to catch murderers before they kill. But some security experts believe a version of that future is much closer than 2054.
Increasingly, smart surveillance cameras are monitoring public places in search of suspicious cues, a high-tech version of “if you see something, say something.” By reviewing massive volumes of ordinary surveillance tape, algorithms can “learn” what type of behavior is typical right before a crime or terrorist attack is committed – like a person suddenly breaking into a run or abandoning a suitcase on a subway platform – and alert authorities.
Esther Hovers, a photographer, captures some examples of the seemingly deviant behavior that these cameras pick up in a photo exhibition called “False Positives.” These photographs, which Hovers took in Brussels, the de facto capital of Europe, are montages, partially natural and partially staged, which Hovers created by combining images from several minutes of video.
If some of these behaviors seem relatively innocuous, that is partly Hovers' point. While smart cameras offer big benefits in security, they also increase surveillance of behavior that is slightly unusual, but not criminal in any way. Like many new technologies, smart surveillance systems may contain worrying consequences for privacy and public freedom.
For one, these cameras currently make a lot of mistakes. Hovers says that nine out of 10 alerts that these systems issue today are “false positives” – what the industry calls false alarms.
Part of it is that algorithms are much worse than humans at recognizing context, says Hovers. For example, a smart surveillance system might alert authorities if foot traffic on a street reaches much higher volumes than normal – but a human would be able to figure out that the cause is a newly opened market, or a town festival.
Hovers says her project is more about future possibilities than the current state of security, since the vast majority of cameras in use today are not yet smart cameras. But Washington, Boston, Chicago, Amsterdam and other cities have begun testing out smart surveillance technology.
While everyone wants security, Hovers says she is concerned about the kind of judgments this system imposes on a society, and whether it would restrain some types of "abnormal" public expression -- of which art could be considered one.
“Not every type of deviant behavior is criminal behavior, and I’m happy about that, actually,” she says.



Unless otherwise stated, this website and all content within this site are the property of Esther Hovers and are protected by copyright and other intellectual property laws.

© 2019 Esther Hovers – www.estherhovers.com
© 2024 Esther Hovers