Paranoia has never been so social.
A slew of community apps meant to inform locals of crime in their area are helping encourage a false sense of rising danger in America — despite the fact that crime rates have plummeted over the past two decades, according to Bureau of Justice statistics and the FBI.
“If you see more coverage of crime, you think it’s more of an issue, even if real-world statistics say it isn’t,” David Ewoldsen, a professor in Michigan State University’s Department of Media and Information, tells Vox.
The apps Citizen (previously known by the problematic name Vigilante), Nextdoor and Amazon’s Ring and Neighbors all publicly source and notify users of local crime, serving as personalized alert platforms. The App Store and Google Play report these are among the most downloaded social and news apps in the US.
The crowdsourcing means many reports are inaccurate or more bizarre than criminal, resulting in virally laughable internet fodder and a Reddit group dedicated to the most ridiculous Citizen posts. These include alerts like “table on highway,” “man spitting on MetroCard machines” and “swan in subway station.”
In other cases, the apps have fulfilled their purported purpose and helped alert users to nearby incidents, such as one man who was alerted by Citizen in February that there was a fire in his own building.
The decline in local news outlets could be one thing bolstering the apps. But there are downsides: In addition to sowing fear, the apps could be exacerbating racial bias.
“There’s very deep research saying if we hear about or read a crime story, we’re much more likely to identify a black person than a white person [as the perpetrator],” Ewoldsen says.
“So the biases baked in to the decisions around who is suspicious and who is arrested for a crime ends up informing future policing priorities and continuing the cycle of discrimination,” according to Steven Renderos, senior campaigns director at the Center for Media Justice.
He says users can try to avoid the negative side effects of the apps through reducing the radius they get notifications for, but in the long run, it is app creators who will have to implement change — ideally by doing a more thorough job of vetting reports to weed out inaccurate ones.
“You go on because you’re afraid and you want to feel more competent, but now you’re seeing crime you didn’t know about,” Ewoldsen says. “The long-term implication is heightened fear and less of a sense of competence … It’s a negative spiral.”