The Facebook Killer is dead, but the questions for Facebook itself are only growing.
Steve Stephens ended the three-day manhunt Tuesday when, under police pursuit, he killed himself near Erie, Pa. Still with us is the challenge of a social-media world that increasingly pushes users to post provocative, unfiltered video.
Last Sunday, Stephens put up on Facebook a video promising to murder someone. Two minutes later, he posted video of himself killing a 74-year-old grandfather, apparently chosen at random.
Eleven minutes after that, he broadcast live on Facebook for five minutes, confessing to the crime. Yet it was two hours before the videos were reported to Facebook, and another 23 minutes before the company took down Stephens’ accounts.
While Facebook execs, including CEO Mark Zuckerberg, bemoaned all this and promised to “review our reporting flows,” this is nothing new. In recent years, Facebook has broadcast live or taped murders, rapes, suicides and the racially motivated brutalizing of a mentally disabled teen.
Though eventually taken down, the images were seen — and saved — by millions.
It’s precisely the kind of “emotional and raw and visceral” images that Zuckerberg pleaded for when he announced Facebook’s shift in priority to streaming video in a bid for more lucrative advertising.
Yet much as he’d like to pretend otherwise, Facebook is not a disinterested platform for any and all images: It’s a full-fledged media company that both creates news and spreads it globally, and pushes its users to engage without limitations.
But with that comes responsibility — or else Facebook and other social media are little more than modern purveyors of snuff films.
And whatever Facebook claims to be doing (it says it’s been looking at the problem for over a year), it’s not working.
Zuckerberg & Co. need to truly start getting serious about defining Facebook’s standards and coming up with mechanisms to enforce them. If they don’t, it’ll be an open invitation to Congress to step in.