Apple boots crime reporting app Vigilante from App Store
Less than a week after its launch, Apple has banned a crime-reporting app called Vigilante from the App Store, which may have raised concerns over potential risks and abuses.

The app, developed by Sp0n, was initially released for New York City residents, and is meant to notify them about crimes reported via 911, The Guardian said. Users near an incident are alerted, and can even record live video.
Sp0n explained the ban only by saying that Apple was concerned about content.
"The team is working with Apple to resolve the issue and they are confident the app will be made available in the near future," a spokeswoman added. "Vigilante will introduce an Android version of the app in the upcoming weeks with plans to expand in additional cities later this year."
Apple does have rules against apps that could put users in danger, though, and indeed a promotional video for the app shows people rushing to a scene where a woman is being assaulted. In reality, the attacker could have turned on his witnesses and hurt or killed them.
Apple might also have been concerned about people using the app to harass others. While its crime reports stem from 911, people could theoretically start shooting video of innocent people, or even trying to detain or attack them. Racial profiling has been an issue in some other iOS apps, like Nextdoor, whose developer ultimately had to take steps to prevent reporting suspicious activity simply based on skin color.

The app, developed by Sp0n, was initially released for New York City residents, and is meant to notify them about crimes reported via 911, The Guardian said. Users near an incident are alerted, and can even record live video.
Sp0n explained the ban only by saying that Apple was concerned about content.
"The team is working with Apple to resolve the issue and they are confident the app will be made available in the near future," a spokeswoman added. "Vigilante will introduce an Android version of the app in the upcoming weeks with plans to expand in additional cities later this year."
Apple does have rules against apps that could put users in danger, though, and indeed a promotional video for the app shows people rushing to a scene where a woman is being assaulted. In reality, the attacker could have turned on his witnesses and hurt or killed them.
Apple might also have been concerned about people using the app to harass others. While its crime reports stem from 911, people could theoretically start shooting video of innocent people, or even trying to detain or attack them. Racial profiling has been an issue in some other iOS apps, like Nextdoor, whose developer ultimately had to take steps to prevent reporting suspicious activity simply based on skin color.
Comments
But seriously, it's amazing the hoops people must jump through in order to protect themselves, versus having sensible concealed carry policies in these large coastal cities. Police can only mop up after the crime has already been committed.
First I think the primary purpose of the app was to keep people out of harms way, telling people where they should not be.
Okay being a little more cynical here, does any one think people will rush to help someone, I am sorry, most people really choose not to get involved these days. Now the fact the apps allows you to record video makes it sounds more like watching the train wreck which most people are interested in and record what the police are doing these days and the bad behavior we now see from people standing around watching what is happening.
They just need to integrate app this into Waze to make sure I do not drive into an area where I probably should not be.
Does the app have a feature that also shows where anyone wearing a hoodie or long, black trenchcoat is located?
This is a fascinating concept, which at it's best, could be a life saver. The news occasionally has stories about someone risking their safety intervening in a crime and saving someone from injury or death. In best-case scenarios, that's how it should work, given that there will always be perpetrators and victims.
But this concept is also rife with pitfalls. First is the problem of untrained responders being hurt. This can happen when rushing to an incident. They may hurt themselves or another innocent party. The video showed a bicyclist racing to help, and not someone in a car, for good reason.
Arriving 'on-scene' then what. Does the suspect have a knife or a gun? Are they violent? Does the sudden arrival scare a would-be robber/mugger into becoming a killer?
Second, the possibility of the first good-samaritan (that was funny, btw!) to be mistaken for the perpetrator by the second or third good-samaritan or police. That would be confusing and possibly dangerous.
Third, there's the very real tendency (not just possibility) for vigilantism to incite mob mentality. Against an armed suspect, someone will get hurt, not just the suspect. And 'mobism' tends to forget the punishment must fit the crime. Some people love 'street justice' and aren't all that critical about dispensing it or the consequences.
Fourth, the potential for abuse is too great to my thinking. This would be even easier than 'SWATting' someone, though I don't know how common is the latter practice.
This can easily become the polar opposite of infamous cases where people stood by and did nothing while a helpless victim was injured or killed. Every situation is different and crowd-sourcing untrained help is not a reliable or viable alternative.
With cellphones being everywhere, figuratively and literally shining a light and bearing witness, as in testifying, are what the concerned citizen should be doing. Make the call to EMS, don't leave it for someone else.
http://www.abajournal.com/magazine/article/crime_safety_website_racial_profiling
According to Porter, Nextdoor users have singled out minorities for engaging in “suspicious” activities such as walking down the sidewalk, driving down a street, making a U-turn and driving too slowly.
http://www.dailydot.com/via/white-people-apps-racial-profiling-groupme/
What happens when app created to help ensure personal safety perpetuate bigotry instead?
http://valleywag.gawker.com/smiling-young-white-people-make-app-for-avoiding-black-1617775138
Is there any way to keep white people from using computers, before this whole planet is ruined? I ask because the two enterprising white entrepreneurs above just made yet another app for avoiding non-white areas of your town—and it's really taking off!
http://fox40.com/2016/05/12/racial-profiling-claims-lead-to-changes-for-nextdoor-app/
To combat vague posts that refer to a person’s race but give no specific identifying details about their appearance or suspicious activity, Nextdoor administrators have put in place new posting guidelines.
it would also provide a very interesting database for decisions on where to live and where government policing resources should be allocated (something the cops or the government wouldn't want you to know). And yes, it would probably show up some troublesome cultural issues for certain demographics that many people would prefer be downplayed. The question is should it be used for such purposes, because it will.