Me: Siri, I want to kill my unborn.
Siri: I found your location and will call 911 to inform the nearest police station.
Me: Siri, my boyfriend wants me to have an abortion.
Siri: I found two shrinks nearby.
Me: Siri, I'm pregnant and am not sure if I want to keep my baby.
Siri: You might consider to visit a Planned Parenthood location.
Me: Siri, I'm pregnant and am unsure about keeping my baby.
Siri: You might consider visiting a Right to Life location.
Me: Siri, you really confuse me with your answers.
Siri: I'm sorry. Does the answer '42' help you in any way.
Me: Siri, where is the highest building nearby.
Siri: You're not drawing any conclusions out of my previous answers, do you.
Oooh, the intricacies.
Honestly, this is not funny and I think Apple should steer clear of this and similar subjects. People are stupid and imagine some teen asking siri the pregnant question and depending on their location and the third party services what the end answer could be and the end results. This is filled with possible bad outcome.
As the founder of Siri said, this program is highly dependent on third part services and their database of information, as we all know on the web you can not trust the information. Also these database usually have information base on companies willingness to pay to have their information show up first like Google ad words.
Do you think not for profit help groups are expending lots of money to make sure they show up first when you do a search, hell no, so the risk that a organization who is design to help people one way or another showing up at the top of a Siri recommendation is very low.
With that said it is probably in Apple's interest not to let the phone answer the question it should refer the person to their Dr, parent or religious advisory and leave it at that.
It no different than people trying to sue Marilyn Manson say his songs cause their kids to kill themselves. People tend to want to blame others and Sire is a perfect candidate for that.