tenly
About
- Username
- tenly
- Joined
- Visits
- 19
- Last Active
- Roles
- member
- Points
- 258
- Badges
- 1
- Posts
- 710
Reactions
-
Apple correcting Siri "abortion" search issue uncovered in 2011
tallest skil said:tenly said:Don't be an idiot. One is legal and the other is not.
Yes, exactly. The murder part wasn't a problem because murder is not involved. The poster incorrectly used the word in the first half of their example.tallest skil said:
So the murder part of it wasn’t the problem for you, just the legality?tenly said:Don't be an idiot. One is legal and the other is not.
"Murder" - "the unlawful, premeditated killing of one human being by another"
There are plenty of circumstances where terminating a pregnancy is the right thing to do - and there are also plenty of times when it is the wrong thing to do.
It's a complicated issue. But for now - it's legal. What is disgusting is the way those of you who dislike the law treat people who support the law or choose to terminate a pregnancy. It's nothing but bullying and many of you are participating in what should be classified and prosecuted as hate crimes. Try respecting the life that is already on the planet even just half as much as the unborn life you are trying to protect.
if you disagree with the law, lobby the government or elect a government that will change the law - but stop targeting your fellow law abiding citizens for having an opinion that is different than yours. -
Apple correcting Siri "abortion" search issue uncovered in 2011
tallest skil said:tenly said:Don't be an idiot. One is legal and the other is not.
The job of a maps product is to help guide you to the place you want to go. Omitting or censoring some destinations because the author doesn't believe in what goes on there is ludicrous and just ends up alienating customers while creating an inferior and incomplete product.
Now I guess I should take a small step back since this article is not about "maps", it's about Siri. Siri has been marketed as a personal assistant and clearly is meant to have a personality of her own - so - as long as the addresses and directions are available in the actual "maps" application - perhaps it's not that far fetched to have Siri exert her own personality with a response such as "I really don't believe in abortion. Have you considered other options such as adoption? I can't in good conscience help you with an abortion. You'll have to search the maps application on your own if you insist on pursuing that option." But - can you imagine how many people a response like that would alienate? And if Siri is supposed to emulate a real person - a real assistant - consider what would happen if a real-life personal assistant you employed took that sort of stance with you... You'd likely fire her ass for not doing her job. So why isn't it reasonable to expect the digital her to do her job competently and accurately and without any back talk? Or....why can't I fire her and hire an assistant who has views and beliefs similar to my own?
This could end up setting a dangerous precedent where Siri refuses to help you with anything that she (Apple) doesn't agree with. Suppose you wanted information about a Trump rally - Siri could send you to the wrong address or claim she knows nothing about it - or just redirect you to information about the candidate that she is backing! And if Siri decided that she is against alcohol - good luck finding that bar your friends are meeting at. Microsoft store? What Microsoft store? I know nothing about any Microsoft stores! You want to donate money to the "American Kidney Fund"? Sorry. Can't help you. Perhaps you'd like to donate to "Action Against Hunger" instead. Siri eventually ends up being nothing more than a shill for Apple and Apples interests.
To alienate and offend the fewest people possible - Siri needs to practice complete impartiality (within the law). Her mandate is to "assist" you and she should do so competently and without attitude or unsolicited advice.
The abortion issue is an issue for government and the courts to decide - not Apple and Siri.
And the poster obviously chose the term "murder" to maximize dramatic effect at the expense of accuracy. Abortion clinics - in jurisdictions where abortion is legal - do not "murder" anything. The word "murder" is defined as an "unlawful killing". So no matter how strongly you feel as an individual - "murder" is the wrong word to use. "Killing" is the word that the poster should have used - but then, it's not quite as dramatic as "murder".
Remember that at Siri's inception (until recently) she could be relied on to assist us all in finding good places to stash a corpse. It seems she's been updated recently though - now when I ask her where to hide a body, she just replies "I used to know the answer to this..." -
Apple culture hinders recruitment and talent retention efforts, report says
tallest skil said:gargravarr said:"compounded by a failure to introduce ground-breaking products" Imagine if motor racing teams were judged as harshly for their failure to win races. -
Apple correcting Siri "abortion" search issue uncovered in 2011
sirozha said:If a woman should be entitled to get a list of service providers that would murder her child for money, should Siri return appropriate information for a request from a man for the service providers that would murder his wife or girlfriend for money? -
Apple correcting Siri "abortion" search issue uncovered in 2011
1983 said:This article was bound to trigger a bit of a shit storm in the comments section and lo and behold it has. Still a nasty but also sad subject to tackle.
People don't use "maps" to research their options for making important life choices - they use search engines like Google and Bing. Only after they've done their research and made their decision will they turn to maps to help them get to the place at which they already have an appointment.