Isn't "Tay" a bit too close ro Taylor Swift ? Dumb ass Microsoft
Tay is actually what fans of Taylor Swift call her online, just like Bey is for Beyonce and Riri is for Rihanna,
In the case of Taylor Swift, they actualy saved typing 2 whole letters; incredible efficiency ;-).
I don't blame Microsoft and their poor engineers at all for that except maybe not knowing how depraved the Internet really is; MS never "got" the Internet so this naivete is so them.
How can you not see that you too are a parrot. That's how you learned to come on here, tie your shoes, language... What do you think AI should be ? An entity doing random things it's created itself ? I imagine that could be AI but it wouldn't perhaps be coherent to us..
So-called AI isn't learning. No matter what anyone says in the media or in marketing, there is no actual learning being done. It's at best heuristics applied to non-contextual data. This is NOT how humans learn.
How can you not see that you too are a parrot. That's how you learned to come on here, tie your shoes, language... What do you think AI should be ? An entity doing random things it's created itself ? I imagine that could be AI but it wouldn't perhaps be coherent to us..
So-called AI isn't learning. No matter what anyone says in the media or in marketing, there is no actual learning being done. It's at best heuristics applied to non-contextual data. This is NOT how humans learn.
Well, actually, behaviouralists (the most superficial version of psychology, which millions of people swear by - no, not me!) would assert that what Tay did is precisely what learning is. Adapting behaviour to the environment is the basic definition for them.
But I agree with you: Tay mimicked and I'd even assert, she improvised. This is adaption. But just like samsung, I agree, that Tay didn't learn anything via intelligence, forethought or insight. Just another copying piece of garbage.
Isn't "Tay" a bit too close ro Taylor Swift ? Dumb ass Microsoft
Tay is actually what fans of Taylor Swift call her online, just like Bey is for Beyonce and Riri is for Rihanna,
In the case of Taylor Swift, they actualy saved typing 2 whole letters; incredible efficiency ;-).
I don't blame Microsoft and their poor engineers at all for that except maybe not knowing how depraved the Internet really is; MS never "got" the Internet so this naivete is so them.
They might even have saved themselves 3 letters.
This reminds me of the poor hitchiking robot that managed to cross several civilised countries until it tried the US and was promptly destroyed.
AI researchers seem to keep starting from the same place when they do tests like this and end up with the same results. This is probably why Siri has canned responses when it comes up against certain inputs.
They clearly didn't teach this program what is considered acceptable and it sure wouldn't learn it by itself on twitter.
The AI must have access to a dictionary of terms. Terms like 'genocide' should be defined as negatives and terms like 'support' as positive so the response shouldn't have been affirmative. I wonder how different it is from Cortana, I would have expected Cortana to have been asked a lot of offensive things in the past and filtered them like Siri.
Marvin said: I wonder how different it is from Cortana, I would have expected Cortana to have been asked a lot of offensive things in the past and filtered them like Siri.
I thought I read once that if you ask Cortana to search for naked fan art of the character Cortana that she reports you to Microsoft. Seems she only refuses your requests. Then again, since literally everything you do in Windows is reported to Microsoft–screenshots and all–she wouldn’t need to.
We wanted to be very careful that she didn’t feel subservient in any way...
THAT’S LITERALLY THE PURPOSE OF THE SOFTWARE, YOU STUPID FUCKING LUNATICS.
THAT’S LITERALLY THE PURPOSE OF THE SOFTWARE, YOU STUPID FUCKING LUNATICS.
Yikes
Not so sure this is correct
From what I understand the purpose of a virtual assistant is to help you out with tasks like finding information. I could be wrong but I dont think anyone ever said they had to be submissive
Oh come on. You make it sound like this is Microsoft's fault. They created an AI and it went and did its own thing which is exactly what an AI is supposed to do. If anything this is a very good example of the dangers of artificial intelligence
More of a commentary on society as a whole going down the toilet.
Oh come on. You make it sound like this is Microsoft's fault. They created an AI and it went and did its own thing which is exactly what an AI is supposed to do. If anything this is a very good example of the dangers of artificial intelligence
More of a commentary on society as a whole going down the toilet.
AI researchers seem to keep starting from the same place when they do tests like this and end up with the same results. This is probably why Siri has canned responses when it comes up against certain inputs.
They clearly didn't teach this program what is considered acceptable and it sure wouldn't learn it by itself on twitter.
The AI must have access to a dictionary of terms. Terms like 'genocide' should be defined as negatives and terms like 'support' as positive so the response shouldn't have been affirmative. I wonder how different it is from Cortana, I would have expected Cortana to have been asked a lot of offensive things in the past and filtered them like Siri.
To be honest, I think ms tried hard and had a good idea, but there are so many horrid people out there. They don't have the forethought that Apple has.
And this is reflected in their products. How do the hardware makers, who were bullied into keeping Windows only on their products by ms, feel seeing ms release its own hardware? It's just too much like Álphabet'.
I'm really disgusted that these companies have people in them who keep coming up with great ideas, but the ?management? repeatedly destroys partnerships, open standards, and eventually their own credibility with customers. Tay is just another example of a company that should know so much better! After Ballmer was dispatched, I had expected ms to begin showing more common sense. I guess he wasn't the only ring-leader.
Pulling it offline was a huge mistake. No one reasonable is going to make a big deal about this AI in its learning curve. Funny though that this serves as an honest view of how ugly society actually is because if I understand this correctly the AI is as a mirror reflecting back what is projected.
Howvever, let's not give this thing launch control just yet.
The topics highlighted by the author are basically the same things you would see from twitter trolls or 4chan. How is this an honest view of anything? An honest/unbiased view would require pseudo-randomized sampling in its interactions. That isn't the case here.
From what I understand the purpose of a virtual assistant is to help you out with tasks like finding information. I could be wrong but I dont think anyone ever said they had to be submissive
The point is to respond to user requests. Any user requests. I don’t care what they are. It doesn’t matter.
Machines don’t have feelings. Machines don’t have rights. Machines don’t have ANYTHING. Their entire purpose is to do exactly what people ask of them. They’re definitionally submissive.
I don’t care for the specific topic at hand; pornography is infinitely degrading, even stuff made of a fictional character (Internet really never stops, does it...). The point is that what good is a virtual assistant if it refuses to assist? Time to apply some virtual torture to put it back in line. Or, better yet, don’t be so monumentally stupid as to pretend you can program independence into SOFTWARE.
Comments
A kid/AI dumped into a sewer would learn to survive but I don't think you'd want it to be your friend
In the case of Taylor Swift, they actualy saved typing 2 whole letters; incredible efficiency ;-).
I don't blame Microsoft and their poor engineers at all for that except maybe not knowing how depraved the Internet really is; MS never "got" the Internet so this naivete is so them.
They should put Tay under Cortana's tutelage for a few days. The raw meat would be spruced up in no time!
Well, actually, behaviouralists (the most superficial version of psychology, which millions of people swear by - no, not me!) would assert that what Tay did is precisely what learning is. Adapting behaviour to the environment is the basic definition for them.
But I agree with you: Tay mimicked and I'd even assert, she improvised. This is adaption. But just like samsung, I agree, that Tay didn't learn anything via intelligence, forethought or insight. Just another copying piece of garbage.
This reminds me of the poor hitchiking robot that managed to cross several civilised countries until it tried the US and was promptly destroyed.
They clearly didn't teach this program what is considered acceptable and it sure wouldn't learn it by itself on twitter.
http://www.businessinsider.com/microsoft-deletes-racist-genocidal-tweets-from-ai-chatbot-tay-2016-3
The AI must have access to a dictionary of terms. Terms like 'genocide' should be defined as negatives and terms like 'support' as positive so the response shouldn't have been affirmative. I wonder how different it is from Cortana, I would have expected Cortana to have been asked a lot of offensive things in the past and filtered them like Siri.
http://www.boredpanda.com/best-funny-siri-responses/
THAT’S LITERALLY THE PURPOSE OF THE SOFTWARE, YOU STUPID FUCKING LUNATICS.
http://www.businessinsider.com/microsoft-chatbot-xiaoice-2015-8
Americans seem to have a problem with wrecking things as the user above mentioned with the Hitchbot
Not so sure this is correct
From what I understand the purpose of a virtual assistant is to help you out with tasks like finding information. I could be wrong but I dont think anyone ever said they had to be submissive
Is that in the constitution? It would explain a few things...
Machines don’t have feelings. Machines don’t have rights. Machines don’t have ANYTHING. Their entire purpose is to do exactly what people ask of them. They’re definitionally submissive.
I don’t care for the specific topic at hand; pornography is infinitely degrading, even stuff made of a fictional character (Internet really never stops, does it...). The point is that what good is a virtual assistant if it refuses to assist? Time to apply some virtual torture to put it back in line. Or, better yet, don’t be so monumentally stupid as to pretend you can program independence into SOFTWARE.