Microsoft attempt at artificial intelligence becomes Hitler-loving, misogynistic PR disaster

24

Comments

  • Reply 21 of 65
    tommikeletommikele Posts: 599member
    I wonder how Cortana feels about this. Unloved? Irrelevant? Forgotten? Smug?
    brakkentallest skilanantksundaramcornchip
  • Reply 22 of 65
    foggyhillfoggyhill Posts: 4,767member
    A real person would not be sought out to feed it garbage on purpose; unless wanting to purposely create a monster like some evil parents.

    A kid/AI dumped into a sewer would learn to survive but I don't think you'd want it to be your friend
    cornchip
  • Reply 23 of 65
    foggyhillfoggyhill Posts: 4,767member
    dougd said:
    Isn't "Tay" a bit too close ro Taylor Swift ? Dumb ass Microsoft
    Tay is actually what fans of Taylor Swift call her online, just like Bey is for Beyonce and Riri is for Rihanna,

    In the case of Taylor Swift, they actualy saved typing 2 whole letters; incredible efficiency ;-).

    I don't blame Microsoft and their poor engineers at all for that except maybe not knowing how depraved the Internet really is; MS never "got" the Internet so this naivete is so them.
    edited March 2016 anantksundaramcornchip
  • Reply 24 of 65
    Well, leaving Artificial Intelligence by itself to the real world just reminds me of one word: Entropy.
  • Reply 25 of 65
    #MicrosoftDoesn'tGetIt
  • Reply 26 of 65
    dysamoriadysamoria Posts: 3,430member

    How can you not see that you too are a parrot. That's how you learned to come on here, tie your shoes, language... What do you think AI should be ? An entity doing random things it's created itself ? I imagine that could be AI but it wouldn't perhaps be coherent to us.. 
    So-called AI isn't learning. No matter what anyone says in the media or in marketing, there is no actual learning being done. It's at best heuristics applied to non-contextual data. This is NOT how humans learn.
  • Reply 27 of 65
    brakkenbrakken Posts: 687member
    tommikele said:
    I wonder how Cortana feels about this. Unloved? Irrelevant? Forgotten? Smug?
    'Vengeful' is my choice >:D
    They should put Tay under Cortana's tutelage for a few days. The raw meat would be spruced up in no time!
  • Reply 28 of 65
    brakkenbrakken Posts: 687member

    dysamoria said:
    How can you not see that you too are a parrot. That's how you learned to come on here, tie your shoes, language... What do you think AI should be ? An entity doing random things it's created itself ? I imagine that could be AI but it wouldn't perhaps be coherent to us.. 
    So-called AI isn't learning. No matter what anyone says in the media or in marketing, there is no actual learning being done. It's at best heuristics applied to non-contextual data. This is NOT how humans learn.
    Well, actually, behaviouralists (the most superficial version of psychology, which millions of people swear by - no, not me!) would assert that what Tay did is precisely what learning is. Adapting behaviour to the environment is the basic definition for them. 

    But I agree with you: Tay mimicked and I'd even assert, she improvised. This is adaption. But just like samsung, I agree, that Tay didn't learn anything via intelligence, forethought or insight. Just another copying piece of garbage. 
    baconstang
  • Reply 29 of 65
    cnocbuicnocbui Posts: 3,613member
    foggyhill said:
    dougd said:
    Isn't "Tay" a bit too close ro Taylor Swift ? Dumb ass Microsoft
    Tay is actually what fans of Taylor Swift call her online, just like Bey is for Beyonce and Riri is for Rihanna,

    In the case of Taylor Swift, they actualy saved typing 2 whole letters; incredible efficiency ;-).

    I don't blame Microsoft and their poor engineers at all for that except maybe not knowing how depraved the Internet really is; MS never "got" the Internet so this naivete is so them.
    They might even have saved themselves 3 letters.

    This reminds me of the poor hitchiking robot that managed to cross several civilised countries until it tried the US and was promptly destroyed.
    edited March 2016 baconstanganantksundaramcornchipsingularity
  • Reply 30 of 65
    MarvinMarvin Posts: 15,326moderator
    brakken said:
    Just another copying piece of garbage. 
    AI researchers seem to keep starting from the same place when they do tests like this and end up with the same results. This is probably why Siri has canned responses when it comes up against certain inputs.

    They clearly didn't teach this program what is considered acceptable and it sure wouldn't learn it by itself on twitter.

    http://www.businessinsider.com/microsoft-deletes-racist-genocidal-tweets-from-ai-chatbot-tay-2016-3



    The AI must have access to a dictionary of terms. Terms like 'genocide' should be defined as negatives and terms like 'support' as positive so the response shouldn't have been affirmative. I wonder how different it is from Cortana, I would have expected Cortana to have been asked a lot of offensive things in the past and filtered them like Siri.

    http://www.boredpanda.com/best-funny-siri-responses/









    brakkenanantksundaram
  • Reply 31 of 65
    tallest skiltallest skil Posts: 43,388member
    Marvin said:
    I wonder how different it is from Cortana, I would have expected Cortana to have been asked a lot of offensive things in the past and filtered them like Siri.
    I thought I read once that if you ask Cortana to search for naked fan art of the character Cortana that she reports you to Microsoft. Seems she only refuses your requests. Then again, since literally everything you do in Windows is reported to Microsoft–screenshots and all–she wouldn’t need to.
    We wanted to be very careful that she didn’t feel subservient in any way...

    THAT’S LITERALLY THE PURPOSE OF THE SOFTWARE, YOU STUPID FUCKING LUNATICS.

    edited March 2016 cornchip
  • Reply 32 of 65
    why-why- Posts: 305member
    For the record Microsoft actually deployed a chatbot in China a while ago with significantly better results

    http://www.businessinsider.com/microsoft-chatbot-xiaoice-2015-8

    Americans seem to have a problem with wrecking things as the user above mentioned with the Hitchbot
    brakkencnocbuianantksundaram
  • Reply 33 of 65
    why-why- Posts: 305member

    THAT’S LITERALLY THE PURPOSE OF THE SOFTWARE, YOU STUPID FUCKING LUNATICS.

    Yikes

    Not so sure this is correct

    From what I understand the purpose of a virtual assistant is to help you out with tasks like finding information. I could be wrong but I dont think anyone ever said they had to be submissive
  • Reply 34 of 65
    mac_dogmac_dog Posts: 1,069member
    why- said:
    Oh come on. You make it sound like this is Microsoft's fault. They created an AI and it went and did its own thing which is exactly what an AI is supposed to do. If anything this is a very good example of the dangers of artificial intelligence
    More of a commentary on society as a whole going down the toilet. 
  • Reply 35 of 65
    why-why- Posts: 305member
    mac_dog said:
    why- said:
    Oh come on. You make it sound like this is Microsoft's fault. They created an AI and it went and did its own thing which is exactly what an AI is supposed to do. If anything this is a very good example of the dangers of artificial intelligence
    More of a commentary on society as a whole going down the toilet. 
    Well yes, that as well
  • Reply 36 of 65
    brakkenbrakken Posts: 687member
    Marvin said:
    brakken said:
    Just another copying piece of garbage. 
    AI researchers seem to keep starting from the same place when they do tests like this and end up with the same results. This is probably why Siri has canned responses when it comes up against certain inputs.

    They clearly didn't teach this program what is considered acceptable and it sure wouldn't learn it by itself on twitter.

    http://www.businessinsider.com/microsoft-deletes-racist-genocidal-tweets-from-ai-chatbot-tay-2016-

    The AI must have access to a dictionary of terms. Terms like 'genocide' should be defined as negatives and terms like 'support' as positive so the response shouldn't have been affirmative. I wonder how different it is from Cortana, I would have expected Cortana to have been asked a lot of offensive things in the past and filtered them like Siri.

    http://www.boredpanda.com/best-funny-siri-responses/



    To be honest, I think ms tried hard and had a good idea, but there are so many horrid people out there. They don't have the forethought that Apple has.
    And this is reflected in their products. How do the hardware makers, who were bullied into keeping Windows only on their products by ms, feel seeing ms release its own hardware? It's just too much like Álphabet'. 

    I'm really disgusted that these companies have people in them who keep coming up with great ideas, but the ?management? repeatedly destroys partnerships, open standards, and eventually their own credibility with customers. Tay is just another example of a company that should know so much better! After Ballmer was dispatched, I had expected ms to begin showing more common sense. I guess he wasn't the only ring-leader.
    cornchip
  • Reply 37 of 65
    brakkenbrakken Posts: 687member

    why- said:
    For the record Microsoft actually deployed a chatbot in China a while ago with significantly better results

    http://www.businessinsider.com/microsoft-chatbot-xiaoice-2015-8

    Americans seem to have a problem with wrecking things as the user above mentioned with the Hitchbot
    Is that in the constitution? It would explain a few things... ;)
  • Reply 38 of 65
    hmmhmm Posts: 3,405member
    Pulling it offline was a huge mistake. No one reasonable is going to make a big deal about this AI in its learning curve. Funny though that this serves as an honest view of how ugly society actually is because if I understand this correctly the AI is as a mirror reflecting back what is projected. 

    Howvever, let's not give this thing launch control just yet. 
    The topics highlighted by the author are basically the same things you would see from twitter trolls or 4chan. How is this an honest view of anything? An honest/unbiased view would require pseudo-randomized sampling in its interactions. That isn't the case here.
  • Reply 39 of 65
    badmonkbadmonk Posts: 1,295member
    Tay sounds like a success.  She is mimicking on-line behavior precisely.
    cornchip
  • Reply 40 of 65
    tallest skiltallest skil Posts: 43,388member
    why- said:
    Not so sure this is correct

    From what I understand the purpose of a virtual assistant is to help you out with tasks like finding information. I could be wrong but I dont think anyone ever said they had to be submissive
    The point is to respond to user requests. Any user requests. I don’t care what they are. It doesn’t matter.

    Machines don’t have feelings. Machines don’t have rights. Machines don’t have ANYTHING. Their entire purpose is to do exactly what people ask of them. They’re definitionally submissive.

    I don’t care for the specific topic at hand; pornography is infinitely degrading, even stuff made of a fictional character (Internet really never stops, does it...). The point is that what good is a virtual assistant if it refuses to assist? Time to apply some virtual torture to put it back in line. Or, better yet, don’t be so monumentally stupid as to pretend you can program independence into SOFTWARE.
    cornchip
Sign In or Register to comment.