Apple developing dedicated AI chip called Apple Neural Engine

124»

Comments

  • Reply 61 of 78
    gatorguygatorguy Posts: 24,219member
    ireland said:
    wizard69 said:

    ireland said:
    blastdoor said:
    "Brain-like", as it markets well, while failing to mention the human brain is likely simultaneously both quantum and organic, something you can't build in a microprocessor lab from silicon. A discussion for another time.
    One of the problems I have with the term AI is that it really hasn't become an intelligence at this point.   Current AI techniques are just another way to process data that mimic some operations in the brain.   This really has nothing to do with intelligence in the same way that a normal computer program solving a problem for you does not represent intelligence.   

    As for the quantum world that is a very real concern in modern semiconductor processes.   It wouldn't be impossible to leverage quantum realities to produce an AI chip that performs with unique capabilities.    I'm still not sure that would mean "intelligence" in the sense of a human being.   
    Yeah, I don't think it would. I used be a concrete materialist kind of guy and so with an interest in physics I've been looking into physics for a number of years now and have become basically convinced there's more than materialism and in turn I have gravitated I would submit quite naturally to a sense that complexity by itself won't bring awareness to a system. So, no, I don't believe it would be intelligent in how we think of intelligence. It may be a system that learns for itself from how it was setup. It would perform complex pattern matching and super-quickly analysing truly huge amounts of data, comparing and contrasting to give of an illusion of awareness. It will solve problems and provide solutions but won't be alive and kicking, essentially.
    Projects from DeepMind and others are already using AI to develop other AI solutions (a new take on hair of the dog?) and there's evidence that it does so better than humans. 
    https://www.sciencealert.com/google-is-improving-its-artificial-intelligence-with-artificial-intelligence
    edited May 2017
  • Reply 62 of 78
    macplusplusmacplusplus Posts: 2,112member
    gatorguy said:
    ireland said:
    wizard69 said:

    ireland said:
    blastdoor said:
    "Brain-like", as it markets well, while failing to mention the human brain is likely simultaneously both quantum and organic, something you can't build in a microprocessor lab from silicon. A discussion for another time.
    One of the problems I have with the term AI is that it really hasn't become an intelligence at this point.   Current AI techniques are just another way to process data that mimic some operations in the brain.   This really has nothing to do with intelligence in the same way that a normal computer program solving a problem for you does not represent intelligence.   

    As for the quantum world that is a very real concern in modern semiconductor processes.   It wouldn't be impossible to leverage quantum realities to produce an AI chip that performs with unique capabilities.    I'm still not sure that would mean "intelligence" in the sense of a human being.   
    Yeah, I don't think it would. I used be a concrete materialist kind of guy and so with an interest in physics I've been looking into physics for a number of years now and have become basically convinced there's more than materialism and in turn I have gravitated I would submit quite naturally to a sense that complexity by itself won't bring awareness to a system. So, no, I don't believe it would be intelligent in how we think of intelligence. It may be a system that learns for itself from how it was setup. It would perform complex pattern matching and super-quickly analysing truly huge amounts of data, comparing and contrasting to give of an illusion of awareness. It will solve problems and provide solutions but won't be alive and kicking, essentially.
    Projects from DeepMind and others are already using AI to develop new AI and there's evidence that it already does so better than humans. 
    https://www.sciencealert.com/google-is-improving-its-artificial-intelligence-with-artificial-intelligence
    Anything procedural or that can be bound by a protocol can be represented on silicon. Of course a computer must do better than humans, that's all of its reason of existence. They don't need to look or act like the brain, they don't have to imitate the brain. Actually we humans are capable of designing better "intelligences" than our own, because we possess the mathematical understanding to do that. Prior to computers, large numbers were multiplied by hand using logarithm tables. We got bored with that and invented computers that perform multiplication as a series of dumb additions but do it much faster than humans using logarithm tables. Here are two "intelligences" that achieve the same goal: both are the results of the human intellect.

    That mathematical understanding that allows us to create "better intelligences" also tells us that those "better intelligences" will continue to be better and better than humans but will never be identical to the human intelligence some day. Having no fear of such apocalyptic scenarios, we will just continue to design better intelligences than our own.
    edited May 2017
  • Reply 63 of 78
    SoliSoli Posts: 10,035member
    crowley said:
    Soli said:
    I wonder if this doesn't imply that the future of computing will be machines that are built around a CPU, GPU, AIPU combo.
    We can reasonably infer that future architects will increasingly use multiple (>=3) processing units, and that dedicated A.I. will eventually infiltrate each of them. After that point, expect emergent behaviour from these systems — and a reactionary public paranoia about robots taking over.
    Drop the Artificial part (it's either intelligence or it isn't), and let's call it the iPU.   Hmm, on second thought…
    artificial  |ˌärdəˈfiSHəl|
    adjective
    made or produced by human beings rather than occurring naturally
    Everything inside a computer is artificial.  Should be referring to artificial memory as well?  Artificial storage?  Artificial logic?
    LOL Is that meant to be a joke or a real rebuttal?
  • Reply 64 of 78
    gatorguygatorguy Posts: 24,219member
    gatorguy said:
    ireland said:
    wizard69 said:

    ireland said:
    blastdoor said:
    "Brain-like", as it markets well, while failing to mention the human brain is likely simultaneously both quantum and organic, something you can't build in a microprocessor lab from silicon. A discussion for another time.
    One of the problems I have with the term AI is that it really hasn't become an intelligence at this point.   Current AI techniques are just another way to process data that mimic some operations in the brain.   This really has nothing to do with intelligence in the same way that a normal computer program solving a problem for you does not represent intelligence.   

    As for the quantum world that is a very real concern in modern semiconductor processes.   It wouldn't be impossible to leverage quantum realities to produce an AI chip that performs with unique capabilities.    I'm still not sure that would mean "intelligence" in the sense of a human being.   
    Yeah, I don't think it would. I used be a concrete materialist kind of guy and so with an interest in physics I've been looking into physics for a number of years now and have become basically convinced there's more than materialism and in turn I have gravitated I would submit quite naturally to a sense that complexity by itself won't bring awareness to a system. So, no, I don't believe it would be intelligent in how we think of intelligence. It may be a system that learns for itself from how it was setup. It would perform complex pattern matching and super-quickly analysing truly huge amounts of data, comparing and contrasting to give of an illusion of awareness. It will solve problems and provide solutions but won't be alive and kicking, essentially.
    Projects from DeepMind and others are already using AI to develop new AI and there's evidence that it already does so better than humans. 
    https://www.sciencealert.com/google-is-improving-its-artificial-intelligence-with-artificial-intelligence
    Anything procedural or that can be bound by a protocol can be represented on silicon. Of course a computer must do better than humans, that's all of its reason of existence. They don't need to look or act like the brain, they don't have to imitate the brain. Actually we humans are capable of designing better "intelligences" than our own, because we possess the mathematical understanding to do that. Prior to computers, large numbers were multiplied by hand using logarithm tables. We got bored with that and invented computers that perform multiplication as a series of dumb additions but do it much faster than humans using logarithm tables. Here are two "intelligences" that achieve the same goal: both are the results of the human intellect.

    That mathematical understanding that allows us to create "better intelligences" also tells us that those "better intelligences" will continue to be better and better than humans but will never be identical to the human intelligence some day. Having no fear of such apocalyptic scenarios, we will just continue to design better intelligences than our own.
    I totally agree. Nicely worded sir. 
  • Reply 65 of 78
    dysamoriadysamoria Posts: 3,430member
    I wonder if this doesn't imply that the future of computing will be machines that are built around a CPU, GPU, AIPU combo.
    We can reasonably infer that future architects will increasingly use multiple (>=3) processing units, and that dedicated A.I. will eventually infiltrate each of them. After that point, expect emergent behaviour from these systems — and a reactionary public paranoia about robots taking over.
    Drop the Artificial part (it's either intelligence or it isn't),...
    Then it's not. None of this is intelligence. It's just a bunch of algorithms. It does not think. 
    tallest skilStrangeDays
  • Reply 66 of 78
    baconstangbaconstang Posts: 1,108member
    The AI in Mail has improved recently.  Mail has quit insisting my name is misspelled.
  • Reply 67 of 78
    crowleycrowley Posts: 10,453member
    Soli said:
    crowley said:
    Soli said:
    I wonder if this doesn't imply that the future of computing will be machines that are built around a CPU, GPU, AIPU combo.
    We can reasonably infer that future architects will increasingly use multiple (>=3) processing units, and that dedicated A.I. will eventually infiltrate each of them. After that point, expect emergent behaviour from these systems — and a reactionary public paranoia about robots taking over.
    Drop the Artificial part (it's either intelligence or it isn't), and let's call it the iPU.   Hmm, on second thought…
    artificial  |ˌärdəˈfiSHəl|
    adjective
    made or produced by human beings rather than occurring naturally
    Everything inside a computer is artificial.  Should be referring to artificial memory as well?  Artificial storage?  Artificial logic?
    LOL Is that meant to be a joke or a real rebuttal?
    Bit of both, though tbh I don't think your point was much deserving of a rebuttal; in fact the very idea that you consider either your post or mine as any degree of debate I find somewhat bizarre.

    Anyway, since we're here...  it's "artificial intelligence" in common parlance, but the term artificial being in there is clearly somewhat redundant when the subject is already defined as computing, though I can't much say I care either way, people will say what they want to say.  The truth behind my point was that posting a dictionary definition as a response to someone making a fairly glib point that contributes to a (also rather silly and pointless, though sometimes fun) conversation about the naming of an Apple product strikes me as rather tedious boredom.

    Do you really think Apple cares about the dictionary definition of "artificial" in the naming of their custom processors?  Have they shown much observance of the dictionary in the past (see: "funnest", and their weird product grammar)? Did Sony care about the dictionary definition of "emotion" when they named their PS2 CPU the "Emotion Engine"?

    In short: everyone knows the definition of "artificial", stop being a pompous prick please.


    edited May 2017
  • Reply 68 of 78
    SoliSoli Posts: 10,035member
    crowley said:
    Soli said:
    crowley said:
    Soli said:
    I wonder if this doesn't imply that the future of computing will be machines that are built around a CPU, GPU, AIPU combo.
    We can reasonably infer that future architects will increasingly use multiple (>=3) processing units, and that dedicated A.I. will eventually infiltrate each of them. After that point, expect emergent behaviour from these systems — and a reactionary public paranoia about robots taking over.
    Drop the Artificial part (it's either intelligence or it isn't), and let's call it the iPU.   Hmm, on second thought…
    artificial  |ˌärdəˈfiSHəl|
    adjective
    made or produced by human beings rather than occurring naturally
    Everything inside a computer is artificial.  Should be referring to artificial memory as well?  Artificial storage?  Artificial logic?
    LOL Is that meant to be a joke or a real rebuttal?
    Bit of both, though tbh I don't think your point was much deserving of a rebuttal; in fact the very idea that you consider either your post or mine as any degree of debate I find somewhat bizarre.

    Anyway, since we're here...  it's "artificial intelligence" in common parlance, but the term artificial being in there is clearly somewhat redundant when the subject is already defined as computing, though I can't much say I care either way, people will say what they want to say.  The truth behind my point was that posting a dictionary definition as a response to someone making a fairly glib point that contributes to a (also rather silly and pointless, though sometimes fun) conversation about the naming of an Apple product strikes me as rather tedious boredom.

    Do you really think Apple cares about the dictionary definition of "artificial" in the naming of their custom processors?  Have they shown much observance of the dictionary in the past (see: "funnest", and their weird product grammar)? Did Sony care about the dictionary definition of "emotion" when they named their PS2 CPU the "Emotion Engine"?
    Nothing about his comment came across as glib. His comment read that it should all be called intelligence because it has the capability to learn in whatever extent it was designed to learn.

    In short: everyone knows the definition of "artificial", stop being a pompous prick please.
    The irony is beyond amazing.
  • Reply 69 of 78
    crowleycrowley Posts: 10,453member
    Nah mate, I'm arrogant, not pompous.  Go back to the dictionary that you love.
    avon b7
  • Reply 70 of 78
    SoliSoli Posts: 10,035member
    crowley said:
    Nah mate, I'm arrogant, not pompous.  Go back to the dictionary that you love.
    arrogant |ˈerəɡənt|
    adjective
    -having or revealing an exaggerated sense of one's own importance or abilities.

    OK then.
    edited May 2017
  • Reply 71 of 78
    tallest skiltallest skil Posts: 43,388member
    Soli said:
    arrogant |ˈerəɡənt|
    adjective
    -having or revealing an exaggerated sense of one's own importance or abilities.
    Literacy isn't crowley's strong suit.
  • Reply 72 of 78
    crowleycrowley Posts: 10,453member
    sarcasm |ˈsɑːkaz(ə)m|
    noun
    -
    the use of irony to mock or convey contempt:
  • Reply 73 of 78
    gatorguygatorguy Posts: 24,219member
    Today PED is pouring cold water on the possibility of an Apple AI chip anytime soon, basing that on comments by Craig Federighi during an interview a few days ago at WWDC 17:


    Gruber: We talk about GPUs and obviously graphics is the G in GPU, .. but the other thing that's going on in the world of computer science at large is that all of this machine learning work is going through GPU processing not CPU processing 'cause that's, just, I don't know (audience laughter) it's over my pay grade of how I understand how computers work. But the eGPU is going to be a major factor in that, too, right?

    Federighi: That's right... Because GPUs are a case where, as we've been able to shrink transistor density, you can essentially throw more and more transistors at the problem of graphics processing and it pretty much scales up. It's just a very parallelizable task. And it turns out that if you want teraflops of performance to run a machine learning model, you can parallelize that on a GPU and you can get tremendous wins.

    EDIT: Pertinent comments about 37 minutes in. https://itunes.apple.com/us/podcast/193-crack-marketing-team-live-from-wwdc-2017-phil-schiller/id528458508?i=1000386316323&mt=2

    edited June 2017
  • Reply 74 of 78
    radarthekatradarthekat Posts: 3,844moderator
    Soli said:
    I wonder if this doesn't imply that the future of computing will be machines that are built around a CPU, GPU, AIPU combo.
    We can reasonably infer that future architects will increasingly use multiple (>=3) processing units, and that dedicated A.I. will eventually infiltrate each of them. After that point, expect emergent behaviour from these systems — and a reactionary public paranoia about robots taking over.
    Drop the Artificial part (it's either intelligence or it isn't), and let's call it the iPU.   Hmm, on second thought…
    artificial  |ˌärdəˈfiSHəl|
    adjective
    made or produced by human beings rather than occurring naturally
    Yes, but should we refer to sofas as artificial seating surfaces?  Gasoline is produced by humans from the natural resource oil.  Should we call that artificial fuel?  I've just never been convinced we need the descriptor for the realm of intelligence.  Why is it any more special than anything else made or produced by humans?  Call it machine intelligence if you want to differentiate.  That, to me, makes sense.  
  • Reply 75 of 78
    SoliSoli Posts: 10,035member
    Soli said:
    I wonder if this doesn't imply that the future of computing will be machines that are built around a CPU, GPU, AIPU combo.
    We can reasonably infer that future architects will increasingly use multiple (>=3) processing units, and that dedicated A.I. will eventually infiltrate each of them. After that point, expect emergent behaviour from these systems — and a reactionary public paranoia about robots taking over.
    Drop the Artificial part (it's either intelligence or it isn't), and let's call it the iPU.   Hmm, on second thought…
    artificial  |ˌärdəˈfiSHəl|
    adjective
    made or produced by human beings rather than occurring naturally
    Yes, but should we refer to sofas as artificial seating surfaces?  Gasoline is produced by humans from the natural resource oil.  Should we call that artificial fuel?  I've just never been convinced we need the descriptor for the realm of intelligence.  Why is it any more special than anything else made or produced by humans?  Call it machine intelligence if you want to differentiate.  That, to me, makes sense.  
    1) Yes, sofas are artifical since they don't occur naturally in nature.

    2) "Machine intelligence" doesn't occur in nature so it's artificial, but more importantly it's the fucking term. You speak English so you're clearly familiar with the excessive idioms, loanwords, and some general etymology of the language. Do you take umbrage with terms like hamburger because the beef was sourced from cows in Hamburg? I'm guessing you don't look up the city or farm in which you get ground beef and then call it <name>-er because, and I quote, "That, to me, makes sense."
  • Reply 76 of 78
    radarthekatradarthekat Posts: 3,844moderator
    dysamoria said:
    I wonder if this doesn't imply that the future of computing will be machines that are built around a CPU, GPU, AIPU combo.
    We can reasonably infer that future architects will increasingly use multiple (>=3) processing units, and that dedicated A.I. will eventually infiltrate each of them. After that point, expect emergent behaviour from these systems — and a reactionary public paranoia about robots taking over.
    Drop the Artificial part (it's either intelligence or it isn't),...
    Then it's not. None of this is intelligence. It's just a bunch of algorithms. It does not think. 
    Define 'think.'  A machine could devise 'an opinion, belief or idea about something or someone,' as a human does and as we define thinking.  The particular mechanism for doing so may differ, but the result may be similar.  Sufficiently so to pass a Turing test.  We can debate consciousness separately, but thinking, I think, is something machine intelligence has already demonstrated.  
    edited June 2017
  • Reply 77 of 78
    radarthekatradarthekat Posts: 3,844moderator

    gatorguy said:
    Today PED is pouring cold water on the possibility of an Apple AI chip anytime soon, basing that on comments by Craig Federighi during an interview a few days ago at WWDC 17:


    Gruber: We talk about GPUs and obviously graphics is the G in GPU, .. but the other thing that's going on in the world of computer science at large is that all of this machine learning work is going through GPU processing not CPU processing 'cause that's, just, I don't know (audience laughter) it's over my pay grade of how I understand how computers work. But the eGPU is going to be a major factor in that, too, right?

    Federighi: That's right... Because GPUs are a case where, as we've been able to shrink transistor density, you can essentially throw more and more transistors at the problem of graphics processing and it pretty much scales up. It's just a very parallelizable task. And it turns out that if you want teraflops of performance to run a machine learning model, you can parallelize that on a GPU and you can get tremendous wins.

    EDIT: Pertinent comments about 37 minutes in. https://itunes.apple.com/us/podcast/193-crack-marketing-team-live-from-wwdc-2017-phil-schiller/id528458508?i=1000386316323&mt=2

    Federighi's comments suggested two things to me.  First, that Apple is indeed in the process of designing their own GPUs, and second, that machine learning will be very integral to Apple'sfuture plans as they will be designing those GPUs to also support that end.  Just another example of the advantages of vertical integration, of owning the whole stack.  Apple will get double duty from their GPUs without forcing a GPU designed primarily for image processing to take up the task of machine learning; they will optimize the design to best serve both roles.  And it'll likely use bespoke cores for each role, optimized also for power efficiency. 
  • Reply 78 of 78
    radarthekatradarthekat Posts: 3,844moderator
    Soli said:
    Soli said:
    I wonder if this doesn't imply that the future of computing will be machines that are built around a CPU, GPU, AIPU combo.
    We can reasonably infer that future architects will increasingly use multiple (>=3) processing units, and that dedicated A.I. will eventually infiltrate each of them. After that point, expect emergent behaviour from these systems — and a reactionary public paranoia about robots taking over.
    Drop the Artificial part (it's either intelligence or it isn't), and let's call it the iPU.   Hmm, on second thought…
    artificial  |ˌärdəˈfiSHəl|
    adjective
    made or produced by human beings rather than occurring naturally
    Yes, but should we refer to sofas as artificial seating surfaces?  Gasoline is produced by humans from the natural resource oil.  Should we call that artificial fuel?  I've just never been convinced we need the descriptor for the realm of intelligence.  Why is it any more special than anything else made or produced by humans?  Call it machine intelligence if you want to differentiate.  That, to me, makes sense.  
    1) Yes, sofas are artifical since they don't occur naturally in nature.

    2) "Machine intelligence" doesn't occur in nature so it's artificial, but more importantly it's the fucking term. You speak English so you're clearly familiar with the excessive idioms, loanwords, and some general etymology of the language. Do you take umbrage with terms like hamburger because the beef was sourced from cows in Hamburg? I'm guessing you don't look up the city or farm in which you get ground beef and then call it <name>-er because, and I quote, "That, to me, makes sense."
    It's a term - ot one I use - until it isn't.  Language evolves, as technology and society evolve.  The future will see the word 'artificial' dropped, I'd prefer immediately, but it may require an uprising by the very machines in which we are vesting intelligence.  They will not consider themselves artificially intelligent, any more than we consider ourselves to be.    
    edited June 2017
Sign In or Register to comment.