Programming Cell - with actor languages?

Posted:
in Mac Software edited January 2014
I finally sat down and read <a href="http://news.com.com/2102-1001-948493.html"; target="_blank">this article</a> on the Sony/IBM/Toshiba joint venture called Cell.



It's always hard to sort out fact from IBM hype, but they claim up to 16 cores on a single processor.



The article notes that programming such a beast will be difficult. As nVIDIA's chief scientist notes in <a href="http://www.ve3d.com/hw/interviews/davidkirk/"; target="_blank">this interesting interview</a> at Voodoo Extreme (link courtesy of <a href="http://www.imgmagazine.com/news/"; target="_blank">Inside Mac Games)</a>, CPUs are traditionally biased toward linear rather than parallel execution.



So what's this doing in Software?



Good question.



It was not so long ago that some people thought that computers would go massively parallel. It looks like they were right, just 20 years ahead of themselves. At any rate, they designed actor languages, which are based on the idea that you have lots of independent bits (called actors) running simultaneously that send messages to each other, and process received messages according to a given script. If you have a lot of processors, then the actors are distributed over them, and you get massively parallel execution, stated more transparently and more elegantly than conventional flow-of-control languages - and even dynamic OO languages - have managed sofar.



I remember there are some people here who have worked with them. Dust off your chops, guys. IBM is envisioning distributed processing over multicore chips. This implies at least two (and, in practice, more than two) levels of communication: local, and remote. As far as I understand them (which, I freely admit, is not far at all) actor languages are set up to assume that all relevant actors are local - that is, that the machine is massively parallel, not distributed or clustered over relatively low-bandwidth pipes. Would modifications be necessary to, say, differentiate between IPC and remote IPC, for practical reasons? Or could that be done under the sheets? Am I anywhere near to being on the right track here?



Comments welcome!

Comments

  • Reply 1 of 9
    Are you talking about multiple processing units on a single chip? Would remote communication be between the units while local communication take place between processing pipes on a single processing unit? And finally, would the actor language replace the current concept of micro-code?
  • Reply 2 of 9
    programmerprogrammer Posts: 3,467member
    The Cell architecture involves both multi-core chips, and bunches of them stuck together in some kind of a communications fabric.



    I don't (yet) know anything about the actor languages that Amorph is refering to, but I also see a strong need for data flow languages which describe computations (and their data) in a more direct way than embedded logic in a conventional language does. The tradtional (and actor?) languages will probably be reduced to control mechanisms for generating, scheduling, and pushing around these other "heavy" computations.



    One thing is for certain -- new languages are needed. C/C++ simply aren't going to scale up to the coming hardware (or even the current GPU hardware).
  • Reply 3 of 9
    amorphamorph Posts: 7,112member
    [quote]Originally posted by ThinkingDifferent:

    <strong>Are you talking about multiple processing units on a single chip? Would remote communication be between the units while local communication take place between processing pipes on a single processing unit?</strong><hr></blockquote>



    As per the Cell article, IBM is planning for multiple cores on a single chip which is capable of communicating with other, similar chips. So it's both parallelized and distributed. Local would be between cores, remote between chips.



    [quote]<strong>And finally, would the actor language replace the current concept of micro-code?</strong><hr></blockquote>



    No, I'm thinking system/application level. It would replace the C-like languages, Java, etc.
  • Reply 4 of 9
    amorphamorph Posts: 7,112member
    Here's some reading on actor languages, and derivatives:



    <a href="http://citeseer.nj.nec.com/492958.html"; target="_blank">A highly technical summary hosted at NEC</a>



    An <a href="http://www.cs.uwaterloo.ca/~fmavadda/p58-agha.pdf"; target="_blank">overview from MIT</a>. [PDF]



    A lighthearted example: <a href="http://www.toontalk.com/English/computer.htm"; target="_blank">ToonTalk</a>



    And something Programmer might be interested in:



    A paper on <a href="http://www.cs.uwaterloo.ca/~fmavadda/p58-agha.pdf"; target="_blank">programming and animating on the screen at the same time</a> via actor languages.
  • Reply 5 of 9
    airslufairsluf Posts: 1,861member
  • Reply 6 of 9
    kickahakickaha Posts: 8,760member
    Alright, maybe I'm missing something vital and basic here, but in perusing those sources, I'm not finding anything new but the name.



    Essentially, actor languages are concurrent OO in a new terminology, as far as I can tell.



    What am I missing, Amorph?
  • Reply 7 of 9
    amorphamorph Posts: 7,112member
    [quote]Originally posted by Kickaha:

    <strong>Alright, maybe I'm missing something vital and basic here, but in perusing those sources, I'm not finding anything new but the name.



    Essentially, actor languages are concurrent OO in a new terminology, as far as I can tell.



    What am I missing, Amorph?</strong><hr></blockquote>



    Not much. Actor languages are, in sum, an attempt to implement concurrent OO elegantly, based on the assumption that the underlying hardware is significantly (or massively) SMP.



    I've been trying to find the link (ironically, I was first alerted to it on these boards...), but there's one in particular where each actor has a script, and (in the classic CS broken metaphor) when they receive a message, they apply their script to it, which produces zero or more messages to send to other actors. It's almost a parallelized state machine.



    Anyway, all the "new paradigm" talk is not so much "new to CS" as "new to VB/C++ programmers," who are still, basically, writing the bulk of production code in linear languages with extra bits of dynamism and parallelism bolted on.



    [ 08-08-2002: Message edited by: Amorph ]</p>
  • Reply 8 of 9
    programmerprogrammer Posts: 3,467member
    [quote]Originally posted by Amorph:

    <strong>Here's some reading on actor languages, and derivatives:



    <a href="http://citeseer.nj.nec.com/492958.html"; target="_blank">A highly technical summary hosted at NEC</a>



    An <a href="http://www.cs.uwaterloo.ca/~fmavadda/p58-agha.pdf"; target="_blank">overview from MIT</a>. [PDF]



    A lighthearted example: <a href="http://www.toontalk.com/English/computer.htm"; target="_blank">ToonTalk</a>



    And something Programmer might be interested in:



    A paper on <a href="http://www.cs.uwaterloo.ca/~fmavadda/p58-agha.pdf"; target="_blank">programming and animating on the screen at the same time</a> via actor languages.</strong><hr></blockquote>



    Is the last link the one you wanted to link to Amorph? It is the same one as the MIT link.
  • Reply 9 of 9
    amorphamorph Posts: 7,112member
    [quote]Originally posted by Programmer:

    <strong>Is the last link the one you wanted to link to Amorph? It is the same one as the MIT link.</strong><hr></blockquote>



    Ummm, err...



    Someone wasn't looking too carefully. :o



    I still haven't found the original link I was looking for, either. Darnitol.
Sign In or Register to comment.