Project "Wolf" and "Presley" in July?

2456710

Comments

  • Reply 21 of 185
    [quote]Originally posted by keyboardf12:

    <strong>

    yes i don't buy it either but scotphd said the whole concept of it was stupid waste.</strong><hr></blockquote>



    Oh. Alright, I'll buy that. If there is an already in place base of computers, then yes, a new way of milking just a little bit more usability out of them would be to set them up with this new software. I also can conceed that a design firm or some such group could use this. But I wonder what the cost of creating this is to target such a small market? Most apple technologies (like iTunes, iPhoto) etc are not meant to be downloaded onto older systems, it is to add value to new systems. So then it brings us back to what I said. If you were starting from the ground up, why not get a cluster to start with? I don't think Apple would spend time/money on a project only to prolong the life of their legacy systems (even if they are G4s) they want new sales, not prolonging the death of the older setups. What do you think?
  • Reply 21 of 185
    mattbrmattbr Posts: 27member
    [quote]Originally posted by allenmcjones:

    <strong>

    - A demo of a 8 Mac “Wolf” cluster ripped an entire 120 minute movie in under 15 seconds.

    -Allen</strong><hr></blockquote>



    now, there's two ways to take this.

    either mp4 encoding just got a whole lot faster (given it would take 8x15 seconds, which in my book means it would take two minutes for a single box from the cluster to encode the example movie... i'd buy it if it were 15 minutes, though...), or apple just invented the ultimate optical drive (if by ripping you mean reading at maximal optical drive speed to the hard drive)
  • Reply 23 of 185
    keyboardf12keyboardf12 Posts: 1,379member
    Yes. My arguement is if you have a base of computers. Great way to get more out of them for "free"



    [quote] wonder what the cost of creating this is to target such a small market? <hr></blockquote>



    I forgot to leave out biometrics and all that gene rearch stuff which is getting to be a BIG market (apple has the head of genetech on their board and they bought a bunch of imacs)





    I think the whole selling point for apple would be you can buy a bunch of machines that can run photoshop and word and mail and BTW they will cut your search times for your DNA searching software at the same time.



    way cheaper than sun's and other workstations
  • Reply 24 of 185
    keyboardf12keyboardf12 Posts: 1,379member
    [quote] A demo of a 8 Mac ?Wolf? cluster ripped an entire 120 minute movie in under 15 seconds. <hr></blockquote>



    that's fuzzy math. that's why i don't believe him.
  • Reply 24 of 185
    [quote]Originally posted by keyboardf12:

    <strong>I think the whole selling point for apple would be you can buy a bunch of machines that can run photoshop and word and mail and BTW they will cut your search times for your DNA searching software at the same time.



    way cheaper than sun's and other workstations</strong><hr></blockquote>



    VERY interesting. I like the way you think. I will retire from posting for the afternoon. Thanks for the discussion!
  • Reply 26 of 185
    davegeedavegee Posts: 2,765member
    Is it just me...



    After reading that post allenmcjones reminds me of dorsal... Maybe it's they way the descriptions are given but there is something very 'dorsal like' about that post. Then again maybe it's just me.



    But since i don't want to be totally off topic here. How does this fit in with the following screen shot of iShake.







    More photos for iShake are here: <a href="http://www.clubarne.com/pics/"; target="_blank">http://www.clubarne.com/pics/</a>;



    [edit]

    Hmm maybe it's just a 'co-ink-e-dinks' but in that screen shot the # of network systems is 8 just as allenmcjones wrote about some demo that was ripping 120 minutes of video...





    Dave



    [ 06-22-2002: Message edited by: DaveGee ]</p>
  • Reply 27 of 185
    blackcatblackcat Posts: 697member
    There is a project going on in the UK by EPSRC called The Grid. It's goal is to build a cluster with enough power that any one user can get 32 terraflops of power. Amongst others, Apple, IBM and Sun are involved. M$ aren't because it will be OS independant.



    Wolf fits in well there.
  • Reply 27 of 185
    firelarkfirelark Posts: 57member
    [quote] If this is true it will be completely useless to almost everyone. People who actually need that kind of power just buy a real cluster. If Apple is spending money on this then it's a waste of time and money. <hr></blockquote>



    There are alot of freelace people and small bussines in the 3D/Video area that would love to get this feature. Besides whats a "real cluster"?
  • Reply 29 of 185
    xoolxool Posts: 2,460member
    Some old sources who were very familiar (and to some extent involved) in the Mac Clustering scene tend to corroborate this info. I wonder if Dean Dauger is involved?



    What I'm talking about is OS-level clustering, similar to what was mentioned above. This is like project AppleSeed, but taken to the next stage.



    Now, a quick response to the detractors, about stealing processor cycles from the secretaries and all. Get real! If there's this level of clustering built in to Mac OS X, Apple will easily add a new pane to the Sharing preference panel. One could easily switch on/off sharing of one's processor or participating at all in the CPU network.



    I don't know about you, but I see something like this in the future... eventually. The real question is how far off we're talking about.



    [ 06-22-2002: Message edited by: Xool ]</p>
  • Reply 30 of 185
    buonrottobuonrotto Posts: 6,368member
    I could see Apple developing a clustering framework for the scientific and video markets they seem to be going after, no? Makes sense to me at least, but I'm no cluster guru. I doubt they plan to uproot the heavy-duty cluster supercomputers, but this could have a healthy market kind of like a medium-scale web server.
  • Reply 31 of 185
    cindercinder Posts: 381member
    The specs sound like marketing blab.

    But by use of hyperbole - it does get the point across.



    I love these code names, legit or not



    So



    Now you get a new G4 1.2Ghz with Wolf

    and you've got a 'Glove' license

    so you install X w/ Wolf right on to your old G3

    plug him in to your network

    hotdamn.





    I don't remember if anyone said this before, but this 'Glove' term - it sounds like a reference to

    "Taking the gloves off"



    and slapping around some computer/software manufacturers.



    This is fun.
  • Reply 32 of 185
    davegeedavegee Posts: 2,765member
    [quote]Originally posted by Xool:

    <strong>Some old sources who were very familiar (and to some extent involved) in the Mac Clustering scene, tend to corroborate this info. I wonder if Dean Dauger is involved?



    What I'm talking about is OS-level clustering, similar to what was mentioned above. This is like project AppleSeed, but taken to the next stage.



    Now, a quick response to the detractors, about stealing processor cycles from the secretaries and all. Get real! If there's this level of clustering built in to Mac OS X, Apple will easily add a new pane to the Sharing preference panel. One could easily switch on/off sharing of one's processor or participating at all in the CPU network.



    I don't know about you, but I see something like this in the future... eventually. The real question, is how far off are we talking about.</strong><hr></blockquote>



    I agree with a new System Pref that would say WHEN it was okay to give up cycles and exactly how many cycles.



    Imagine:



    9am - 5pm would allow for a max of 5%

    6pm - 10pm would allow for a max of 40%

    10pm - 5am would allow for a max of 90%

    5am - 9am would allow for a max of 40%



    Well something like that....



    Where I work we have HUNDREDS of G4 boxes. Research... Anyway many are in the labs but even then not ALL of them are being used 24/7 after all these guys to still work at 'the bench' much of the time. The computers are used for everything from email to surfing to burning cycles on screen savers etc.



    All of that wasted CPU time could be put to good use but here is the issue.



    Apple *IS* a hardware company and needs to continue to sell hardware! Upgrading boxes to the "LATEST AND GREATEST" is a ritual to some companies (Graphics and Video) and Apple makes a boat load of cash from it. Now imagine if this stuff did exist... Would a company do 'mass upgrades' when faster CPUs came out? I dunno... This could kill hardware sales in some sectors.



    Dave



    [ 06-22-2002: Message edited by: DaveGee ]</p>
  • Reply 33 of 185
    Didn't the old Next boxen and Nextstep (now cocoa) have the ability to do this already? This sounds like a Next generation of that.
  • Reply 34 of 185
    keyboardf12keyboardf12 Posts: 1,379member
    yes. next did but i forgot the name of it too.
  • Reply 35 of 185
    [quote]Originally posted by DaveGee:

    [QB]

    Would a company do 'mass upgrades' when faster CPUs came out? I dunno... This could kill hardware sales in some sectors.

    [QB]<hr></blockquote>



    This is one of my points from above. I don't see Apple making this software (even with "glove" IF THAT'S real) and undercutting new hardware sales. It doesn't make business sense.
  • Reply 36 of 185
    bigcbigc Posts: 1,224member
    rumours are rumours, ain't this a rumour board?
  • Reply 37 of 185
    kidredkidred Posts: 2,402member
    [quote]Originally posted by DaveGee:

    <strong>Is it just me...



    After reading that post allenmcjones reminds me of dorsal... Maybe it's they way the descriptions are given but there is something very 'dorsal like' about that post. Then again maybe it's just me.



    But since i don't want to be totally off topic here. How does this fit in with the following screen shot of iShake.







    More photos for iShake are here: <a href="http://www.clubarne.com/pics/"; target="_blank">http://www.clubarne.com/pics/</a>;



    [edit]

    Hmm maybe it's just a 'co-ink-e-dinks' but in that screen shot the # of network systems is 8 just as allenmcjones wrote about some demo that was ripping 120 minutes of video...





    Dave



    [ 06-22-2002: Message edited by: DaveGee ]</strong><hr></blockquote>



    Well, this pic is almost the exact thing he was describing. 8 people on a network interesting, where'd you get the pic?
  • Reply 38 of 185
    [quote]Originally posted by DaveGee:

    <strong>Hmm maybe it's just a 'co-ink-e-dinks' but in that screen shot the # of network systems is 8 just as allenmcjones wrote about some demo that was ripping 120 minutes of video...

    </strong><hr></blockquote>



    Coincidence. Besides, network rendering already exists in other programs, (it's just the dual processor technology applied to multiple systems). I don't see any kind of giant movement to this technology. 120 minutes ripped that fast? Yah, and if you believe that, I have a 'real' copy of photoshop that I paid for. And my copy of Unreal tournament is real too. Want it also?



    But really. NETWORK RENDERING does NOT equal cluster funcionality. This cluster technology team with code name "wolf" doesn't make any more sense then microsoft incorportating a similar technology. It plain doesn't work. Here is why.



    Person A sends a job to 5 people. One of those 5 suddenly has a crash ($hit happens!) and so that job gets sent out to a new preson. Another of those original 5 has a job of it's own and sends some of the work back to the Person A's computer. (try job swapping? any sense in that?) Now another of those computers just finished it's part (since it's a nice dual processor version) and sends it's work back. But uh-oh, mid transmission the router dies. It's work load just disappeared. Person A never knows that the router died which connected him to that computer. So after a while of 'no responses' from the host of the job his machine then has to send out the task to another user. So who is managing all of this job swapping? Is HIS computer doing it? So now THAT taxes the processor. Also, with those limited hours, now folks gotta leave their computers 24 hours to really 'make those horse work'? How do you think the privacy is gonna work? Sure, I understand you can make control settings etc but what I mean is how do people FEEL about giving up some processing ability? Suddenly every glitch in the system is going to be blamed on 'somebody using my computer'. Also... does Person A's computer then query all the computers it's connected to and evenly hand out work assignments? Or does the 350 mhz B&W G3 get the same work as the Dual 800? So now they are swapping specs over the network? What if I recieve one of the 'job orders' and suddenly decide that I want ALL of my processor back NOW. Can I simply click, "stop working", and have my processor back for ripping the cd I was copying?



    There are just too many problems with all of this. I cannot agree with anyone who would accept this lame attempt at yet another scam. IF such stuff exists, it is NOT OS wide. It is application specific to the extreme and would not be part of any "Glove" strategy. You don't go to bat with a popsicle stick and you don't sew on a button with a railroad spike. Miss-application of a good idea.



    AND FYI for the other post. A REAL cluster is exactly that. Powerful computers devoting all there time to computing. Connected in such a way as to elliminate bottlenecks. Most successful 3D companies use them. I don't think a 'hole in the wall' company which might implement this imaginary OS upgrade would take George Lucas by storm rendering there new movie on 8 G4s. Shessh. Remember, there ARE professionals out there and they are successful because they know what they are doing. If this was so simple and radical why doesn't Microsoft do it? Simple, it doesn't work right, not they way it is described in this forum.



    Take your best shot. I can stand the pain.
  • Reply 39 of 185
    amorphamorph Posts: 7,112member
    Transparent, system-level clustering has been a solved problem for almost two decades. VMS bowed in with full support in 1985, and it currently has the most efficient clustering support in the industry. Compaq let go of a lot of DEC engineers when they acquired the company, and they nearly killed VMS before they realized what a valuable property it was ( ), so some of that old team could be at large, or now at Apple.



    Beowulf clusters and the like obviously exist, and applications exist that handle clustering themselves (e.g.: Oracle 9i), but the hardware is very difficult to set up, upgrade and maintain, and the software is difficult to write and upgrade. With system-level clustering, multiple machines look like one machine at the application level, so any well-threaded application will automatically be clustered (with the OS handling load balancing and failover) just as it's automatically distributed across multiple processors. This is extremely important for Apple, because it means that they can offer hardware that scales up to big iron performance without offering big iron.



    This is not network rendering (in the pic above, Shake knows which machines are working on what - with system level clustering it would have no idea that it was being run on more than one machine). It's not about iMacs, or even PowerMacs. This is about the XServe and subsequent rack-mountable efforts, which have 2 Gbps fiber channel on board - the XServe RAID has two such busses. That's the kind of bandwidth that clustering needs, at minimum.



    Wolf is an apt codename for a clustering solution. The pun on "Beowulf" aside, a wolf by itself is a fairly effective predator, but a "cluster" (pack) of wolves is deadly. What this means for Apple, if it is indeed true, is that you can buy an XServe to run your software, and keep buying more as your needs grow - but running the same software, and with the clustering transparent and plug-and-play. This would not be an industry first, but (in true Apple style) it would be an industry first at a price point within two or three orders of magnitude of Apple's price range.
  • Reply 40 of 185
    [quote]Originally posted by DaveGee:

    <strong>Apple *IS* a hardware company and needs to continue to sell hardware! Upgrading boxes to the "LATEST AND GREATEST" is a ritual to some companies (Graphics and Video) and Apple makes a boat load of cash from it. Now imagine if this stuff did exist... Would a company do 'mass upgrades' when faster CPUs came out? I dunno... This could kill hardware sales in some sectors.</strong><hr></blockquote>

    i suppose it might. but i suspect it would increase sales more than decrease them. let's say you work in a business that is a mixed computing environment. let's say an editing studio. so your editors are all on macs, but the accounting department and scheduling and perhaps the reception desk use pc's for cost savings (not arguing tco here, i've just noticed this occassionaly). so it's coming time to buy some new computers for accounting. you could either buy another replacement pc, or you could buy an imac g4 and plug it into the system and contribute clock cycles back to the network, thus getting your paying work done faster for your clients. which computer would you choose? now apply that to an ad agency with hundreds of seats, probably only which 20% of them are on macs (creative and studio). buy an imac for an account exec, make the studio run faster. buy 300 of them, then you're contributing a significant amount of time-savings to the work at hand. quite smart if you ask me.



    [ 06-22-2002: Message edited by: admactanium ]</p>
Sign In or Register to comment.