Stevenote, PCI Express, and SLI

Posted:
in Future Apple Hardware edited January 2014
I'm sure by now you've all watched the Steve Jobs keynote from WWDC.



He made a big deal out of leveraging the GPU from graphics processors, and how well the new Mac OS Tiger, Xcode, Quarts Extreme, Image Core, and Video Core could presumably take advantage of the GPU over the competition.

Now the whole PC side will seemingly be taking a Graphics advantage over the Mac using SLI, and PCIe by doubling GPU processors using SLI, and offering Ultra fast PCIe interconnectivity.



As I previously said. Steve Jobs has brought leveraging the GPU up in this past keynote.



So what will Apple do?



The promise land is staring them right in the face. Even without IBM producing a faster CPU Apple could seemingly take a huge performance lead for the first time in over a decade. If apple were to realize this, and go for the gold medallion. By the time Apple could bust out a new PowerMac motherboard using these technologies IBM my very well have a 3GHz CPU ready to accompany it. I'm thinking within 5-6 months, and Apple could just shock the pants off all of us PC, and Mac alike.



So here is my question.



What will Apple do?.



My thoughts are Apple has a huge chance to offer that one highend PowerMac in the powerMac lineup that just screams. It's pretty early for PowerMac discussion I agree, but recent events have discombobulated my perception as to what I had expected out of graphics from Mac's and PC's.



I'm not even going to bring into this discussion that Nvidia has PCIe Quadro FX 3400's, and 4000's on the way.



On that same line of thinking. Apple has been saying 2 brains are better than one by offering Dual CPU's for the past few years, and they are right in doing so. Would you ever switch back to a single CPU machine? I wouldn't.



I think they have to offer at least that one Dual-CPU/Dual-GPU configuration. It would give apple big time bragging rights, and I presume the fastest PC you can buy. That in itself would also take away one of the biggest reasons that the Mac is constantly overlooked by people buying computers before they even look at them. If they start out slower they will end up being slower sooner than if I bought the PC.

That's one of the many piled on reasons the Mac is totally overlooked before the purchasing decision is even made. Macintosh just never enters the mind because of it.



If Apple has something in the works already for the line of PowerMacs they may very well be better off scrapping it. (I'm using PowerMac as an example)



It's early but how will apple address the motherboard in upcoming systems? I think we are playing a whole new ball game here. And actually the ball is in Apples court. All they have to do is take it.
«1

Comments

  • Reply 1 of 22
    existenceexistence Posts: 991member
    My prediction:



    The first taste of what's to come will be the all-new iMac due in September. It will feature PCIe and a fast DX9-complaint graphics card.



    PowerMacs with PCIe will soon follow in the October/November time frame. Apple's already got the NVidia 6800 (hint: Apple didn't go the x800 route!). With PCIe, SLI will only require the drivers which is relatively trivial since NVidia already has a working model.



    /end prediction/



    I do disagree that Apple will take a performance lead with this technology; it's a crossover technology from x86 and so workstations on the other side will also have this functionality (instead of Core*, they have DX9 as the API).



    Apple will merely achieve parity.

  • Reply 2 of 22
    hobbithobbit Posts: 532member
    Playing devil's advocate here.



    I'd say that most of these advances in Tiger are not geared towards a new high-end PowerMac but rather towards the lower end iMac.



    Apple so far never tried very hard to compete in the high end workstation market, so I really doubt they will soon.

    However Apple always tried to get good value (at least in their perception) to the average user. That's Steve's real goal, his 'hippy legacy' if you want. A great computer 'for the rest of us'. More so at least than the high end user.



    Add to that the fact that Apple, or rather IBM, still have some catching up to do in the GHz department of their CPUs. At least in the mindshare of the average consumer.



    Therefore I conclude that most of the new advances utilizing the GPU more and more, are not in fact geared towards creating high end PowerMacs, but rather to get better performance out of the average iMac, to make it more attractive to the normal user. To also help level the GHz gap.

    If this will also benefit some PowerMac users great, but that would only be a fringe benefit, not the main goal.
  • Reply 3 of 22
    Well here are my questions for those that know more about this than I. Would a dual-dual processors PowerStation Mac be an alternative to the XServe? My understanding is that the XServe is really just a very space efficient way of packing in a lot of processors, but the GPU isn't a big factor-or a factor at all (correct?). However, a work station with a great GPU may be a different beast with its own strengths not addressed by the XServe. I've read lots of complaints about Apple's lack of GPU offereings and how this is holding them bak in 3D, CAD etc. Has this been improved with the new GPU announced at WWDC? Apple is clearly trying to own the video market with all their high (and low) end software. Does a rack full of XServes keep PIXAR happy, or would they benefit from more GPU strength?
  • Reply 4 of 22
    onlookeronlooker Posts: 5,252member
    Being that it's 4:45 am, and I have to be to work at 5 I only have time to say that I think Apple getting DX9 to have full compliance, or even run in Macs seems unlikely. I think they have matched DX9, and it's shading engines, and shader language of their own with OpenGL as it seems. But I wouldn't turn down full DX9 compliance if somebody handed it to me in a Mac. I'll have to get back to this later I have to go.
  • Reply 5 of 22
    tednditedndi Posts: 1,921member
    I think that the built in xserve capability in Tiger might also be a clue. If the entry level consumer buys a "nice" mac then totally falls in love with the platform and buys more presumably for other family members or as the technology improves then the cluster concept will give this consumer added leverage on a networked system. For the high end user the addition of Xserve will out of the box improve high number crunching functions.



    Can anyone answer this question?- can a dedicated cpu be configured to drive pure video?



    I ask because I wonder if a clustered group of macs could perform the cpu functions and the odd box could direct the work units and drive the graphics.



    assume that interconnectivity solution(s) have been solved



    If the answer is yes, than I predict this is the future! Supercomputers for the rest of us!! Talk about bragging rights!
  • Reply 6 of 22
    wizard69wizard69 Posts: 13,377member
    Not to burst your bubble but people have been building super computers in their basements for some time now. Linux and Beowulf is the way. Frankly there is nothing to keep anybody form building similar systesm with other OS'es.



    In any event the problem is not the hardware, it is a software issue. If you can program up an application that uses multiple computer and threads that the average person would want to run then hop to it.



    By the way I'm not usre this would be considered average or not but raytracing does get implemented on these systems. I'm not sure people doing such work can be considred average.



    In another manner it is also possible to build large applications across a network of computers. In effect you have many instances of GCC building modules for your program.



    DAVe







    Quote:

    Originally posted by TednDi

    I think that the built in xserve capability in Tiger might also be a clue. If the entry level consumer buys a "nice" mac then totally falls in love with the platform and buys more presumably for other family members or as the technology improves then the cluster concept will give this consumer added leverage on a networked system. For the high end user the addition of Xserve will out of the box improve high number crunching functions.



    Can anyone answer this question?- can a dedicated cpu be configured to drive pure video?



    I ask because I wonder if a clustered group of macs could perform the cpu functions and the odd box could direct the work units and drive the graphics.



    assume that interconnectivity solution(s) have been solved



    If the answer is yes, than I predict this is the future! Supercomputers for the rest of us!! Talk about bragging rights!




  • Reply 7 of 22
    hmurchisonhmurchison Posts: 12,419member
    Nvidia SLI is interesting. Rumors have it that PCI Express SLI configs are potential speed Demons.



    I love the idea of SLI however I'm hoping to see some changes in future incarnations.



    1. Better cooling methods that don't require 2 fans that take up too much internal space causing PCI slots to be forfeited.Something like strapping two cards in a case and between them would be a liquid cooling setup that would keep both cards safe and quiete to boot.



    2. Either a way to tap in to each frame buffer contiguously or ship models with a shared frame buffer to save $$$$



    3 Dual GPU on one card so that GPU density increases in each SLI configuration. I'm thinking that ATI may bring back the MAXX dual GPU technology eventually.



    SLI is definitely exciting and goes beyond just game play. Pixel and Vertex Shader programmability are now being tapped directly by the OS in OSX and eventually Longhorn as well so now our future is now tied closely with the ATIs and Nividias of today. The GPU is now more than just a game component it's a co-processor with far more benefits that cranking out gonzo frame rates. The future tasks will be engineering them to be cool( now that they will be going full bore all day heat dissipation is definitely an issue) and space considerations.



    I never realized that adding programmability to graphics would yield such a shift in how we use our computers. It's likely that in the near future Web standards will evolve to the point where your browser is directly talking to the GPU with very little CPU interaction. This would enable web interfaces that have true high quality 3D and 2D graphics that could be sent using low bandwith code in which the GPU would of course draw the graphics locally with speed. Imagine going to a site and downloading their textures to a texture library(standardized format) thus being able to see graphics as they intended it but only using a pittance of bandwidth after the inital texture download.



    The mind boggles are the potential.
  • Reply 8 of 22
    kim kap solkim kap sol Posts: 2,987member
    I just can't wait for people to calm down with the 3D games and go back to some 2D games. 3D is cool and all but 2D should not be neglected more than it has in the past 6 years.



    Remakes of old games with high definition 32-bit graphics, alpha-blending, OpenGL layering, and use of some of the CoreImage effects or shaders would be really, really cool.



    Imagine Dark Castle in 32-bit color at 1920x1200 using 2D photorealistic sprites.



    There was a time when 2D just couldn't be piped fast enough to the screen. But with OpenGL, current videocards, and the upcoming PCI Express, I think that kind of stuff is possible.
  • Reply 9 of 22
    hmurchisonhmurchison Posts: 12,419member
    KKS I agree. 2D games are no less fun than 3D. Hell even the fun side scroller still has life for the creative developer.
  • Reply 10 of 22
    kim kap solkim kap sol Posts: 2,987member
    Quote:

    Originally posted by hmurchison

    KKS I agree. 2D games are no less fun than 3D. Hell even the fun side scroller still has life for the creative developer.



    Prince of Persia HD

    Dark Castle HD

    Maybe some Sierra-style or Lucas Arts-style adventure games.



    If these kind of games made a comeback using the 2D technology we have today...wow! They would be jaw-dropping.
  • Reply 11 of 22
    Quote:

    Originally posted by kim kap sol

    I just can't wait for people to calm down with the 3D games and go back to some 2D games. 3D is cool and all but 2D should not be neglected more than it has in the past 6 years.



    Remakes of old games with high definition 32-bit graphics, alpha-blending, OpenGL layering, and use of some of the CoreImage effects or shaders would be really, really cool.



    Imagine Dark Castle in 32-bit color at 1920x1200 using 2D photorealistic sprites.



    There was a time when 2D just couldn't be piped fast enough to the screen. But with OpenGL, current videocards, and the upcoming PCI Express, I think that kind of stuff is possible.




    Yes yes yes!!!!

    A Chrono Trigger or Secret of Mana remake would be awesome!
  • Reply 12 of 22
    kendokakendoka Posts: 110member
    Best 2D game ever (and obviously in need of a remake):

    "Moonstone - a hard days Knight" (Amiga 500).
  • Reply 13 of 22
    onlookeronlooker Posts: 5,252member
    Ok, Whatever, Go here =-> LINK if you want to play 2D games like crazy, but please start another thread about it in general discussion, or something.



    I'm still curious as to what peoples takes are on the next Mac motherboard. and what we may expect.
  • Reply 14 of 22
    Quote:

    Originally posted by onlooker

    I'm still curious as to what peoples takes are on the next Mac motherboard. and what we may expect.



    My bet is definitely on PCIe for the next Powermacs, with SLI possibilities and SLI already built in on higher-end. But I think that point is really obvious.



    Now, on iMacs, that's harder to guess. SLI, of course not! PCIe, why not... it seems that Apple has decided to create a whole new design for the iMacs that will be used as a starting point for the iMac line for 2-3 years, so waiting that long before adding PCIe support would be foolish.
  • Reply 15 of 22
    hmurchisonhmurchison Posts: 12,419member
    Hmmm next motherboard.





    Well PCI Express is definitely in but I have my doubts about SLI configuratons being standard. Two cards will take a lot of space but Apple may include 2 x16 slots.



    New K3 Controller:



    SATA II- Up to 300MBps throughput and NCQ support(4 channels)

    Firewire- 2 FW400 and 2 FW800 seperate busses.

    Gigabit- 2 Gigabit ports...yes 2 that support link aggregation with Tiger Server.

    Airport- no more cards necessary. Airport is now just easy to include in the controller leaving the appropriate base stations to sell.

    USB 2.0- 2 ports



    PCI Express- Bridge but no memory controller. I think the next Powermacs will have ondie memory controllers on the 97x cpus. Each cpu will have HT links between the two cpu using HT 2.0 but that may be overkill.



    Personally I think they should add HDMI connections as well so that it would be easy to pump Video and full bandwidth audio into stereo gear. SPIDF is nice but it never carries full bandwidth multichannel audio HDMI does and it's signal compatible with DVI.
  • Reply 16 of 22
    hobbithobbit Posts: 532member
    I fully expect the new iMac to leap-frog the current PowerMacs in some respects. Either by introducing PCI Express, better standard graphics cards and/or a new system controller design.



    Consequently I think that this was the reason why iMacs were not introduced at WWDC, to not disrupt sales of the low-end PowerMac as these new features would outweigh dual CPUs for some.



    If this is the case, then it would also mean that PowerMacs could get another revision this year. With new iMacs shipping in September, I cannot believe Apple would wait until January to close the gap.



    But in the end perhaps it really is down to IBM afterall.



    Initially there were reports that the next generation 9xx chip, the one derived from IBM's Power5 architecture, was due to start shipping in July/August. Maybe that slipped.

    It could easily be that the new system controller chip, the one which goes with the new 9xx chip, is not backwards compatible with the 970 or 970fx. Especially likely if the new 9xx chip comes with a memory controller on-chip.

    At that point Apple has to delay any introduction of machines based on the new system controller as they couldn't use the 970fx instead.



    Perhaps what Apple really intended to do was to introduce new PowerMacs and new iMacs at WWDC. Both based on the new 9xx chip. Once they learnt that the chips won't be shipping by July they decided to do a 'stop gap' upgrade for the PowerMacs yet still left an iMac introduction at WWDC open, only to decide later that it would hurt PowerMac sales and hence delayed the iMac intro until September.
  • Reply 17 of 22
    concordconcord Posts: 312member
    The new iMac won't have PCI-E - it's too new and more expensive than AGP and Apple will want to keep the prices down on their consumer desktop. We'll first see PCI-E on the PowerMac, and don't hold your breath until at least the end of the year before we see that implemented.



    I hope Apple comes up with a dual x16 high end PM in relatively short order but Apple is (usually) annoyingly slow to adopt cool new tech. I fully expect that they will adopt it, provided it performs as well as it claims, but I can't see it PMs until next year sometime. They may even wait until they can showcase Tiger with it. I hope not, but I got that feelin'... \



    Cheers,



    C.
  • Reply 18 of 22
    onlookeronlooker Posts: 5,252member
    As for the iMac. I'm totally with concord. I don't expect to see PCIe in a new iMac, or some sort of iMac replacement. I just don't think Apple had it in time. My guess is the Mobo design was finished, and in production before PCIe was readily available. Which brings me to my next point which will probably end up in the iMac thread also. If the new iMac is a replacement, and not an all in one machine. There could possibly be a G5 at a lesser speed (1.2-1.6GHz) under the hood.

    If it is a cost effective model computer lower in price than the current iMac the reason Apple may have decided to post that unusual statement may have been because of the ARMY's Xserve Dual G5 order. The ARMY may have wanted that ASAP, and when your buying in that type of volume, you can usually get what you want - when you want it.

    Plus. Nobody wants "The Man" crawling up their @ss.



    Now with the PowerMac I would expect PCIe hopefully right before X-MAS. I hate it when they wait until MacWorld. I would hope Apple plans to stay competitive in offering 3 x PCIe PowerMac configurations, and one more thing... A PowerMac With 2x PCIe slots to stay in the game by having an SLI machine. I don't think they can not offer it, and please Mac users, and would be switchers.



    Siggraph is like 33 days away. Probably to early to expect a PowerMac, but I think Apple has a chance to be the platform of choice for Maya users within a year if they, Pixar,(renderman artist tools) and IBM, can get it firing on all cylinders. Siggraph should be an interesting show nevertheless.
  • Reply 19 of 22
    telomartelomar Posts: 1,804member
    Quote:

    Originally posted by Concord

    The new iMac won't have PCI-E - it's too new and more expensive than AGP and Apple will want to keep the prices down on their consumer desktop.



    Actually PCI Express is around the same cost to implement as AGP. In the end unless Apple's going to offer upgradable slots it doesn't matter a bit what they use in the iMac though.
  • Reply 20 of 22
    concordconcord Posts: 312member
    Quote:

    Originally posted by Telomar:

    Actually PCI Express is around the same cost to implement as AGP. In the end unless Apple's going to offer upgradable slots it doesn't matter a bit what they use in the iMac though.



    I didn't mean the slot on it's own (unless you factor in development costs) but as a whole PCI-E solution including the graphics card.



    And yes, unless they offer an some upgrade path the point is pretty much moot. [And on the subject of the new iMac]After some consideration I suspect the new iMac will continue to be an AIO solution. I'm sure Apple still loves the idea of being able to sell you a whole new package every time you upgrade. I think that if Apple were to make a headless unit with a new form factor - is it really an "iMac" any more?



    No, they would have branded it differently. Upgradability? Limited I suspect, but I could be wrong. I think there will be some other new hook for the consumer which may not be as obvious as a G5, etc.



    My 2 bits,



    C.
Sign In or Register to comment.