Apple's nForce

Posted:
in Future Apple Hardware edited January 2014
I was just thinking about that "graphic strategy" spymac is reporting on,



Isn't NVidia selling their graphics chips to graphics card makers along with a reference board design? So the card makers actually make their own designs. NVidia also made their own motherboard chipset, the nForce. But basically they still "only" sell GPUs to other manufacturers.



So since Apple actually makes their own motherboards, would it be possible to, say, have Apple put 1 or 2 or 4 NVidia GPUs onto it's board and write a "multitasking" OpenGL driver to make use of the programmable GPUs? Since the newest GPUs are programmable they could in theory be used to offload computation tasks (maybe rendering or video?) off the CPU. Also, would it be possible to have some other connection than AGP between the GPUs and the CPU?



I was just wondering what tasks those GPUs can be programmed to do, since it could even make the speed of the main CPU not that huge an issue. In the Pro market the software developers don't mind to write software for specialised processors if there is a noticable speed improvement and if Apple manages to make a good (as in faster than nForce 1 & 2) implementation of a CPU/GPU motherboard it could be quite revolutionary.



Would a dual GPU in such a configuration even make sense? From what I know OpenGL software could simply make every second operation with the second GPU, but would it bring a noticeable performance increase with higher polygon counts?
«1

Comments

  • Reply 1 of 23
    I would not be surprised if Apple will, at some point, have onboard video. It would probably only be able to work on a consumer machine, because prosumers would want to possibly upgrade their video. Maybe if they used a new type of slot. . .



    I'm sorry. This all my imagination. Maybe someone with more direct knowledge could provide some insight?
  • Reply 2 of 23
    That sounds like a novel idea! Hum... Offloading the video work onto the video card where it belongs. (I like that!) Would that extend to DVD playback but NOT with dedicated chips? Like keeping with true 'software decoding' but without making it a proprietory issue? If that could be done on a 4x AGP card I'd be glad to buy it. I just don't see onboard happening though. All the 'low end' systems that they'd put it in already have monitors built it. (ie eMac and iMac)
  • Reply 3 of 23
    crusadercrusader Posts: 1,129member
    You mean this Spymac report:

    [quote] Apple plans on making big waves in the graphics industry, and at Macworld New York, CEO Steve Jobs will shed some light on a long-planned project that will showcase Apple's commitment to the high-end graphics market.



    As we reported at the beginning of April -- before the announcement of Apple-branded servers, the purchase of Silicon Grail, or even articles suggesting Pixar's move to OS X -- Apple will announce a strategy that will be mutually beneficial to both Pixar and Apple.



    To protect our source, we are not able to provide any further details on the new technology. However, a quote from our original story stands true:



    "[Steve Jobs] could be stepping on a lot of feet... I have a feeling [Jobs] does not want anyone to know of this strategy." <hr></blockquote>



    To me it seems that Spymac is saying very little about this because they just made it up. Sorry for the evil cynicism, but it's Spymac.
  • Reply 4 of 23
    stoostoo Posts: 1,490member
    Apple might have a more conventional nForce chipset, like the PC ones: dual channel DDR, with an onboard graphics chipset "occupying" one channel (IIRC )



    This may be the co-operation that Dorsal M mentions in his latest thread.
  • Reply 5 of 23
    eugeneeugene Posts: 8,254member
    [quote]Originally posted by speechgod:

    <strong>I would not be surprised if Apple will, at some point, have onboard video. It would probably only be able to work on a consumer machine, because prosumers would want to possibly upgrade their video. Maybe if they used a new type of slot. . .



    I'm sorry. This all my imagination. Maybe someone with more direct knowledge could provide some insight?</strong><hr></blockquote>



    Apple already has onboard video on every one of its machines except the Power Mac and Xserve.



    [ 06-15-2002: Message edited by: Eugene ]</p>
  • Reply 6 of 23
    zazzaz Posts: 177member
    Ok, this is just goofy but my noodle is boiling now...



    Funny you bring this up, I was musing about it this AM to a buddy.



    The nVidia nForce isn't that old... pretty new design really.



    If the controller was made to use PPC chips rathe than X86 the board is really almost ready. Subtract the Serial and PS2 garbage, the PCI/AGP slots ad Firewire... perfect for iMac lines.



    For a pro model Take out only a PCI slot or two and slip the firewire and gigabit ethernet in and it is a done deal. (nVidia makes a board sans the onboard GPU so that would likewise be the case on a pro model)



    Does any one know how technically different the chipsets are from PC to Mac? Is this a feasible venture?



    I mean, it does seem like things are pretty similar in the box aside from the CPU..... and they are all part of the hypertransport group, right?



    nVidia and Apple do keep dropping future hints on stuff 'stellar' to come. Maybe the GF 5 hot air is really the first rendition of am nForce 2.0 mobo for Apple. It would be a good rollout for the architecture for nVidia.



    Uh oh.. now I am all giddy again.....



    Thoughts?
  • Reply 7 of 23
    nathan22tnathan22t Posts: 317member
    I find it extremely unlikely, to the point of impossible, that Apple would begin to rely on one company so heavily for such an important issue as motherboard design. We can certainly see how Apple's reliance on Motorola has transpired for the past few years. Luckily, Apple has the option of moving to IBM.



    Rubenstein and his teams are perfectly capable of designing powerful products. Perhaps more important, is the high level of integration they are able to achieve with Ive's design team. Throwing a 2nd party Nvidia in the mix would just confuse and slow down the development cycle of new products. Apple needs the level of control found in being the primary developer of its hardware. In adopting nForce, Apple would be committing itself to long term mediocrity when compared to the nearly identical hardware available for PC users. It would be quite difficult to develop a reputation for high-end video workstations in this situation.



    Regardless of these concerns, the technical feasibility of modifying the PC centric nForce, is questionable at best. Even if possible, it would probably be far easier to do their own work.
  • Reply 8 of 23
    zazzaz Posts: 177member
    [quote]Originally posted by nathan22t:

    <strong>Regardless of these concerns, the technical feasibility of modifying the PC centric nForce, is questionable at best. Even if possible, it would probably be far easier to do their own work.</strong><hr></blockquote>



    nathan, um... why do you say that? Beyond the CPU and the 'northbridge'... the design of a logic board is somewhat similar to both platforms.



    All use PCI, AGP, ethernet, USB etc.



    Additionally, nForce is the actual implementation of HyperTransport of which Apple is on the founding charter. Why sink dollars to help the competitors?



    Now, I don't have the 1337ty Mc1337 sKiLz to say it is possible and here is how you do it.... but before we just disregard this there will have to be some better reason than that.



    Also, your suspicions on Apple not wanting to put too much at stake with a single company is a little paranoid. They already have several. Besides, nForce is only the chip.... anyone could make the board itself. And, unless Apple develops its own fab plant they will always be subject to a 3rd party vendor.



    [ 06-15-2002: Message edited by: zaz ]</p>
  • Reply 9 of 23
    programmerprogrammer Posts: 3,457member
    I've been saying all this for months.



    Apple doesn't have to give up their control over the motherboard or chipset design, they just have to adopt industry standard chip-to-chip interconnects.... like HyperTransport, for example, of which they just happen to be a consortium member. So are nVidia and ATI. If Apple builds a chipset with HyperTransport ports they can then hook up HT compliant devices.



    The new generation of graphics chips support full floating point internally, in frame buffers and in source textures, which makes them useful for things other than graphics. The programmability of these things is carefully controlled to ensure that they can be highly superscalar and extremely deeply pipelined... but they can still be used to do an awful lot of cool stuff. Like SIMD, I think it'll take a while for developers to figure that out, but once they do (and once there are enough of these machines out there) then we'll see some very cool applications. 2D / 3D graphics and video are just the beginning.



    If a G4 w/ HyperTransport arrives in the 1.25 - 1.5 GHz range, I could see a 100% performance improvement even without any other processor enhancements, and without considering any computational capability the graphics card has.



    Of course whether any of this comes true is entirely another matter.
  • Reply 10 of 23
    bartobarto Posts: 2,246member
    [quote]Originally posted by Eugene:

    <strong>



    Apple already has onboard video on every one of its machines except the Power Mac and Xserve.



    [ 06-15-2002: Message edited by: Eugene ]</strong><hr></blockquote>



    No, onboard video is where the graphics sub-system is part of the northbridge, sharing the system ram.



    Every consumer/portable Mac has basically a graphics card as part of the motherboard.



    Barto
  • Reply 11 of 23
    airslufairsluf Posts: 1,861member
  • Reply 12 of 23
    eugeneeugene Posts: 8,254member
    [quote]Originally posted by Barto:

    <strong>



    No, onboard video is where the graphics sub-system is part of the northbridge, sharing the system ram.



    Every consumer/portable Mac has basically a graphics card as part of the motherboard.



    Barto</strong><hr></blockquote>



    On-board means on-board. You're describing a chipset with an integrated graphics processor (ATI and nVidia like the term IGP.) That's not the definition of 'on-board' video though it is a type of on-board video.
  • Reply 13 of 23
    xypexype Posts: 672member
    [quote]Originally posted by Programmer:

    <strong>I've been saying all this for months.</strong><hr></blockquote>



    Yep, your ideas were the ones got me thinking when reading about the "new graphics strategy"..



    [quote]Originally posted by Programmer:

    <strong>The new generation of graphics chips support full floating point internally, in frame buffers and in source textures, which makes them useful for things other than graphics. The programmability of these things is carefully controlled to ensure that they can be highly superscalar and extremely deeply pipelined... but they can still be used to do an awful lot of cool stuff. Like SIMD, I think it'll take a while for developers to figure that out, but once they do (and once there are enough of these machines out there) then we'll see some very cool applications. 2D / 3D graphics and video are just the beginning.</strong><hr></blockquote>



    Well, I was thinking about the "how long it takes for developers to make use of it" as well. And I think that what really motivates developers is seeing a product that's using it blowing away their own software. And with all the aquisitions Apple is making recently I am sure - IF such a GPU thing happens - we might have a few titles at the very launch to impress people and motivate developers.



    I still am wondering whether it would be hard to make a design using 2 or more GPUs, or make different GPUs (not neccessarily integrated)? If you had two I guess load balancing would be easy to do, right?
  • Reply 14 of 23
    airslufairsluf Posts: 1,861member
  • Reply 15 of 23
    g-newsg-news Posts: 1,107member
    Are you guys really wasting your time discusing posts from Spymac?

    I mean is there ANY site in the world that has less credibility. Dude I'd even give <a href="http://www.sex.com"; target="_blank">www.sex.com</a> more credibility than spymac, if they suddenly came up with mac rumors.



    It's not even a month from now, and we'll know.

    We'll know if moki and Dorsal M are full of **** or not, and Apple will know if they get our cash or not.



    I personally have saved so much cash it's almost embarrasing, and I'm really ready to spend it.





    Give me a reason to, Apple.





    G_News
  • Reply 16 of 23
    xypexype Posts: 672member
    [quote]Originally posted by G-News:

    <strong>Are you guys really wasting your time discusing posts from Spymac?</strong><hr></blockquote>



    I am just discussing possibilities here. It seems an interesting idea - as did OpenGL Quartz, when most of people tought it was plain useless. Since MacOS X is quite graphics-intensive a onboard GPU would make sense - even more so when it means every powermac has it, so developers can sort of depend that their software would run on every such powermac.
  • Reply 17 of 23
    rogue27rogue27 Posts: 607member
    Well, I would never say moki us full of asterisks, but he's also being vague enough to stay out of trouble.
  • Reply 18 of 23
    programmerprogrammer Posts: 3,457member
    [quote]Originally posted by AirSluf:

    <strong>Twin GPU's are not overly difficult. There have been high-end workstation cards that are twin GPU. Cost and size of market are the bigger issues. Imagine what a Twin GF4 with 128Mb each would go for once you amortize the board design over the small market. Currently there are $2500 GFX cards that are high-end single CPU. Not sure who manufactures those but a compatriot of mine just bought 3 to run a wide screen 3 display system.



    Load balancing duals is trivial. Processor A works odd frames, processor B works even frames. Keeping track of unchanged portions from one frame to the next in shared memory is much more difficult, unless you decide to bag that whole optimization.



    [ 06-16-2002: Message edited by: AirSluf ]</strong><hr></blockquote>



    That's the old way of load balancing, before the GPUs were designed to share the work. Even 3Dfx was having one processor do even lines and the other do odd lines. The great thing about the shader execution module that these new processors use is that the GPU or driver takes care of when a shader executes, and it executes on a very atomic piece of the scene (i.e. a vertex or a fragment). This allows a lot of flexibility in how execution is managed, so going from one graphics pipe to two, four, eight, or more is really straightforward. A pair of chips (each of which is a large number of execution units) can share the work on a much more intimate level than before.





    The current "on-board" GPUs still go through the AGP bus, they just happen to be hard wired to the motherboard. That limits their performance to AGP bus speeds. If HyperTransport (or something else really fast) were used then performance gets a lot better. Quartz Extreme on something like this would be spectacular, but its the 3D demos that will knock your socks off. Graphics companies have been talking about doing Final Fantasy in real-time, and we are going to get pretty darn close this year.



    Writing the shaders in their assembly language isn't really that hard, but nVidia's Cg language is certainly a good way to dive in and do some really cool stuff much more easily. I hope Apple brings it to the MacOSX platform along with the hardware -- they could make this the leading platform for shader development, and that would be a really big win for Apple.
  • Reply 19 of 23
    kidredkidred Posts: 2,402member
    [quote]Originally posted by G-News:

    <strong>Are you guys really wasting your time discusing posts from Spymac?

    I mean is there ANY site in the world that has less credibility. Dude I'd even give <a href="http://www.sex.com"; target="_blank">www.sex.com</a> more credibility than spymac, if they suddenly came up with mac rumors.



    It's not even a month from now, and we'll know.

    We'll know if moki and Dorsal M are full of **** or not, and Apple will know if they get our cash or not.



    I personally have saved so much cash it's almost embarrasing, and I'm really ready to spend it.





    Give me a reason to, Apple.





    G_News</strong><hr></blockquote>





    Moki isn't, hasn't and won't predict anything but he will specualte. If he's in the know, he won't say a word, trust me on that he won't get his friends in trouble and risk his/thier NDA.



    If he's speculating then he's not full of crap if it's not true. He never said "xxx will happen because of a source I know..."
  • Reply 20 of 23
    xypexype Posts: 672member
    [quote]Originally posted by Programmer:

    <strong>The current "on-board" GPUs still go through the AGP bus, they just happen to be hard wired to the motherboard. That limits their performance to AGP bus speeds. If HyperTransport (or something else really fast) were used then performance gets a lot better. Quartz Extreme on something like this would be spectacular, but its the 3D demos that will knock your socks off. Graphics companies have been talking about doing Final Fantasy in real-time, and we are going to get pretty darn close this year.</strong><hr></blockquote>



    That's what I had in mind - since Apple is "lacking" in the 3D department it might be a way to truly put PowerMacs ahead in that department. In bigger companies rendering is offloaded to cheap x86 boxes anyway, so rendering speed differencies might not be that an issue. But if Apple can make a Hypoertransport link to the GPU and even make the GPU unit itself switchable with either other or double/quadruple GPUs the performance should be very noticeable.



    Right now a GeForce 4 is considered some 80% withing the "pro" cards speed range and two of them might as well provide Apple with the edge. And if Apple then does release quicktime APIs that make use of the GPU, along with a general "computation on GPU" API that'd be great and Steve-o would most likely be doing more Photoshop bake-offs.



    What I am wondering is that most of that stuff is in theory doable with today's technology, but no company has the guts to actually assemble such a machine. Maybe Apple does :o )
Sign In or Register to comment.