Future of Graphics Processors? (Make it simple)

Posted:
in Future Apple Hardware edited January 2014
Could some of you GP mavens postulate as to where graphics processors are headed in the next year or two, but do it in simple terms with lots of real-world examples?



That would be swell.
«1

Comments

  • Reply 1 of 37
    pscatespscates Posts: 5,847member
  • Reply 2 of 37
    ATI will own the market in terms of raw power, while Nvidia will have the most Direct-X and OGL compatable features. They're already moving in their seperate directions, and creating their own markets. Nvidia has long held onto the OEM (mobile) market, while ATI's starting to break in with it's Radeon Series for latops, but those are going to need some serious revisions before they're even considdered "high-end."
  • Reply 3 of 37
    programmerprogrammer Posts: 3,457member
    Quote:

    Originally posted by elizaeffect

    ATI will own the market in terms of raw power, while Nvidia will have the most Direct-X and OGL compatable features. They're already moving in their seperate directions, and creating their own markets. Nvidia has long held onto the OEM (mobile) market, while ATI's starting to break in with it's Radeon Series for latops, but those are going to need some serious revisions before they're even considdered "high-end."



    Its debatable whether ATI will "own the market in terms of raw power". They do at the moment, but that could change quickly. The feature set is a tricky business since the two companies tend to leapfrog eachother continually. ATI was first to DX9.0 but nVidia went straight to DX9.1. The next core from ATI may very well leapfrog nVidia again. I don't expect either company to maintain a lead in either power or features.



    nVidia's mobile GPU is a relative newcomer to the scene in which ATI has been in for a long time. ATI's 3D mobile performance has been weak until recently, but is superiour in terms of 2D and video performance. I think ATI's latest mobile products outshine nVidia's and given the power hungry nature of nVidia's new geForceFX it'll be a while before they get that technology into a mobile form. It will be a tight race there too for the foreseeable future.



    I expect ATI and nVidia to continue battling for supremacy in the GPU markets, and the rest of the would-be contenders will be sidelined in niche markets unless a big money player weighs in with a new technology or buys up one of the smaller players (like 3DLabs). Matrox is still kicking, but it seems more like a last gasp or convulsive twitching.



    As for future parts, we will see the current top-of-the-line programmable parts migrate down into the lower end of the market. Rather than being stripped of their features this time around, however, they will be full featured but lower performance versions of the high end. This combined with DX's HLSL, nVidia's Cg, and OpenGL's own upcoming HLSL we will hopefully see broader adoption of programmable pipelines by software vendors. Currently the high end parts are extremely under-utilized and their potential is as-yet untapped. Over the next couple of years the thing to watch for is software which finally takes advantage of these awesomely programmable GPUs and does some really amazing graphics.
  • Reply 4 of 37
    Quote:

    Originally posted by Programmer

    Its debatable whether ATI will "own the market in terms of raw power". They do at the moment, but that could change quickly. The feature set is a tricky business since the two companies tend to leapfrog eachother continually. ATI was first to DX9.0 but nVidia went straight to DX9.1. The next core from ATI may very well leapfrog nVidia again. I don't expect either company to maintain a lead in either power or features.



    nVidia's mobile GPU is a relative newcomer to the scene in which ATI has been in for a long time. ATI's 3D mobile performance has been weak until recently, but is superiour in terms of 2D and video performance. I think ATI's latest mobile products outshine nVidia's and given the power hungry nature of nVidia's new geForceFX it'll be a while before they get that technology into a mobile form. It will be a tight race there too for the foreseeable future.



    I expect ATI and nVidia to continue battling for supremacy in the GPU markets, and the rest of the would-be contenders will be sidelined in niche markets unless a big money player weighs in with a new technology or buys up one of the smaller players (like 3DLabs). Matrox is still kicking, but it seems more like a last gasp or convulsive twitching.



    As for future parts, we will see the current top-of-the-line programmable parts migrate down into the lower end of the market. Rather than being stripped of their features this time around, however, they will be full featured but lower performance versions of the high end. This combined with DX's HLSL, nVidia's Cg, and OpenGL's own upcoming HLSL we will hopefully see broader adoption of programmable pipelines by software vendors. Currently the high end parts are extremely under-utilized and their potential is as-yet untapped. Over the next couple of years the thing to watch for is software which finally takes advantage of these awesomely programmable GPUs and does some really amazing graphics.




    Yeah, I'd say that you're right too. I interpreted the question as "3d power." I'm writing this from my Wintel, so I'm still "in the zone" in terms of gaming hardware. Honestly, to gamers, 2d is an afterthough that comes far after pushing polygons.



    I think that ATI will definately continue to rule the mid-range mobile market (those who aren't concerned with running Doom 3 for example). Nvidia is obviously shooting for something else with it's GeForce Gos. I don't doubt that ATIs years of experience in the mobile market will allow it to come back with a good answer to Nvidia, though.



    I'm betting the speed crown will continue to be held by ATI over the next few years in the desktop market, and eventually it will rule the laptops as well. Of course, they're on different product cycles, so it will appear that they are constantly "one-upping" each other, but the reach shake-out is the real-world performance in games, feature changes in revisions by 3rd parties, etc.



    Longer-term, I think that Nvidia will end up being the real winner in the Doom 3 test, as their relationship with iD is pretty close. However, they may not be there when it hits the shelves, while people who are upgrading BEFORE Doom 3's launch will have to go with ATI.
  • Reply 5 of 37
    drewpropsdrewprops Posts: 2,321member
    <ahem>

    I said make it "simple" you guys!

    (not as simple as the Etch-a-Sketch though)



    Talk to me in terms of applications and what they'll do for me. Is this all ONLY about games or will I see performance increases in other applications? How would these team with the rumored 970 chip?
  • Reply 6 of 37
    Well, I'm not that familar with support for Macs, but I think that the way OsX renders windows (as textures) it will only get better they get more ram, and less latency (ram, and bus). The real factor, will be the bus on the mobo, though, as you can have as much power as you like, but if the data is waiting in line, you're not going to see any performance increase. This is the sort of thing that will most likely come with a further revision of the mobo for a new processor (not nessesarly first with the 970, but that will happen, as they'll obviously need a new mobo for it).
  • Reply 7 of 37
    programmerprogrammer Posts: 3,457member
    Quote:

    Originally posted by drewprops

    <ahem>

    I said make it "simple" you guys!

    (not as simple as the Etch-a-Sketch though)



    Talk to me in terms of applications and what they'll do for me. Is this all ONLY about games or will I see performance increases in other applications? How would these team with the rumored 970 chip?




    Currently the GPUs are tremendously under-utilized by anything except games and 3D modelling packages (although only the latest 3D modellers even attempt to use OpenGL and thus the GPU). Frankly I don't know what to expect from applications since they've show so little imagination in the past when it comes to using advanced graphics techniques. Part of the fault has been a lack of OS support for integrating OpenGL into the GUI, but AFAIK that has been solved by Apple and will eventually be solved by MS (i.e. Longhorn). Even then, however, I'm skeptical that your average application developer will do anything interesting with it.
  • Reply 8 of 37
    kroehlkroehl Posts: 164member
    Quote:

    Originally posted by Programmer

    Currently the GPUs are tremendously under-utilized by anything except games and 3D modelling packages (although only the latest 3D modellers even attempt to use OpenGL and thus the GPU). Frankly I don't know what to expect from applications since they've show so little imagination in the past when it comes to using advanced graphics techniques. Part of the fault has been a lack of OS support for integrating OpenGL into the GUI, but AFAIK that has been solved by Apple and will eventually be solved by MS (i.e. Longhorn). Even then, however, I'm skeptical that your average application developer will do anything interesting with it.



    Part of the problem when it comes to 3D is that OpenGL is really only useful in real-time render modes i.e. panning, dollying etc. When it comes to "production" renders OpenGL is (as yet) not really useful. Not that I wouldn't want a real-time render mode supporting full raytrace, texturing, area lights or even some sort of radiosity, mind you. I'm really waiting for that day to come.



    However, your statement that only the latest 3D modelling packages utilize OpenGL isn't entirely true. It has been around for quite some years (at least in PC-land). Apple was sorta Johnny-come-lately to OpenGL in any case because they were hanging on to the actually far superior, open platform, QuickDraw 3D - another innovation which got drowned by an inferior product with marketing and user-base behind it. Anyone remember Betamax? \



    Consequently the only programs which are able to utilize real-time 3D rendering in a meaningful way are games. For the rest of them it's just a sidenote preview mode.



    Kroehl
  • Reply 9 of 37
    programmerprogrammer Posts: 3,457member
    Quote:

    Originally posted by kroehl

    Part of the problem when it comes to 3D is that OpenGL is really only useful in real-time render modes i.e. panning, dollying etc. When it comes to "production" renders OpenGL is (as yet) not really useful. Not that I wouldn't want a real-time render mode supporting full raytrace, texturing, area lights or even some sort of radiosity, mind you. I'm really waiting for that day to come.



    However, your statement that only the latest 3D modelling packages utilize OpenGL isn't entirely true. It has been around for quite some years (at least in PC-land). Apple was sorta Johnny-come-lately to OpenGL in any case because they were hanging on to the actually far superior, open platform, QuickDraw 3D - another innovation which got drowned by an inferior product with marketing and user-base behind it. Anyone remember Betamax? \



    Consequently the only programs which are able to utilize real-time 3D rendering in a meaningful way are games. For the rest of them it's just a sidenote preview mode.




    Yes, I was only refering to Mac 3D modellers. Also, the original question was what will we see the GPUs used for so there isn't much point in talking about raytracing, etc which the GPUs cannot do (yet). That is a very small market as well, whereas the really interesting uses for GPUs have yet to be discovered (aside from games). The general user base is never going to get into 3D modelling, but that doesn't mean there aren't interesting things that 3D hardware can do for them -- the real issue is whether non-3D application designers will have the expertise and inventiveness to do it.
  • Reply 10 of 37
    onlookeronlooker Posts: 5,252member
    Quote:

    Originally posted by elizaeffect

    ATI will own the market in terms of raw power, while Nvidia will have the most Direct-X and OGL compatable features. They're already moving in their seperate directions, and creating their own markets. Nvidia has long held onto the OEM (mobile) market, while ATI's starting to break in with it's Radeon Series for latops, but those are going to need some serious revisions before they're even considdered "high-end."







    I don't see ATI owning anything. It's been all Nvidia for the past year until lately when ATI claimed a lead for a while, but now it looks like Nvidia OWNZ again. Read This!
  • Reply 11 of 37
    lemon bon bonlemon bon bon Posts: 2,383member
    Ouch. Ati's Mobility looks on top...but the 5900 from Nvid' brushes the 9800 Pro aside like a child in some of the benches.



    Incredible. I can't wait until we have graphic cards that can help LW rendering...



    Wait till the autumn, folks! You'll have a card from Ati or Nvidia that will be able to throw Doom III around! Golly. Doom III looks like it's going to be incredible. It makes the latest Nvidia card huff and puff. No more 300 fps ala Quake III, eh? We're all equal, Mac and PC on the next ground floor up..?



    (I'm feeling confident of the 970's gaming powers if the Dooby benches are anything to go by...)



    Lemon Bon Bon
  • Reply 12 of 37
    programmerprogrammer Posts: 3,457member
    Quote:

    Originally posted by Lemon Bon Bon

    Ouch. Ati's Mobility looks on top...but the 5900 from Nvid' brushes the 9800 Pro aside like a child in some of the benches.



    While eloquent I don't agree with the sentiment. Sure the new nVidia is a bit faster (especially at really high resolutions where fill rate is king), but its hardly equal to the sound thrashing that ATI have the geForce4 last year. The two are roughly on par and I expect them to stay competitive with eachother, barring some serious development or marketing screw up. We'll see what the next cores look like, but there isn't any particular reason to think that they won't both grow at roughly the same rate. They use about the same fab technology and they are both limited by the same RAM chip technologies.



    Competition is good.
  • Reply 13 of 37
    bungebunge Posts: 7,329member
    Quote:

    Originally posted by Programmer



    Sure the new nVidia is a bit faster....




    I believe these benchmarks were supplied by NVIDIA....
  • Reply 14 of 37
    hmurchisonhmurchison Posts: 12,419member
    Quote:

    Originally posted by onlooker

    I don't see ATI owning anything. It's been all Nvidia for the past year until lately when ATI claimed a lead for a while, but now it looks like Nvidia OWNZ again. Read This!



    Don't believer the hype. The Ati 9800 is but a mere tweak. These two will leapfrog each other but I don't expect to see nvidia break out to a clearcut lead anytime soon. The 5900 is nice but it's more expensive and louder. I'm sure ATI has some neat stuff coming from their next core.
  • Reply 15 of 37
    onlookeronlooker Posts: 5,252member
    Quote:

    Originally posted by hmurchison

    Don't believer the hype. The Ati 9800 is but a mere tweak. These two will leapfrog each other but I don't expect to see nvidia break out to a clearcut lead anytime soon. The 5900 is nice but it's more expensive and louder. I'm sure ATI has some neat stuff coming from their next core.



    But for who? ATI does not make Mac Cards the same way nVidia does. nVidia's processors are Mac, and PC compatible right off the press. ATI has to adapt their cards to work on a Mac, and after that we have to deal with their drivers. By the time we see ATI cards come to the Mac there is already 2 newer ATI cards released for the PC. Then what? We wait for drivers that suck. Pullleeease.
  • Reply 16 of 37
    applenutapplenut Posts: 5,768member
    Quote:

    Originally posted by onlooker

    But for who? ATI does not make Mac Cards the same way nVidia does. nVidia's processors are Mac, and PC compatible right off the press. ATI has to adapt their cards to work on a Mac, and after that we have to deal with their drivers. By the time we see ATI cards come to the Mac there is already 2 newer ATI cards released for the PC. Then what? We wait for drivers that suck. Pullleeease.



    1.) not sure what you're talking about since the only nVidia cards for the mac are from Apple. ATI's chips are also mac/pc just like nVidia. I don't see nVidia (or any OEM) releasing nvidia cards for the mac at all, nevermind in a timely manner.



    2.) ATI makes mac specific cards with drivers. Same chips. No adapting. Different strategy i guess. But Apple can use whatever they want. They don't have to wait for an ATI retail card. The Apple graphic card market is miniscule. I mean, how many mac users are there and how many of those have an AGP slot and of those how many are willing to spend 250-400 dollars to replace the geforce 2mx, geforce 4mx, geforce 4 ti, radeon already in there? not many



    3.) ATI drivers have proven to be better on the Mac. Obviously the case has traditionally been different on the PC side
  • Reply 17 of 37
    programmerprogrammer Posts: 3,457member
    Quote:

    Originally posted by onlooker

    But for who? ATI does not make Mac Cards the same way nVidia does. nVidia's processors are Mac, and PC compatible right off the press. ATI has to adapt their cards to work on a Mac, and after that we have to deal with their drivers. By the time we see ATI cards come to the Mac there is already 2 newer ATI cards released for the PC. Then what? We wait for drivers that suck. Pullleeease.



    You're right -- nVidia doesn't make Mac cards at all. Apple uses nVidia chips to make Mac cards. The ATI chips are just as Mac compatible as the nVidia chips, and ATI makes Mac retail boards. ATI's drivers have traditionally been problematic, but I was told that the new 9700 drivers were a complete re-write (hence the delay) and are substantially better. No doubt a bit of time to mature is required as well, but frankly I've also heard plenty of complaints about nVidia drivers. Hopefully Apple reinvigorating the PowerMac line will inspire ATI to improve their retail lineup (a low-end slotted Mac would help that greatly as well).
  • Reply 18 of 37
    drewpropsdrewprops Posts: 2,321member
    And so, for tricking out my machine for Photoshop and Illustrator I've no worries about the GPU, right? My main concerns there are fast CPU and big pipes on the Mobo....si?



    If I bump up to doing a lot of Desktop Movie Production then a GPU might be more important to me? Yes?



    Ta.
  • Reply 19 of 37
    RAM.
  • Reply 20 of 37
    programmerprogrammer Posts: 3,457member
    Quote:

    Originally posted by drewprops

    And so, for tricking out my machine for Photoshop and Illustrator I've no worries about the GPU, right? My main concerns there are fast CPU and big pipes on the Mobo....si?



    If I bump up to doing a lot of Desktop Movie Production then a GPU might be more important to me? Yes?




    As mentioned above, RAM is going to be a big win for you. A good GPU w/ Quartz Extreme will speed up your system overall, but you don't need a killer top-end video card. Also, while Adobe currently doesn't use the GPU at all in Photoshop I'd be surprised if somebody doesn't try it. The new programmable vertex and fragment units can operate with fairly large programs on 32-bit per channel pixels and this makes them much more useful for this kind of work. So one day soon the GPU might become very useful in a Photoshop-like app, but until that day I wouldn't blow a lot of money on one because they are still getting better really fast.
Sign In or Register to comment.