Future of Graphics Processors? (Make it simple)

2»

Comments

  • Reply 21 of 37
    eupfhoriaeupfhoria Posts: 257member
    Quote:

    Originally posted by drewprops

    And so, for tricking out my machine for Photoshop and Illustrator I've no worries about the GPU, right? My main concerns there are fast CPU and big pipes on the Mobo....si?



    If I bump up to doing a lot of Desktop Movie Production then a GPU might be more important to me? Yes?



    Ta.




    If by desktop video, you mean iMovie, than I am pretty sure your GPU does not matter.



    If by desktop video, you mean final cut pro, than you might want to look into matrox video cards. They are finally porting their RTmac drivers to OSX and they lowered the cost of their chip to around $600. It will save you a massive amount of render time.
  • Reply 22 of 37
    dfilerdfiler Posts: 3,420member
    Quote:

    Originally posted by drewprops

    Could some of you GP mavens postulate as to where graphics processors are headed in the next year or two, but do it in simple terms with lots of real-world examples?



    Most posters seem to approach this topic from a ?how will hardware change? standpoint. In the past, this is what I would have been most interested in. Yet, ever since the original Voodoo boards, it seems that software utilization of GPUs has been lagging farther and farther behind hardware potential. A well-written, promoted, and adopted API stands to transform our user experience far more than any speed gain or new hardware capability.



    Thus, I will approach this from the opposite side of the equation. How will we make use of the currently untapped power in consumer level GPUs?



    Quartz Extreme is one of the coolest thing to happen to the computer industry since? well since OS X.



    Even better, QE has only scratched the surface. I envision dynamic, colored, and directional lighting at the window server level. Granted, additional eye-candy could be ill-used. But hopefully apple will think long and hard about this and deliver subtle, yet informative cues via dynamic lighting.



    Aqua?s raised and throbbing default buttons could actually be subtle and diffused sources of light. Text fields, buttons, scroll-thumbs, windows edges, and other screen-elements could glint slightly with reflected light.



    Reflectivity will become just one more tool in the GUI toolbox. Just as textured area?s were once used to indicate regions from which windows could be dragged, reflectivity could provide additional subconscious cues.



    Drag and drop is one such domain in which additional GUI hints would be useful. Currently, it is difficult to discern just where a particular object can be dropped. Perhaps, shimmering but diffused lighting could be used to suggest valid drop regions?



    These ideas may seem trivial or somewhat useless but I think that they are inevitable. Excessive GPU power is bringing many forms of real-world visual-feedback to graphical user interfaces. Status lights on dashboards (or just about any electronic device) are immensely useful. It is possible to notice real-world warning lights even when they are not visible via a direct line of sight. The same could be true on our virtual desktops. Status indicators in obscured windows would no longer go unheeded?



    And yes? microsoft will probably find a way to use the technology in a tacky manner. Oooo! Internally lit fisher price.



    Dynamic lighting, shadows, reflections, depth-of-field, shimmering heat waves, glowing objects, iridescent objects?



    Let?s hope Apple uses Panther and the 970 to tactfully but completely blow everyone's minds.
  • Reply 23 of 37
    matsumatsu Posts: 6,558member
    This actually came up in a seminar I took two years ago. Apple once had a prototype for a completely 3-d UI, navigated in a representation of 3d space. Think Doom OS. Well, the pic didn't look like that, but you uget the idear. Testers found it very confusing and intrusive. 2D definitely makes a better desktop. But, this is not to say a few well chosen 3-d effects couldn't make a better desktop out of our 2d windows. For instance. Our windows seem to float, but they don't really float. They ooze and melt, or become transparent, but they still use only the x and y axes to do this. What if windows also used the z axis? I imagine a pane flipping, like flipping a sheet over, and on the back could be useful info about document you've just opened, mebbe a preference panel, yadda yadda.



    Or, howzabout flipping the whole desktop. In effect having two desktops. One facing you and the other facing away. You grab the edge of the screen and pull and the whole thing rotates on the z axis to show another desktop waiting behind. This might have Safari. sherlock, and mail/contact apps going, while on the other face, photoshop or excel or someother 'work' is open. Rather than having windows all over, or elements zooming into and out of the dock as you switch, you have two desktops at the ready, each less cluttered. Or you can arrange pairings that make sense to you. Like launching calendar, and having contacts automatically open behind it. Search one side, search the other, kinda like paper.



    I dunnaknow how any of it works, but I agree that writing a better way to take advantage of the power we have will be more significant to the user experience.



    On the hardware side though, I see the GPU rising in importance as it becomes more like a CPU, that is programmable. The few things it does, it does VERY VERY quickly, so a little creative/judicious application of programming could lead to an approach were the CPU does less calculating per se, but rather prepares things into the sorta chuncks the GPU wants and then hands it off. Mebbe, the GPU can even assist some calculations not related to "graphics" ?



    This would take a lot of heat of Apple to have competitive CPU's -- if the GPU could do more of the "work", then it's not so important to match up hz for hz or flop[ for flop, or whatever is en-vogue ATM. Less pressure to have a single mainstream CPU, so long as your platform has a mainstream graphics port and a CPU that can talk to it efficiently.



    I mean, just look at an Xbox. A 733Mhz P3 (psuedo-Celeron?) isn't eactly a tonking fast chip, but it doesn't matter, it puts out some darned impressive visuals thanks to ist ability to get in there and really use that GF3+ (nv22/2A or something like that, between a GF3Ti and a GF4 I imagine). I guess this is a tad easier on an Xbox since it's closed and the programmers know exactly what they're writing for. But, if we had the API's that let the GPU shoulder more of the burden, and do it with a greater degree of flexibility, not just in terms of calculations, but also in terms of supporting a set that runs across multiple GPU's of a given baseline spec, then GPU's could play a much more prominent role in overall performance.



    I imagine a QE just for encoding and decoding video, for example, or a QE routine that specifically assited Photoshop functions, yadda yadda. Not that each app would do that, but that each dev, knowing that said API was at the ready at the OS level, could write to take advantage of it.



    Does that make sense?
  • Reply 24 of 37
    Quote:

    Originally posted by Matsu



    I imagine a QE just for encoding and decoding video, for example, or a QE routine that specifically assited Photoshop functions, yadda yadda. Not that each app would do that, but that each dev, knowing that said API was at the ready at the OS level, could write to take advantage of it.[/B]



    Hey Matsu, I thought that is what Open GL was supposed to be??
  • Reply 25 of 37
    matsumatsu Posts: 6,558member
    Like I said, iDunno how it works, just thinking out loud. I think that was what OpenGL was about, but I dunno, it seems, however, to be under utilized. Not that I know whether or not it is, I just hear people complaining about it all the time, that they need som high-end card or another to really speed things up. I was talking more in terms of an appreciable boost courtesy of new revs, not that it isn't done to some degree or another already, just that it would be done to an even greater degree as the months go by. OpenGl wasn't used to draw the UI elements for instance, but they found a way to do that, so who knows, mebbe they can find a way to use OpenGL more extensively and flexibly as time goes by?
  • Reply 26 of 37
    amorphamorph Posts: 7,112member
    Quote:

    Originally posted by Mike Eggleston

    Hey Matsu, I thought that is what Open GL was supposed to be??



    OpenGL to date has not provided much in the way of programmable functionality because there's been no hardware support for it. All it is is a (largely) platform-agnostic interface to a (logical) GPU. It doesn't do magic.



    The last generation of cards allowed for small programmable filters that couldn't use any conditional branches (i.e., if this pixel is red, do this; otherwise, do that). The latest generation seems to allow for longer filters that can have branch logic, which makes them far more general purpose.



    You still want to offload work to the GPU that will end up going directly to an output device, just because every write back across the AGP (or whichever) bus is bandwidth that could have been used to feed the GPU. But in Apple's case, this will allow them to feed a lot of very basic information to the GPU (= less traffic across the AGP bus) and have the GPU spit out a very slick looking interface. In effect, the GPU becomes a powerful graphical terminal served over a massively high-bandwidth connection by a server (the CPU/main RAM). This, incidentally, is a variation on the way UNIX has always done user interfaces of all descriptions.



    If Apple can offload all of Quartz - or even all of the grunt work of Quartz - to the GPU, they'll have solved an efficiency problem that's dogged Macs since the very beginning, and allow the rest of the machine to concentrate the vast bulk of its bandwidth and power to more interesting tasks than metal interfaces and transparent windows. And, since the GPUs will themselves be almost powerful enough to be general-purpose in their own right (and, as now, far more powerful than a CPU at certain specialized tasks) Apple will be able to do this and still supply a highly sophisticated interface.



    Of course, if there are other ways to use the GPU as an auxiliary processor, Apple should build them into their libraries and frameworks behind the scenes. That'll be about the only way to get application developers to avail themselves of the GPU.
  • Reply 27 of 37
    programmerprogrammer Posts: 3,458member
    Quote:

    Originally posted by Amorph

    OpenGL to date has not provided much in the way of programmable functionality because there's been no hardware support for it. All it is is a (largely) platform-agnostic interface to a (logical) GPU. It doesn't do magic.



    The last generation of cards allowed for small programmable filters that couldn't use any conditional branches (i.e., if this pixel is red, do this; otherwise, do that). The latest generation seems to allow for longer filters that can have branch logic, which makes them far more general purpose.





    Even without branches many amazing this can be done. Even the earliest vertex shaders support predication -- i.e. compute both and choose one. You can also just divide your draws into multiple groups, some using this shader and some using that shader. The main restriction is the per-vertex model, and the very limited fragment shaders in earlier hardware. The R300+ and nv30+ can do some amazing things per-pixel, and even the R200 is under appreciated for its available flexibility (the earlier nVidia GPUs are just too weird at the fragment level).



    The stuff Matsu is talking about above is very doable on the programmable graphics hardware (that's all ATIs >8000 and all nVidia's beyond geForce2 except the 4MX). With some effort even the earlier boards can do quite a few tricks, but they typically don't have the fill rate to burn in order to accomplish it seamlessly enough to process a whole GUI's high rez display. Dynamic lighting, real specular highlights, glows, etc. Right now is probably a really fun time to be a UI programmer at Apple (well... maybe not right now since they are finally Panther).
  • Reply 28 of 37
    thai moofthai moof Posts: 76member
    This is a interesting thread, everyone. When do you experts think these advances will be useful in an everyday sense? One year? Five years? When can I have my own 'Minority Report' computer interface? 8)
  • Reply 29 of 37
    dfilerdfiler Posts: 3,420member
    This type of prediction is exceedingly difficult to make. Progress can be affected by just a few individuals and their interpersonal skills at corporate meetings. Many of the advances that make OS X so amazing have been sitting around for a decade or two. NeXT made an amazing OS and accompanying development APIs yet it is just now that these technologies are finding their way onto most people?s desks.



    What we are talking about here is the application of existing technologies rather than pure-science or research into uncharted territory. Good system architects and hard-core code-monkeys wouldn?t have to do anything too revolutionary while implement the next generation GUI.



    Of more significance is how to apply the technology from these ?known? domains in a manner that is actually beneficial to users. Determination of what comprises ?good? application of visual-effects is a subjective process. Or, rather, it is difficult to explicitly prove the relative value of various GUI designs.



    For example: Warning dialogs could contain a glowing red badge, subtly casting light on other things on the interface even when obscured by other windows. This could be implemented via the Cocoa APIs such that application programmers need not know anything about vertex lighting. However, the utility of such a feature is impossible to accurately assess until after it is possibly too late. Will developers make good use of the capability or will they abuse and overuse? It?s almost like trying to predict cultural development or drift.



    My prediction?



    Keep an eye on the utilization of transparency at the public-API and window-server level. If companies like adobe start making use of quartz and quartz extreme instead of rolling their own solutions, Apple will have successfully introduced GPU accelerated GUI APIs to the field of application development. If this happens, then it would only be logical for them to continue extending quartz extreme. However, if QE?s initial functionality isn?t readily embraced by large developers, it could be left to languish in the same manner that QuickDrawGX was abandoned.



    I suspect that FCP and similar software from Apple will be the first to truly make use of quartz extreme. The video production market might be enough to single-handedly spur further development.



    Finally, in wild speculation mode: I predict that Panther?s dock will wow us as much as the original dock did. I?m envisioning something even more mind-warping than the genie effect?
  • Reply 30 of 37
    chychchych Posts: 860member
    ...I think graphics cards of the future will use 2 AGP slots and 3 PCI slots! They will also require their own 600W power supply!



    ...Seriously, does anyone think Apple will put nvidia's offerings in their towers at this time? And given their revisions... I think we'll be out of the latest and greatest (directly from Apple anyway) for a while.
  • Reply 31 of 37
    programmerprogrammer Posts: 3,458member
    Quote:

    Originally posted by chych

    ...I think graphics cards of the future will use 2 AGP slots and 3 PCI slots! They will also require their own 600W power supply!



    ...Seriously, does anyone think Apple will put nvidia's offerings in their towers at this time? And given their revisions... I think we'll be out of the latest and greatest (directly from Apple anyway) for a while.




    I could see Apple positioning the AGP slot far enough away from the other slots, and ensuring good airflow around it. This would mitigate some of the problems with nVidia's offerings, and in their latest version it sounds like they have addressed the noise issues.
  • Reply 32 of 37
    whisperwhisper Posts: 735member
    Quote:

    Originally posted by Programmer

    I could see Apple positioning the AGP slot far enough away from the other slots, and ensuring good airflow around it. This would mitigate some of the problems with nVidia's offerings, and in their latest version it sounds like they have addressed the noise issues.



    Can they do that? I think there some cards that really do require a PCI slot and therefore wouldn't fit if the AGP slot wasn't where it is now. AGP 8x supports more than one slot anyway. Use the slot furthest from the PCI slots if you need the extra room and the one closest if you've got one of those weird AGP/PCI cards.
  • Reply 33 of 37
    programmerprogrammer Posts: 3,458member
    Quote:

    Originally posted by Whisper

    Can they do that? I think there some cards that really do require a PCI slot and therefore wouldn't fit if the AGP slot wasn't where it is now. AGP 8x supports more than one slot anyway. Use the slot furthest from the PCI slots if you need the extra room and the one closest if you've got one of those weird AGP/PCI cards.



    I'm not aware of any cards that need both an AGP slot and a PCI slot. Even if there was such a monster it would probably use a ribbon cable connector between the boards for each. I doubt it though -- AGP should give a card all the bandwidth its going to get since AGP and PCI usually compete for bandwidth in the chipset.
  • Reply 34 of 37
    audiopollutionaudiopollution Posts: 3,226member
    Quote:

    Originally posted by Programmer

    I'm not aware of any cards that need both an AGP slot and a PCI slot. Even if there was such a monster it would probably use a ribbon cable connector between the boards for each. I doubt it though -- AGP should give a card all the bandwidth its going to get since AGP and PCI usually compete for bandwidth in the chipset.



    The GeForce 5800 FX Ultra requires a PCI slot, not for the PCI connection, but rather for the actual slot in the case. That is where the cooling fan for the GPU exhausts.
  • Reply 35 of 37
    macroninmacronin Posts: 1,174member
    Quote:

    Originally posted by audiopollution

    The GeForce 5800 FX Ultra requires a PCI slot, not for the PCI connection, but rather for the actual slot in the case. That is where the cooling fan for the GPU exhausts.



    Moot point though, since I doubt the GeForce 5800 FX Ultra will ever be offered as BTO... Chances are we might never see a Mac version of this card...



    The 5900 though, that might be available as BTO when the 970 debuts...



    ;^p
  • Reply 36 of 37
    amorphamorph Posts: 7,112member
    Quote:

    Originally posted by audiopollution

    The GeForce 5800 FX Ultra requires a PCI slot, not for the PCI connection, but rather for the actual slot in the case. That is where the cooling fan for the GPU exhausts.



    Right, and so that fits in to what Programmer was thinking of : A big AGP slot with a blank, nonfunctional space above it the size of a PCI slot so that the machine can accomodate cards like that without actually using up a real, functional slot.



    I think it's a great idea.
  • Reply 37 of 37
    keyboardf12keyboardf12 Posts: 1,379member
    didn't the rumors mention 6 slots on the new mobo.



    if true maybe 5 real, 1 non func.?
Sign In or Register to comment.