10.6.4 degrades nVidia graphic performance significantly

Posted:
in macOS edited January 2014
In an totally unsurprising turn of events the always completely fair and unbiased AppleInsider staff has accidentally forgotten to report on Apple's latest endeavor into the gaming scene on their Mac lineup: 10.6.4 degrades the performance of nVidia based graphic cards:



http://forums.steampowered.com/forum....php?t=1314406



Preliminary tests done by users with 10.6.3 and 10.6.4 on the same machine show a frames per second drop of over 30 percent in games like Half-Life 2.



Many readers of Appleinsider have reacted with spontaneous outbursts of maniac behavior upon beind made aware of this news, ranging from violence to complete nervous breakdowns as it is believed that the human mind can only take a certain amount of doublethink before it snaps.

Comments

  • Reply 1 of 20
    MarvinMarvin Posts: 15,309moderator
    Quote:
    Originally Posted by Erunno View Post


    In an totally unsurprising turn of events the always completely fair and unbiased AppleInsider staff has accidentally forgotten to report on Apple's latest endeavor into the gaming scene on their Mac lineup: 10.6.4 degrades the performance of nVidia based graphic cards:



    Seems to be affecting some and not others:



    http://forums.steampowered.com/forum....php?t=1316523

    http://forums.steampowered.com/forum....php?t=1311095

    http://forums.steampowered.com/forum....php?t=1315523



    Apple definitely need some quality control on the drivers though. They are only supporting 5-10 chipsets vs Microsoft supporting hundreds if not thousands. They should manage to ensure that every single Mac user gets good GPU drivers with so few hardware options.



    Apple also aren't in the habit of issuing graphics driver updates separate from an OS update so users may have to wait until 10.6.5. This same sort of thing happened with 10.4.7 I think it was and machines that shipped with that OS were basically screwed for 2-3 months.



    I imagine the developers who tested 10.6.4 somehow didn't test the graphics drivers to declare it fit for release. They were specifically asked to test the graphics drivers. These drivers were also rumored to improve performance significantly but they haven't. They just need to play through any of the Valve games to see there are issues they need to fix.
  • Reply 2 of 20
    erunnoerunno Posts: 225member
    It's actually doubly embarrassing as Apple knew that Valve was going to release Steam around the time of 10.6.4 was supposed to be released. Valve has been doing quite a lot of advertisement for the Mac, praising Apple's commitment to make OS X a first-rate gaming platform. I consider it really shitty behavior that they pretty much stabbed Valve in the back with this. As you already rightfully mentioned Apple only has very few configurations to test. A company with that many resources should be able to establish the necessary test infrastructure, even if the focus is currently on the iPhone/iPad and the iOS.
  • Reply 3 of 20
    spotonspoton Posts: 645member
    Apple is a moving target, expecting their OS to remain the same long enough for third party software to make a long lasting decent product is unrealistic.



    Steve likely heard of Steam and decided to kill it.



    If you want third party software to work, you have to not upgrade the OS, then you live with the poor security of earlier OS versions.



    Heck, this time next year all Mac's could be sporting A4 processors, then what?



    "Roads? Where we're going, we don't need roads." - Dr. Emmett Brown in Back to the Future.
  • Reply 4 of 20
    erunnoerunno Posts: 225member
    Quote:
    Originally Posted by SpotOn View Post


    Heck, this time next year all Mac's could be sporting A4 processors, then what?



    Well, at least Windows 7 is supported on all Intel Macs. The PowerPC people were in a far more worse situation: They had only Linux as an alternative. Out of the frying pan into the fire.
  • Reply 5 of 20
    1337_5l4xx0r1337_5l4xx0r Posts: 1,558member
    Last I checked, linux's drivers beat Mac drivers for the same hardware, by any metric. Which is either painful or laughable, depending on how you look at it.

  • Reply 6 of 20
    1337_5l4xx0r1337_5l4xx0r Posts: 1,558member
    I'd REALLY like to see Apple invest a LOT more effort into BOTH their OpenGL implementation and graphics drivers. And I'd like to see them use the latest and greatest GPUs and VRAMs before they hit the mainstream.



    I'd like to see them on the bleeding edge of OpenGL specs, adopting them rapidly like HTML5 in Safari.



    I'd like to see OS X as the go-to OS for 3D creative work.
  • Reply 7 of 20
    hirohiro Posts: 2,663member
    Quote:
    Originally Posted by 1337_5L4Xx0R View Post


    I'd like to see OS X as the go-to OS for 3D creative work.



    It is. And almost none of that work is rendered in realtime meaning the card is mostly irrelevant.



    Recent updates to preview functionality in Maya may finally be changing that. The new preview mode makes a near photo-real frame in a few seconds rather than over an hour. FAR better quality than the crap it was churning out for previews previously. But cutting that time down much more is into diminishing returns, as long as the previews run in less than a minute or so the artists will be absolutely frakking ecstatic. High quality final renders will always be done on special purpose dedicated Linux clusters, at least until there is a pervasive reason to change that. I don't see OS X ever moving into that particular nano-niche, it's much more effective to sell to the profitable creative professional's seats rather than furnish free OSes and razor thin margin render boxes to the render farm folks.
  • Reply 8 of 20
    erunnoerunno Posts: 225member
    Did Apple actually issue any statement concerning this problem?
  • Reply 9 of 20
    1337_5l4xx0r1337_5l4xx0r Posts: 1,558member
    Quote:
    Originally Posted by Hiro View Post


    It is. And almost none of that work is rendered in realtime meaning the card is mostly irrelevant.



    Recent updates to preview functionality in Maya may finally be changing that. The new preview mode makes a near photo-real frame in a few seconds rather than over an hour. FAR better quality than the crap it was churning out for previews previously. But cutting that time down much more is into diminishing returns, as long as the previews run in less than a minute or so the artists will be absolutely frakking ecstatic. High quality final renders will always be done on special purpose dedicated Linux clusters, at least until there is a pervasive reason to change that. I don't see OS X ever moving into that particular nano-niche, it's much more effective to sell to the profitable creative professional's seats rather than furnish free OSes and razor thin margin render boxes to the render farm folks.



    No, and no. There's modeling, texturing, these days lighting, done in realtime. Not to mention 100+ million polygon sculpting for normal maps and displacement maps. I'm not talking Post Production here.



    Hiro, I'm not sure what you are talking about. Feeble graphics cards + feeble OpenGL performance = feeble performance in Mudbox, Maya, etc etc, which use said graphics cards and OpenGL.



    Preview functionality in Maya? I'm not sure what you are referring to. Viewport 2.0? That requires good graphics cards, lots of VRAM and a decent OpenGL implementation. It is realtime. Or rather, it is supposed to be realtime.
  • Reply 10 of 20
    hirohiro Posts: 2,663member
    Quote:
    Originally Posted by 1337_5L4Xx0R View Post


    No, and no. There's modeling, texturing, these days lighting, done in realtime. Not to mention 100+ million polygon sculpting for normal maps and displacement maps. I'm not talking Post Production here.



    Hiro, I'm not sure what you are talking about. Feeble graphics cards + feeble OpenGL performance = feeble performance in Mudbox, Maya, etc etc, which use said graphics cards and OpenGL.



    Preview functionality in Maya? I'm not sure what you are referring to. Viewport 2.0? That requires good graphics cards, lots of VRAM and a decent OpenGL implementation. It is realtime. Or rather, it is supposed to be realtime.



    You object an you don't even know what you want to talk about?



    Dude, we can't even render 100Million polygons in an hour yet unless they are flat shaded! Go back to school, let the rest of us write the software you fantasize is already here!
  • Reply 11 of 20
    aquaticaquatic Posts: 5,602member
    Quote:
    Originally Posted by Hiro View Post


    You object an you don't even know what you want to talk about?



    Dude, we can't even render 100Million polygons in an hour yet unless they are flat shaded! Go back to school, let the rest of us write the software you fantasize is already here!



    Anyone else notice the latest Flash stutters on full screen video? At least on my 2010 MBP with 10.6.4. Is that because of this issue or is it Flash? I tried every single archived Flash of 10.0 and 10.1 and it happened on all of them...even in Intel only graphics using gfxCardStatus though! Just seemed like maybe it was somehow related to the issue in this thread..
  • Reply 12 of 20
    hirohiro Posts: 2,663member
    Quote:
    Originally Posted by Aquatic View Post


    Anyone else notice the latest Flash stutters on full screen video? At least on my 2010 MBP with 10.6.4. Is that because of this issue or is it Flash? I tried every single archived Flash of 10.0 and 10.1 and it happened on all of them...even in Intel only graphics using gfxCardStatus though! Just seemed like maybe it was somehow related to the issue in this thread..



    Flash isn't GPU accelerated on OS X yet. Talk to Adobe, although they are still crying about not liking Apples video acceleration APIs.
  • Reply 13 of 20
    1337_5l4xx0r1337_5l4xx0r Posts: 1,558member
    Hiro, your comment doesn't make sense to me. Perhaps address the fact that OpenGL and realtime 3D is part and parcel of creating 3D content? Do you disagree with this? Do you actually have working knowledge of Maya? Mudbox? Other 3D content creating apps?



    It appears you are arguing on a topic you don't have working knowledge on. And vehemently at that. Why?



    Quote:

    You object an you don't even know what you want to talk about?



    Dude, we can't even render 100Million polygons in an hour yet unless they are flat shaded! Go back to school, let the rest of us write the software you fantasize is already here!



    The first sentence I don't understand. Clarify?



    Second, are you familiar with modern graphics cards? Are you familiar with Mudbox and ZBrush? And how they do the "heavy lifting" on the graphics cards for high-poly sculpting? (And yes, when sculpting and modeling, one tends to work flat-shaded.)



    Hiro, I'm not lambasting you. I'm a 3D content creator, who is speaking from personal experience. Chill on the 'dissing' and maybe actually address my previous post and take it apart point by point. I'll be glad to rebut, if you require it.
  • Reply 14 of 20
    hirohiro Posts: 2,663member
    Quote:
    Originally Posted by 1337_5L4Xx0R View Post


    Hiro, your comment doesn't make sense to me. Perhaps address the fact that OpenGL and realtime 3D is part and parcel of creating 3D content? Do you disagree with this? Do you actually have working knowledge of Maya? Mudbox? Other 3D content creating apps?



    It appears you are arguing on a topic you don't have working knowledge on. And vehemently at that. Why?







    The first sentence I don't understand. Clarify?



    Second, are you familiar with modern graphics cards? Are you familiar with Mudbox and ZBrush? And how they do the "heavy lifting" on the graphics cards for high-poly sculpting? (And yes, when sculpting and modeling, one tends to work flat-shaded.)



    Hiro, I'm not lambasting you. I'm a 3D content creator, who is speaking from personal experience. Chill on the 'dissing' and maybe actually address my previous post and take it apart point by point. I'll be glad to rebut, if you require it.



    You are telling somebody who writes tools in the realtime 3D and rendering field that they don't know anything about those tools and how they work or are used. Huh. That's interesting.



    As for clarification:

    Quote:

    "No, and no. There's modeling, texturing, these days lighting, done in realtime. Not to mention 100+ million polygon sculpting for normal maps and displacement maps. I'm not talking Post Production here."



    Your concept of reality is definitely not how the part of the world that makes money at this does things. Especially the part that's all hyperbole trying to win a point that it just lost miserably. Any scene graphically can be done realtime, but not very well. Realtime is 100% faked effects. Period. Fast, but completely unconvincing. Nothing outside of games that's worth paying for is rendered in realtime, and it won't change for quite a few years yet.



    I'm not saying Apples drivers are the shiznitz. But I am saying if you make serious money in 3D, you aren't playing on the GPU in realtime for final output, unless you are doing games.
  • Reply 15 of 20
    1337_5l4xx0r1337_5l4xx0r Posts: 1,558member
    Hiro, that 3D content, as it is created, is created in realtime 3D. As for example, modelling any object that is seen. Another example would be texturing, which as a professional in 3D, you know (?) means displacement map creation, normal maps, not just diffuse textures.



    Mudbox, as you know(?), subdivides models to such a fine level of detail (over 135million polys on year-old PCs with gamer cards) that things like wrinkles around a character eyes, cloth fabric, etc are sculpted. We are talking hundreds of millions of polygons, rendered and manipulated in realtime. Yes, realtime. As you know (?) modern GPUs with, say, 2GB VRAM are more than up to the task.



    I say this as a professional 3D artist, who works with these tools for a living. As someone who creates in 3D on an every day basis, my concern with Apple GPU choice and poor OpenGL implementation is 'in my face' on the Mac.



    Quote:

    Your concept of reality is definitely not how the part of the world that makes money at this does things.



    I work in that part of the world that 'makes money at this'. I've shipped AAA games, I've taught 3D for years.



    Quote:

    Especially the part that's all hyperbole trying to win a point that it just lost miserably.



    This isn't a contest, it's a discussion. A discussion whose points I made 3 posts ago you refuse to rebut. Now why would that be?



    And you still haven't clarified:
    Quote:

    Recent updates to preview functionality in Maya may finally be changing that.



    What preview functionality are you referring to? Viewport 2.0? Some funky Mental Ray development? Please, answer this. I'm curious.



    Instead of 'clarifying' as asked, you downplay me as a 'student' and tell me I must not make money doing what I do. You call my points 'hyperbole' without explaining why or making any attempt to prove them otherwise. With, say, links. Facts. Stuff like that.



    You then add the line
    Quote:

    you aren't playing on the GPU in realtime for final output, unless you are doing games.



    When I've clearly been talking about creation (assets) and modeling and texturing the entire time. Like in this post, in fact.



    I'm an instructor. If you'd like me to educate you on OpenGL, graphics cards, realtime 3d and their effects on modeling, texturing, etc, I'll be happy to give you an overview. Programming is a very different discipline, and I doubt you have the time or inclination to research another field. Just as I have no clue on how to add displacement map rendering code to a game engine.



    I made this (mostly) on a Mac: (warning: 2560x1600, 1.7MB!)







    And let me tell you, it was a complete nightmare on a VRAM starved older GPU. Rendering was not the issue (I command-line rendered in Mental ray for the 2560x1600 output. It took 1/2 hour, which is 'fast'). The issue was a mac that crashed everytime I made textures viewable on ANY of the assets alone. IE: JUST the tank. Just the character. Just Paris blocks. Trying to put together a piece like this in said environment was beyond frustrating. My only window on this scene WAS Mental Ray renders (and occasionally the PCs at work, which, while older, fared better)), and all the waiting, and slow iteration, that such a workflow implies.



    So let me again state: Apple's very poor GPUs, combined with industry-worst OpenGL implementation in terms of performance, do not make for a "go to" platform for 3D content creators as you said a few posts back. It means 3D content creators (such as myself), on the contrary, are forced to move to other, more capable platforms. Like Windows, or even Linux.



    I'm awaiting a thoughtful, reasoned reply.
  • Reply 16 of 20
    hirohiro Posts: 2,663member
    Quote:
    Originally Posted by 1337_5L4Xx0R View Post


    Hiro, that 3D content, as it is created, is created in realtime 3D. As for example, modelling any object that is seen. Another example would be texturing, which as a professional in 3D, you know (?) means displacement map creation, normal maps, not just diffuse textures.



    That creation isn't considered realtime. Yes you are looking at your views in real time, but they aren't fully rendered scenes. You cannot push a term meant to describe the timeframe of final rendered content to suddenly mean something convenient when talking about working views



    Quote:

    Mudbox, as you know(?), subdivides models to such a fine level of detail (over 135million polys on year-old PCs with gamer cards) that things like wrinkles around a character eyes, cloth fabric, etc are sculpted. We are talking hundreds of millions of polygons, rendered and manipulated in realtime. Yes, realtime. As you know (?) modern GPUs with, say, 2GB VRAM are more than up to the task.



    Mudbox doesn't push 135 million poly's. Mudbox displays the highest subdivision level in the closest portion of the frustrum you are examining so you get the effect of looking at a model that would need to have that many poly's if you could see the whole thing at full resolution. The parts that are out of view are not computed to that same level. And the subdivided poly's aren't even all there, just the rules for how to compute them, they are algorithmically described so they only actually exist as the render needs them. It's a useful trick. A very clever one, but it isn't rendering but 10% of that many actual poly's in a scene. Not having to keep all those poly's around all the time is one of the reasons subdivision schemes were developed in the first place.



    Quote:

    I say this as a professional 3D artist, who works with these tools for a living. As someone who creates in 3D on an every day basis, my concern with Apple GPU choice and poor OpenGL implementation is 'in my face' on the Mac.



    The money renderers, not game engines, never hit the OpenGL pipeline as it is designed, they use the GPU shader engines independent of the pipeline. You are complaining about something that doesn't affect the final output speed in any appreciable manner. You event allude to that farther down in your post! I also can't speak to one modeling package implementation over another, as there are lots of things, actually most things, in modeling packages, that can suck and not be OpenGL related at all.



    Quote:

    I work in that part of the world that 'makes money at this'. I've shipped AAA games, I've taught 3D for years. This isn't a contest, it's a discussion. A discussion whose points I made 3 posts ago you refuse to rebut. Now why would that be?



    You complain about non-game problems completely in game terms. The technical underpinnings of game and non-game graphics are two entirely different beasts which have had very little code level overlap until VERY recently, and that's really still research and demo stuff. I'm not answering what you want to hear, as opposed to not answering your question. That's not my problem.



    Quote:

    And you still haven't clarified:

    What preview functionality are you referring to? Viewport 2.0? Some funky Mental Ray development? Please, answer this. I'm curious.



    I don't know every vendors name for every feature. When I have coffee with other gents at conferences and catch their technical presentations the talk isn't about trade names, but how we did neat new stuff.



    Quote:

    Instead of 'clarifying' as asked, you downplay me as a 'student' and tell me I must not make money doing what I do. You call my points 'hyperbole' without explaining why or making any attempt to prove them otherwise. With, say, links. Facts. Stuff like that.



    I didn't downplay you as a student. I told you to go back to school to study something you displayed you knew little about. Yes that would make you a student, and if you think that's a dirty term I feel for your students. I still audit classes from colleagues and mentors, I'm happy that I can still be a student at times, I get to learn new things that will allow me to come up with new techniques. When you decide you are the last word and don't want to be a student and learn about new stuff, it's time to start thinking about a new field because you will be on the fast track to obsolescence.



    Quote:

    You then add the line When I've clearly been talking about creation (assets) and modeling and texturing the entire time. Like in this post, in fact.



    Go back to para 1, you are munging the terminology again. I can't help that.



    Quote:

    I'm an instructor.



    You teach content creation. I build and teach the guts that make your line of work possible. Both of those levels are critical, but they have different skillsets and technical knowledge requirements without a lot of hard overlap. I don't expect everyone to know the deep dark secrets of how we fake thinks on your behalf. When you start trying to tell us HOW the tools work under the hood, and you are incorrect, you better go back to square one. If you want to talk about that level, go back to school and learn that.





    Quote:

    If you'd like me to educate you on OpenGL, graphics cards, realtime 3d and their effects on modeling, texturing, etc, I'll be happy to give you an overview. Programming is a very different discipline, and I doubt you have the time or inclination to research another field. Just as I have no clue on how to add displacement map rendering code to a game engine.



    Thanks, but no. If you can't write that displacement map rendering code to a game engine you really don't have the handle on OpenGL you think you do. You can use it and understand what it can do for you visually, but were you ever fluent in RedBook and BlueBook? The tools hide those implementation details, make the visualizations manageable, and using them at a high level is a talent that frankly I don't have, nor desire to learn at this point, I have plenty of other new areas to work that will eventually make new tools for folks like you.



    Quote:

    And let me tell you, it was a complete nightmare on a VRAM starved older GPU. Rendering was not the issue (I command-line rendered in Mental ray for the 2560x1600 output. It took 1/2 hour, which is 'fast'). The issue was a mac that crashed everytime I made textures viewable on ANY of the assets alone. IE: JUST the tank. Just the character. Just Paris blocks. Trying to put together a piece like this in said environment was beyond frustrating. My only window on this scene WAS Mental Ray renders (and occasionally the PCs at work, which, while older, fared better)), and all the waiting, and slow iteration, that such a workflow implies.



    So let me again state: Apple's very poor GPUs, combined with industry-worst OpenGL implementation in terms of performance, do not make for a "go to" platform for 3D content creators as you said a few posts back. It means 3D content creators (such as myself), on the contrary, are forced to move to other, more capable platforms. Like Windows, or even Linux.



    I'm awaiting a thoughtful, reasoned reply.



    I have no idea if that was application level snafu or OS/driver/card level snafu. Most likely application level though, it almost always is. That just happens to be a longstanding tenet of software engineering, 98% of the time it's the application programmer's fault.
  • Reply 17 of 20
    1337_5l4xx0r1337_5l4xx0r Posts: 1,558member
    Hiro, thank you for actually responding point by point with your assertions.

    Quote:
    Originally Posted by Hiro View Post


    That creation isn't considered realtime. Yes you are looking at your views in real time, but they aren't fully rendered scenes. You cannot push a term meant to describe the timeframe of final rendered content to suddenly mean something convenient when talking about working views



    We seem to be divided by a common language. http://en.wikipedia.org/wiki/3D_rendering is a good start for learning the industry terminology as I'm using it.



    Quote:

    "Rendering methods

    Main article: Rendering (computer graphics)



    Rendering is the final process of creating the actual 2D image or animation from the prepared scene. This can be compared to taking a photo or filming the scene after the setup is finished in real life. Several different, and often specialized, rendering methods have been developed. These range from the distinctly non-realistic wireframe rendering through polygon-based rendering, to more advanced techniques such as: scanline rendering, ray tracing, or radiosity. Rendering may take from fractions of a second to days for a single image/frame. In general, different methods are better suited for either photo-realistic rendering, or real-time rendering.



    Rendering for interactive media, such as games and simulations, is calculated and displayed in real time, at rates of approximately 20 to 120 frames per second. In real-time rendering, the goal is to show as much information as possible as the eye can process in a 30th of a second (or one frame, in the case of 30 frame-per-second animation). The goal here is primarily speed and not photo-realism. In fact, here exploitations are made in the way the eye 'perceives' the world, and as a result the final image presented is not necessarily that of the real-world, but one close enough for the human eye to tolerate. Rendering software may simulate such visual effects as lens flares, depth of field or motion blur. These are attempts to simulate visual phenomena resulting from the optical characteristics of cameras and of the human eye. These effects can lend an element of realism to a scene, even if the effect is merely a simulated artifact of a camera. This is the basic method employed in games, interactive worlds and VRML. The rapid increase in computer processing power has allowed a progressively higher degree of realism even for real-time rendering, including techniques such as HDR rendering. Real-time rendering is often polygonal and aided by the computer's GPU."



    (emphasis mine)



    Quote:

    You cannot push a term meant to describe the timeframe of final rendered content to suddenly mean something convenient when talking about working views



    Hardware/Viewport/Realtime rendering is the accepted industry term. Without getting too metaphysical (neither of us has any acid or weed handy, anyway), any view of my creations is a simulacrum, whether realtime and "low quality," or finalgathered/GI/HDRI/Photons/caustics/AAed in Mental Ray and "high quality"(Aka Software Rendered). There is no 'one true view' of a bunch of points in space represented on a screen. It's all relative, man... *smokes bong hit*





    Quote:
    Originally Posted by Hiro View Post


    Mudbox doesn't push 135 million poly's. Mudbox displays the highest subdivision level in the closest portion of the frustrum you are examining so you get the effect of looking at a model that would need to have that many poly's if you could see the whole thing at full resolution. The parts that are out of view are not computed to that same level.



    Agreed. What it does is allow seamless, real-time interaction with those 135 million polys. That's very clever programming. Now try Mudbox on a Mac. Go on, nobody's watching. See how many subdivision levels you get to. 4? Now orbit the object. now subdivide again. You're in for a surprise I don't want to spoil.



    Quote:
    Originally Posted by Hiro View Post


    And the subdivided poly's aren't even all there, just the rules for how to compute them, they are algorithmically described so they only actually exist as the render (my edit: AKA GRAPHICS CARD rendering?) needs them. It's a useful trick. A very clever one, but it isn't rendering but 10% of that many actual poly's in a scene. Not having to keep all those poly's around all the time is one of the reasons subdivision schemes were developed in the first place.



    And mudbox leverages GPUs very heavily, and thus OpenGL implementations as well, to do it. Crap GPUs, crap OpenGL, crap mudbox performance.



    Quote:
    Originally Posted by Hiro View Post


    The money renderers, not game engines, never hit the OpenGL pipeline as it is designed, they use the GPU shader engines independent of the pipeline. You are complaining about something that doesn't affect the final output speed in any appreciable manner. You event allude to that farther down in your post! I also can't speak to one modeling package implementation over another, as there are lots of things, actually most things, in modeling packages, that can suck and not be OpenGL related at all.



    Quite possibly. But since the two things Apple actually has control over are, 1. GPUs in the "Pro" Mac, and 2. Their OpenGL implementation. And 3, to some extent, the optimization of the drivers for those GPUs, as Apple has taken the reigns on those.



    Quote:
    Originally Posted by Hiro View Post


    You complain about non-game problems completely in game terms. The technical underpinnings of game and non-game graphics are two entirely different beasts which have had very little code level overlap until VERY recently, and that's really still research and demo stuff. I'm not answering what you want to hear, as opposed to not answering your question. That's not my problem.



    The new Viewport 2.0 in Maya, ie: the realtime 3D, hardware render, portion, was finally brought up to modern game-level performance. IE: support for normal maps, spec maps, primitive hardware lighting, etc. You can view a vast scene, and as you dolly back, the textures are mip-mapped down in resolution, and as you zoom in, the are up-rezzed. All this with no performance penalty as compared to viewport 1.0, in fact, it's faster. Why? They used modern game engine techniques. Too bad it doesn't run at all on my current GPU.



    Quote:
    Originally Posted by Hiro View Post


    I don't know every vendors name for every feature. When I have coffee with other gents at conferences and catch their technical presentations the talk isn't about trade names, but how we did neat new stuff.



    No worries, I just thought something may have slipped past me in that regard in Maya 2011.



    Quote:
    Originally Posted by Hiro View Post


    I didn't downplay you as a student. I told you to go back to school to study something you displayed you knew little about. Yes that would make you a student, and if you think that's a dirty term I feel for your students. I still audit classes from colleagues and mentors, I'm happy that I can still be a student at times, I get to learn new things that will allow me to come up with new techniques. When you decide you are the last word and don't want to be a student and learn about new stuff, it's time to start thinking about a new field because you will be on the fast track to obsolescence.



    I got into this field precisely because I'd always be learning, always be challenged. The more I know, the more I realize how little I know.



    Quote:
    Originally Posted by Hiro View Post


    You teach content creation. I build and teach the guts that make your line of work possible. Both of those levels are critical, but they have different skillsets and technical knowledge requirements without a lot of hard overlap. I don't expect everyone to know the deep dark secrets of how we fake thinks on your behalf. When you start trying to tell us HOW the tools work under the hood, and you are incorrect, you better go back to square one. If you want to talk about that level, go back to school and learn that.



    It's precisely because of your claims (and i believe them) that I cannot believe you are/were of the opinion that Mac's are 'go-to' for 3D work. There's no rendering advantage at all, in your software render sense. In fact, in light of Cuda, etc, the are huge advantages for other platforms right now. The CPUs in Macs are (at best) the same as Linux and Windows.



    In my hardware, real-time sense, I look at this $2499-$3299 Mac, with its 512MB GT 120, and laugh.



    Thus, all else being equal, I'd posit Macs are either a poor choice for 3D, or an extremely poor choice for 3D. I'd love to think otherwise, but I am slapped with Apple's decisions in this regard every single day.





    Quote:
    Originally Posted by Hiro View Post


    Thanks, but no. If you can't write that displacement map rendering code to a game engine you really don't have the handle on OpenGL you think you do. You can use it and understand what it can do for you visually, but were you ever fluent in RedBook and BlueBook? The tools hide those implementation details, make the visualizations manageable, and using them at a high level is a talent that frankly I don't have, nor desire to learn at this point, I have plenty of other new areas to work that will eventually make new tools for folks like you.



    Great. And no, I don't know OpenGL programming, just a few creative softwares.



    Quote:
    Originally Posted by Hiro View Post


    I have no idea if that was application level snafu or OS/driver/card level snafu. Most likely application level though, it almost always is. That just happens to be a longstanding tenet of software engineering, 98% of the time it's the application programmer's fault.



    There's a mix of variables in there, no doubt. But again, decent GPUs would be one way to alleviate texture memory dearths, and speed realtime interactivity frame rates, as is needed by 3D creators. And Apple could also work on making their OpenGL faster, to compensate for their crap GPUs, But instead, they do neither. Those are the 2 things they have direct control over. Either one is a deal-breaker. Both is unforgivable, to me. So I'm looking elsewhere for my next machine.
  • Reply 18 of 20
    aquaticaquatic Posts: 5,602member
    Hiro I thought 10.1 added acceleration on OS X?



    And why would all versions exhibit this issue if it was related to the most current release of Flash only? I installed every single old archived version of 10.0 as well. They all stutter full screen Flash video on my 2010 i5 MBP. Nothing else running. 8 gigs ram. 10.6.4. I didn't have this happen in 10.6.3.
  • Reply 19 of 20
    erunnoerunno Posts: 225member
    Quote:
    Originally Posted by Aquatic View Post


    Hiro I thought 10.1 added acceleration on OS X?



    Nope, only the Windows version of 10.1 features hardwace acceleration as Apple made the necessary APIs available too late in Flash's development cycle. It will be added in a later version. There's an official development build with hardware acceleration turned on called Gaia.
  • Reply 20 of 20
    of high quality always will finally dedicated special purpose Linux cluster at least until there is a common reason for the change. I do not see OS X without ever moving in this particular niche of nano-, it is much more efficient to sell the seats profitable creative professionals, instead offering free OS and razor thin margins make boxes for render farms.
Sign In or Register to comment.