QUARTZ HARDWARE ACCEL!!

145679

Comments

  • Reply 161 of 191
    aslanaslan Posts: 97member
    Ooo Ooo



    Sorry for the dblpost, but I forgot to address this one...



    [quote]

    But 10.2 should be a robust OS that recent models can use efficiently.

    <hr></blockquote>



    Really? What makes you think that?



    Just where was there evidence that it wasn't? Aren't we jumping the gun here steve? Have you even USED 9.2 yourself yet? Not that I have, but I have been keeping myself as informed as possible, which includes me reading such things as "JAG SCREAMS!"



    Is naysaying without grounds your hobby?
  • Reply 162 of 191
    steve666steve666 Posts: 2,600member
    &gt;BTW, steve666, how exactly do you consider targeting old, aging computers with a bleeding edge OS good business practice? How is it the wrong attitude? How is it foolish? How is costly development for legacy systems going to pay for itself without new hardware sales? How is catering to you, the I-don't-want-to-buy-a-better-computer folk, going to gain them marketshare? You are already stading in their small percentage...

    I am sorry. You are either shortsighted and angry cause you have an old computer (join the club on the old computer bit), or you just don't understand the not-so-subtle points involved in making money as a business, I guess... but which?

    Didn't mean this as a flame, but seriously, WTF?&lt;



    My computer is a year old. Even Microbloat doesn't require hardware improvememnts that fast, and if they did there are cheap cards out there for PC users, but not for mac users. Their is a niche of mac users, mostly yuppies, with the dough to keep on buying new machines or overpriced graphics cards, but the market Apple has been trying to target for the last few years are not in this camp. I am happy that Apple has found a way to speed up the new OS. I am not happy that my video card will not be fully supported. If Apple sold me the Nvidia Geforce2MX for $100, I would be happy. Alienating customers is NOT good business practice-you don't need an MBA to know that.........................
  • Reply 163 of 191
    steve666steve666 Posts: 2,600member
    [quote]Originally posted by Aslan:

    <strong>Ooo Ooo



    Sorry for the dblpost, but I forgot to address this one...







    Really? What makes you think that?



    Just where was there evidence that it wasn't? Aren't we jumping the gun here steve? Have you even USED 9.2 yourself yet? Not that I have, but I have been keeping myself as informed as possible, which includes me reading such things as "JAG SCREAMS!"



    Is naysaying without grounds your hobby?</strong><hr></blockquote>



    I'm using 9.2 right now-although i assume you made a typo. Perhaps you should remove yourlips from Steve Jobs' butt while you type.



  • Reply 164 of 191
    kidredkidred Posts: 2,402member
    [quote]Originally posted by steve666:

    <strong>



    I'm using 9.2 right now-although i assume you made a typo. Perhaps you should remove yourlips from Steve Jobs' butt while you type.



    </strong><hr></blockquote>



    Well, he's right, jag-wire is very fast.
  • Reply 165 of 191
    You all still buying macs? Why? They took your money and gave you crap beta software. Then nailed for the final version which is beta quality. Now they drop support for 2 month old hardware? When will you learn?



    [ 05-11-2002: Message edited by: scott_h_phd ]</p>
  • Reply 166 of 191
    zapchudzapchud Posts: 844member
    its the hardware being unable to support the future, not the future dropping support for the hardware.



    todays 2 month old hardware is NOT unsupported in Jag-wire!
  • Reply 167 of 191
    scott_h_phdscott_h_phd Posts: 448member
    [quote]Originally posted by r-0X#Zapchud:

    <strong>its the hardware being unable to support the future, not the future dropping support for the hardware.



    todays 2 month old hardware is NOT unsupported in Jag-wire!</strong><hr></blockquote>



    The company unable (not willing) to support the hardware. Simple.
  • Reply 168 of 191
    buonrottobuonrotto Posts: 6,368member
    Back to troll, Scott?
  • Reply 169 of 191
    guitarblokeguitarbloke Posts: 125member
    [quote]Originally posted by scott_h_phd:

    <strong>You all still buying macs? Why? They took your money and gave you crap beta software. Then nailed for the final version which is beta quality. Now they drop support for 2 month old hardware? When will you learn?



    [ 05-11-2002: Message edited by: scott_h_phd ]</strong><hr></blockquote>

    You troll like a girl.
  • Reply 170 of 191
    bellebelle Posts: 1,574member
    I'm with scott_h_phd (Congrats on the doctorate, by the way) on this one.



    I wonder what percentage of Mac owners have a computer with a Quartz Extreme supported video card? I wonder what that percentage will be, say, twelve months after the release of Jaguar?



    It's a nice feature for future users, for sure, but it's a bit of a slap in the face for current Mac users, even if there is some improvement in other aspects of the performance.



    [ 05-11-2002: Message edited by: Belle ]</p>
  • Reply 171 of 191
    amorphamorph Posts: 7,112member
    Would you prefer that they just axed the technology altogether? Or mothballed it for another 5 years (the average lifetime of a Mac) so that Apple could be 3 years behind MS, failing to exploit their own hardware to its fullest capacity, and all so that some people wouldn't feel left behind?



    Did anyone here feel left behind when Apple adopted AltiVec?
  • Reply 172 of 191
    buonrottobuonrotto Posts: 6,368member
    [quote]Originally posted by Belle:

    <strong>It's a nice feature for future users, for sure, but it's a bit of a slap in the face for current Mac users...</strong><hr></blockquote>



    You mean like the G4 with Velocity Engine was?



    [ 05-11-2002: Message edited by: BuonRotto ]</p>
  • Reply 173 of 191
    bellebelle Posts: 1,574member
    [quote]Originally posted by Amorph:

    <strong>Would you prefer that they just axed the technology altogether? Or mothballed it for another 5 years (the average lifetime of a Mac) so that Apple could be 3 years behind MS, failing to exploit their own hardware to its fullest capacity, and all so that some people wouldn't feel left behind?



    Did anyone here feel left behind when Apple adopted AltiVec?</strong><hr></blockquote>

    I knew Amorph would make a hard taskmaster when he became an administrator!

    No, but surely some of the performance tweaks could be implemented on other cards which support OpenGL?

    [quote]Originally posted by BuonRotto:

    <strong>You mean like the G4 with Velocity Engine was?</strong><hr></blockquote>

    My card supports OpenGL, though.



    While I'm being a bit of an antagonist (Not deliberately, obviously), it's just that it seems probable that given the performance enhancements in Quartz Extreme have been gained using OpenGL, even a couple of the features must offer performance gains on lesser graphics hardware.



    [ 05-11-2002: Message edited by: Belle ]</p>
  • Reply 174 of 191
    buonrottobuonrotto Posts: 6,368member
    [quote]Originally posted by Belle:

    <strong>

    My card supports OpenGL, though.



    [ 05-11-2002: Message edited by: Belle ]</strong><hr></blockquote>



    Whew! We're all posting at the same time. Anyway, the question is does it support this use of openGL [i]adequately[i/]. That is, could it be a case of putting 10 lbs. of you-know-what in a 5 lb. bag? Hence I would think of the VRAM "recommendation." At some point I imagine that the CPU would do a faster, more efficient job of rendering all this stuff than the GPU would. It depends on what each is and the bandwidth for each. They obviously can both handle the screen drawing, the fact that they even made QE proves that. But which approach produces the better, faster, more responsive display on which systems? I think you could force-feed QE into any GPU that handles OpenGL, but would it be worth the trouble? I don't know for sure obviously, but I have seen the content these things are dealing with. Perhaps I am being naive to think that Apple might have a better idea than I do.



    I agree with your last edit, Belle. If there is some advantage, then they should milk it. I guess it's a matter of whether you believe they can, and whether they would if they could. It's trying to get into their heads a bit too much IMO and at that point it becomes an emotional argument.



    [ 05-11-2002: Message edited by: BuonRotto ]</p>
  • Reply 175 of 191
    bellebelle Posts: 1,574member
    [quote]Originally posted by BuonRotto:

    <strong>I think you could force-feed QE into any GPU that handles OpenGL, but would it be worth the trouble? I don't know for sure obviously, but I have seen the content these things are dealing with. Perhaps I am being naive to think that Apple might have a better idea than I do.</strong><hr></blockquote>

    There are most certainly parts of Quartz Extreme that could be implemented on lesser cards and offer improvements.



    I wouldn't question Apple's technical abilities, just the decision making process that decided not to take the time to offer the performance gains in "old" machines.
  • Reply 176 of 191
    bellebelle Posts: 1,574member
    I'm thinking we need to stop cross-posting, BuonRotto.
  • Reply 177 of 191
    ghost_user_nameghost_user_name Posts: 22,667member
    Seeing as I fall into the category of top-of-the-line hardware (PBG4 500) now unsupported by Quartz Extreme, (can we just call it QuartzGL from now on?) i'm a bit miffed at the lack of support for my card.



    However, it's my understanding (having read this on many forums and in some dev notes which I will be sure to find and reference here) that the QuartzGL acceleration leverages the T&L unit of the GPU in lowest levels- thereby rendering cards without a hardware T&L unit useless. The Rev. B PowerBooks with the 16 MB RADEON mobility chips have the T& L unit, therefore they will benefit from QGL, however not optimally, as the 32 MB is needed to keep all the Aqua textures in memory. Or so my understanding goes.



    Currently, the situation seems to be a win-lose situation for people with Rage 128s: Although there will be a speed boost in the short term due to the other optimizations of Jaguar, long term additions of more eye candy (as the current HW offerings all will support QGL) will slow our legacy machines to a crawl.



    If only laptop video cards were upgradable..... <img src="graemlins/hmmm.gif" border="0" alt="[Hmmm]" />





    There is the possibility that Apple could, eventually support the Rage 128: it's just that they're delaying working on that to get QGL out the door for a September release. That'd be my hope.



    Although, if you really think about it, by the time Jaguar comes out, my computer will be 2 years old.. by the next point release of OS X (when more eye candy would potentially be added, slowing computers like mine to a crawl), it should be time to upgrade, anyway.





    I think we were spoiled for way too long with old hardware being sufficient to run the latest OS releases with 9.x. I mean, the 6400/180 I bought in 8th grade would run 9.2 pretty well... heh.



    I see that i've begun to ramble. Apologies.
  • Reply 178 of 191
    nostradamusnostradamus Posts: 397member
    Jonathan, you have it all wrong.

    From <a href="http://www.xlr8yourmac.com"; target="_blank">www.xlr8yourmac.com</a>



    [quote]And the lack of QE Rage128 AGP support was not an arbitrary decision. (The Rage128 chip does not support texture sizes that are not a power of 2. That's why there's no QE support for the Rage128/Rage128 Mobility chips.)

    <hr></blockquote>



    32MB video memory is recommended, but I pose that if one limits oneself to 16bit color depth and lower resolutions, one could get by with 16MB video memory. The only 16MB chipset that supports Quartz Extreme is the Radeon Mobility chipset found on the Winter 2001 PowerBooks and Summer 2002 iBooks.



    [ 05-11-2002: Message edited by: Nostradamus ]</p>
  • Reply 179 of 191
    buonrottobuonrotto Posts: 6,368member
    Well, jonathan knows more than I certainly do about why QE is at work on some chips and not others. I'm ignorant about what's going on down there in technical terms. Now, this could be just version 1 of QE, and that a major revision in 2003 will leverage more, uh, "standard" or rather common OpenGL implementations, thus bringing at least some benefit to the Rage 128 crowd. Having said that, I doubt they'll go to the trouble of getting Rage IIs to support QE.



    But it's all gravy IMO. To me, the most frustrating aspect of OS X, where the issue about performance pops up the most isn't in the system itself, it's in these crappy "dirty" Carbon ports. If these get more polished both inside and out (and this obviously applies to the Finder), I think everyone's OS X experience will be a lot more pleasurable regardless of whether the screen is being drawn by the GPU or the CPU. These poor apps are the weakest part of the chain in terms of the apparent performance of the OS. Fix them and you've fixed more than QE can ever fix.
  • Reply 180 of 191
    amorphamorph Posts: 7,112member
    [quote]Originally posted by Belle:

    <strong>

    There are most certainly parts of Quartz Extreme that could be implemented on lesser cards and offer improvements.</strong><hr></blockquote>



    Not according to, for example, Arshad from ATi.



    The main issue seems to be how efficiently the card can handle gigantic textures of arbitrary size (i.e., windows). Textures in video games are sized and compressed and optimized for high performance, but on-screen windows obviously cannot be. Games also break things down into lots of polygons, exploiting the card's processing power, but a window has a grand total of one polygon, mooting the GPU's pipeline for the most part.



    [the following paragraph has been corrected]



    Nostradamus just noted a flaw with the Rage 128, in that it requires textures to be sized in powers of two, with height equal to width, but Programmer has noted that although there would be a penalty associated with pulling a lot of extra data across the bus, it could still be easily clipped and displayed. The real problem seems to be that the Rage 128, despite being an AGP card/chipset, doesn't actually support AGP's ability to read directly from system RAM. Whoops. (T&L and programmability have been ruled out as reasons.)



    I'm going to note, again, that what Apple ended up announcing is better then the graphics-knowledgeable people here thought they could offer. It would sure be nice if it worked on my friend's original iMac, but Arshad said that Apple and ATi (and probably nVIDIA, but he's not going to mention them) tried getting QE to work on things like PCI video cards, and it sucked. It works on everything they could get it to work on. Given the brute-force nature of the problem (huge textures of arbitrary size) I can't imagine where support for older cards will come from. The next step, I suppose, will be turning some or all of the actual in-window compositing that Quartz does over to the GPU, but that will shut out even more cards - then, programmability becomes a real issue.



    [quote]

    Originally posted by BuonRotto:



    <strong>But it's all gravy IMO. To me, the most frustrating aspect of OS X, where the issue about performance pops up the most isn't in the system itself, it's in these crappy "dirty" Carbon ports. If these get more polished both inside and out (and this obviously applies to the Finder), I think everyone's OS X experience will be a lot more pleasurable regardless of whether the screen is being drawn by the GPU or the CPU. These poor apps are the weakest part of the chain in terms of the apparent performance of the OS. Fix them and you've fixed more than QE can ever fix. </strong><hr></blockquote>



    Amen.



    But how much do you want to bet that Microsoft will ever clean up Office X?



    [ 05-11-2002: Message edited by: Amorph ]



    [ 05-11-2002: Message edited by: Amorph ]



    [ 05-11-2002: Message edited by: Amorph ]</p>
Sign In or Register to comment.