Intel unleashes Mac-bound "Woodcrest" server chip

191012141529

Comments

  • Reply 221 of 565
    melgrossmelgross Posts: 33,580member
    Quote:

    Originally posted by Placebo

    It's properly designed 2D apps that can benefit heftily from these cards. Have you ever played around in Core Image Funhouse with a 9800 or better graphics card? Compared to performing the same blurs and distortions in Photoshop or Fireworks, it's ridiculously fast. Literally as fast as you can drag the blur slider.



    Sure. I have 9800's in my older G4 towers. But those cards are really slow by todays standards.
  • Reply 222 of 565
    melgrossmelgross Posts: 33,580member
    Quote:

    Originally posted by a_greer

    unless there is a chance of the infamous 23 inch pro-ish iMac surfacing...



    Infamous is right. At that size, I would expect it to topple over.
  • Reply 223 of 565
    melgrossmelgross Posts: 33,580member
    Quote:

    Originally posted by a_greer

    I bet that they will just call the damn thing "sex". It may attract the Aleinware buyers, they have never had "sex"!



  • Reply 224 of 565
    sunilramansunilraman Posts: 8,133member
    Well, someone's in a good mood
  • Reply 225 of 565
    melgrossmelgross Posts: 33,580member
    Quote:

    Originally posted by sunilraman

    Well, someone's in a good mood



    Yeah. I've been working on a big project for a while in my shops downstairs, and it's in the living room, finally. That's why I've been here erratically lately. So, I'm feeling less stressed, except for the audio component that just went south.
  • Reply 226 of 565
    placeboplacebo Posts: 5,767member
    Quote:

    Originally posted by melgross

    Sure. I have 9800's in my older G4 towers. But those cards are really slow by todays standards.



    Therefore, Core Image accelleration has even more potential now with a 7900GTX than it did with a 9800 Pro.
  • Reply 227 of 565
    imacfanimacfan Posts: 444member
    Quote:

    Originally posted by Placebo

    Therefore, Core Image accelleration has even more potential now with a 7900GTX than it did with a 9800 Pro.



    But surely a 9800 is better than a 7900?!?!?



    Man, these stupid names make even the 'Model year' sound sensible.



    David
  • Reply 228 of 565
    jeffdmjeffdm Posts: 12,953member
    Quote:

    Originally posted by Placebo

    Therefore, Core Image accelleration has even more potential now with a 7900GTX than it did with a 9800 Pro.



    Definitely more potential, but I thought the debate is about the ability to tap that potential. That unused potential makes the difference between the two cards moot for 2D tasks.
  • Reply 229 of 565
    placeboplacebo Posts: 5,767member
    Quote:

    Originally posted by iMacfan

    But surely a 9800 is better than a 7900?!?!?



    Man, these stupid names make even the 'Model year' sound sensible.



    David




    Yeah, that struck me as I was writing that post. And now ATI has significantly crippled their products by reducing their mighty 9800 to an X1800! 1800 is smaller than 9800! 9800 must be fast fast fast!
  • Reply 230 of 565
    vineavinea Posts: 5,585member
    Quote:

    Originally posted by melgross

    Infamous is right. At that size, I would expect it to topple over.



    Heh...having switched to dual 24" monitors its painful going back to dual 20" (or single 19" at home).



    A 23" iMac Conroe probably handles most power desktop users (non-video). These folks that need more HD can buy an external multi-TB drives or SANs to park under their desks...



    Currently the 20" is just too small. Given the real estate the 2x24"s take up on my desk the next upgrade is a single 30" which I have but am not using at the moment. I could go 1x30" + 1x21WS" in portrait mode but the iMac won't do portrait will it?



    I don't really need expansion for the most part. I do need a decent vid card but decent...not top of the line. Just needs to drive the 30".



    Vinea
  • Reply 231 of 565
    jeffdmjeffdm Posts: 12,953member
    Quote:

    Originally posted by vinea

    Currently the 20" is just too small. Given the real estate the 2x24"s take up on my desk the next upgrade is a single 30" which I have but am not using at the moment. I could go 1x30" + 1x21WS" in portrait mode but the iMac won't do portrait will it?



    That depends on what you mean. Unless Apple changed it, the iMac's built-in display can't be told to operate in rotated mode, but that would be awkward unless you had a custom mount - the iSight iMacs don't have an available VESA mount. It might work on the external display port, but I don't have an iSight iMac (or an iMac anymore) so I don't know.



    Quote:

    I don't really need expansion for the most part. I do need a decent vid card but decent...not top of the line. Just needs to drive the 30".



    Unfortunately, I think dual link DVI is one feature that Apple would try to reserve for the pro units, though I can understand, it's definitely not something I'd expect from a consumer computer just yet.
  • Reply 232 of 565
    sunilramansunilraman Posts: 8,133member
    [QUOTE]Originally posted by iMacfan

    But surely a 9800 is better than a 7900?!?!? ... Man, these stupid names make even the 'Model year' sound sensible... David




    GPUs have almost a cult-like following in the computer gaming and hardware enthusiast scene. Placebo was being a smartass, it's useful to ignore him somtimes



    GPU 101:



    1. Intel Integrated Graphics is fine for almost any tasks, except for playing 3D games. Regardless of whatever marketing terms and numbers Intel says about it. Puzzle and flash-type games no problem, but forget about any of the current, 1 year old, or upcoming 3D games, Windows or Mac.



    2. nVidia GPUs. These are quite easy to understand. There is the 6 series (http://www.nvidia.com/page/geforce6.html) and the 7 series (http://www.nvidia.com/page/geforce7.html). 7 series is newer than the 6 series. For each series, generally the higher number models are better. Eg. in the 6 series a 6600 is better than the 6150. Further reading is required for letters after the numbers, eg. 6600GT is better than 6600 which is better than 6600LE. Now for the 7 series it is similar, in that 7600 is better than 7300; 7950 is better than the 7900 which is better than the 7600 and 7300. Making sense so far? Now again, the letter designations are also important, the 7900GTX is better than the 7900GT.



    3. nVidia GPUs. Overlap between 6 series and 7 series: Note that at some level the high-6 series can outperform the lower-7 series. This requires further reading. Personally, for example, I think the nVidia 7300 is a total waste of money - it has some newer "features" but is not fast enough to make the most of these features. AFAIK you are better off with a 6600GT.



    4. Okay. ATI GPUs. The old designations made sense in that higher numbers are better than lower numbers. radeon 9800 is better than 9600 which is better than the now hopelessly outdated 9200. Actually, these old designations are all outdated now, in general. Don't even worry about them. ATI has now moved on to the X1000 series designations. Similarly, X1900 is better than X1800 is better than X1600 is better than X1300. Now, for a given number, further reading is required. From top down, for the X1900, we have X1900 XTX, X1900 XT, X1900 GT.



    5. ATI vs nVidia. This is an area that DEFINITELY requires further reading to make sense. for example, the ATI X1600 comes in around the nVidia 6600/6800 range. At the top end, the nVidia 7900 battles it out with the ATI X1900.



    6. GPU clocking. Clockspeeds also matter within a certain range of cards. The iMac and MacBookPro sports an ATI X1600 and ATI x1600 Mobility. These run at slower clockspeeds, which results in poorer performance in 3DMark05, a highly regarded GPU benchmarking tool, when compared to the ATI X1600 normal factory settings:







    7. Various sites such as tomshardware.com, anandtech.com and GPU enthusiast sites and forums will have further info on a whole range of benchmarks for different games and the quality of visuals you get with different cards. For example, a new feature is HDR (High Dynamic Range) lighting, which is no doubt commonly discussed in recent GPU forums.



    8. What Apple has chosen. Mac mini, MacBook has Intel Integrated Graphics. The iMac and MacBookPro has ATI X1600 and X1600 Mobility as per the graph above. For the Powermac G5, it comes with the nVidia 6600LE and 6600, which ironically, has performance roughly around the ATI X1600 in the iMac and MacBookPro, depending on how you adjust the clockspeed of the X1600 when running Windows Bootcamp. Thankfully though the Powermac G5 has the option of a nVidia 7800GT which is a pretty decent card. The nVidia 7800GT is roughly 1.8x faster than the iMac Core Duo's ATI X1600. If the ATI X1600 is clocked at factory settings in Windows Bootcamp, then the nVidia 7800GT is 1.5x faster than the iMac Core Duo. These estimates are based on 3DMark05.



    9. nVidia Quadro - These are not for gaming as much, they are targeted towards people that do 3D graphics work.
  • Reply 233 of 565
    sunilramansunilraman Posts: 8,133member
    In our discussions of leveraging the power of GPU to make GPU and CPU work more streamlined, it is interesting to consider nVidia's Gelato:

    http://www.nvidia.com/page/gelato.html



    Check out this render made with Gelato enabled:







    "NVIDIA® Gelato® rendering software brings high-quality rendering to the masses. A hardware-accelerated, non-real-time renderer, Gelato allows you to use your NVIDIA GPU to create stunning images fast."



    "Gelato is a software program that leverages the NVIDIA GPU as a floating point math processor. This allows Gelato to render images faster than comparable renderers, but without the quality limitations traditionally associated with real-time graphics processing on the GPU."



    It is interesting that the system requirements go as low as a nVidia 5200 all the way up to a Quadro.



    System Requirements

    * NVIDIA GPU, one of the following:

    * NVIDIA Quadro FX

    * NVIDIA GeForce® 5200 or higher

    * Microsoft® Windows® XP or

    * Linux 2.4 kernel or later

    * 1 GB RAM (recommended)



    ......................................



    In the Mac world, I think it is interesting how Core Image leverages the GPU, in this case via pixel shaders. I feel the promise of nVidia's Gelato shows that quite interesting results can come out of really maximising the GPU and then "handing off" to the CPU for clean-up/ further accuracy.



    For developers working with Core Image and Image Units in their applications, I think it will be interesting to see how much accuracy they can get via GPU-driven pixel shaders and how this is "handed off" to the CPU in the best possible way for final renders. Both at 2D and 3D levels.
  • Reply 234 of 565
    sunilramansunilraman Posts: 8,133member
    Finally, let's review the nVidia cards out there as compared to the iMac Core Duo's ATI X1600 as clocked by Apple. Estimates based on 3DMark05:



    7900GTX: 2.1x faster

    7900GT: 1.9x faster

    7800GT: 1.8x faster

    7600GT: 1.8x faster

    7600GS: 1.1x faster

    6600GT: 0.9x faster



    IMO this leads me to believe that MacPros will ship with 7600GS on the low-end, 7600GT on the mid and high end, with 7900GTX and Quadros as BTO options.
  • Reply 235 of 565
    onlookeronlooker Posts: 5,252member
    Quote:

    Originally posted by sunilraman

    Finally, let's review the nVidia cards out there as compared to the iMac Core Duo's ATI X1600 as clocked by Apple. Estimates based on 3DMark05:



    7900GTX: 2.1x faster

    7900GT: 1.9x faster

    7800GT: 1.8x faster

    7600GT: 1.8x faster

    7600GS: 1.1x faster

    6600GT: 0.9x faster



    IMO this leads me to believe that MacPros will ship with 7600GS on the low-end, 7600GT on the mid and high end, with 7900GTX and Quadros as BTO options.




    They usually don't offer that many, but I think your close. The two highend ones should be there.
  • Reply 236 of 565
    Let's look at prices:



    $600: 7950GX2

    $500: 7900GTX

    $400: 7900GT

    $300: 7800GT

    $200: 7600GT

    $150: 7600GS



    I think Apple will be looking for a $200-300 card in a $2k-ish dual-Woodcrest, so I'm going to say the low-end will come with a 7600GT



    Mid and High Ends could start with a 7800GT or a 7900GT with one of the top two as BTO, and a Quadro BTO.
  • Reply 237 of 565
    jeffdmjeffdm Posts: 12,953member
    Quote:

    Originally posted by ZachPruckowski

    Let's look at prices:



    $600: 7950GX2

    $500: 7900GTX

    $400: 7900GT

    $300: 7800GT

    $200: 7600GT

    $150: 7600GS



    I think Apple will be looking for a $200-300 card in a $2k-ish dual-Woodcrest, so I'm going to say the low-end will come with a 7600GT



    Mid and High Ends could start with a 7800GT or a 7900GT with one of the top two as BTO, and a Quadro BTO.




    Given Sunilraman's numbers and yours, it looks to me that an individual getting anything other than the 7600GT is basically signing up for the sucker tax. $200 more for 5% faster, or $300 more for 10% faster is hard to justify.
  • Reply 238 of 565
    Quote:

    Originally posted by JeffDM

    Given Sunilraman's numbers and yours, it looks to me that an individual getting anything other than the 7600GT is basically signing up for the sucker tax. $200 more for 5% faster, or $300 more for 10% faster is hard to justify.



    Assuming my newegg-based numbers are right, people who buy the high-end cards do so because they it's that 5% or 10% performance that get you the highest performance settings in games. A 7600GT is gonna get you the high-res stuff, but with the cool smoothing, anti-aliasing whatever features off and a lower framerate in the ultra-new games. Whereas a 7900GTX in theory gets the high rez, and the AA and the HDR and the everything else at a decent framerate. So the theory is that if you're gonna buy these $60 games and spend hours on them on a $1500-2000 system, you may as well have all the special effects turned up.
  • Reply 239 of 565
    jeffdmjeffdm Posts: 12,953member
    Quote:

    Originally posted by ZachPruckowski

    Assuming my newegg-based numbers are right, people who buy the high-end cards do so because they it's that 5% or 10% performance that get you the highest performance settings in games. A 7600GT is gonna get you the high-res stuff, but with the cool smoothing, anti-aliasing whatever features off and a lower framerate in the ultra-new games. Whereas a 7900GTX in theory gets the high rez, and the AA and the HDR and the everything else at a decent framerate. So the theory is that if you're gonna buy these $60 games and spend hours on them on a $1500-2000 system, you may as well have all the special effects turned up.



    That's not making sense though unless the 7600GT isn't capable of those features.
  • Reply 240 of 565
    sunilramansunilraman Posts: 8,133member
    Well, to clarify. Firstly, I took numbers for 3DMark05 to get comparisons with the X1600.



    The 7900GT and 7900GTX difference in 3DMark05, when compared to say the 7600GT, does get bigger the higher the resolution you go, eg. 1024x768 going up to 1600x1200 or something like that.



    I took 3Dmark05 numbers of 1024x768 with no antialiasing, no anisotropic filtering, simply as a simple comparison.



    So yeah, it does look weird in the numbers I put out comparing the 7900GT and 7900GTX compared to the 7600GT.



    The first thing is what you refer to as the "sucker tax". Yes, gaming enthusiasts are willing to put in the extra cash to get the latest and greatest. A 7600GT to them seems mid-range when if they have the cash they want to get a 7900GT or 7900GTX to get the best available.



    Secondly, there are other aspects outside of 3DMark05 to consider when actually playing games. Various things such as 4x AntiAliasing, 16x Anisotropic Filtering, HighDynamicRange, all at smooth, playable frame rates, are more likely in the 7900GT and 7900GTX. Think of it as having greater "headroom" and higher overall visual quality when actually playing games, looking beyond the 3DMark05.



    The 7600GT is capable of a lot of the latest features, but again, think of the 7900s as having greater "headroom" with all these features at higher resolutions, running smoother.



    For example, my 6600GT in HalfLife2-Episode1 runs smooth at 2x AntiAliasing, 16x Anisotropic Filtering, and HDR enabled. Water reflections when shining at things with a torch starts to slow things down a bit though. Additionally, in the 6600GT, it is proven that anything higher than 2x AntiAliasing, ie. 4x and above, cuts frame rates drastically. Whereas a 6800 probably does not have this "issue".



    Going back to 3dMark05 though, the 7950 GX2 offers some incredible value. By 3Dmark05 scores as compared to the iMac Core Duo clocked by Apple, it is about 3.5x faster. Having two GPU cores on one card, you're getting an extra GPU core for about $100+.



    Yeah if you do the math, one 7900-ish core is 2x faster than the iMac Core Duo, so two 7900-ish cores should be 4x faster. But of course, in 3DMark05 and realistic implementations in games, you're going to see the 7950 GX2 2.5x-3.5x faster than the iMac's GPU.
Sign In or Register to comment.