Briefly: Apple UK blunder hints at Mac Pro update

123457

Comments

  • Reply 121 of 155
    nvidia2008nvidia2008 Posts: 9,262member
    Quote:
    Originally Posted by Marvin View Post


    Sure but I'm not talking about serious gaming - even the X1600 can't really do serious gaming. I just mean that Apple's lowest end shouldn't have such a huge difference in performance compared to the next level up because that doesn't give people any choice. The X1300 will play Half-Life 2 at maximum settings with no AA. The GMA can't do that.



    3dmark06

    X1600 = 1800

    X1400 = 900

    X1300 = 560

    GMA 950 = 170



    So the jump from the Mac Mini to the iMac with X1600 gives you an order of magnitude jump in graphics performance. I would be content if they used x1400s because that's only a factor of 2. More than a factor of 10 is taking the piss.



    According to this forum, the X3000 should definitely be a good solution:



    http://forums.vr-zone.com/archive/in.../t-101174.html



    But their benchmarks are coming out at around 750 for 3dmark05 for the X3000 and the x1300 gets 1000 so it's about double the GMA. When the drivers are better optimized and we get it in the mac machines, I'm sure it will perform a bit better. Behold the future spec of the Mini:



    http://www.newegg.com/Product/Produc...82E16883227007



    As always, already available in a PC.



    You'll notice it has a TV tuner and two optical drives so that would equal a Mini + elgato + external HD as I'm sure you can replace one of the optical drives with a HD. And they are nice quiet mini-cd compatible tray loading drives.



    I think you have made some interesting points.



    Could it be that even not-so-serious gaming, for 2007 and 2008 titles, we're gonna need a GPU that can push at least 1500 3DMark06...?* I'm trying to keep a more open mind because I would consider myself a mid-range-mainstream but very casual (1 hour per day?) gamer, but "aesthetically biased" with regards to what minimum "visual experience" one should have with 1024x768, 1280x960 level resolution of gaming. Biased for example I love Source with "HDR" (HL2+1 etc) and LithTech+ (FEAR) and NFS:MostWanted graphic engines..... Doom3/Quake4, I don't likey.



    *??I've to go back and check my 3DMark06 for my nVidia6600GT 128mbVRAM.
  • Reply 122 of 155
    nvidia2008nvidia2008 Posts: 9,262member
    Quote:
    Originally Posted by Marvin View Post


    ...............

    3dmark06

    X1600 = 1800

    X1400 = 900

    X1300 = 560

    GMA 950 = 170

    ..................



    Yeah, I pulled about 1826 3dMark06's for 6600GT 128mbVRAM,

    AMD64 singlecore 2.18ghz 1gb RAM.





    Edit:

    So yeah... a 3dMark06 of 1800 would give 1280x960-is resolution, 2xAA 8xAF or 16xAF, and "HDR" graphics on close-to-max settings for quite a number of games. HL2 Episode 1, yeah, you can max it out but keep AA to 2x only. 4x is nice but sluggish. FEAR and FEAR:ExtractionPoint, 2xAA, 4xAF or 16xAF, can max out most settings, but textures medium for 128mbVRAM cards, and all shadows turned off (some shadows okay but look not so nice without "soft shadows" on, which kills framerates). NeedForSpeed:MostWanted, 2x-ishAA, 8x or 16xAF, "overbright" "visual treatment" medium to max "model detail".



    So that's my bias -- 1800 3DMark06's - just nice = which dovetails smoothly into, X1600 [mobility], but nonetheless just right, for 2005-2008 titles at just about what I would have described here as the "6600GT" standard of visual experience in 3D Gaming. Looking briefly at your links to the X3000, yeah, kinda passable, but as they said in the linked thread, UnrealTournament2007 will eat it alive.



    Give me at least 1500 3DMark06's or Give ME DEATH! muah haha ah hahhahah
  • Reply 123 of 155
    nvidia2008nvidia2008 Posts: 9,262member
    I think Apple could put in an X3000 integrated graphics into the Mac Mini and MacBooks, and a number of people would complain that it "can't really play games" or that "teh graphix sucks" or "is sooo lagggy...".



    If you'll excuse me now, I've got a 6600GT to run at 10% hard OC and redline my AMD64, 200+ miles an hour (NeedForSpeed:MW)... Aww yeah. Peace Out.
  • Reply 124 of 155
    melgrossmelgross Posts: 33,510member
    Quote:
    Originally Posted by nvidia2008 View Post


    I think Apple could put in an X3000 integrated graphics into the Mac Mini and MacBooks, and a number of people would complain that it "can't really play games" or that "teh graphix sucks" or "is sooo lagggy...".



    If you'll excuse me now, I've got a 6600GT to run at 10% hard OC and redline my AMD64, 200+ miles an hour (NeedForSpeed:MW)... Aww yeah. Peace Out.



    By Apple relying on the 9xx chipsets, they are trying to avoid having people say "Since they put a seperate GPU in, why didn't they put a ---- in, instead of what they used?"



    This just isn't a gaming machine.
  • Reply 125 of 155
    jeffdmjeffdm Posts: 12,951member
    Quote:
    Originally Posted by melgross View Post


    By Apple relying on the 9xx chipsets, they are trying to avoid having people say "Since they put a seperate GPU in, why didn't they put a ---- in, instead of what they used?"



    This just isn't a gaming machine.



    I believe that the X3000 is being built into the next mobile chipset coming in the next few months, it's just a matter of when Apple updates their notebooks.
  • Reply 126 of 155
    melgrossmelgross Posts: 33,510member
    Quote:
    Originally Posted by JeffDM View Post


    I believe that the X3000 is being built into the next mobile chipset coming in the next few months, it's just a matter of when Apple updates their notebooks.



    Whatever is being built into the chipset is fine. It's the same concept as the 950 or new 965. So it's the same thing, not a seperate part.
  • Reply 127 of 155
    Quote:
    Originally Posted by melgross View Post


    By Apple relying on the 9xx chipsets, they are trying to avoid having people say "Since they put a seperate GPU in, why didn't they put a ---- in, instead of what they used?"



    This just isn't a gaming machine.



    But then aging there is not an mid-range head less desktop
  • Reply 128 of 155
    melgrossmelgross Posts: 33,510member
    Quote:
    Originally Posted by Joe_the_dragon View Post


    But then aging there is not mid-range head less desktop



    I'm sorry Joe, but I didn't quite understand that.
  • Reply 129 of 155
    applebookapplebook Posts: 350member
    I'm sorry, but Apple has never been a truly gaming platform and probably never will be.



    The X1600 in the iMac and MBP is GARBAGE. Sure, it can run Doom III, but that game is like, what, 3 years old? Cutting edge is FEAR and Oblivion.



    The disappointing thing about the MacBook for me is that I use video spanning, and there is some lag with Expose --this just shouldn't happen for a laptop with a rather fast CPU.



    Bring on the X3000, but by the time that it's released and optimized, nVidia will roll out its new mid-range DX10 cards (ATi as well), and the X3000 will then be considered just as poor as the 950 was a year ago.
  • Reply 130 of 155
    jeffdmjeffdm Posts: 12,951member
    Quote:
    Originally Posted by applebook View Post


    The disappointing thing about the MacBook for me is that I use video spanning, and there is some lag with Expose --this just shouldn't happen for a laptop with a rather fast CPU.



    I don't think it is the chip - the ATI card in the Mac Pro can show some lag for Exposé too.
  • Reply 131 of 155
    Quote:
    Originally Posted by applebook View Post


    The X1600 in the iMac and MBP is GARBAGE. Sure, it can run Doom III, but that game is like, what, 3 years old? Cutting edge is FEAR and Oblivion.



    Get your hand out of the candy jar. What part of midrange don't you understand?



    MacBook Pro's aren't pizza ovens.
  • Reply 132 of 155
    applebookapplebook Posts: 350member
    Quote:
    Originally Posted by gregmightdothat View Post


    Get your hand out of the candy jar. What part of midrange don't you understand?



    MacBook Pro's aren't pizza ovens.



    Mid-range would be something like the $150 7900 GS. The X1600 always was a low-end card and has been discontinued for quite awhile on the PC side.



    The interesting thing is that Apple continues to use ATi when the 1000 Radeon series run significantly slower and hotter than the 7000 GeForce cards. Good for Apple.



    I don't have a huge problem with the MacBooks and Minis' 950 because almost all PC laptops in the $1000 range and virtually all small PCs use the same chip.



    A $1500+ PC laptop with the X1600 is one year ago. Today, it's plain
  • Reply 133 of 155
    Quote:
    Originally Posted by applebook View Post


    Mid-range would be something like the $150 7900 GS. The X1600 always was a low-end card and has been discontinued for quite awhile on the PC side.



    No, the 7900 is last generation's high-end card.



    For both NVIDIA and ATI cards, __300 is low end, __600 is mid-range, and __800+ is high end.



    The difference is important. Even though the 7900 is slower by today's high-end standards, it still runs incredibly hot—too hot to be used in a 1" laptop.



    Quote:

    The interesting thing is that Apple continues to use ATi when the 1000 Radeon series run significantly slower and hotter than the 7000 GeForce cards. Good for Apple.



    You seem to confuse series with models. A high-end card is never going to be cooler than a mid-range card. Besides, I don't know where you got that statistic from, but NVIDIA chips have always been way hotter and consumed more power than ATI.



    Quote:

    I don't have a huge problem with the MacBooks and Minis' 950 because almost all PC laptops in the $1000 range and virtually all small PCs use the same chip.



    A $1500+ PC laptop with the X1600 is one year ago. Today, it's plain



    Funny, looking at a $1600 Dell Inspiron, it only goes up to a NVIDIA Quadro 350 Mobile, which is less than half of the speed of an X1600.



    An HP Pavillion can only go up to a GeForce GO 7600, which is still a dip below the X1600.



    So, the fastest shipping graphics card of any mainstream laptop is a no, huh? Or were you holding out for a $4,000 Alienware monstrosity?
  • Reply 134 of 155
    Quote:
    Originally Posted by gregmightdothat View Post


    No, the 7900 is last generation's high-end card.



    For both NVIDIA and ATI cards, __300 is low end, __600 is mid-range, and __800+ is high end.



    The difference is important. Even though the 7900 is slower by today's high-end standards, it still runs incredibly hot?too hot to be used in a 1" laptop.



    You seem to confuse series with models. A high-end card is never going to be cooler than a mid-range card. Besides, I don't know where you got that statistic from, but NVIDIA chips have always been way hotter and consumed more power than ATI.







    The pathetically inept X1650 XT runs hotter than the 7900 GS:







    "A high-end card is never going to be cooler than a mid-range card."



    False, refer to the chart: the high-end 1950GT is much cooler than the X1650, though I maintain that it's a low-end card.



    If you want to look only at series as the indicator of range, then my point about the X1600 is even more relevant, seeing as it was a lower-mid (at best) mainstream card that has been discontinued. Why does Apple continue to supply this, and how can any Mac buyer justify this?



    Quote:

    Funny, looking at a $1600 Dell Inspiron, it only goes up to a NVIDIA Quadro 350 Mobile, which is less than half of the speed of an X1600.



    An HP Pavillion can only go up to a GeForce GO 7600, which is still a dip below the X1600.



    The 7600 is not slower than the X1600. A simple search would verify this: http://www.notebookcheck.net/Mobile-...ist.844.0.html



    Quote:

    So, the fastest shipping graphics card of any mainstream laptop is a no, huh? Or were you holding out for a $4,000 Alienware monstrosity?



    Speaking of Alienware, check this out:



    [1] Area-51® m5790

    Processor: Intel® Core? 2 Duo Processor T7400 2.16GHz 4MB Cache 667MHz FSB

    Display: 17" WideUXGA 1920 x 1200 LCD - Saucer Silver

    Motherboard: Alienware® Intel® 945PM + ICH7 Chipset

    Memory: 1GB Dual Channel DDR2 SO-DIMM at 667MHz - 2 x 512MB

    System Drive: 160GB Serial ATA 1.5Gb/s 5,400 RPM w/ 8MB Cache

    8x Dual Layer CD-RW/DVD±RW w/ Nero Software

    Video/Graphics Card: 256MB ATI Mobility? Radeon® X1800

    Sound Card: Intel® 7.1 High-Definition Audio



    >>>>>>>>>>> Free Alienware® T-Shirt: Free Alienware® T-Shirt - Black!!!!!!!!!!!!!

    SubTotal: $1,939.00





    15.4" WideUXGA 1920 x 1200 LCD

    Intel® Core? 2 Duo Processor T7400 2.16GHz 4MB Cache 667MHz FSB

    1GB Dual Channel DDR2 SO-DIMM at 667MHz - 2 x 512MB

    120GB Serial ATA 1.5Gb/s 5,400 RPM w/ 8MB Cache

    Intel High Definition 7.1 Audio

    8x Dual Layer CD-RW/DVD±RW w/ Nero Software

    256MB NVidia® GeForce? Go 7600

    $1,894.00



    At the end of the day, these are Windoze laptops, so I wouldn't even touch them, but there's no doubt that Windoze PCs are generally a step ahead in terms of GPU, display, and other hardware.



    Most $1000 notebooks come with 100GB+ hard drives and 1Gb Ram as well.



    [/QUOTE]
  • Reply 135 of 155
    melgrossmelgross Posts: 33,510member
    Quote:
    Originally Posted by applebook View Post






    The pathetically inept X1650 XT runs hotter than the 7900 GS:







    "A high-end card is never going to be cooler than a mid-range card."



    False, refer to the chart: the high-end 1950GT is much cooler than the X1650, though I maintain that it's a low-end card.



    If you want to look only at series as the indicator of range, then my point about the X1600 is even more relevant, seeing as it was a lower-mid (at best) mainstream card that has been discontinued. Why does Apple continue to supply this, and how can any Mac buyer justify this?







    The 7600 is not slower than the X1600. A simple search would verify this: http://www.notebookcheck.net/Mobile-...ist.844.0.html







    Speaking of Alienware, check this out:



    [1] Area-51® m5790

    Processor: Intel® Core? 2 Duo Processor T7400 2.16GHz 4MB Cache 667MHz FSB

    Display: 17" WideUXGA 1920 x 1200 LCD - Saucer Silver

    Motherboard: Alienware® Intel® 945PM + ICH7 Chipset

    Memory: 1GB Dual Channel DDR2 SO-DIMM at 667MHz - 2 x 512MB

    System Drive: 160GB Serial ATA 1.5Gb/s 5,400 RPM w/ 8MB Cache

    8x Dual Layer CD-RW/DVD±RW w/ Nero Software

    Video/Graphics Card: 256MB ATI Mobility? Radeon® X1800

    Sound Card: Intel® 7.1 High-Definition Audio



    >>>>>>>>>>> Free Alienware® T-Shirt: Free Alienware® T-Shirt - Black!!!!!!!!!!!!!

    SubTotal: $1,939.00





    15.4" WideUXGA 1920 x 1200 LCD

    Intel® Core? 2 Duo Processor T7400 2.16GHz 4MB Cache 667MHz FSB

    1GB Dual Channel DDR2 SO-DIMM at 667MHz - 2 x 512MB

    120GB Serial ATA 1.5Gb/s 5,400 RPM w/ 8MB Cache

    Intel High Definition 7.1 Audio

    8x Dual Layer CD-RW/DVD±RW w/ Nero Software

    256MB NVidia® GeForce? Go 7600

    $1,894.00



    At the end of the day, these are Windoze laptops, so I wouldn't even touch them, but there's no doubt that Windoze PCs are generally a step ahead in terms of GPU, display, and other hardware.



    Most $1000 notebooks come with 100GB+ hard drives and 1Gb Ram as well.




    [/QUOTE]



    Unfortunately, Apple seems to have other concerns.



    Actually, Apple has always been slow to move to the latest systems. They may start out with a great system, but they upgrade too slowly. Other companies may start out behind, but they end up ahead.



    It's something we suffer with for the OS.



    I'm hoping that will change, at least with the Mac Pro, and then, if we are lucky (very lucky) the comcept will filter down to the other lines.
  • Reply 136 of 155
    Quote:
    Originally Posted by melgross View Post


    Unfortunately, Apple seems to have other concerns.



    They may start out with a great system, but they upgrade too slowly. Other companies may start out behind, but they end up ahead.



    Absolutely true when it comes to CPU and GPU trends. The G4 was years ahead of Intel and AMD when first released, but IBM lost all of its advantage within a few years. The same could be said about the G5, though its performance wasn't as big a step forward as the G4 was many years earlier.



    Notice now though that in terms of CPU, Apple is right there with PC vendors. The next step is not cheaping out on RAM and HD space for the lower and mid-range machines.



    I think that the 7600GT (a very solid card) should be standard on the 20" and 24" iMacs and optional on the 17."
  • Reply 137 of 155
    melgrossmelgross Posts: 33,510member
    Quote:
    Originally Posted by applebook View Post


    Absolutely true when it comes to CPU and GPU trends. The G4 was years ahead of Intel and AMD when first released, but IBM lost all of its advantage within a few years. The same could be said about the G5, though its performance wasn't as big a step forward as the G4 was many years earlier.



    Notice now though that in terms of CPU, Apple is right there with PC vendors. The next step is not cheaping out on RAM and HD space for the lower and mid-range machines.



    I think that the 7600GT (a very solid card) should be standard on the 20" and 24" iMacs and optional on the 17."



    One of the problems Apple has, is that if sales of a product are making them happy, they see no need to quickly update it, and if sales are slow, they may not want to bother with it.



    They try to get as much mileage as they can from a configuration.
  • Reply 138 of 155
    nvidia2008nvidia2008 Posts: 9,262member
    And I thought *I* was getting all intense and PC-gamer-y on y'all asses talking about my poor old, now apparently "low end" 6600GT... ...I'm stepping off this hamster-treadmill discussion.



    I will be trying out a bit of GhostRecon and ZOMFG FEAR:ExtractionPoint. Speaking of FEAR, surprisingly, for almost similar 3dMark06's to a X1600, I seemed to have enjoyed it... and HL2 and HL2:LostCoast and HL2:Episode 1.



    I'm not wasting money at this stage on a nVidia 7-series, and for frack's sake nVidia's 8 series and ATI's next gen stuff needs to get down to 65nm. The heat and power is just ridiculous, and they have totally stalled in ANY BLOODY CASE ON DELIVERING A POWERFUL AFFORDABLE MOBILE DISCREET GPU. ...The X1600mobility or a bit higher than that, that's it for 1st half 2007, peoples, let's accept that and move on.
  • Reply 139 of 155
    nvidia2008nvidia2008 Posts: 9,262member
    Feel free to call me an Apple Apologist. ...I'd be happy to see a Go7600 or MobilityX1800 in iMacs and MacBookPros and MacBooks, perhaps, but meh. PC Gaming is a weird and terribly cruel world when it comes to graphics. And yes, Apple is "behind" because they are milking the profits.
  • Reply 140 of 155
    kukitokukito Posts: 113member
    Quote:
    Originally Posted by nvidia2008 View Post


    I'm not wasting money at this stage on a nVidia 7-series, and for frack's sake nVidia's 8 series and ATI's next gen stuff needs to get down to 65nm.



    It's done. R600 will be 65nm. The source for the story is the not very reliable Inquirer but they're reporting from CeBIT in Hanover and they're also very close to AMD.



    http://www.theinquirer.net/default.aspx?article=38292



    Hopefully it's true.
Sign In or Register to comment.