Mac OS X 10.5.7 may have Nehalem, Radeon HD 4000 support

13»

Comments

  • Reply 41 of 56
    solipsismsolipsism Posts: 25,726member
    Quote:
    Originally Posted by WickedRabbit View Post


    So I'm potentially about to become a new Mac customer, but because of the rumors everyone has been telling me to wait at least another month or two before purchasing a system. Primarily, I have my eyes on what the Mac Pro's might be getting because if these two rumors pan out (core i7 and Radeon HD 4800 series card) doesn't that also make the system FINALLY suitable for gaming (so long as you run windows in bootcamp anyway)?



    If you want a Mac Pro and you can wait, then wait. The systems were updated last about a year ago. But don't expect i7, the new Mac Pros should be getting Nehalem-based Xeons. Perhaps overkill for most, but considerably more powerful than i7.
  • Reply 42 of 56
    outsideroutsider Posts: 6,008member
    Quote:
    Originally Posted by WickedRabbit View Post


    So I'm potentially about to become a new Mac customer, but because of the rumors everyone has been telling me to wait at least another month or two before purchasing a system. Primarily, I have my eyes on what the Mac Pro's might be getting because if these two rumors pan out (core i7 and Radeon HD 4800 series card) doesn't that also make the system FINALLY suitable for gaming (so long as you run windows in bootcamp anyway)?



    One thing that's always kept me away from Apple is their limited (or should I say ZERO?) interest in gaming. Granted, the Mac's are "bad" at gaming, but it's pretty budget with what you get and you can't run anything really at medium to max settings or at high resolution.



    I have zero interest in using a Mac Pro for what Apple probably intends other than occasionally editing a few videos here and there for my website, but I was about to either buy a Mac or just build a new Core i7 PC and call it a day.



    People have told me that despite the fact that even the current Mac Pro's are pretty powerful, the CPU is not optimized and designed for gaming and I can look at the graphics card and tell it's budget for gaming. But, a Core i7 and 4800 (not the highest end card on the market currently, but amongst the top) would make for a good system, unless Apple limits it somehow, right?



    You should build yourself a nice PC. Apple getting a better graphics card is not going to get developers to port games to the Mac. You're only going to get frustrated and pissed off.
  • Reply 43 of 56
    italiankiditaliankid Posts: 279member
    Quote:
    Originally Posted by BigE View Post


    I'm disappointed that, by all appearances, machines carrying ATI 2600 will not be addressed by SL and Grand Central. I've got a perfectly good 24" Aluminum iMac with an ATIHD2600 and would love to tap into Grand Central's power...



    And who is to blame for this? APPLE!



    They introduced these cards and are now screwing ALL the customers who bought an iMac in 2007/8 and now not supporting them...



    Makes you wonder sometimes. I guess they are doing what MS did with VISTA! Good Job Apple.
  • Reply 44 of 56
    solipsismsolipsism Posts: 25,726member
    Quote:
    Originally Posted by italiankid View Post


    They introduced these cards and are now screwing ALL the customers who bought an iMac in 2007/8 and now not supporting them...



    Huh? Updating HW is a bad thing?
  • Reply 45 of 56
    Quote:
    Originally Posted by Outsider View Post


    You should build yourself a nice PC. Apple getting a better graphics card is not going to get developers to port games to the Mac. You're only going to get frustrated and pissed off.



    Well I knew that devs weren't just going to immediately jump on board. In fact, that thought never crossed my mind. I just figured that at least I'd be able to play games in windows via bootcamp if they did actually put a nice graphics card in the machine.



    However, after doing some research, turns out that Xeon's - while being oober powerful - are not optimized for common desktop applications and that c2d's and quad core's are better suited for gaming than a Xeon. Anyone shed some light on that?
  • Reply 46 of 56
    Quote:
    Originally Posted by solipsism View Post


    If you want a Mac Pro and you can wait, then wait. The systems were updated last about a year ago. But don't expect i7, the new Mac Pros should be getting Nehalem-based Xeons. Perhaps overkill for most, but considerably more powerful than i7.



    If the Pros will be sticking with Xeons, will any model be getting i7's? I'm hoping they're feeling a bit of heat from PC manufacturers offering i7 mid-range consumer desktops and may offer a mid-range tower again, finally. Or perhaps they redesigned the iMac to handle the heat of the i7's. Can't imagine them pushing Duo 2 Core again with their mid-range $1500-$2300 desktop line (iMac).
  • Reply 47 of 56
    wizard69wizard69 Posts: 13,377member
    Quote:
    Originally Posted by italiankid View Post


    And who is to blame for this? APPLE!



    No it is a reality of the technology we use. If Apple implements quad core CPUs with hyperthreading tomorrow do you complain about the fact that the old machines only support two threads while the new have 8 threads available? Of course not as you should be thankful that something in this world actually improves year to year.

    Quote:

    They introduced these cards and are now screwing ALL the customers who bought an iMac in 2007/8 and now not supporting them...



    That statement is just plain ignorant.



    Besides that there is no sign indicating that support has stopped. Rather Apple has indicated that OpenCL is apparently not possible on them.

    Quote:

    Makes you wonder sometimes. I guess they are doing what MS did with VISTA! Good Job Apple.



    Actually Apple is doing a good job. They are moving forward with very innovative technology that could impact computing for a decade or more. If Apple can pull this off well and get the developer interest it could turn the Mac into one impressive work station.



    Now this may not make much difference to a person that is a casual Mac/web user but for many professionals it really means super computer performance. Or what once passed for a super computer. Putting that much power into the average users hand in accessible way is a very good job indeed.







    Dave
  • Reply 48 of 56
    Quote:
    Originally Posted by WickedRabbit View Post


    Well I knew that devs weren't just going to immediately jump on board. In fact, that thought never crossed my mind. I just figured that at least I'd be able to play games in windows via bootcamp if they did actually put a nice graphics card in the machine.



    However, after doing some research, turns out that Xeon's - while being oober powerful - are not optimized for common desktop applications and that c2d's and quad core's are better suited for gaming than a Xeon. Anyone shed some light on that?



    The processors aren't "optimized" for certain applications at the expense of others. In fact, Intel's desktop and server/workstation processors are identical, except that they use different sockets (no longer true with Nehalem) and the Xeons are dual- or quad- socket enabled.



    The Mac Pro now uses FB-DIMMs, a high-latency memory that hurts game performance a tiny bit, although it sounds like Nehalem Xeons will be able to use normal desktop DDR3.
  • Reply 49 of 56
    Quote:
    Originally Posted by winterspan View Post


    Yeah, I forgot they killed the 45nm dual-core + GPU and moved up it's 32nm replacement. About the mainstream quad-cores -- I can't imagine the reason why they would do that! Like you said, if anything, Clarksfield would stand to benefit the most from a newer process.

    What do you think they'll do with the MB Pro then? Surely they can't keep using Core 2 Duos until Arrandale?? I hope they would re-engineer it to accept a C2Q or Clarksfield.



    Lynnfield and Clarkfield's introduction has been slow and a Q4 2009 introduction and Sandy Bridge coming in H2 2010 means there isn't enough time to justify designing a 32nm shrink for mainstream quad core. 45nm dual cores were already cancelled in favour of 32nm versions for this reason. But, I quess Intel didn't do the same for Lynnfield and Clarkfield since they didn't want to delay further.



    Some quad core Clarkfield will probably fit the MacBook Pro, but it certainly wouldn't be at very high clock speeds.



    Quote:
    Originally Posted by winterspan View Post


    I've always wondered how that was going to be done.. I assume the CPU can tell the kernel which threads fall on which core so it negate that issue? Will this be a major modification to OSX?



    It's a difficult problem because the OS should be determining which threads go to which core. But the OS can't really tell which resources a thread needs and whether it can double with another thread on the same Hyperthreaded core or needs a separate dedicated core. The best way is probably for developers to determine scheduling themselves, but there's no way they can define it for all the various CPU configurations and numbers. I believe part of Grand Central and the LLVM Just-In-Time compiler is that the OS has better ability to organize the code at run-time based on the real-world CPU configuration that it's running on.



    Quote:
    Originally Posted by Maury Markowitz View Post


    I'm curious if you might point me to a non-geekish comparison of my current 8800 GT vs. the HD4870? If this is the new mid-range card that will be offered BTO, and there's a version for my first-gen MacPro, I'd be interested to see if it's "worth it" in performance terms.



    Maury



    I don't really know what would be non-geeky benchmarks.



    http://www.techreport.com/articles.x/15651/1



    You can look at the charts on Tech Report though. They compare the GTX260 Core 216 and the 9800GT. The 9800GT is just a rebranded 8800GT, the clocks and performance are the same. The GTX260 Core 216 is definitely noticeably faster than the 9800GT/8800GT, 50% faster in many cases.



    Quote:
    Originally Posted by wizard69 View Post


    Besides that there is no sign indicating that support has stopped. Rather Apple has indicated that OpenCL is apparently not possible on them.



    Can someone point me to official statements that Apple isn't supporting the HD2xxx series with OpenCL? ATI already supports GPGPU operation on the HD2xxx series through their Steam SDK and proprietary Brook+ language. The hardware interface is there and ATI just needs to provide a translator to connect the OpenCL language to the hardware interface. ATI's driver support on OS X has been better than nVidia's as of late, so ATI may well be interested in doing an OpenCL driver for OS X, even if Apple isn't actively pushing it themselves.
  • Reply 50 of 56
    Quote:
    Originally Posted by ltcommander.data View Post


    It's a difficult problem because the OS should be determining which threads go to which core. But the OS can't really tell which resources a thread needs and whether it can double with another thread on the same Hyperthreaded core or needs a separate dedicated core. The best way is probably for developers to determine scheduling themselves, but there's no way they can define it for all the various CPU configurations and numbers. I believe part of Grand Central and the LLVM Just-In-Time compiler is that the OS has better ability to organize the code at run-time based on the real-world CPU configuration that it's running on.



    Yeah, I guess that makes sense. Perhaps one reason why the work on LLVM could be potentially crucial to many-core scaling on new CPUs. With scheduling, I didn't mean that the CPU determines what (software) threads go on which core, but that the CPU hopefully can communicate to the kernel which of the 8 "logical processors" is an actual core versus just a SMT "virtual processor".





    Quote:
    Originally Posted by ltcommander.data View Post


    Can someone point me to official statements that Apple isn't supporting the HD2xxx series with OpenCL? ATI already supports GPGPU operation on the HD2xxx series through their Steam SDK and proprietary Brook+ language.



    I agree here. Although the 2xxx series was crap, architecturally the GPU core is very similar to the 3xxx and 4xxx series right? And given it already supports the same GPGPU interface, there isn't a reason why they wouldn't work with OpenCL. The only real difference is shader count and memory interface/type..
  • Reply 51 of 56
    pbpb Posts: 4,255member
    Quote:
    Originally Posted by poxonyou View Post


    If the Pros will be sticking with Xeons, will any model be getting i7's? I'm hoping they're feeling a bit of heat from PC manufacturers offering i7 mid-range consumer desktops and may offer a mid-range tower again, finally. Or perhaps they redesigned the iMac to handle the heat of the i7's. Can't imagine them pushing Duo 2 Core again with their mid-range $1500-$2300 desktop line (iMac).



    The problem here is that Apple already has a record of ignoring a significant part of the desktop Intel CPU's, commonly used by other manufacturers.
  • Reply 52 of 56
    solipsismsolipsism Posts: 25,726member
    Quote:
    Originally Posted by PB View Post


    The problem here is that Apple already has a record of ignoring a significant part of the desktop Intel CPU's, commonly used by other manufacturers.



    I wonder if Apple will address the issue this time around. The Xeons will cost more than the previous Xeons (though I'm told that the overall cost for the system will still be around the same) and Apple will most likely use those new low-power desktop C2Q in the iMac. This increases the gap even more than before, which will increase the forum complaining about an elusive xMac.
  • Reply 53 of 56
    I hope they include the Radeon 4870 as standard on the Mac Pro. Not that a 4850 would be a disaster...it's also a nice card. But I hope they include the newer 4870 version with 1 gig of Vram which supports higher resolutions better.



    It's a great card. Here's hoping for BTO options for the x2 Radeon and the 280 Nvidia cards.



    And as for the 4850 and 4870, I hope to see them as options in the iMac. But they seem to run hotter than the Nvidia equivalents...



    March 24th. I guess we'll see then. It's looking like they'll clean sweep update the desktop line then.



    Baited breath. I hope they introduce a mid-tower desktop as a 'surprise'.



    Lemon Bon Bon.
  • Reply 54 of 56
    outsideroutsider Posts: 6,008member
    In related news, ATI to slash Radeon 4800 prices.



    Quote:
    Originally Posted by Fudzilla on March 2, 2009


    4870 goes down to $149, 4850 to $129



    ATI is cutting Radeon HD4800 series pricing in an effort to put together a welcome party for Nvidia's soon to launch GTS 250 rebrand.



    According to several sites, the HD4870 will get a $50 price cut, and should retail in the US for $149, while the HD4850 will drop to $129. The cuts are expected to kick in this week, and at these prices, Nvidia's new cards will face un uphill struggle against ATI's cheap mid range.



    However, there's still no word on price cuts for ATI's dual GPU HD4800 series cards, the HD4870 X2 and HD4850 X2. As we reported earlier, we are expecting them to come down as well, but we are not sure if it will happen this week. Also, there's no word on a price cut for the HD4830, but it is already dirt cheap, and it will soon to be replaced by the RV740 anyway.



    We will keep our eyes open for any changes in the US and EU markets.



    More here.



  • Reply 55 of 56
    Quote:
    Originally Posted by Outsider View Post


    In related news, ATI to slash Radeon 4800 prices.



    Another ATI rumor is that they'll launch their first 40nm GPU this month, or next month. This is probably off-topic for Apple news, but based on the leaked info, this card will be fantastic. Performance between the Radeon 4830 and 4850, for less than $100.
  • Reply 56 of 56
    wizard69wizard69 Posts: 13,377member
    Quote:
    Originally Posted by ltcommander.data View Post


    Can someone point me to official statements that Apple isn't supporting the HD2xxx series with OpenCL? ATI already supports GPGPU operation on the HD2xxx series through their Steam SDK and proprietary Brook+ language. The hardware interface is there and ATI just needs to provide a translator to connect the OpenCL language to the hardware interface. ATI's driver support on OS X has been better than nVidia's as of late, so ATI may well be interested in doing an OpenCL driver for OS X, even if Apple isn't actively pushing it themselves.



    Apple does have a document, or at least I think they did, that list OpenCL compatible GPU's. Unfortunately I go to work in a few minutes so I won't be able to dig it up. Of course I could have the GPU chipset confused with older models as I don't have the doc in front of me right now.



    Even that doesn't really mean anything as the document would be very old and could have reflected what they had developed against at that point. I do remember concern being raised at the time though. In any event I do believe that OpenCL will be more about future hardware rather than the hardware from the past. Oh by the way that doesn't mean I don't want OpenCL support and video acceleration for my old MBP, its just that I would expect future systems to better leverage the technology.





    Dave
Sign In or Register to comment.