Apple to use Intel's Sandy Bridge without Nvidia GPUs in new MacBooks

12346

Comments

  • Reply 101 of 126
    wizard69wizard69 Posts: 13,377member
    Quote:
    Originally Posted by melgross View Post


    Neither are most PC users.



    At least based on percentages. There are far more people that don't have a clue as to what a PC is or how it does what it does. All they care about is the web and e-mail.



    Apple on the other hand attracts a lot of UNIX hold outs and others skilled in using the computer as a lever to advance their businesses.



    Dave
     0Likes 0Dislikes 0Informatives
  • Reply 102 of 126
    wizard69wizard69 Posts: 13,377member
    Quote:
    Originally Posted by Marvin View Post


    Same here, it would be nice if SB can run ME3 sufficiently. One day I'm sure games will look like the good trailers too:



    http://www.youtube.com/watch?v=qZGFjBmD41Q



    I know that contradicts the near-term plateau I mentioned but it has to happen somewhere between 2014-2018 because they are reaching the fabrication limits for electronic components. They can make the die bigger/stacked perhaps, switch to optical transistors or find ways to ramp up the clock speeds without overheating the chips but they will run into issues.



    It'll be interesting to see what happens when we reach that point because regardless of demand, they might not be able to make better chips.



    Long term there will likely be transition to another fabrication method. However there is a enough potential in the short term to give us significant improvements out to 2020. I've been seeing some really impressive numbers for the 28 nm node, like a 40% drop in static power and 50% improvements in performance for a given power level. This is most interesting in the context of Apples ARM based hardware in my mind, it is just delightful that we may have more that 2X improvement sin iPhone/iPad processors in the very near future and 2X again a generation later.



    There is lots of interesting technology coming down the pipes to truly enable a new generation of portable devices. Just yesterday I was reading an interesting article in Photonics about a way to get rid of the color dies in LCD panels. As nice as the current generation of LCDs is the future is still very bright with many competing paths to even better displays. Likewise I don't see many dead ends in the field of semiconductor fabrication, once lithography meets its limit we will likely transition to something different like self assembling systems built with proteins.

    Quote:

    I was thinking the same thing. When the GMA 950 came to replace the Radeon 9200 etc, Intel and partners were trying to claim that IGPs weren't like IGPs of the past, which turned out to be wrong. They were slow, incompatible with poor feature support and far outperformed by the competition. This has happened with every GPU they've ever made.



    This is all true but why do people think Intel will continue to screw up GPU's. After awhile it has to be embarrassing to intel so eventually they will have to fix the issue. Indeed they will have to or give up the market to AMD. The reality is the GPU is more important than the CPU these days.

    Quote:

    With CPUs, they certainly outperform the competition though so I guess that's something but they have a lot of smoke and mirrors like their vanishing act with Larrabee. So much hype and it didn't amount to what they said it would (when they said it would anyway).



    They made good moves with SSD but still under-delivered on sequential write. Light Peak will probably be good though.



    Actually I was under the impression that Intels SSD's are well respected. No product can be expected to lead in every performance category every time.

    Quote:

    Web-based software certainly requires more capable hardware to be able to run Javascript and other content quickly enough so that will drive hardware forward.



    Well that wasn't what I was thinking about as the is today technology. I was thinking more along the lines of intelligent agents and AI's. Sort of like was seen in that old video of the Apple Navigator or what ever that little tablet was called.

    Quote:

    They tend to double every 2 years so in 10 years, it's only 32x faster and the manufacturers will hit the manufacturing difficulties before then. It will fall short of current render farms:



    http://www.slashfilm.com/cool-stuff-...s-renderfarms/



    but it should be plenty for photoreal rendering in real-time. The important drive needs to be on results and needs both software and hardware to be implemented smartly, not just with speedups on both sides. That's one impressive part about SB putting very fast fixed function architecture in there. Once people decide on the best algorithms, they should just put them in silicon to make them thousands of times faster. At first it seems like general purpose solutions would be better as those setups are more flexible but post-production rendering and games and mathematical algorithms are being refined to a point where people know what algorithms to run, they just need it done quickly.



    As long as they don't exclude general purpose capability, I think that will end up being the best design.



    The problem with fixed functions is that you are locked in to what is current technology. A good example here is video decoders seen in modern hardware. All is great until a new codec becomes the poster child for video Then you are back to software decoding.



    As to all this special purpose or fixed function hardware it does have me wondering what Apple is up to with the iPad processor. They have patented over the years a lot of ideas that can only be realize in hardware that we have yet to see. It would be very interesting to see such ideas put into silicon thus I'm wondering if part of the reason for going to ARM was to have a platform upon which to realize some of their ideas in hardware.
     0Likes 0Dislikes 0Informatives
  • Reply 103 of 126
    melgrossmelgross Posts: 33,702member
    Quote:
    Originally Posted by wizard69 View Post


    As to all this special purpose or fixed function hardware it does have me wondering what Apple is up to with the iPad processor. They have patented over the years a lot of ideas that can only be realize in hardware that we have yet to see. It would be very interesting to see such ideas put into silicon thus I'm wondering if part of the reason for going to ARM was to have a platform upon which to realize some of their ideas in hardware.



    This is a long complex discussion i would love to get involved in, but it's just too much overall.



    But I would like to comment on this last. It seems intereting that OS's like Unix (Mac OS X and iOS) and Linix (Android being the most important one right now) based systems can be made so much more efficient than Microsoft designed systems. Even their mobile OS's, not being Windows based, and written with low power cpus in mind seem to be much less efficient. windows is a dog. It required much more RAM, and much more powerful cpus, but doesn't perform well i netbook either. Getting it to perform with an ARM chip might prove impossible.



    So Apple has chosen ARM, because it works very well there. Of course, OS X is designed to work best with at least two cores. I don't know how that relates to iOS, but if that's true there as well, we can expect some substantial improvements next year. Throw in a VR 740 or so, and things could really fly. With their own optimizations of the chips, battery life should remain good, as these two core chips are already supposed to be equal in that respect to the older Cortex 8 single core.



    My iPad is already as powerful as my 700 MHz G4 Audio in many respects, and I go a lot of work done with that. a two core model with out of order execution will be a lot more powerful.



    Things are going well!
     0Likes 0Dislikes 0Informatives
  • Reply 104 of 126
    backtomacbacktomac Posts: 4,579member
    Mel.



    It's clear you're posting from an iPad. The typos give it away.
     0Likes 0Dislikes 0Informatives
  • Reply 105 of 126
    wizard69wizard69 Posts: 13,377member
    There is no ending because technology is constantly changing.

    Quote:
    Originally Posted by melgross View Post


    This is a long complex discussion i would love to get involved in, but it's just too much overall.



    But I would like to comment on this last. It seems intereting that OS's like Unix (Mac OS X and iOS) and Linix (Android being the most important one right now) based systems can be made so much more efficient than Microsoft designed systems. Even their mobile OS's, not being Windows based, and written with low power cpus in mind seem to be much less efficient. windows is a dog. It required much more RAM, and much more powerful cpus, but doesn't perform well i netbook either. Getting it to perform with an ARM chip might prove impossible.



    Considering UNIX and the like had its roots way back at the dawn of the Mini computer revolution, it shouldn't be a surprise that these systems have less of a demand on hardware. In Apples case though they are not afraid to drop backwards compatibility over time. Micro Soft on the other hand has up to recently tried to maintain backwards compatibility way back.



    I know lots of people complain about Apple dropping Carbon and support for other software but the reality is it keeps MAc OS slim.

    Quote:



    So Apple has chosen ARM, because it works very well there



    Yes of course for the portable market ARM is your only choice. But that isn't exactly what I was getting at. Apple could just as easily purchase ARM SoC from any number of suppliers. Instead they choose to roll their own. Even that isn't all that exciting as lots of companies license ARM cores. Apple on the other hand have gone a step further from what I understand and have a much more involved license giving them access to the IP to morph the processor to their will.



    It is my understanding that only a very few companies go to the trouble of licensing an ARM core to this extent. The obvious thought (in my vivid imagination) is that they are about to customize the processor to an extent not seen lately.

    Quote:

    . Of course, OS X is designed to work best with at least two cores. I don't know how that relates to iOS, but if that's true there as well, we can expect some substantial improvements next year.



    A lot of the infra structure is already there to leverage more cores.

    Quote:

    Throw in a VR 740 or so, and things could really fly. With their own optimizations of the chips, battery life should remain good, as these two core chips are already supposed to be equal in that respect to the older Cortex 8 single core.



    The 28nm node is becoming very very interesting. Some reports have indicated a 45% drop in static power and much faster circuitry. I'm not sure that Apple can field a 28 nm SoC next year but it is clear that there is a long ways to go as far as dropping power while upping performance.

    Quote:



    My iPad is already as powerful as my 700 MHz G4 Audio in many respects, and I go a lot of work done with that. a two core model with out of order execution will be a lot more powerful.



    Things are going well!



    As an old owner of a Mac Plus I understand 100% what you are saying as my iPhone impresses the heck out of me every day. Functionally that iPhone is light years ahead of the Mac Plus and a bunch of other computers I've had over the years. In this case we are talking a 3G so it isn't even up to date performance wise at all. It isn't to much to expect that iPad 2 will come close to effectively out performing my current 2008 MBP. Maybe not in raw CPU performance but certainly it has the potential to be a better machine than my MBP for many tasks.



    As a side note I was watching the video of the RIM exec demoing a Playbook earlier today. Some have dismissed RIM already but I was left with the impression that they will have a very good product that will be leveraging dual core processors and other advancements. The product demo was good enough to draw my interest in the Playbook which is more than any of the Android platforms have done. In any event the Playbook highlights where all these low power technologies are taking us.
     0Likes 0Dislikes 0Informatives
  • Reply 106 of 126
    Quote:
    Originally Posted by wizard69 View Post


    More interesting will be how Apple implements the architecture of this new SoC.



    Will the GPU be an equal partner to the CPU?



    It pretty much is already.



    Quote:

    Will the GPU support threads or other partitioning?



    Not for a generation or two.



    Quote:

    Will they implement an on board cache or video RAM buffer?



    What makes you think there isn't some already?



    Quote:

    Lots of questions but really this is what interests me about iPad 2, that is just what does Apple have up it's sleeve with respect to the next iPad processor. Considering all the recent patents it could be a major advancement or it could be another run of the mill ARM SoC.



    Keep in mind that hardware development pipelines are years long. No telling when patents will materialize in product.



    Fortunately we know that iOS 4 is based on Snow Leopard and includes GCD, blocks and therefore has many of the pieces required for OpenCL. The current GPU hardware is just not flexible enough. The next generation Imagination parts are. And the next generation ARM cores are multi-core capable (and faster individually).



    RIM might be bragging about how they'll really fly when dual core arrives (sounds like an excuse to me), but they had better be careful touting that one because Apple has far more experience (and shipping code) than they do at this parallelism stuff.
     0Likes 0Dislikes 0Informatives
  • Reply 107 of 126
    Quote:
    Originally Posted by AppleInsider View Post


    Future MacBooks set to arrive in 2011 will rely on Intel's forthcoming Sandy Bridge processor, which means Nvidia's graphics processors will not be included in at least some models 13 inches and under, according to a new report.



    Citing anonymous sources, Cnet on Thursday said that MacBook models with screen sizes of 13 inches and under will switch to Sandy Bridge-only graphics. Apple's larger, higher-end MacBooks, with screen sizes of 15 and 17 inches, will allegedly rely on GPUs from AMD.



    "Adoption of Sandy Bridge in popular small MacBook designs would constitute one of the strongest endorsements of Intel technology since Apple made the seminal transition from IBM-Motorola PowerPC chips to Intel back in 2005," the report said. "And a recognition that Intel's graphics technology, while maybe not the best, now offers the best price-performance for low-end MacBooks."



    Starting in 2010 with its Arrandale processors, Intel began building in the major northbridge chipset memory controller components to its chips. The architectural changes in Arrandale, along with a lawsuit, forced Nvidia to halt the development of future chipsets.



    Previously, Apple has not typically relied on Intel's graphics solutions for its notebooks. This year, in its updated MacBook Pro line, Apple introduced a proprietary automated graphics switching solution that dynamically switches between Intel's integrated graphics processor and Nvidia's discrete graphics chip.



    For the new MacBook Air and 13-inch MacBook Pro, Apple relies on older Core 2 Duo processors because Nvidia is still capable of creating chipsets for use with those processors. But if Nvidia loses its legal battle with Intel, it would not be able to make chipsets for the current Core i series or the forthcoming Sandy Bridge line of processors.



    Nathan Brookwood, principal analyst at Insight64, told Cnet he believes that Apple's lower-end MacBooks are "sitting ducks" for AMD's Fusion technology, which combines the company's central processors and graphics processors. In April, AppleInsider reported that Apple and AMD were in advanced discussions to potentially adopt AMD processors in at least some of its MacBook line.



    Intel will formally unveil its Sandy Bridge processors at the Consumer Electronics Show on Jan. 5, 2011. The company's chief executive, Paul Otellini, has said that he is "more excited by Sandy Bridge" than any other product the company has launched in years.



    Is this Sandy Bridge Processor only for Graphics? Is this also a data processor? A little confused.
     0Likes 0Dislikes 0Informatives
  • Reply 108 of 126
    Marvinmarvin Posts: 15,570moderator
    Quote:
    Originally Posted by gerald apple View Post


    Is this Sandy Bridge Processor only for Graphics? Is this also a data processor? A little confused.



    Yes, it's both like AMD's Fusion chips. It has some parts from a CPU and some parts from a GPU inside the same chip.
     0Likes 0Dislikes 0Informatives
  • Reply 109 of 126
    nhtnht Posts: 4,522member
    Quote:
    Originally Posted by wizard69 View Post


    T

    The 28nm node is becoming very very interesting. Some reports have indicated a 45% drop in static power and much faster circuitry. I'm not sure that Apple can field a 28 nm SoC next year but it is clear that there is a long ways to go as far as dropping power while upping performance.



    In the spirit of doubling down after making a rash prediction that might actually come true I will predict that the 2011 A5 is a 32nm HKMG Cortex A9 fabbed in Samsung's spanking new 32nm HKMG fab.



    That's pushing things a little but I can see Apple like having a process advantage over current gen competitors using the 45nm Tegra 2 or OMAP4 to beat them over the head with far longer battery life and or performance.
     0Likes 0Dislikes 0Informatives
  • Reply 110 of 126
    backtomacbacktomac Posts: 4,579member
    Quote:
    Originally Posted by nht View Post


    In the spirit of doubling down after making a rash prediction that might actually come true I will predict that the 2011 A5 is a 32nm HKMG Cortex A9 fabbed in Samsung's spanking new 32nm HKMG fab.



    That's pushing things a little but I can see Apple like having a process advantage over current gen competitors using the 45nm Tegra 2 or OMAP4 to beat them over the head with far longer battery life and or performance.



    That would be tremendous if your prediction is correct. Battery life and performance would easily kill anything out there.
     0Likes 0Dislikes 0Informatives
  • Reply 111 of 126
    wizard69wizard69 Posts: 13,377member
    Quote:
    Originally Posted by nht View Post


    In the spirit of doubling down after making a rash prediction that might actually come true I will predict that the 2011 A5 is a 32nm HKMG Cortex A9 fabbed in Samsung's spanking new 32nm HKMG fab.



    It might be slightly expensive but the die is reasonably small thus might be practical.



    I don't expect huge power savings but rather a lot more capability. Well that is what I want, Apple though has it's own balancing act to perform. I could see them going extremely low power to trim battery size. I think that is a mistake though as iPad needs a performance boost.

    Quote:

    That's pushing things a little but I can see Apple like having a process advantage over current gen competitors using the 45nm Tegra 2 or OMAP4 to beat them over the head with far longer battery life and or performance.



    Plus it would be pretty silly to have your own chip design team if they can't stay a step ahead of the competition.
     0Likes 0Dislikes 0Informatives
  • Reply 112 of 126
    wizard69wizard69 Posts: 13,377member
    Quote:
    Originally Posted by Programmer View Post


    It pretty much is already.



    Well not exactly. GPUs don't share the same address space as the CPU.

    Quote:

    Not for a generation or two.



    yes but at least AMD has indicated that they are going this route. When we can run GPU threads without having to worry about the impact on the display GPU computing will be here to stay.

    Quote:

    What makes you think there isn't some already?



    On Sandy Bridge or AMDs fusion? As far as I know there is no cache or RAM dedicated to the GPU on the SoC. I should qualify that with cache or RAM for a frame buffer. It is a lot of memory to put on board but in things like an iPhone or iPad it should be doable. The goal is to reduce or eliminate off board trips to drive the video. I would think this would save considerable power.

    Quote:

    Keep in mind that hardware development pipelines are years long. No telling when patents will materialize in product.



    This is very true but Apple needs a payoff for their investment in Chip development. A4 is a pretty run of the mill processor.

    Quote:

    Fortunately we know that iOS 4 is based on Snow Leopard and includes GCD, blocks and therefore has many of the pieces required for OpenCL. The current GPU hardware is just not flexible enough. The next generation Imagination parts are. And the next generation ARM cores are multi-core capable (and faster individually).



    Juicy! This is important for people to understand, if iPad2 gets the processor update we are hoping for their should be an immediate benefit.

    Quote:

    RIM might be bragging about how they'll really fly when dual core arrives (sounds like an excuse to me), but they had better be careful touting that one because Apple has far more experience (and shipping code) than they do at this parallelism stuff.



    This is possibly the point I disagree the most with. RIM is using QNX here which has a history that is pretty long. Plus they have had an excellent reputation in the industry. Admittedly this isn't the "PC" industry but their back ground is strong and diverse. Obviously this is a crash program on their part but let's face it it took Apple years to get iOS to the point where it is today. The Playbook OS really needs to be evaluated a year after release.
     0Likes 0Dislikes 0Informatives
  • Reply 113 of 126
    nhtnht Posts: 4,522member
    Quote:
    Originally Posted by melgross View Post


    But I would like to comment on this last. It seems intereting that OS's like Unix (Mac OS X and iOS) and Linix (Android being the most important one right now) based systems can be made so much more efficient than Microsoft designed systems.



    You can actually make fairly efficient and secure windows installs. There's nothing about windows itself that makes it particularly more or less efficient than the the unix core. MS is less capable in providing tailored power management beyond the lowest common denominator because it doesn't control hardware and depends on OEMs to provide tailored power schemes.



    OEMs are hampered because they don't control the software stack and can't tweak Windows power management code.



    MS added more power management support in Win7 and the general feeling is that windows power management had typically been better than Linux power management. Depends on the drivers quite a bit so YMMV.



    "Going into this power consumption testing we figured Microsoft Windows 7 would likely be more power efficient than Ubuntu Linux due to Windows having better ACPI support and more hardware vendors catering to Windows, but we did not expect to see such a wide difference with the ASUS Eee PC. With the "out of the box" experience for each operating system, Ubuntu 10.04 LTS was consuming 56% more power than Windows 7 Professional!"



    "Fortunately, with the Lenovo ThinkPad T61 the power consumption between Windows and Ubuntu Linux were not nearly as large as the Atom 330 + ION-based netbook. The ThinkPad T61 with Ubuntu 10.04 LTS was consuming 14% more power than Windows 7, but when both were loaded with NVIDIA's binary driver that leverages PowerMizer and other power-savings techniques, Ubuntu 10.04 LTS averaged to consume just 4% more power."



    http://www.phoronix.com/scan.php?pag...ws_part2&num=2



    Vista posted opposite results where Linux was more power efficient.



    Quote:

    Even their mobile OS's, not being Windows based, and written with low power cpus in mind seem to be much less efficient. windows is a dog. It required much more RAM, and much more powerful cpus, but doesn't perform well i netbook either. Getting it to perform with an ARM chip might prove impossible.



    Given that WP7 is working admirably on ARM chips this seems like a very strange assertion to make.





    Quote:
    Originally Posted by wizard69 View Post


    Considering UNIX and the like had its roots way back at the dawn of the Mini computer revolution, it shouldn't be a surprise that these systems have less of a demand on hardware.



    8-bit micro computers were not more powerful than the minis of the same time period. Far less. It is hard to argue that the unix kernel has some mystical advantage over the nt kernel.
     0Likes 0Dislikes 0Informatives
  • Reply 114 of 126
    melgrossmelgross Posts: 33,702member
    Quote:
    Originally Posted by nht View Post


    You can actually make fairly efficient and secure windows installs. There's nothing about windows itself that makes it particularly more or less efficient than the the unix core. MS is less capable in providing tailored power management beyond the lowest common denominator because it doesn't control hardware and depends on OEMs to provide tailored power schemes.



    OEMs are hampered because they don't control the software stack and can't tweak Windows power management code.



    MS added more power management support in Win7 and the general feeling is that windows power management had typically been better than Linux power management. Depends on the drivers quite a bit so YMMV.



    "Going into this power consumption testing we figured Microsoft Windows 7 would likely be more power efficient than Ubuntu Linux due to Windows having better ACPI support and more hardware vendors catering to Windows, but we did not expect to see such a wide difference with the ASUS Eee PC. With the "out of the box" experience for each operating system, Ubuntu 10.04 LTS was consuming 56% more power than Windows 7 Professional!"



    "Fortunately, with the Lenovo ThinkPad T61 the power consumption between Windows and Ubuntu Linux were not nearly as large as the Atom 330 + ION-based netbook. The ThinkPad T61 with Ubuntu 10.04 LTS was consuming 14% more power than Windows 7, but when both were loaded with NVIDIA's binary driver that leverages PowerMizer and other power-savings techniques, Ubuntu 10.04 LTS averaged to consume just 4% more power."



    http://www.phoronix.com/scan.php?pag...ws_part2&num=2



    Vista posted opposite results where Linux was more power efficient.







    Given that WP7 is working admirably on ARM chips this seems like a very strange assertion to make.



    OS X is considered to be better at power management than Windows. iOS is considered to be very power efficient. Comparing Windows to Linux based systems, or to other Unix based systems is irrelevant to this. Apple does its own power management.
     0Likes 0Dislikes 0Informatives
  • Reply 115 of 126
    solipsismsolipsism Posts: 25,726member
    Quote:
    Originally Posted by melgross View Post


    OS X is considered to be better at power management than Windows. iOS is considered to be very power efficient. Comparing Windows to Linux based systems, or to other Unix based systems is irrelevant to this. Apple does its own power management.



    Here is an AnandTech article hat backs up your comment.
    Apple claims 10 hours of battery life for the MBP13 when running OS X, and Anand hit pretty close to that mark when testing it out with his light web browsing test. Now, we?ve shown before that OS X is more optimized for mobile power consumption than all versions of Windows, so going into this test the expectations were a fair bit lower.



    And for good reason; the MBP13 showed fairly similar battery life to some of the older Core 2-based systems. With it?s 63.5 Wh lithium polymer battery, the MBP hits 5.5 hours on our ideal-case battery test, and exactly 5 hours on the web browsing test. While this is decent for the average Core 2 notebook, it?s pretty woeful compared to the OS X battery life of the MBP. If you have no reason to run Windows (program compatibility, gaming, etc) you?re better off in OS X just so that you can get about double the battery life.



    This reduction of battery life in Windows is pretty much along the same lines that Anand saw with the MacBook Air he tested under both OS X and Windows. This is a problem that?s been noted in both Vista and 7, and doesn?t look to go away anytime soon (though we?ll see if Microsoft can fix it in Windows 8).
     0Likes 0Dislikes 0Informatives
  • Reply 116 of 126
    melgrossmelgross Posts: 33,702member
    Quote:
    Originally Posted by solipsism View Post


    Here is an AnandTech article hat backs up your comment.
    Apple claims 10 hours of battery life for the MBP13 when running OS X, and Anand hit pretty close to that mark when testing it out with his light web browsing test. Now, we?ve shown before that OS X is more optimized for mobile power consumption than all versions of Windows, so going into this test the expectations were a fair bit lower.



    And for good reason; the MBP13 showed fairly similar battery life to some of the older Core 2-based systems. With it?s 63.5 Wh lithium polymer battery, the MBP hits 5.5 hours on our ideal-case battery test, and exactly 5 hours on the web browsing test. While this is decent for the average Core 2 notebook, it?s pretty woeful compared to the OS X battery life of the MBP. If you have no reason to run Windows (program compatibility, gaming, etc) you?re better off in OS X just so that you can get about double the battery life.



    This reduction of battery life in Windows is pretty much along the same lines that Anand saw with the MacBook Air he tested under both OS X and Windows. This is a problem that?s been noted in both Vista and 7, and doesn?t look to go away anytime soon (though we?ll see if Microsoft can fix it in Windows 8).



    The tech sites have been reporting this for years.
     0Likes 0Dislikes 0Informatives
  • Reply 117 of 126
    Marvinmarvin Posts: 15,570moderator
    Now that the Sandy Bridge processors are on sale in China/Malaysia etc, someone has kindly posted solid benchmarks:



    http://en.inpai.com.cn/doc/enshowcon...48&pageid=7730



    The version tested is the 6 EU desktop processor and contains the HD Graphic 2000 GPU. The comparison GPU called HD Graphic is the Arrandale one we have now (i.e when you turn the NVidia GPU off in the MBP).



    It's generally only slightly faster than the current one and around half the 5450. With 12 EUs, it will be in the region of the 5450. This to me suggests that Anand's version was in fact 12 EU though may or may not have been turbo-enabled.



    The 12 EU model should be fine but it doesn't make sense to even make a 6 EU model if it's not that much faster than what we have. This suggests a new entry iMac will stick with a dedicated card.



    They have CPU benchmarks of the i5 and i7:



    http://en.inpai.com.cn/doc/enshowcon...47&pageid=7714

    http://en.inpai.com.cn/doc/enshowcon...44&pageid=7685



    The H.264 test won't hold much weight as it doesn't look like they are using the hardware encoder there.



    It would be nice if Apple announced full support of the highest H.264 profile with the SB encoder. All the features to allow the best quality at a given bitrate. Their Quicktime encoders are way behind open source ones in quality and speed.
     0Likes 0Dislikes 0Informatives
  • Reply 118 of 126
    melgrossmelgross Posts: 33,702member
    Quote:
    Originally Posted by Marvin View Post


    Now that the Sandy Bridge processors are on sale in China/Malaysia etc, someone has kindly posted solid benchmarks:



    http://en.inpai.com.cn/doc/enshowcon...48&pageid=7730



    The version tested is the 6 EU desktop processor and contains the HD Graphic 2000 GPU. The comparison GPU called HD Graphic is the Arrandale one we have now (i.e when you turn the NVidia GPU off in the MBP).



    It's generally only slightly faster than the current one and around half the 5450. With 12 EUs, it will be in the region of the 5450. This to me suggests that Anand's version was in fact 12 EU though may or may not have been turbo-enabled.



    The 12 EU model should be fine but it doesn't make sense to even make a 6 EU model if it's not that much faster than what we have. This suggests a new entry iMac will stick with a dedicated card.



    They have CPU benchmarks of the i5 and i7:



    http://en.inpai.com.cn/doc/enshowcon...47&pageid=7714

    http://en.inpai.com.cn/doc/enshowcon...44&pageid=7685



    The H.264 test won't hold much weight as it doesn't look like they are using the hardware encoder there.



    It would be nice if Apple announced full support of the highest H.264 profile with the SB encoder. All the features to allow the best quality at a given bitrate. Their Quicktime encoders are way behind open source ones in quality and speed.



    What is interesting for portable use, is that NVidia has stated that Apple worked with them on their chipsets, and will be using them for some to come.
     0Likes 0Dislikes 0Informatives
  • Reply 119 of 126
    Marvinmarvin Posts: 15,570moderator
    Quote:
    Originally Posted by melgross View Post


    What is interesting for portable use, is that NVidia has stated that Apple worked with them on their chipsets, and will be using them for some to come.



    I don't think it makes much sense to use them for a long time. Even though there is a statement about Apple using them in future, NVidia aren't making chipsets any more:



    http://www.macrumors.com/2010/12/20/...idia-chipsets/



    This means no NVidia IGP ever again to succeed the 320M. If Intel match or exceed the 320M with this generation, they will double it next year. I imagine that if that comment was true about Apple using them, it would apply to the Macbook Air line and nothing else.



    Even with the MBA though, a ULV Sandy Bridge chip would likely be a better option.



    Given that Apple have wiped out NVidia GPUs from every model including BTO besides the laptop models, it doesn't look good for them. The GT 330M uses at least 15W whereas the 6550M will outperform it and scale down to 11W matching Intel's own IGP.



    I think Apple has dropped enough hints that the MBA will replace the entry-level lineup. The simple reason being that they are fast enough for that demographic in every way except for the CPU and the SSD makes up for that somewhat. Also, they would likely design the 13" MB and MBP in a similar way so either the Air goes or the 13" MB and MBP go.



    If they don't, they are going to have a thin, light Sandy Bridge MB with a good GPU and 13" screen at the same price point as the 11" MBA, which is just a slower version of the same thing and no one will buy it.



    I guess they can leave the 13" MBP but if they redesign it, the 13" Air might be a tough sell as it would be $100 more.



    There are actually a number of ways Apple can choose to go with this. It will be interesting to see which way they choose.



    I think the white Macbook has reached the end of its life at this point and I just don't see another version being made. I think they will have to get 128GB of storage in the entry-level though and Light Peak to supplement it with fast external storage.



    Given that they just redesigned the 13" MBA, they won't discontinue it so what happens with the 13" MBP? I don't think it can be discontinued in favour of the Air just now but they only have Core 2 Duo anyway so I see it as a possibility.



    Then the 15" will get the i5 + Radeon but also with a redesign.



    This kind of switch will feel bad right now but by next year it won't. The same thing happened with the iMac. They switched the mid-range towers and cube to AIO and at first it sucked because they had single processor G4s and G5s and then came the Intel chips and now we have 27" screens with quad-core chips.



    The MBA will start out with slow C2D but have other compelling features (instant-on, very light, very thin) and within 1-2 years, it will be quad-core with 4GB RAM, 256GB SSD and a GPU 2x the 320M it has now and no one will care about the decision.
     0Likes 0Dislikes 0Informatives
  • Reply 120 of 126
    nhtnht Posts: 4,522member
    Quote:
    Originally Posted by melgross View Post


    OS X is considered to be better at power management than Windows. iOS is considered to be very power efficient. Comparing Windows to Linux based systems, or to other Unix based systems is irrelevant to this. Apple does its own power management.



    What? You wrote:



    Quote:
    Originally Posted by melgross View Post


    But I would like to comment on this last. It seems intereting that OS's like Unix (Mac OS X and iOS) and Linix (Android being the most important one right now) based systems can be made so much more efficient than Microsoft designed systems.



    The point is that the key to power efficiency is in the power management software and drivers. Not what the base OS is. That iOS/OSX is power efficient has zero to do with any unix heritage but the attention that Apple has given to power management. There is no inherent power advantage in being unix based vs nt or winCE based.
     0Likes 0Dislikes 0Informatives
Sign In or Register to comment.