Yeah that is my questions too. If the i7 isn't much better than the i5 (according to the Notebookcheck.com article) does the i7 MBP perform better overall due to the doubled graphics memory?
There might be some benefit in the gaming department for people who dual-boot Windows. There are almost certainly benefits for scientific applications written in CUDA or OpenCL, so if you are "Folding @ Home" you'll probably see a difference. If you are manipulating huge image files it might matter. For most people there won't be any noticeable difference.
The main differences between the i7 and the i5 seems to be that the i7 hash more on-chip memory, which can make a difference (though benchmarks don't seem to be finding one), and a higher "Turbo" frequency for running single-threaded applications. Wikipedia doesn't think that the 2.4 GHz i5s have working AES units (http://en.wikipedia.org/wiki/AES_instruction_set). I can't imaging that this matters.
I came really close to buying one of these but remembered the golden rule---don't buy any new Apple product for the first 3 months. They will undoubtably have software/driver problems for a while, and may even have other issues (yellowing plastic, etc.). I'll get one closer to "back-to-school" time when I can probably get a free iPod out of it.
How do you sift through the realities of the different chips and video card memory options?
Is the i7 better to an average user than the i5? Will double the video memory mean that much to an occasional gamer?
Depends on what is meant by average user and what kind of games.
For the average user - if the definition is web browsing, email, document processing and typical multimedia of iTunes, iPhoto and movie watching, then the Core 2 Duo (C2D) is plenty of power. and certainly the lower end i-series is more than adequate. Since the real value of the i-series CPU is in the multi-core multi-threading, and most of the average user applications don't exploit multi-threading that much. Thus the two cores you get in either C2D or the i3 or i5 is sufficient for the average user.
Re. games, for the kind of games that I do, sudoku, mahjong and the like, C2D is plenty. It's even OK for the role-play type games that my daughter likes. But if you are into high video action shooter games, then the question is not as simple. (and I have no inclination in that direction, but high-end gamers seem to want high-end machines... if the articles and comments posted are followed).
until there are 4 physical cores in a macbook pro, these are off my list. my 8-core mac pro is just too good! maybe next year we will see 4-core macbook pro's (probably just in 17-incher). good update though, should be a nice upgrade for professionals. consumers this would make no difference for whatsoever. sounds like the graphics cards are auto-switching. huge step up from the previous versions where you'd have to log out and log back in after switching out the cards in the system preferences. wwdc this year? all about iphone and ipad it looks like. no 10.7 until 2011 assuming.
Yeah that is my questions too. If the i7 isn't much better than the i5 (according to the Notebookcheck.com article) does the i7 MBP perform better overall due to the doubled graphics memory?
We'll likely have to wait for websites to come out with their reviews for that. The extra graphic memory might make a decent difference when it comes to everyday processing thanks to OpenCL, but I'm not positive.
Depends on what is meant by average user and what kind of games.
For the average user - if the definition is web browsing, email, document processing and typical multimedia of iTunes, iPhoto and movie watching, then the Core 2 Duo (C2D) is plenty of power. and certainly the lower end i-series is more than adequate. Since the real value of the i-series CPU is in the multi-core multi-threading, and most of the average user applications don't exploit multi-threading that much. Thus the two cores you get in either C2D or the i3 or i5 is sufficient for the average user.
Re. games, for the kind of games that I do, sudoku, mahjong and the like, C2D is plenty. It's even OK for the role-play type games that my daughter likes. But if you are into high video action shooter games, then the question is not as simple. (and I have no inclination in that direction, but high-end gamers seem to want high-end machines... if the articles and comments posted are followed).
Perfect, thank you.
Had a buddy chime in on the video portion. His contention is that even with these offerings, the PC cards available are much faster. If your a gamer, you probably have a PC sitting next to your Mac and optimize. Or, in his case, buy an gaming console. Way better value for the dollar than trying to turn your laptop into a high end gaming platform.
Has anyone seen any benchmarks between the past version of the MBP (running 2.8 ghz) to the new line. I am tempted to get the I5 at 2.4 thinking that overall optimization will push it a bit faster than what I use now, but would love some stats on this.
It's mind-boggling to me that my 2007 SR-MBP has the same amount of VRAM as some of these new machines. What is happening here? I'm very, very disappointed. I originally planned to upgrade, but I think I'll just get a new 500GB HD now to give my MBP a new lease on life.
maybe next year we will see 4-core macbook pro's (probably just in 17-incher).
The current quad-core mobile chips are 45nm. These generate too much heat and drain the battery too quickly to seriously consider putting in a high-volume laptop. Only the dual-core Arrandales are 32nm now. Next year Intel will release quad-core 32nm mobile chips. Those will find their way into the 15" and 17" MacBooks Pro and perhaps the 13" MacBook Pro too.
Yes, it's been suggested. OWC has an eSATA adapter for $40.
I use the OWC eSata express card in my MBP and it works well. I think having the express card slot is more versatile than putting in a SD card slot. I already have an external SD card reader that is multi functional for the different kinds of storage I am using. At least they have one in the 17" now if I have to replace my MBP. When they eventually add the USB3 to all their laptops & desktops, I don't think the eSata I will be using as often.
I expected there to be an option for a 1440x900 13" Macbook. The 1280x800 resolution just doesn't cut if for most of us. Aah well, at least I can always get the 1680x1050 Core i5 15" Macbook Pro.
Has anyone seen any benchmarks between the past version of the MBP (running 2.8 ghz) to the new line. I am tempted to get the I5 at 2.4 thinking that overall optimization will push it a bit faster than what I use now, but would love some stats on this.
Never mind, it was as if Apple Insider was listening, they just posted the benchmark.
LOL! Geforce 330M is just renamed Geforce what 9600? Which was renamed Geforce 8600... Is that 17" actually slower than the previous MacBookPro?
330M is a rebrand of the 240M. 9600M GT was a rebrand of the 8600M GT. The rebranding doesn't mean identical though - the 9600M GT was faster than the 8600M GT.
The 330M is double the speed of the 9600M GT.
Quote:
Originally Posted by gotApple
EDIT: I was hoping that this refresh would bring the graphics department up to date. It didn't. I need to look elsewhere for a new laptop.
They could have put in a faster GPU and you'd get lower battery life and more heat. The 330M plays Crysis on High Quality:
I doubt you need more than that in a laptop with an 8 hour battery.
Quote:
Originally Posted by gotApple
HAHAHA My old 8800GTX (sold it away ages ago) from 2006 had 128 cores. 330M (same family BTW) has 48 cores... Not buying that old tech again, sorry Steve.
You're comparing a desktop GPU that draws 130 Watts to a laptop GPU that draws 35 Watts. The fastest non-SLI mobile GPU from NVidia draws 75W. Sure it has 128 cores like the 8800 but your battery would last less than half the time and your fan would have to run at maximum strength (Crysis reference) all the time when it was maxed out.
Apple only really had a few realistic options to have high performance enough graphics and maintain the battery life. The fastest they probably could have gone for was the Radeon 5830 from ATI or the GTS 250M from NVidia. But even at that, you're talking about 25-40% speed increase over the 330M. That's only an issue when you get a game that the GPU can't handle and it's the difference between playable or unplayable. When it's down to high graphics or enthusiast graphics, it's really a non-issue.
330M is a rebrand of the 240M. 9600M GT was a rebrand of the 8600M GT. The rebranding doesn't mean identical though - the 9600M GT was faster than the 8600M GT.
"The performance of the GT 330M is similar to the GeForce GT 240M and therefore located in the range of the Mobility Radeon HD 4650. The card supports DirectX 10.1 and all the features of the GT 230M / 240M (as it is based on the same GT216 core). The modern ATI Radeon HD 5650 offers DirectX 11 effects and performs better.
Modern and demanding DirectX 10 and 11 games (like Crysis or Risen) can only be played fluently with medium detail settings and resolution settings. Less demanding games like Sims 3 run in high details and resolutions."
Quote:
Originally Posted by Marvin
They could have put in a faster GPU and you'd get lower battery life and more heat. The 330M plays Crysis on High Quality:
I doubt you need more than that in a laptop with an 8 hour battery.
You're comparing a desktop GPU that draws 130 Watts to a laptop GPU that draws 35 Watts. The fastest non-SLI mobile GPU from NVidia draws 75W. Sure it has 128 cores like the 8800 but your battery would last less than half the time and your fan would have to run at maximum strength (Crysis reference) all the time when it was maxed out.
Apple only really had a few realistic options to have high performance enough graphics and maintain the battery life. The fastest they probably could have gone for was the Radeon 5830 from ATI or the GTS 250M from NVidia. But even at that, you're talking about 25-40% speed increase over the 330M. That's only an issue when you get a game that the GPU can't handle and it's the difference between playable or unplayable. When it's down to high graphics or enthusiast graphics, it's really a non-issue.
Whatever. It's still based on that 2006 tech which is a shame. 5830 would have been nice. And I mean really nice. Could have dual booted to DX11 Win7...
Comments
Yeah that is my questions too. If the i7 isn't much better than the i5 (according to the Notebookcheck.com article) does the i7 MBP perform better overall due to the doubled graphics memory?
There might be some benefit in the gaming department for people who dual-boot Windows. There are almost certainly benefits for scientific applications written in CUDA or OpenCL, so if you are "Folding @ Home" you'll probably see a difference. If you are manipulating huge image files it might matter. For most people there won't be any noticeable difference.
The main differences between the i7 and the i5 seems to be that the i7 hash more on-chip memory, which can make a difference (though benchmarks don't seem to be finding one), and a higher "Turbo" frequency for running single-threaded applications. Wikipedia doesn't think that the 2.4 GHz i5s have working AES units (http://en.wikipedia.org/wiki/AES_instruction_set). I can't imaging that this matters.
I came really close to buying one of these but remembered the golden rule---don't buy any new Apple product for the first 3 months. They will undoubtably have software/driver problems for a while, and may even have other issues (yellowing plastic, etc.). I'll get one closer to "back-to-school" time when I can probably get a free iPod out of it.
How do you sift through the realities of the different chips and video card memory options?
Is the i7 better to an average user than the i5? Will double the video memory mean that much to an occasional gamer?
Depends on what is meant by average user and what kind of games.
For the average user - if the definition is web browsing, email, document processing and typical multimedia of iTunes, iPhoto and movie watching, then the Core 2 Duo (C2D) is plenty of power. and certainly the lower end i-series is more than adequate. Since the real value of the i-series CPU is in the multi-core multi-threading, and most of the average user applications don't exploit multi-threading that much. Thus the two cores you get in either C2D or the i3 or i5 is sufficient for the average user.
Re. games, for the kind of games that I do, sudoku, mahjong and the like, C2D is plenty. It's even OK for the role-play type games that my daughter likes. But if you are into high video action shooter games, then the question is not as simple. (and I have no inclination in that direction, but high-end gamers seem to want high-end machines... if the articles and comments posted are followed).
Yeah that is my questions too. If the i7 isn't much better than the i5 (according to the Notebookcheck.com article) does the i7 MBP perform better overall due to the doubled graphics memory?
We'll likely have to wait for websites to come out with their reviews for that. The extra graphic memory might make a decent difference when it comes to everyday processing thanks to OpenCL, but I'm not positive.
Depends on what is meant by average user and what kind of games.
For the average user - if the definition is web browsing, email, document processing and typical multimedia of iTunes, iPhoto and movie watching, then the Core 2 Duo (C2D) is plenty of power. and certainly the lower end i-series is more than adequate. Since the real value of the i-series CPU is in the multi-core multi-threading, and most of the average user applications don't exploit multi-threading that much. Thus the two cores you get in either C2D or the i3 or i5 is sufficient for the average user.
Re. games, for the kind of games that I do, sudoku, mahjong and the like, C2D is plenty. It's even OK for the role-play type games that my daughter likes. But if you are into high video action shooter games, then the question is not as simple. (and I have no inclination in that direction, but high-end gamers seem to want high-end machines... if the articles and comments posted are followed).
Perfect, thank you.
Had a buddy chime in on the video portion. His contention is that even with these offerings, the PC cards available are much faster. If your a gamer, you probably have a PC sitting next to your Mac and optimize. Or, in his case, buy an gaming console. Way better value for the dollar than trying to turn your laptop into a high end gaming platform.
maybe next year we will see 4-core macbook pro's (probably just in 17-incher).
The current quad-core mobile chips are 45nm. These generate too much heat and drain the battery too quickly to seriously consider putting in a high-volume laptop. Only the dual-core Arrandales are 32nm now. Next year Intel will release quad-core 32nm mobile chips. Those will find their way into the 15" and 17" MacBooks Pro and perhaps the 13" MacBook Pro too.
Yes, it's been suggested. OWC has an eSATA adapter for $40.
I use the OWC eSata express card in my MBP and it works well. I think having the express card slot is more versatile than putting in a SD card slot. I already have an external SD card reader that is multi functional for the different kinds of storage I am using. At least they have one in the 17" now if I have to replace my MBP. When they eventually add the USB3 to all their laptops & desktops, I don't think the eSata I will be using as often.
Has anyone seen any benchmarks between the past version of the MBP (running 2.8 ghz) to the new line. I am tempted to get the I5 at 2.4 thinking that overall optimization will push it a bit faster than what I use now, but would love some stats on this.
Never mind, it was as if Apple Insider was listening, they just posted the benchmark.
LOL! Geforce 330M is just renamed Geforce what 9600? Which was renamed Geforce 8600... Is that 17" actually slower than the previous MacBookPro?
330M is a rebrand of the 240M. 9600M GT was a rebrand of the 8600M GT. The rebranding doesn't mean identical though - the 9600M GT was faster than the 8600M GT.
The 330M is double the speed of the 9600M GT.
EDIT: I was hoping that this refresh would bring the graphics department up to date. It didn't. I need to look elsewhere for a new laptop.
They could have put in a faster GPU and you'd get lower battery life and more heat. The 330M plays Crysis on High Quality:
http://www.youtube.com/watch?v=YyAVNJa5d-E
I doubt you need more than that in a laptop with an 8 hour battery.
HAHAHA My old 8800GTX (sold it away ages ago) from 2006 had 128 cores. 330M (same family BTW) has 48 cores... Not buying that old tech again, sorry Steve.
You're comparing a desktop GPU that draws 130 Watts to a laptop GPU that draws 35 Watts. The fastest non-SLI mobile GPU from NVidia draws 75W. Sure it has 128 cores like the 8800 but your battery would last less than half the time and your fan would have to run at maximum strength (Crysis reference) all the time when it was maxed out.
Apple only really had a few realistic options to have high performance enough graphics and maintain the battery life. The fastest they probably could have gone for was the Radeon 5830 from ATI or the GTS 250M from NVidia. But even at that, you're talking about 25-40% speed increase over the 330M. That's only an issue when you get a game that the GPU can't handle and it's the difference between playable or unplayable. When it's down to high graphics or enthusiast graphics, it's really a non-issue.
Link
For those interested in an awesome comparison of the new specs against the old, here you go.
Link
Great chart! Thanks. I hadn't realized some of the price changes and battery size increases between models.
330M is a rebrand of the 240M. 9600M GT was a rebrand of the 8600M GT. The rebranding doesn't mean identical though - the 9600M GT was faster than the 8600M GT.
The 330M is double the speed of the 9600M GT.
Sure, I'd love to see such benchmarks.
http://www.notebookcheck.net/NVIDIA-...M.22437.0.html
"The performance of the GT 330M is similar to the GeForce GT 240M and therefore located in the range of the Mobility Radeon HD 4650. The card supports DirectX 10.1 and all the features of the GT 230M / 240M (as it is based on the same GT216 core). The modern ATI Radeon HD 5650 offers DirectX 11 effects and performs better.
Modern and demanding DirectX 10 and 11 games (like Crysis or Risen) can only be played fluently with medium detail settings and resolution settings. Less demanding games like Sims 3 run in high details and resolutions."
They could have put in a faster GPU and you'd get lower battery life and more heat. The 330M plays Crysis on High Quality:
http://www.youtube.com/watch?v=YyAVNJa5d-E
I doubt you need more than that in a laptop with an 8 hour battery.
You're comparing a desktop GPU that draws 130 Watts to a laptop GPU that draws 35 Watts. The fastest non-SLI mobile GPU from NVidia draws 75W. Sure it has 128 cores like the 8800 but your battery would last less than half the time and your fan would have to run at maximum strength (Crysis reference) all the time when it was maxed out.
Apple only really had a few realistic options to have high performance enough graphics and maintain the battery life. The fastest they probably could have gone for was the Radeon 5830 from ATI or the GTS 250M from NVidia. But even at that, you're talking about 25-40% speed increase over the 330M. That's only an issue when you get a game that the GPU can't handle and it's the difference between playable or unplayable. When it's down to high graphics or enthusiast graphics, it's really a non-issue.
Whatever. It's still based on that 2006 tech which is a shame. 5830 would have been nice. And I mean really nice. Could have dual booted to DX11 Win7...
Yessssss
What other brand runs Mac OS X? None but Apple. And certainly none with the same elegance and build quality.
Of course. But what he's saying that he needs is different from what you mention. He needs a powerful computer.