And on a related note, the Macbook bootcamped with XP defaults to 128mb of ram and goes up to 244mb or thereabouts. Sure its shared but with 2gb ram that ain't shabby. Of course Apple crippled it in OS X but I'm sure that'll be overcome sooner than later.
I thought I saw the graphics RAM had a minimum and maximum.
The main weakness of the GMA950 is not the actual chip's capabilities, but the inherent low-bandwidth design due to shared memory.
This used to be a serious problem with integrated GPUs, making them painful to use for anything. However, with dual-channel DDR2 system RAM, the graphics bandwidth of the current MacBook is, I believe, higher than the G4 iBook it replaced.
Aurora, if you want people to take you seriously, try spelling correctly, using good grammar and sentence structure, and provide solid reasons why, apparently, everyone needs dedicated graphics. It would also help if you demonstrated that you are actually reading and comprehending what everyone else is saying. If you fail to do these things, you are simply trolling and everyone will rightly ignore you.
This used to be a serious problem with integrated GPUs, making them painful to use for anything. However, with dual-channel DDR2 system RAM, the graphics bandwidth of the current MacBook is, I believe, higher than the G4 iBook it replaced.
But the G4 iBook just had a MR9550, with AGP. Not exactly top-notch anyways.
Interesting how the cheapist,slowest,least capable graphics have been spun into gold by the fan boys. If Apple put cat poo in a bag and put a Apple label on it you guys would telling everyone its steak and its better then steak.
I hear lots of people say that but given the choice, I would take the intel graphics in a heartbeat over the 9200 that came in the old minis and the 9500 in the iBooks, 64 MB ram V 32, faster GPU, fully supports the neat whistls and bells of the OSX GUI, and given the choice, I would rather spend an extra $100 on RAM than have the thing cost $100 more with 512 and 64MB discreet VRAM.
For rendering, GPUs are usually only used for preview purposes (due to their performance benefits), not for the final results (due to their quality/precision disadvantages). As such, arguably, the GMA950 might be more than good enough even for their needs. The main weakness of the GMA950 is not the actual chip's capabilities, but the inherent low-bandwidth design due to shared memory; bandwidth, however, doesn't matter such much during live previews (like in rendering) as it does while continuously rendering new images in real-time (like in games).
I do 3D stuff and you're right the GPU isn't used much at all for rendering unless you use say Maya's OpenGL preview or hardware rendering engine for certain things. However, it is used all the time for modelling the scene in the first place. If that isn't capable of handling enough geometry then you have to render in layers.
Now, I tend to use low res scenes because I prefer it that way but even at that, I can get a 2000 poly model up to level 2 subdivision (about 62,000 polys) on my G4 Mini 32MB Radeon 9200 but beyond that it's struggling. Now the GMA has benchmarked slower on some apps than the 9200. If it's faster it's not by much. Actually, I have access to a Mini with a GMA, I've been meaning to test it out.
The issue I have is not so much that the integrated chips are bad but yet again Apple leaves no middle ground.
Macbook: £749 - low end chip
Macbook Pro: £1399 - good chip
Where the hell is the £999 middle of the road chip?
Mac Mini: £449 - low end chip
Intel Imac: £929 - good chip
Where the hell is the £699 middle of the road chip?
provide solid reasons why, apparently, everyone needs dedicated graphics.
EXACTLY! Too often, the "OMG it needs a decent video card" crowd is allowed to get away with complaining that the video chip isn't "decent".
Yet they never define what fill rate, etc. would be "decent". And they don't specify how the "non decent" chip fails in ordinary use.
I still challenge any of the video-card complainers to tell the difference between the two video chips (integrated vs discrete) in a laptop without opening System Profiler or running something that requires high-end 3D transformations.
I'm tired of the "decent video card" obsession. Most of the people chanting about it don't even know how the thing works anyway.
Hmmmm funny that said fan club is almost universally against your statements or at the very least we realize that one solution doesn't fit all. Perhaps you should tell me why my mother needs a whizzband GPU if all she does is use the web and Office?
You're simply not going to win this argument. Just agree to disagree is likely your best option. It has nothing to do with being a fanboy and everything to do with common sense and matching your tools with an actual need.
Interesting how the cheapist,slowest,least capable graphics have been spun into gold by the fan boys. If Apple put cat poo in a bag and put a Apple label on it you guys would telling everyone its steak and its better then steak.
I think people like integrated graphics because they allow the computer to be cheap. If you don't need high end graphics, then it ends up being an effective way to save some dough. Extra dough is always sexy.
Quote:
Originally posted by Chucker
I'm actually slightly unsure over agreeing with 2). For rendering, GPUs are usually only used for preview purposes (due to their performance benefits), not for the final results (due to their quality/precision disadvantages).
The Mac display layer is actually decent at handling OpenGL, and having a nice card allows your previews to be much nicer. Previews are a big part of 3D modeling and rendering. Interestingly, you need a much more insane GPU to get good 3D previewing in Windows, or some sort of uber-expensive proprietary card+driver. I assume this is due to Quartz Extreme.
Interesting how the cheapist,slowest,least capable graphics have been spun into gold by the fan boys. If Apple put cat poo in a bag and put a Apple label on it you guys would telling everyone its steak and its better then steak.
I think people like integrated graphics because they allow the computer to be cheap. If you don't need high end graphics, then it ends up being an effective way to save some dough. Extra dough is always sexy.
To abuse a metaphor: If I want a romantic dinner, I'm definitely springing for the steak. But when I need to fertilize my garden, a sack of poo works just as well and costs a whole lot less. Only a very crazy and/or very rich person would use steak; even though steak makes a slightly better fertilizer.
My old, beloved iBook was great for writing stories, creating planing document and surfing the web. I had a machine at work to render video and other high-end tasks. I prefer consoles for gaming. My old iBook was a perfect cheap extension of my work computer.
For those who do not want integrated graphics, you do have the option to upgrade. It's called the MacBook pro.
I don't know about any of the technical stuff. I just know that in the short time I have had my MB, it has performed every task to perfection. Absolutely zero complaints even with the stock 512 ram.
At the Apple store, the iDVD demo runs horribly. I have a number of guesses but not enough knowledge to make the speculation worthwhile. Still, I would be interested to know what people who actually own these things feel they can't do.
Comments
Originally posted by sandau
And on a related note, the Macbook bootcamped with XP defaults to 128mb of ram and goes up to 244mb or thereabouts. Sure its shared but with 2gb ram that ain't shabby. Of course Apple crippled it in OS X but I'm sure that'll be overcome sooner than later.
I thought I saw the graphics RAM had a minimum and maximum.
Originally posted by Chucker
The main weakness of the GMA950 is not the actual chip's capabilities, but the inherent low-bandwidth design due to shared memory.
This used to be a serious problem with integrated GPUs, making them painful to use for anything. However, with dual-channel DDR2 system RAM, the graphics bandwidth of the current MacBook is, I believe, higher than the G4 iBook it replaced.
Aurora, if you want people to take you seriously, try spelling correctly, using good grammar and sentence structure, and provide solid reasons why, apparently, everyone needs dedicated graphics. It would also help if you demonstrated that you are actually reading and comprehending what everyone else is saying. If you fail to do these things, you are simply trolling and everyone will rightly ignore you.
Originally posted by Mr. H
This used to be a serious problem with integrated GPUs, making them painful to use for anything. However, with dual-channel DDR2 system RAM, the graphics bandwidth of the current MacBook is, I believe, higher than the G4 iBook it replaced.
But the G4 iBook just had a MR9550, with AGP. Not exactly top-notch anyways.
Originally posted by Aurora
Interesting how the cheapist,slowest,least capable graphics have been spun into gold by the fan boys. If Apple put cat poo in a bag and put a Apple label on it you guys would telling everyone its steak and its better then steak.
I hear lots of people say that but given the choice, I would take the intel graphics in a heartbeat over the 9200 that came in the old minis and the 9500 in the iBooks, 64 MB ram V 32, faster GPU, fully supports the neat whistls and bells of the OSX GUI, and given the choice, I would rather spend an extra $100 on RAM than have the thing cost $100 more with 512 and 64MB discreet VRAM.
Originally posted by Chucker
For rendering, GPUs are usually only used for preview purposes (due to their performance benefits), not for the final results (due to their quality/precision disadvantages). As such, arguably, the GMA950 might be more than good enough even for their needs. The main weakness of the GMA950 is not the actual chip's capabilities, but the inherent low-bandwidth design due to shared memory; bandwidth, however, doesn't matter such much during live previews (like in rendering) as it does while continuously rendering new images in real-time (like in games).
I do 3D stuff and you're right the GPU isn't used much at all for rendering unless you use say Maya's OpenGL preview or hardware rendering engine for certain things. However, it is used all the time for modelling the scene in the first place. If that isn't capable of handling enough geometry then you have to render in layers.
Now, I tend to use low res scenes because I prefer it that way but even at that, I can get a 2000 poly model up to level 2 subdivision (about 62,000 polys) on my G4 Mini 32MB Radeon 9200 but beyond that it's struggling. Now the GMA has benchmarked slower on some apps than the 9200. If it's faster it's not by much. Actually, I have access to a Mini with a GMA, I've been meaning to test it out.
The issue I have is not so much that the integrated chips are bad but yet again Apple leaves no middle ground.
Macbook: £749 - low end chip
Macbook Pro: £1399 - good chip
Where the hell is the £999 middle of the road chip?
Mac Mini: £449 - low end chip
Intel Imac: £929 - good chip
Where the hell is the £699 middle of the road chip?
provide solid reasons why, apparently, everyone needs dedicated graphics.
EXACTLY! Too often, the "OMG it needs a decent video card" crowd is allowed to get away with complaining that the video chip isn't "decent".
Yet they never define what fill rate, etc. would be "decent". And they don't specify how the "non decent" chip fails in ordinary use.
I still challenge any of the video-card complainers to tell the difference between the two video chips (integrated vs discrete) in a laptop without opening System Profiler or running something that requires high-end 3D transformations.
I'm tired of the "decent video card" obsession. Most of the people chanting about it don't even know how the thing works anyway.
Originally posted by hmurchison
Hmmmm funny that said fan club is almost universally against your statements or at the very least we realize that one solution doesn't fit all. Perhaps you should tell me why my mother needs a whizzband GPU if all she does is use the web and Office?
You're simply not going to win this argument. Just agree to disagree is likely your best option. It has nothing to do with being a fanboy and everything to do with common sense and matching your tools with an actual need.
Hey genius, grandma doesnt go for the option
Originally posted by Aurora
Hey genius, grandma doesnt go for the option
In English please. Thank you.
Originally posted by Aurora
Hey genius, grandma doesnt go for the option
Least useful and comprehensible post of the week.
Oops, missed the period at the end...
Originally posted by Kickaha
At least everything is spelled right this time.
Oops, missed the period at the end...
And the apostrophe in the middle.
And regardless of that, I still don't get the underlying message, if any, of the post.
If, for whatever unimaginable reason, someone can't be bothered to write in proper English, the least they can do is make themselves clear.
Nope, he's from SC. Definitely not his native language.
Originally posted by Aurora
Interesting how the cheapist,slowest,least capable graphics have been spun into gold by the fan boys. If Apple put cat poo in a bag and put a Apple label on it you guys would telling everyone its steak and its better then steak.
I think people like integrated graphics because they allow the computer to be cheap. If you don't need high end graphics, then it ends up being an effective way to save some dough. Extra dough is always sexy.
Originally posted by Chucker
I'm actually slightly unsure over agreeing with 2). For rendering, GPUs are usually only used for preview purposes (due to their performance benefits), not for the final results (due to their quality/precision disadvantages).
The Mac display layer is actually decent at handling OpenGL, and having a nice card allows your previews to be much nicer. Previews are a big part of 3D modeling and rendering. Interestingly, you need a much more insane GPU to get good 3D previewing in Windows, or some sort of uber-expensive proprietary card+driver. I assume this is due to Quartz Extreme.
Originally posted by Aurora
Hey genius, grandma doesnt go for the option
This would require two motherboard designs.
Originally posted by Splinemodel
quote:
Originally posted by Aurora
Interesting how the cheapist,slowest,least capable graphics have been spun into gold by the fan boys. If Apple put cat poo in a bag and put a Apple label on it you guys would telling everyone its steak and its better then steak.
I think people like integrated graphics because they allow the computer to be cheap. If you don't need high end graphics, then it ends up being an effective way to save some dough. Extra dough is always sexy.
To abuse a metaphor: If I want a romantic dinner, I'm definitely springing for the steak. But when I need to fertilize my garden, a sack of poo works just as well and costs a whole lot less. Only a very crazy and/or very rich person would use steak; even though steak makes a slightly better fertilizer.
My old, beloved iBook was great for writing stories, creating planing document and surfing the web. I had a machine at work to render video and other high-end tasks. I prefer consoles for gaming. My old iBook was a perfect cheap extension of my work computer.
For those who do not want integrated graphics, you do have the option to upgrade. It's called the MacBook pro.
Originally posted by Kickaha
Hey, be nice. English might not be his native language.
Nope, he's from SC. Definitely not his native language.
South Carolina?
At the Apple store, the iDVD demo runs horribly. I have a number of guesses but not enough knowledge to make the speculation worthwhile. Still, I would be interested to know what people who actually own these things feel they can't do.
Originally posted by blackbird_1.0
South Carolina?
Yes. I lived in NC for 10 years until just a couple of months ago.
Originally posted by Kickaha
Yes. I lived in NC for 10 years until just a couple of months ago.
I speak English very well.
It's a joke, ferchrissakes.
Originally posted by Kickaha
It's a joke, ferchrissakes.
Okay, My apologies for taking it seriously. I figured such, just wasn't sure.