its a similair type package sold to the developer of the game, and the developer of the game will package it and test it and get it working right with Cider, before shipping it.
That's where I see the biggest problem. Making games cross-platform would be fine if developers didn't limit themselves to using Windows APIs. There's also the matter of whether they want to target Mac gamers anyway when there are so few of us because they then have much more testing time and more support costs for probably very little return. They would also have to license this cider software and why would they risk tying their software to something that might be unstable?
Time. It's just in too short supply. If one honestly has no other asperations in life, and has no purpose, then gaming is a fun way to whittle away the hours. In my experience though, after the point that I started acquiring some honest goals and ambitions for my future, I really REALLY began to regret the time that I was spending on things that weren't helping me acomplish them.
I suppose your time spent posting here at AI is in some way helping you to accomplish your life goals? Perhaps you aspire to be a gossip collumnist?
Anyways, I don't see what's so hard about installing Windows on a MacIntel, and rebooting to play games. Yeah it would be harder to play games while you work, but isn't that a good thing?
I suppose your time spent posting here at AI is in some way helping you to accomplish your life goals? Perhaps you aspire to be a gossip collumnist?
Heh, good point. Aside from inanity (which is certainly a waste of time) AI forums provide me with a place to engage my mind and keep it well oiled. Can't reasonably reply to good logic when in a daze. As someone else said though, too much of a good thing is very bad.
I suppose your time spent posting here at AI is in some way helping you to accomplish your life goals?
Ever heard 'Buy on rumor, sell on fact"? I have actually been surprised at how often insights provided by members here (and on Slashdot and others for example) have affected my stock moves. Ironically, I have made some serious money on tech stocks for years. If you want the truth about tech companies, read what the geeks are saying.
Oh, and another vote for Celemourn's post. Although I do indulge the habit occasionally. All work and no play...
You realize, of course, that this would make a MacBook thicker, heavier, more power-consuming and hotter?
Depends on whether Apple chose to use MXM 1, 2, or 3. If they went with the low power MXM 1 or 2, then heat would be minimal. Footprint I suspect would be affected regardless, since it's no longer a chip on the board, but now a replacable module, but I doubt the impact would be that great. the graphics chip has to have a certain amount of space for heatsink etc anyway, regardless of where it's positioned.
Idealy I'd like to see the MBP have MXM 3 slots in them, that way you can use whatever level card you want (you can use a 1 or a 2 in a 3, but a 3 only fits in a 3).
It's kind of like what the auto industry was doing a few years back: Build em to break. Car was good for 5 or 6 years, then you basically had to replace the whole thing. Along come Toyota, Honda and a few others, and they started building cars to last. Now look what's happening go GM and Ford. Apple is a higher quality manufacturer anyway, so this model, like what toyota did, by extending the longevity of their products would behoove them. If I have to replace my laptop ever 2 years cause the graphics are no longer cuting the mustard, then why the heck should I go with a premium box? I can just get a junker $400 windows discount special. There has to be a very compelling reason for consumers to pick you when you have the small bit of the market share.
I think Apple would benefit from this, and I don't feel that any minor size sacrifice that had to be made would be that much of a problem, especially since, if you don't want it hot, you can just get a wimpy gpu.
And I'm solidly convinced (for now anyway. ) that the consumer would benefit from it. After I got my G3 Wallstreet back in 98, I SWORE that I would never get a laptop as my primary computer again, for the sole reason that the GPU is never upgradable. And then I went and built a PC... but that's another story, and a mistake I try to forget.
Footprint I suspect would be affected regardless, since it's no longer a chip on the board, but now a replacable module, but I doubt the impact would be that great.
Why do you think socketed CPUs and GPUs are uncommon? Footprint.
Quote:
the graphics chip has to have a certain amount of space for heatsink etc anyway, regardless of where it's positioned.
It currently just has a heatpipe connector. An MXM-style heatsink would take up a lot more vertical space —*something Apple keeps trying to cut down on.
Quote:
If I have to replace my laptop ever 2 years cause the graphics are no longer cuting the mustard, then why the heck should I go with a premium box? I can just get a junker $400 windows discount special. There has to be a very compelling reason for consumers to pick you when you have the small bit of the market share.
Why do you think socketed CPUs and GPUs are uncommon? Footprint.
It currently just has a heatpipe connector. An MXM-style heatsink would take up a lot more vertical space —*something Apple keeps trying to cut down on.
.
.
.
Why would you replace your laptop that often?
Because the graphics are no longer cutting the mustard. Personally, I get very frustrated by slow choppy graphics, while I also thrive on running the newest software. I've succeeded in resigning myself to the fact that if I want high end graphics I have no choice but to go with a desktop or tower, but by gosh, they are anoyingly big, heavy, and non portable. As to the heatpipe, point taken. I haven't seen the guts of the MBP so I'm not familiar with what kind of heat management it uses. But let me ask this: If it's the same gpu chip, why does being modular require better cooling? When you upgrade, I can see it becoming an issue, especially if the new gpu is significantly hotter, but just having it in a modular fashion I don't see as causing any different need for cooling. Course that could be part of the MXM specs (I looked at them many months ago and then promptly forgot what they were), so the point may still be valid. Then again, Apple has very good engineers, who have proven their ability to squeeze tons of junk into absurdly small spaces. However, I see utility even without future upgrades in mind: If someone enjoys having a large display, but doesn't need massive graphics capabilties, they can BTO a 15" or 17" MBP and select a 950 gpu to save money. If someone really REALLY wants powerful graphics, but needs a small computer, they could get a MB with an x1800 mobility board in it. (yes, such a MB would prolly be a bit thicker.)
There are always trade-offs in technology. I suspect that in this case they would be offset by the benefits though.
But let me ask this: If it's the same gpu chip, why does being modular require better cooling?
As far as I know, a soldered chip can be cooled better than a socketed one. The heat is simply transferred a lot easier.
But I'm not a physics guy.
Quote:
Then again, Apple has very good engineers, who have proven their ability to squeeze tons of junk into absurdly small spaces.
By making compromises! Compromises such as soldering instead of socketing, getting rid of expansion bays (as a Wall Street PB owner, you probably remember those), downgrading from a dual-layer drive to a single-layer one, etc. Lots of compromises that lots of would-be customers frown upon.
The two primary problems I've experienced with gaming are:
A) Experiencing a tendancy to loose touch with reality. I experienced this first hand when I was stationed in Korea in the Army. With nothing else to do really, (unless I wanted to go get drunk) I spent countless hours behind the computer engrossed in a fantasy world. It gets very hypnotic, to the point that if I now hear certain music that I played often while gaming, I remember with an almost obscene vividness the FANTASY WORLD that I was in at the time. No recollection at all of actual physical location, only the artificial location. It's kind of spooky. I suspect that may be similar in some way to what many Vietnam era veterans experience due to severe PTSD. So that's the first.
Time. It's just in too short supply. If one honestly has no other asperations in life, and has no purpose, then gaming is a fun way to whittle away the hours. In my experience though, after the point that I started acquiring some honest goals and ambitions for my future, I really REALLY began to regret the time that I was spending on things that weren't helping me acomplish them. It's painful to be actively holding yourself back from your goals and have full knowlege of it. I think it comes from a sense of inertia in our lives- the feeling that it's better to stick with the old habbits than to work really hard psycologically on ourselves (or physically; Ever try to loose 30 pounds?) to establish a new habbit (like studying chemistry in the dorm room instead of the bar) that doesn't provide an immediate sense of gratification.
Ok, back to the dorm room chemistry! (Geeze! don't you HATE dorms?)
You played videogames during the Korean War?!
Was this some prehistoric Pong prototype or are you some sort of Time Traveler?!
As far as I know, a soldered chip can be cooled better than a socketed one. The heat is simply transferred a lot easier.
But I'm not a physics guy.
By making compromises! Compromises such as soldering instead of socketing, getting rid of expansion bays (as a Wall Street PB owner, you probably remember those), downgrading from a dual-layer drive to a single-layer one, etc. Lots of compromises that lots of would-be customers frown upon.
Actually, I think we'd need an engineer to figure that one out. I don't see why a heat pipe couldn't be used on the graphics module. But we have only conjecture on that subject, I think.
As far as compromises, very true. And yes, I like my expansion bays. You're right, that's definately the direction they've been heading, and now that I consider it, additional upgrade options do lead away from that KISS principle they've been following.
As far as I know, a soldered chip can be cooled better than a socketed one.
just as an afterthought to clarify the discussion, rememeber that MXM GPUs are on daughterboard, rather than being socketed like a desktop processor. I'll give you the benefit of the doubt though and assume that's what you meant.
Comments
its a similair type package sold to the developer of the game, and the developer of the game will package it and test it and get it working right with Cider, before shipping it.
That's where I see the biggest problem. Making games cross-platform would be fine if developers didn't limit themselves to using Windows APIs. There's also the matter of whether they want to target Mac gamers anyway when there are so few of us because they then have much more testing time and more support costs for probably very little return. They would also have to license this cider software and why would they risk tying their software to something that might be unstable?
It's a nice idea but so is W3C compliance.
Time. It's just in too short supply. If one honestly has no other asperations in life, and has no purpose, then gaming is a fun way to whittle away the hours. In my experience though, after the point that I started acquiring some honest goals and ambitions for my future, I really REALLY began to regret the time that I was spending on things that weren't helping me acomplish them.
I suppose your time spent posting here at AI is in some way helping you to accomplish your life goals? Perhaps you aspire to be a gossip collumnist?
Anyways, I don't see what's so hard about installing Windows on a MacIntel, and rebooting to play games. Yeah it would be harder to play games while you work, but isn't that a good thing?
I suppose your time spent posting here at AI is in some way helping you to accomplish your life goals? Perhaps you aspire to be a gossip collumnist?
Heh, good point. Aside from inanity (which is certainly a waste of time) AI forums provide me with a place to engage my mind and keep it well oiled. Can't reasonably reply to good logic when in a daze. As someone else said though, too much of a good thing is very bad.
I suppose your time spent posting here at AI is in some way helping you to accomplish your life goals?
Ever heard 'Buy on rumor, sell on fact"? I have actually been surprised at how often insights provided by members here (and on Slashdot and others for example) have affected my stock moves. Ironically, I have made some serious money on tech stocks for years. If you want the truth about tech companies, read what the geeks are saying.
Oh, and another vote for Celemourn's post. Although I do indulge the habit occasionally. All work and no play...
Is it really that bad without a dedicated graphics card? seems such a pitty
the gma 950 sucks at 3d.
the gma 950 sucks at 3d.
http://www.nvidia.com/page/mxm.html
you would think that this type of thing would have become a standard a LONG time ago... Hope Apple decides to impliment it.
http://www.nvidia.com/page/mxm.html
you would think that this type of thing would have become a standard a LONG time ago... Hope Apple decides to impliment it.
You realize, of course, that this would make a MacBook thicker, heavier, more power-consuming and hotter?
You realize, of course, that this would make a MacBook thicker, heavier, more power-consuming and hotter?
Depends on whether Apple chose to use MXM 1, 2, or 3. If they went with the low power MXM 1 or 2, then heat would be minimal. Footprint I suspect would be affected regardless, since it's no longer a chip on the board, but now a replacable module, but I doubt the impact would be that great. the graphics chip has to have a certain amount of space for heatsink etc anyway, regardless of where it's positioned.
Idealy I'd like to see the MBP have MXM 3 slots in them, that way you can use whatever level card you want (you can use a 1 or a 2 in a 3, but a 3 only fits in a 3).
It's kind of like what the auto industry was doing a few years back: Build em to break. Car was good for 5 or 6 years, then you basically had to replace the whole thing. Along come Toyota, Honda and a few others, and they started building cars to last. Now look what's happening go GM and Ford. Apple is a higher quality manufacturer anyway, so this model, like what toyota did, by extending the longevity of their products would behoove them. If I have to replace my laptop ever 2 years cause the graphics are no longer cuting the mustard, then why the heck should I go with a premium box? I can just get a junker $400 windows discount special. There has to be a very compelling reason for consumers to pick you when you have the small bit of the market share.
I think Apple would benefit from this, and I don't feel that any minor size sacrifice that had to be made would be that much of a problem, especially since, if you don't want it hot, you can just get a wimpy gpu.
And I'm solidly convinced (for now anyway. ) that the consumer would benefit from it. After I got my G3 Wallstreet back in 98, I SWORE that I would never get a laptop as my primary computer again, for the sole reason that the GPU is never upgradable. And then I went and built a PC... but that's another story, and a mistake I try to forget.
Anyway, for your consideration.
Footprint I suspect would be affected regardless, since it's no longer a chip on the board, but now a replacable module, but I doubt the impact would be that great.
Why do you think socketed CPUs and GPUs are uncommon? Footprint.
the graphics chip has to have a certain amount of space for heatsink etc anyway, regardless of where it's positioned.
It currently just has a heatpipe connector. An MXM-style heatsink would take up a lot more vertical space —*something Apple keeps trying to cut down on.
If I have to replace my laptop ever 2 years cause the graphics are no longer cuting the mustard, then why the heck should I go with a premium box? I can just get a junker $400 windows discount special. There has to be a very compelling reason for consumers to pick you when you have the small bit of the market share.
Why would you replace your laptop that often?
Why do you think socketed CPUs and GPUs are uncommon? Footprint.
It currently just has a heatpipe connector. An MXM-style heatsink would take up a lot more vertical space —*something Apple keeps trying to cut down on.
.
.
.
Why would you replace your laptop that often?
Because the graphics are no longer cutting the mustard. Personally, I get very frustrated by slow choppy graphics, while I also thrive on running the newest software. I've succeeded in resigning myself to the fact that if I want high end graphics I have no choice but to go with a desktop or tower, but by gosh, they are anoyingly big, heavy, and non portable. As to the heatpipe, point taken. I haven't seen the guts of the MBP so I'm not familiar with what kind of heat management it uses. But let me ask this: If it's the same gpu chip, why does being modular require better cooling? When you upgrade, I can see it becoming an issue, especially if the new gpu is significantly hotter, but just having it in a modular fashion I don't see as causing any different need for cooling. Course that could be part of the MXM specs (I looked at them many months ago and then promptly forgot what they were), so the point may still be valid. Then again, Apple has very good engineers, who have proven their ability to squeeze tons of junk into absurdly small spaces. However, I see utility even without future upgrades in mind: If someone enjoys having a large display, but doesn't need massive graphics capabilties, they can BTO a 15" or 17" MBP and select a 950 gpu to save money. If someone really REALLY wants powerful graphics, but needs a small computer, they could get a MB with an x1800 mobility board in it. (yes, such a MB would prolly be a bit thicker.)
There are always trade-offs in technology. I suspect that in this case they would be offset by the benefits though.
Consider as you like.
But let me ask this: If it's the same gpu chip, why does being modular require better cooling?
As far as I know, a soldered chip can be cooled better than a socketed one. The heat is simply transferred a lot easier.
But I'm not a physics guy.
Then again, Apple has very good engineers, who have proven their ability to squeeze tons of junk into absurdly small spaces.
By making compromises! Compromises such as soldering instead of socketing, getting rid of expansion bays (as a Wall Street PB owner, you probably remember those), downgrading from a dual-layer drive to a single-layer one, etc. Lots of compromises that lots of would-be customers frown upon.
The two primary problems I've experienced with gaming are:
A) Experiencing a tendancy to loose touch with reality. I experienced this first hand when I was stationed in Korea in the Army. With nothing else to do really, (unless I wanted to go get drunk) I spent countless hours behind the computer engrossed in a fantasy world. It gets very hypnotic, to the point that if I now hear certain music that I played often while gaming, I remember with an almost obscene vividness the FANTASY WORLD that I was in at the time. No recollection at all of actual physical location, only the artificial location. It's kind of spooky. I suspect that may be similar in some way to what many Vietnam era veterans experience due to severe PTSD. So that's the first.
Time. It's just in too short supply. If one honestly has no other asperations in life, and has no purpose, then gaming is a fun way to whittle away the hours. In my experience though, after the point that I started acquiring some honest goals and ambitions for my future, I really REALLY began to regret the time that I was spending on things that weren't helping me acomplish them. It's painful to be actively holding yourself back from your goals and have full knowlege of it. I think it comes from a sense of inertia in our lives- the feeling that it's better to stick with the old habbits than to work really hard psycologically on ourselves (or physically; Ever try to loose 30 pounds?) to establish a new habbit (like studying chemistry in the dorm room instead of the bar) that doesn't provide an immediate sense of gratification.
Ok, back to the dorm room chemistry! (Geeze! don't you HATE dorms?)
You played videogames during the Korean War?!
Was this some prehistoric Pong prototype or are you some sort of Time Traveler?!
Peddle your witchcraft somewheres else!
You played videogames during the Korean War?!
Was this some prehistoric Pong prototype or are you some sort of Time Traveler?!
Peddle your witchcraft somewheres else!
lmao
Hmm, maybe I never left the fantasy world?!
Actually, I was in Korea from November 2000 to November 2002.
Although, I DO have a really old RadioShack Pong type game. You plug it into the TV and then turn knobs. It has a little speaker that goes *blip!* 8)
*edit* And that's "World of Witchcraft" thankyou very much. ;-D <jab>
As far as I know, a soldered chip can be cooled better than a socketed one. The heat is simply transferred a lot easier.
But I'm not a physics guy.
By making compromises! Compromises such as soldering instead of socketing, getting rid of expansion bays (as a Wall Street PB owner, you probably remember those), downgrading from a dual-layer drive to a single-layer one, etc. Lots of compromises that lots of would-be customers frown upon.
Actually, I think we'd need an engineer to figure that one out. I don't see why a heat pipe couldn't be used on the graphics module. But we have only conjecture on that subject, I think.
As far as compromises, very true. And yes, I like my expansion bays. You're right, that's definately the direction they've been heading, and now that I consider it, additional upgrade options do lead away from that KISS principle they've been following.
As far as I know, a soldered chip can be cooled better than a socketed one.
just as an afterthought to clarify the discussion, rememeber that MXM GPUs are on daughterboard, rather than being socketed like a desktop processor. I'll give you the benefit of the doubt though and assume that's what you meant.