Can a MacBook reliably push a 24" display?

Posted:
in Genius Bar edited January 2014
I got tired of waiting for Apple to introduce new ACDs with more inputs, so I picked up the Gateway FPD2485W this past weekend. Overall the monitor is great, but I have been noticing some performance issues recently. I am running the display via DVI, with the MacBook lid closed using the Wireless Mighty Mouse and Keyboard. I tend to leave my MacBook on all the time, letting it go to sleep when I close the lid. Since I have had the monitor hooked up I put it to sleep manually when I go to work. I am also running a 500GB Western Digital My Book hard drive via FireWire if that makes a difference.

Everything worked great for the first few days, but eventually I started having performance issues. I get the spinning beach ball from time-to-time, and I have had problems with iTunes and iPhoto not closing. iTunes tends to be open all the time. I use it to watch video podcasts as well as listen to music. So my question is, will the MacBook's integrated video allocate more than 64MB of RAM, if needed? My MacBook has 2GB of RAM, so there should be plenty to go around. I do tend to have dozens of tabs open at once in Safari when I'm using NewsFire. Any suggestions? Thanks.

Comments

  • Reply 1 of 10
    MarvinMarvin Posts: 15,322moderator
    I find the GMA slows down at 1600x1200 resolution and that's why I keep my 17" CRT at 1280x1024. Apple should have used better cards in these machines.



    Can you drop down to 1280x1024 (or nearest resolution if it's widescreen) to see if you still experience the performance issues. If there's no difference, you can look at something else being the problem.
  • Reply 2 of 10
    messiahmessiah Posts: 1,689member
    Quote:
    Originally Posted by Marvin View Post


    Apple should have used better cards in these machines.



    The only major differentiator between the MacBook and the MacBook Pro families is the graphics card. If you want to drive external monitors at high resolutions go for the MacBook Pro.



    IMHO the fact that the MacBook can drive a 23" ACD at all is above and beyond the call of duty.
  • Reply 3 of 10
    messiahmessiah Posts: 1,689member
    Quote:
    Originally Posted by Galley View Post


    Any suggestions?



    Are you runing any Rosetta dependent apps?
  • Reply 4 of 10
    galleygalley Posts: 971member
    Quote:
    Originally Posted by Messiah View Post


    Are you runing any Rosetta dependent apps?



    I don't think so. I was gonna try shutting down the MacBook at night, and see if that helps. It seems to work OK after a reboot.
  • Reply 5 of 10
    sunilramansunilraman Posts: 8,133member
    Quote:
    Originally Posted by Galley View Post


    ....So my question is, will the MacBook's integrated video allocate more than 64MB of RAM, if needed?...



    I don't think it can. The MacBook seems to be pretty darn snappy with 2gb of RAM running Intel Native and even on top of that Parallels-WinXP2Pro... I haven't tried screen spanning to a huge display though... The fact that droplet effects and the RSS 3d-visualiser-screen saver and 720p movies run pretty smooth, I'm impressed with the GMA950, or the MacIntel OS10.4.8 video drawing.
  • Reply 6 of 10
    gongon Posts: 2,437member
    I run a 24" Dell with an iBook G4. In my understanding, the iBook's 32MB VRAM gets split in half when running an external display, so the 24" only has 16MB in use.



    Desktop is fine at 1920x1200.



    Lower-res (like 480p) video won't play in fullscreen scaled to that resolution. It will play fullscreen scaled to 1680x1050. Don't get me wrong - I don't expect a mid-speed G4 to give me smooth video scaled to the full HD resolution - but the picture actually doesn't appear at all,

    a manner of green triangles come up instead. I blame lack of VRAM. If it was the processor, then I'd at least get a full frame of video now and then, and VLC would tell me the performance is insufficient, definitely no green triangles.



    This problem wouldn't exist if Apple hadn't deliberately crippled the iBook hardware at a time when they were unable to produce a decent 12" Powerbook. What they did was to ship crippled and non-crippled versions of basically the same hardware in different enclosures.



    Far from being "above and beyond the call of duty", I think being able to run up to 1920x1200 resolution properly is a basic requirement for any computer made in the last two years. If an Apple computer did not manage to do it, I'd regard it as an abysmal failure.
  • Reply 7 of 10
    chuckerchucker Posts: 5,089member
    Quote:
    Originally Posted by Gon View Post


    Far from being "above and beyond the call of duty", I think being able to run up to 1920x1200 resolution properly is a basic requirement for any computer made in the last two years. If an Apple computer did not manage to do it, I'd regard it as an abysmal failure.



    When in 2007, many, many laptops still ship without DVI, thus effectively making 1920x1200 useless (even at 1680x1050, the output from VGA is already low-quality; beyond that, you can pretty much forget about it), your demand seems ridiculous.
  • Reply 8 of 10
    gongon Posts: 2,437member
    Quote:
    Originally Posted by Chucker View Post


    When in 2007, many, many laptops still ship without DVI, thus effectively making 1920x1200 useless (even at 1680x1050, the output from VGA is already low-quality; beyond that, you can pretty much forget about it), your demand seems ridiculous.



    If I understand correctly, what you're saying here is that the error generated from going through VGA is worse than the difference between 1680x1050 and 1920x1200 resolutions. And that's just for a CRT. On the LCD's the question of native resolution comes into play, so then the error from VGA would need to be worse than the drop in resolution and upscaling from the lower resolution signal to the higher. Sounds fishy.



    Just to make sure, I just now viewed a 2560x1920 photograph in full screen using Preview, first in 1920x1200 and then on 1680x1050 over VGA.



    I doubt Preview uses the best possible scaling as that would likely take more time. There would be better results if I Gimped the large photo to both native resolution sizes. Still, the difference between the displayed pictures is quite massive.



    But what do I know? I'm only using this technology every day.
  • Reply 9 of 10
    chuckerchucker Posts: 5,089member
    Quote:
    Originally Posted by Gon View Post


    If I understand correctly, what you're saying here is that the error generated from going through VGA is worse than the difference between 1680x1050 and 1920x1200 resolutions. And that's just for a CRT.



    There's such a thing as a CRT with 1680x1050, let alone 1920x1200? Virtually all CRTs I know of aren't widescreen.



    Quote:

    On the LCD's the question of native resolution comes into play, so then the error from VGA would need to be worse than the drop in resolution and upscaling from the lower resolution signal to the higher. Sounds fishy.



    What I'm saying is that the conversion from digital (your graphics card) to analog (VGA) and back to digital (the screen) is lossy, twice. This is negligible on a resolution like 1280x1024, but on 1680x1050, it becomes noticeable, and on 1920x1200, it becomes downright ugly.



    Quote:

    Just to make sure, I just now viewed a 2560x1920 photograph in full screen using Preview, first in 1920x1200 and then on 1680x1050 over VGA.



    And what are you trying to say?



    Quote:

    But what do I know? I'm only using this technology every day.



    You're using a high-resolution widescreen monitor with an analog connector every day?



    Wow.
  • Reply 10 of 10
    gongon Posts: 2,437member
    Quote:
    Originally Posted by Chucker View Post


    There's such a thing as a CRT with 1680x1050, let alone 1920x1200? Virtually all CRTs I know of aren't widescreen.



    Did I say there was? Try to see the forest from the trees.
    Quote:

    What I'm saying is that the conversion from digital (your graphics card) to analog (VGA) and back to digital (the screen) is lossy, twice. This is negligible on a resolution like 1280x1024, but on 1680x1050, it becomes noticeable, and on 1920x1200, it becomes downright ugly.



    And what I'm saying is that it's not downright ugly when it is far better than 1680x1050. If the proper resolution wasn't supported, I'd have a lot worse picture.



    Would I want DVI? Sure. Does VGA work? Sure.
    Quote:

    You're using a high-resolution widescreen monitor with an analog connector every day?



    Wow.



    Yeah, thanks to Apple for screwing up the iBook. I'd upgrade to a headless desktop, but Apple doesn't have a midrange. I've been waiting if they'd maybe come out with one (read: one or two years) but no. I'll probably get a PC in a couple months.
Sign In or Register to comment.