GT120 DVI->HDMI not showing native resolution
Hello everyone,
I'm having a lot of trouble with my new Mac Pro and the GT120 card with a 24in LG L246WP LCD. It only has HDMI so I bought a DVI->HDMI and OSX only shows HDTV resolution choices and now Windows in bootcamp cuts off half of my desktop set to it's native 1920x1200.
To get around it I had to buy switchresX to force it to use something other than 1900x1080 etc. To make it even more confusing I had this very LCD hooked up to an iMac for two years in extended mode (using a 2600 ati card in the iMac) and never had one issue in OSX or bootcamp.
Any help much appreciated.
I'm having a lot of trouble with my new Mac Pro and the GT120 card with a 24in LG L246WP LCD. It only has HDMI so I bought a DVI->HDMI and OSX only shows HDTV resolution choices and now Windows in bootcamp cuts off half of my desktop set to it's native 1920x1200.
To get around it I had to buy switchresX to force it to use something other than 1900x1080 etc. To make it even more confusing I had this very LCD hooked up to an iMac for two years in extended mode (using a 2600 ati card in the iMac) and never had one issue in OSX or bootcamp.
Any help much appreciated.
Comments
Hello everyone,
I'm having a lot of trouble with my new Mac Pro and the GT120 card with a 24in LG L246WP LCD. It only has HDMI so I bought a DVI->HDMI and OSX only shows HDTV resolution choices and now Windows in bootcamp cuts off half of my desktop set to it's native 1920x1200.
To get around it I had to buy switchresX to force it to use something other than 1900x1080 etc. To make it even more confusing I had this very LCD hooked up to an iMac for two years in extended mode (using a 2600 ati card in the iMac) and never had one issue in OSX or bootcamp.
Any help much appreciated.
I have the same monitor. Some drivers on some OSes just don't like it. It appears to be completely incompatible with Nvidia's Linux drivers, for example- as soon as the driver loads, it goes into power-save mode and won't wake up.
VGA always works, of course. But it looks like crap at 1920x1200.
I have the same monitor. Some drivers on some OSes just don't like it. It appears to be completely incompatible with Nvidia's Linux drivers, for example- as soon as the driver loads, it goes into power-save mode and won't wake up.
VGA always works, of course. But it looks like crap at 1920x1200.
Ugh I really don't want to shell out cash to replace a working LCD (and a nice 8bit MPVA display at that) JUST to get my crap gt 120 to play nice. A voice in my head told me 'don't buy an HDMI only LCD' but I didn't listen. \
I guess another option is to shell out money for an ATI 3870/4870 and pray that works considering I never had an issue with this monitor with my iMac and its ATI 2600PRO card.
Oh and yes VGA works fine in both OSX and Vista/Win7 but I agree it looks blurry and unusable.
Has anyone out there used the 4870 with a DVI to HDMI cable and had success??
I would love to be stuck with this problem. Text I could read without a struggle. I am curious. If I were to purchase a TV or HDMI compatible monitor, say 20 inch, could I actually get it to work natively at 720 or 768 vertical resolution? I have an iMac 20 and the difference between 1050 or something in the 700s would be huge!
Trust me I'm not lucky I cannot get bootcamp to recognize any real resolution (the screen gets chopped in half) and in OSX it refused to show the native 1920x1200 so I had to pay $20 more dollars just to get it to work with my LCD.
Trust me I'm not lucky I cannot get bootcamp to recognize any real resolution (the screen gets chopped in half) and in OSX it refused to show the native 1920x1200 so I had to pay $20 more dollars just to get it to work with my LCD.
I didn't mean to make light of your dilemma. The way things are going, I may have to pitch my 20 inch iMac and find another way. My right eye, the only one that really works, gets red and sore from the struggle I endure trying to deal with the tiny print on my screen. If I could purchase a cheap 19 inch TV and use the native 720 res, life would be much better. Maybe I can find an eMac with low miles on it.
I ask because I have the thing hooked up to a desktop PC using the "official" Nvidia driver and I've never had a problem in Windows, only other OSes.
I didn't mean to make light of your dilemma. The way things are going, I may have to pitch my 20 inch iMac and find another way. My right eye, the only one that really works, gets red and sore from the struggle I endure trying to deal with the tiny print on my screen. If I could purchase a cheap 19 inch TV and use the native 720 res, life would be much better. Maybe I can find an eMac with low miles on it.
They're rare, but you might be able to find a TV with a DVI port.
http://forums.anandtech.com/
and search your monitor, there are posts about it. All Windows stuff, but the problem is similar. Seems the LG wants to pretend it is a HDTV monitor when associated with Nvidia, even with the DVI to HDMI adapter. In Windows, the official GeForce 8800 GTS driver has a checkbox labeled "Treat as HDTV" and it checked by default. Rather than insert the picture, here is the link:
http://img62.imageshack.us/img62/112...nineboxma4.jpg
Sadly, this is of no help, right? You may be right in your previous post, either a new monitor or new video card. You claim the Mac Pro is new. If Apple Care wanted to do the right thing, they could let you get a different video card, since Nvidia is the problem and they put it there.
They're rare, but you might be able to find a TV with a DVI port.
I actually followed a Hack for windows I found on HardCorp that has you start the install of NVIDIA's newest driver package, stop it once it unzips everything then you have to hack the inf file and tell it 'stop thinking it is a TV' and it worked.
My issue is I have found no info for the Mac side. I'm using that switchresx software and I'm still getting flicker/refresh issues even when I input the factory settings.
I did hear that 10.5.6 breaks the DDC-CI support and that .7 is supposed to put it back, I'm just hoping it comes out very soon!
Sadly, this is of no help, right? You may be right in your previous post, either a new monitor or new video card. You claim the Mac Pro is new. If Apple Care wanted to do the right thing, they could let you get a different video card, since Nvidia is the problem and they put it there.
Thanks for the info. I did contact Apple twice and they just said "Unfortunately Apple doesn't support the use of non-apple adapters and the new GT120 doesn't support HDMI output even through an adapter" and without saying it told me to get a new LCD.
I actually followed a Hack for windows I found on HardCorp that has you start the install of NVIDIA's newest driver package, stop it once it unzips everything then you have to hack the inf file and tell it 'stop thinking it is a TV' and it worked.
My issue is I have found no info for the Mac side. I'm using that switchresx software and I'm still getting flicker/refresh issues even when I input the factory settings.
I did hear that 10.5.6 breaks the DDC-CI support and that .7 is supposed to put it back, I'm just hoping it comes out very soon!
I found that, too. I'll have to use it as soon as I get a new graphics card and must update the driver on my PC. Damn this monitor (and it's a very nice monitor) for making computers think it is a television.