PDA

View Full Version : Can't get native resolution



Varmint260
April 10th, 2009, 12:56 AM
Hey, guys. I got a new video card for cheap (not high end but I got an awesome deal). Now, I disabled the on-board GFX and installed this thing, and then the drivers. Played some games at full native resolution for an hour, and then shut the computer off. When I turned it back on, suddenly my computer thinks it doesn't support my native resolution. It tells me my native is either 1440x900 or 1600x1200, and I have no idea why. I forced the NVidia drivers to run 1680x1050 but now every single game I have still won't switch to the right resolution. The only ones that I can force into the right resolution are ones that work with the -vidmode command. I've got the correct drivers for the monitor and the latest drivers from NVidia. Any ideas what I can do to fix this?

legionaire45
April 10th, 2009, 01:32 AM
Try reinstalling the drivers?

Varmint260
April 10th, 2009, 06:45 AM
I've reinstalled the drivers for both GPU and screen (in several different orders) and I've tried running "custom resolutions" which do nothing. If it were something simple like that I'm sure I'd have got it. It's like either Windows or NVidia's drivers suddenly decided I didn't have the monitor I have.

itszutak
April 10th, 2009, 01:25 PM
What cable are you using? That's had an impact on what resolutions work for me in the past.

a VGA cable with a DVI converter was the only one that worked on my old, cheap monitor (which, by the way, died about 6 months after I bought it)

Varmint260
April 14th, 2009, 12:54 AM
I have a VGA cable running from a VGA monitor to an adapter and into a DVI port on the card.

Anyhow, I think it's working now. Not sure how, mind you. I made sure the connections with the cables were tight (I guess a bad cable connection could cause windows to detect the wrong resolutions?) and I changed the connector to the DVI port closest to the bottom of the card.

Thanks for the suggestions, everyone! Question: SHOULD it matter which DVI port I connect my monitor to?

itszutak
April 14th, 2009, 08:50 AM
I have a VGA cable running from a VGA monitor to an adapter and into a DVI port on the card.

Anyhow, I think it's working now. Not sure how, mind you. I made sure the connections with the cables were tight (I guess a bad cable connection could cause windows to detect the wrong resolutions?) and I changed the connector to the DVI port closest to the bottom of the card.

Thanks for the suggestions, everyone! Question: SHOULD it matter which DVI port I connect my monitor to?
On a properly made monitor, no.

But there's always a chance that a flaw passed inspection, leaving one port more likely to work than another.

Dr Nick
April 14th, 2009, 08:50 AM
If you have problems with it again, try using the nVidia Control panel to force a custom resolution.


<.<


>.>


...ya damn varmint!

Varmint260
April 14th, 2009, 11:05 AM
On a properly made monitor, no.

But there's always a chance that a flaw passed inspection, leaving one port more likely to work than another.

I mean the DVI ports on the card, right? But it doesn't matter; as long as my monitor works.

And to Nick: I tried forcing a custom resolution and it works quite well for Windows applications, but not for gaming. The correct resolution wouldn't show up in the game's settings and not all of my games will accept .ini settings edits or -vidmode command lines. But since it seems to be working now, I'm going to leave all of that alone.

Thanks for the info, everyone!

itszutak
April 14th, 2009, 10:03 PM
I mean the DVI ports on the card, right? But it doesn't matter; as long as my monitor works.

And to Nick: I tried forcing a custom resolution and it works quite well for Windows applications, but not for gaming. The correct resolution wouldn't show up in the game's settings and not all of my games will accept .ini settings edits or -vidmode command lines. But since it seems to be working now, I'm going to leave all of that alone.

Thanks for the info, everyone!
The card ports? I'm not sure, but I do know that on my old nVidia card (with two monitors) it used one port for the primary and one for the secondary monitor.

Warsaw
April 14th, 2009, 11:04 PM
Mine does that. Looking at the card, right port is primary, left port is secondary.

Perhaps uninstalling the monitor drivers might fix it...most monitors these days are plug and play, so perhaps the monitor drivers are somehow conflicting with the video drivers.

Varmint260
April 14th, 2009, 11:50 PM
Okay... looking at my card I'm plugged into the left port. I started with the right, and it worked the first time I started the computer after installing the drivers, but the second time it screwed up. I changed to the left and it's been working fine ever since. I'll try experimenting with it for awhile another time. Maybe when I switched ports I made a bigger effort to have a secure connection so maybe which port doesn't matter.

Amusing thought... maybe I'm just a huge n00b 'cause this is my first PCIe x16 video card! Last video card I installed was a Radeon 9250 (PCI standard) in my Mom's old Dell. Only one VGA port to worry about, thank goodness! Now there's two DVI ports and an S-Video and I'm completely and utterly lost! Nah, that can't be it.